Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documenting macOS setup with tauri and ComfyUI #104

Open
metamas opened this issue Jul 30, 2023 · 14 comments
Open

Documenting macOS setup with tauri and ComfyUI #104

metamas opened this issue Jul 30, 2023 · 14 comments

Comments

@metamas
Copy link

metamas commented Jul 30, 2023

Describe the bug
When my tauri app window is launched, the "Welcome to StableStudio" seems to be stuck on the "Downloading ComfyUI..." loader for way too long. I inspected the downloader portion of the App file and then followed the download link to find that the .zip being downloaded is 2.6 GB.

await download(
"https://pub-5e5adf378ed14628a527d735b7743e4e.r2.dev/stability-downloads/ComfyUI/ComfyUI_windows_portable.zip",
`${comfyui_location}\\comfyui.zip`,
(p, total) => {
comulativeProgress += p;
setProgress(comulativeProgress / total);
}
);

Why so large? Are checkpoints being included in that .zip file?

This pain is particularly interesting to me right now because I'm currently on quite a slow connection.

Steps to reproduce
I followed the setup instructions for the tauri branch. Then, whether I launch via cargo tauri dev or with packages/stablestudio-ui/src-tauri/target/release/StableStudio, the loading screen (mentioned above) moves at a snail's pace and always seems to get stuck at about 30% on the progress bar.

I still have not waited for it complete before losing patience and discontinuing the process. I tried multiple times, waiting ~1hr before realizing the progress bar didn't seem to be making anymore progress.

Expected behavior
Even with my current slow connection, I expect a faster download. But I'm also concerned by the size of the .zip being downloaded. While I was waiting for StableStudio to "Downloading ComfyUI...", I was able to download and setup a standalone ComfyUI installation, so I'm confused about why it's taking StableStudio so much time/space to do the same. Seems like something inefficient and/or unnecessary may be happening. 🤔

Further notes
I see the detect/download flow for ComfyUI, so I'm currently just looking into if there's a not-too-hacky way for me to set the comfyui_location setting to the directory where I already have working copy of ComfyUI.

const comfyui_location = await invoke("get_setting", {
key: "comfyui_location",
})

If I trace comfyui_location to some nicely abstracted configs file/object, that will be great. Otherwise, I'll just edit references to comfyui_location with the path to my standalone copy of ComfyUI. If that works well, it might be worth providing an clear way for setting up a dev environment that uses an already installed ComfyUI, rather than downloading whatever is in ComfyUI_windows_portable.zip.

Also, I did notice an error in the Tauri web view inspector, when I looked at its console during one of the instances when the "Downloading ComfyUI..." progress bar seemed to get stuck. Sorry, I didn't copy it down. So, in any case, I'll look into that after I try my experiment above.

Smartphone (please complete the following information):

  • macOS Ventura
  • cargo 1.70
  • tauri-cli 1.4.0
@metamas metamas added the bug Something isn't working label Jul 30, 2023
@metamas
Copy link
Author

metamas commented Jul 30, 2023

I've downloaded ComfyUI_windows_portable.zip directly in my browser. Doing that did not experience the same issue of getting stuck part way through the download, as with what I was experience within the StableStudio launch process.

@metamas
Copy link
Author

metamas commented Jul 30, 2023

After unpacking the .zip, it appears the /python_embedded directory is the brunt of this download at a whopping 5.5GB. What is in there and why is it necessary? Its sibling /ComfyUI directory is only 27MB, and appears to just be a complete clone of the ComfyUI repo.

All the .bat, .dll. and .exe files I'm seeing in there makes me wonder if the downloaded files are even compatible with Linux/macOS systems. Which would beg the question of why it's ever being downloaded to my system in the first place, and whether or not there are Linux/macOS compatible options implemented in the code yet?

@KAJdev
Copy link
Contributor

KAJdev commented Jul 31, 2023

As noted in the releases page, the only existing portable ComfyUI builds are packaged for Windows, and additional steps are needed for macOS/Linux systems for now. The python_embedded directory is a fully portable install of the CPython interpreter, as well as all the dependencies required for the ComfyUI backend (the largest being pytorch).

Also note that the ComfyUI that is being installed is slightly modified, coming with an extra core node, and modifications to ComfyUI's frontend for ease of compatibility with StableStudio.

@KAJdev KAJdev removed the bug Something isn't working label Jul 31, 2023
@metamas
Copy link
Author

metamas commented Jul 31, 2023

My bad about not looking at the releases page. Maybe worth mentioning that in the main README also? 🤷‍♂️

MacOS and Linux are supported, but will need some custom work to get ComfyUI setup manually for now.

That's pretty vague. Has anyone documented what that "custom work" entails? If not, I'm happy to do so.

I've already set up the standalone ComfyUI, so it seems that I should be able to put the modified ComfyUI in the right place for StableStudio to find it, and be good to go... Or will I run into problems with StableStudio looking for embedded the Python stuff instead of using what is already globally installed? I'll give it a try.

@metamas
Copy link
Author

metamas commented Jul 31, 2023

Out of curiosity, is the embedded CPython always a necessary when installing StableStudio on Windows? Or is it included just in case a user doesn't already have a python interpreter on their system? If they do already have a compatible python interpreter, do they still get stuck with the extra/unnecessary copy taking up space?

I answered my question. Looks like main.rs always expecting to use an interpreter from python_embedded, on any OS.

let (mut rx, _child) = Command::new({
if cfg!(unix) {
"python_embeded/python"
} else if cfg!(windows) {
"python_embeded/python.exe"
} else if cfg!(macos) {
"python_embeded/python.app"
} else {
panic!("Unsupported platform")
}
})

Might be nice to provide some documentation on how to set things up to use global python instances, for people that have the tech know-how and would prefer not to have an unnecessary >5GB taking up space on their system. That's what I'm going with.

@metamas
Copy link
Author

metamas commented Jul 31, 2023

Yeah, there's stuff going on in main.rs that does not work on macOS even when all of the dependencies for ComfyUI are installed and the ComfyUI directory from the download is placed in ~/Library/Application Support/com.stabilityai.stablestudio (where comfy_location is set to). e.g python_embeded/python.app does not exists in the downloaded directory, but I assume that was put in there as a future intention.

That's as far as I got yesterday. Rust it still erroring when I have it try to launch ComfyUI with my global Python. It's right at the end of the script, so almost there. I think I may have the directory nesting it expects off slightly. Will figure it out later today.

@metamas metamas changed the title Why is the ComfyUI download so large and slow? Documenting macOS setup with tauri and ComfyUI Jul 31, 2023
@metamas
Copy link
Author

metamas commented Jul 31, 2023

Also, using port 5000 won't work for macOS (after Monterey) when a user has AirPlay on because its default receiver port is 5000. So, currently main.rs just assumes that is ComfyUI and skips launching it.

settings::create_settings(
store_location,
Settings {
comfyui_location: Box::new(comfy_location.to_str().unwrap().to_string()),
comfyui_url: Box::new("http://localhost:5000".to_string()),
},
)

// test to make sure its not already running (just test port 5000)
println!("Checking for existing comfy process...");
let client = reqwest::Client::builder()
.connect_timeout(std::time::Duration::from_secs(1))
.build()
.unwrap();
let resp = client.get(url.clone()).send().await;
if resp.is_ok() {
println!("Comfy already running, skipping launch.");
return Ok("completed".to_string());
}

@KAJdev
Copy link
Contributor

KAJdev commented Jul 31, 2023

Thanks for this!

the download is placed in ~/Library/Application Support/com.stabilityai.stablestudio (where comfy_location is set to). e.g python_embeded/python.app does not exists in the downloaded directory, but I assume that was put in there as a future intention.

yes, as there is no packaged python build that was speculative.

Also, using port 5000 won't work for macOS (after Monterey) when a user has AirPlay on because its default receiver port is 5000. So, currently main.rs just assumes that is ComfyUI and skips launching it.

Good catch, would be good to add some better port picking logic similar to how the app service works

let mut context = tauri::generate_context!();
let url = format!("http://localhost:{}", port).parse().unwrap();
let window_url = WindowUrl::External(url);
// rewrite the config so the IPC is enabled on this URL
if !cfg!(dev) {
context.config_mut().build.dist_dir = AppUrl::Url(window_url.clone());
}

@metamas
Copy link
Author

metamas commented Aug 2, 2023

@KAJdev I now have everything setup so that StableStudio launches on macOS, using the custom ComfyUI instance that comes packaged in ComfyUI_windows_portable.zip. There are a number of things I did manually that I need to confirm the handling of in main.rs and see about making adjustments where needed.

  1. Retry the automatic download of ComfyUI_windows_portable.zip and see if the issues with the download stalling were just one-off or something that needs to be debugged for macOS specifically.

  2. Add some better port selection logic (as we discussed above).

    Switching off AirPort allows ComfyUI to be launched on port 5000, but better to switch this to a port that none of macOS built-in services will ever be using.

  3. Edit launch_comfy to use the global installation of python3 for macOS.

    What do you think about this as a long-term option? Is it desired that the launcher eventually uses an executable from within embedded_python, for every OS?_

    What about an optional flag/config to use the global Python and delete the embedded_python directory altogether?

  4. Add logic that creates a venv and installs python requirements within comfyui_location/ComfyUI

    This is what I'm currently doing, but it also works by removing the python -s flag and using globally installed packages.

  5. Lastly, for some reason macOS doesn't like the usage of localhost for the proxy configs in vite.config.ts. Switching all of those to explicitly be http://127.0.0.1 fixes these errors.

12:30:45 PM [vite] http proxy error at /style.css:
Error: connect ECONNREFUSED ::1:5000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)
12:30:45 PM [vite] http proxy error at /lib/litegraph.css:
Error: connect ECONNREFUSED ::1:5000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)
12:30:45 PM [vite] http proxy error at /lib/litegraph.extensions.js:
Error: connect ECONNREFUSED ::1:5000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)
12:30:45 PM [vite] http proxy error at /lib/litegraph.core.js:
Error: connect ECONNREFUSED ::1:5000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)
12:30:45 PM [vite] http proxy error at /scripts/app.js:
Error: connect ECONNREFUSED ::1:5000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1494:16)

@metamas
Copy link
Author

metamas commented Aug 2, 2023

@KAJdev Also, I had previously done the Apple Silicon specific setup advised by the ComfyUI README. What those steps do is set up the accelerated version of PyTorch for Apple M1/M2 and ensure that comfy is launched with the --force-fp16 flag.

However, afaik using that version of PyTorch is not a hard requirement for running ComfyUI on macOS, but more of a best for performance setup. Maybe someone who has tried without it can confirm this for me?

So, my question is do we try to bake those additional Apple-specific set up steps into main.rs or advise them as optional best practice for performance?

@KAJdev
Copy link
Contributor

KAJdev commented Aug 2, 2023

What do you think about this as a long-term option? Is it desired that the launcher eventually uses an executable from within embedded_python, for every OS?_

I would be weary of this. Many people will either not have a correct Python version, not have the correct dependencies, or prefer to keep system-wide site-packages clean. A packaged venv for each OS is the best way to move forward in the long run.

Lastly, for some reason macOS doesn't like the usage of localhost for the proxy configs in vite.config.ts. Switching all of those to explicitly be http://127.0.0.1 fixes these errors.

interesting. I'm not sure why that would be happening but that seems like an easy fix

So, my question is do we try to bake those additional Apple-specific set up steps into main.rs or advise them as optional best practice for performance?

ideally, I would set up another job to package & serve comfyui specifically for macOS, and the installation would be no different from Windows.

@airtonix
Copy link

airtonix commented Aug 9, 2023

Many people will either not have a correct Python version, not have the correct dependencies, or prefer to keep system-wide site-packages clean.

then intergrate ASDF so you do get the version you want so that it's not the system version of python:

https://gist.github.com/airtonix/1031cd7b4e9745c73b385204f7a9688d

@airtonix
Copy link

@davecrab
Copy link

davecrab commented Jan 1, 2024

Is this something that's still on the horizon? I tried to build and run the dev of this today but unfortunately it couldn't find the comfy_ui_portable zip. Tried to paste the path into a browser and it gave me an access denied.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants