Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run this on windows ? #1

Open
2blackbar opened this issue Aug 30, 2023 · 8 comments
Open

How to run this on windows ? #1

2blackbar opened this issue Aug 30, 2023 · 8 comments

Comments

@2blackbar
Copy link

Is there a way ?

@secretenergy
Copy link

No, this is in development the Creator is working on a way that you may be able to run it in hugging face or possibly local. But even then you would need to get your Python setup.

@jjhaggar
Copy link

jjhaggar commented Sep 3, 2023

I hope the creator makes it possible to run local in the future :) This is incredibly fun! :D

@jbilcke-hf
Copy link
Owner

jbilcke-hf commented Sep 4, 2023

Hello, indeed to summarize the current situation:

  • the AI Comic Factory isn't a single all-in-one Gradio app (like for the other HF spaces): it's a production web app, that uses multiple services to run
  • that's why it is a bit difficult (but not impossible) to make it cloneable by people
  • same for the secret variables, we can't just give away the "keys to the cloud" (behind there are expensive servers for llama 70b, a dozen of SDXL servers etc which cost money)
  • I fully get that people want to be able to duplicate (that's natural, myself too - I would like to try it with OpenAI, Replicate etc for various experiment)

So what I propose:

@jbilcke-hf
Copy link
Owner

Hi,

I've started to make modifications in the code to support alternative backend engines

  • LLM: either HF Inference API or HF Inference Endpoint
  • Rendering: either my custom DIY server or Replicate

I plan to add more options in the future (such as a locally running llamacpp implementation, and a locally running sdxl)

@alarmgoose
Copy link

Will this ever be able to run with SD1.5?

@jbilcke-hf
Copy link
Owner

Hello everyone, I have updated the README.md and .env files, to add more options

https://twitter.com/flngr/status/1706340053041975480

Will this ever be able to run with SD1.5?

Well, in theory it should be possible if you set:

AUTH_HF_API_TOKEN="<YOUR TOKEN>"
RENDERING_ENGINE="INFERENCE_API"
RENDERING_HF_RENDERING_INFERENCE_API_MODEL="runwayml/stable-diffusion-v1-5"

However in practice my code is designed to generate panels with sizes such as 1024x1024px, 1024x768, 512x1024 etc
so I'm afraid you will have to change the code to account for those image size changes

On the other hand, know that today PRO Hugging Face users can also use:

RENDERING_HF_RENDERING_INFERENCE_API_MODEL="stabilityai/stable-diffusion-xl-base-1.0"

@alarmgoose
Copy link

jbilcke-hf

Thank you for the reply. I have another question though! Will Lora ever be able to be used? I have many char. Lora that I would love to use. And thank you for creating this project and for no cost as well. This looks like something that one should charge for and imo is worth a subscription especially if it can become more versatile. I've always wanted to make my own comic but I lack the drawing skills, then AI came. But then creating comic panels was very hard in 1.5...

then you created this. =)

jbilcke-hf pushed a commit that referenced this issue Oct 16, 2023
@jbilcke-hf
Copy link
Owner

Sorry for the late reply @alarmgoose , it is now possinle to use a LoRA (no need to fork the app, this can even be done directly the the settings panel the "improve quality" button)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants