Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Latent consistency model txt2img #1330

Open
wants to merge 16 commits into
base: master
Choose a base branch
from

Conversation

YToleubay
Copy link
Contributor

@YToleubay YToleubay commented Dec 5, 2023

@YToleubay YToleubay marked this pull request as ready for review December 5, 2023 05:54
@kyakuno
Copy link
Collaborator

kyakuno commented Dec 19, 2023

Thanks your PR. I will test it.

@kyakuno
Copy link
Collaborator

kyakuno commented Dec 19, 2023

@YToleubay Can you commit tokenizer folder? I got following error.

 INFO model_utils.py (86) : ONNX file and Prototxt file are prepared!
 INFO latent-consistency-models.py (286) : This model not optimized for macOS GPU currently. So we will use BLAS (env_id = 1).
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 261, in hf_raise_for_status
    response.raise_for_status()
  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/tokenizer/resolve/main/tokenizer_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 430, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1346, in hf_hub_download
    raise head_call_error
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1232, in hf_hub_download
    metadata = get_hf_file_metadata(
               ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1608, in get_hf_file_metadata
    hf_raise_for_status(r)
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 293, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-65819799-133b3bdd493584db45154725;a27c983d-c540-4380-9acf-c638adfe9399)

Repository Not Found for url: https://huggingface.co/tokenizer/resolve/main/tokenizer_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/kyakuno/Desktop/repos/ailia-models-ax/diffusion/latent-consistency-models/latent-consistency-models.py", line 361, in <module>
    main()  
    ^^^^^^
  File "/Users/kyakuno/Desktop/repos/ailia-models-ax/diffusion/latent-consistency-models/latent-consistency-models.py", line 292, in main
    tokenizer = CLIPTokenizer.from_pretrained(local_clip_tokenizer_path)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1947, in from_pretrained
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 451, in cached_file
    raise EnvironmentError(
OSError: tokenizer is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`

@YToleubay
Copy link
Contributor Author

@YToleubay Can you commit tokenizer folder? I got following error.

 INFO model_utils.py (86) : ONNX file and Prototxt file are prepared!
 INFO latent-consistency-models.py (286) : This model not optimized for macOS GPU currently. So we will use BLAS (env_id = 1).
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 261, in hf_raise_for_status
    response.raise_for_status()
  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/tokenizer/resolve/main/tokenizer_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 430, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1346, in hf_hub_download
    raise head_call_error
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1232, in hf_hub_download
    metadata = get_hf_file_metadata(
               ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1608, in get_hf_file_metadata
    hf_raise_for_status(r)
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 293, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-65819799-133b3bdd493584db45154725;a27c983d-c540-4380-9acf-c638adfe9399)

Repository Not Found for url: https://huggingface.co/tokenizer/resolve/main/tokenizer_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/kyakuno/Desktop/repos/ailia-models-ax/diffusion/latent-consistency-models/latent-consistency-models.py", line 361, in <module>
    main()  
    ^^^^^^
  File "/Users/kyakuno/Desktop/repos/ailia-models-ax/diffusion/latent-consistency-models/latent-consistency-models.py", line 292, in main
    tokenizer = CLIPTokenizer.from_pretrained(local_clip_tokenizer_path)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1947, in from_pretrained
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 451, in cached_file
    raise EnvironmentError(
OSError: tokenizer is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`

Sorry for that, added tokenizer folder

@kyakuno
Copy link
Collaborator

kyakuno commented Dec 20, 2023

M2 Macで動作を確認。

ailia processing time 71439 ms

@kyakuno
Copy link
Collaborator

kyakuno commented Dec 20, 2023

RTX3080でFP32だとout of memoryになる。FP16だと何とか動くが、AEが共有GPUメモリに配置された遅い。

@kyakuno
Copy link
Collaborator

kyakuno commented Dec 21, 2023

profile

====Profile(Grouped by LayerType)====
LayerType	TotalInferTime(Average)[us]	TimeRatio[%]
MatMul	314752	88.82
ReduceMean	12789	3.61
Eltwise	12653	3.57
Sigmoid	5587	1.58
Transpose	5013	1.41
Softmax	1745	0.49
Reshape	1251	0.35
ConvertValue	325	0.09
Gather	217	0.06
ArgMax	20	0.01
Flatten	15	0.00
====Profile(Summary)====
Predict Average Time[us]:354367	Variance:0	N:1

====Profile(Grouped by LayerType)====
LayerType	TotalInferTime(Average)[us]	TimeRatio[%]
Convolution	46335122	81.34
Eltwise	4386081	7.70
MatMul	3540882	6.22
Sigmoid	1152133	2.02
InstanceNormalization	1118233	1.96
Resize	318260	0.56
Softmax	77261	0.14
Transpose	33280	0.06
Reshape	1956	0.00
ConvertValue	96	0.00
====Profile(Summary)====
Predict Average Time[us]:56963304	Variance:0	N:1

====Profile(Grouped by LayerType)====
LayerType	TotalInferTime(Average)[us]	TimeRatio[%]
MatMul	38037558	39.65
Eltwise	26237699	27.35
Convolution	22003112	22.93
Softmax	7103627	7.40
ReduceMean	1068315	1.11
Slice	425463	0.44
Transpose	310202	0.32
Gelu	260622	0.27
Gemm	145982	0.15
InstanceNormalization	128292	0.13
Concat	96800	0.10
Sigmoid	81682	0.09
Reshape	22177	0.02
Resize	17149	0.02
ConvertValue	3304	0.00
Unsqueeze	1766	0.00
Expand	298	0.00
====Profile(Summary)====
Predict Average Time[us]:95944049	Variance:45524020220588	N:4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants