Issues: huggingface/optimum
Community contribution -
optimum.exporters.onnx
support fo...
#555
opened Dec 7, 2022 by
michaelbenayoun
Open
37
Community contribution -
BetterTransformer
integration for ...
#488
opened Nov 18, 2022 by
younesbelkada
Open
24
[Quick poll] Give your opinion on the future of the Hugging F...
#568
opened Dec 9, 2022 by
LysandreJik
Open
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
gemma-2b static quantized, generate text makes no sense
bug
Something isn't working
#1853
opened May 10, 2024 by
CHNtentes
2 of 4 tasks
no attribute '_TASKS_TO_AUTOMODELS' error
bug
Something isn't working
#1852
opened May 9, 2024 by
klutzDrawers
2 of 4 tasks
The Whisper large-v3 model exported to ONNX does not return the end timestamp for the last chunk
bug
Something isn't working
#1850
opened May 8, 2024 by
IlyaPikin
2 of 4 tasks
[Optimum export ONNX] Bug when export model TrOCR with dtype BF16
bug
Something isn't working
#1845
opened May 6, 2024 by
vu0607
2 of 4 tasks
Can't get ORTStableDiffusionPipeline to run on GPU on neither AWS nor GCP fresh instances
bug
Something isn't working
#1844
opened May 3, 2024 by
iuliaturc
2 of 4 tasks
why does ORTModelForCausalLM assume new input length is 1 when past_key_values is passed
#1839
opened Apr 29, 2024 by
cyh-ustc
Pushing to the hub with a token
bug
Something isn't working
#1836
opened Apr 26, 2024 by
IlyasMoutawwakil
4 tasks
The transformation of the model Blip2ForConditionalGeneration to BetterTransformer failed
bug
Something isn't working
#1833
opened Apr 25, 2024 by
garyzhang99
4 tasks
Issue Report: Unable to Export Gemma 7b Model to ONNX Format in Optimum
bug
Something isn't working
#1814
opened Apr 12, 2024 by
Harini-Vemula-2382
2 of 4 tasks
Support Llava ONNX export
feature-request
New feature or request
onnx
Related to the ONNX export
#1813
opened Apr 12, 2024 by
Harini-Vemula-2382
2 of 4 tasks
Unable to Export Chatglm3 Model to ONNX Format in Optimum
bug
Something isn't working
#1812
opened Apr 12, 2024 by
Harini-Vemula-2382
2 of 4 tasks
Exporting tinyllama-1.1b using onnxruntime bf16 crashes
bug
Something isn't working
#1807
opened Apr 10, 2024 by
mgiessing
4 tasks
advice for simple onnxruntime script for ORTModelForVision2Seq (or separate encoder/decoder)
#1804
opened Apr 9, 2024 by
eduardatmadenn
Cannot export jinaai models to onnx format because the model is > 2Gb
bug
Something isn't working
#1800
opened Apr 8, 2024 by
clarinevong
2 of 4 tasks
Previous Next
ProTip!
Updated in the last three days: updated:>2024-05-09.