-
Notifications
You must be signed in to change notification settings - Fork 25.1k
Issues: huggingface/transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
transformers 4.41.2 breaks paligemma inference
#31171
opened May 31, 2024 by
kishan-character
4 tasks
Finetuning OPT models with 8bit and LoRA on QA tasks leads to NAN weight in
model.qa_outputs
#31162
opened May 31, 2024 by
shiningrain
2 of 4 tasks
AttributeError: 'LlamaForCausalLM' object has no attribute '_setup_cache'
#31157
opened May 31, 2024 by
mobicham
2 of 4 tasks
TypeError: 'NoneType' object cannot be interpreted as an integer
#31156
opened May 31, 2024 by
andysingal
1 of 4 tasks
Unable to load t5-small tokenizer saved with latest packages in older versions
#31139
opened May 30, 2024 by
jpmann
2 of 4 tasks
SAM-HQ implementation in transformers
New model
#31137
opened May 30, 2024 by
IskanderNakipov
2 tasks done
Slow tokenizer-loading when many tokens (~17k) are added by a user.
#31134
opened May 30, 2024 by
jaeminSon
2 of 4 tasks
Inconsistency in logit values between generation and direct model prediction
#31127
opened May 30, 2024 by
lowlypalace
2 of 4 tasks
Request for Static Cache Support for XLA Compiler in Transformers
Feature request
Request for a new feature
#31126
opened May 30, 2024 by
huzama
Understanding loss in Training LLM
Feature request
Request for a new feature
#31125
opened May 29, 2024 by
mostafamdy
TypeError: Cannot convert [array([322., 1.])] to EagerTensor of dtype int64
#31112
opened May 29, 2024 by
pavi-ninjaac
4 tasks
need better token_type_id processing on transformer GPT2Model
Feature request
Request for a new feature
#31105
opened May 29, 2024 by
JohnHerry
Caching Past Key values of any length for Vision LLM's
Feature request
Request for a new feature
Generation
#31096
opened May 28, 2024 by
saikoneru
Log multiple losses used along with the combined losses when a model returns a dictionary of losses.
Feature request
Request for a new feature
#31081
opened May 28, 2024 by
NikhilMank
Add ability to specify input device for ffmpeg_microphone()
Feature request
Request for a new feature
#31074
opened May 27, 2024 by
jferments
from transformers import Phi3VModel, Phi3VConfig. Phi3 Vision config not added to transformers.
#31073
opened May 27, 2024 by
sankydesai
3 of 4 tasks
GenerationConfig throws Object is not JSON serializable when setting constraints
Generation
#31070
opened May 27, 2024 by
OS-leonardopratesi
4 tasks
Sigmoid instead of softmax used in Documentation and autopipeline for siglip
#31064
opened May 27, 2024 by
rishabh063
Previous Next
ProTip!
no:milestone will show everything without a milestone.