-
Notifications
You must be signed in to change notification settings - Fork 25.2k
Issues: huggingface/transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Speed up image processors - cast to array before BatchFeature
Feature request
Request for a new feature
Good Second Issue
Issues that are more difficult to do than "Good First" issues - give it a try if you want!
#31205
opened Jun 3, 2024 by
amyeroberts
Control flow issue with symbolic_trace when using inputs_embeds in MistralForCausalLM
#31200
opened Jun 3, 2024 by
Hongjie1Chu
2 of 4 tasks
Some weights of BlipModel were not initialized from the model checkpoint at Salesforce/blip-image-captioning-base.
#31199
opened Jun 3, 2024 by
aivolcano
1 of 4 tasks
Original Llama-3 tokenizer behaves differently from
transformers
version
#31187
opened Jun 2, 2024 by
chawins
2 of 4 tasks
ModuleNotFoundError: No module named 'distutils' in Python 3.12
#31174
opened Jun 1, 2024 by
sailfish009
4 tasks
transformers 4.41.2 breaks paligemma inference
#31171
opened May 31, 2024 by
kishan-character
4 tasks
Finetuning OPT models with 8bit and LoRA on QA tasks leads to NAN weight in
model.qa_outputs
#31162
opened May 31, 2024 by
shiningrain
2 of 4 tasks
AttributeError: 'LlamaForCausalLM' object has no attribute '_setup_cache'
#31157
opened May 31, 2024 by
mobicham
2 of 4 tasks
TypeError: 'NoneType' object cannot be interpreted as an integer
#31156
opened May 31, 2024 by
andysingal
1 of 4 tasks
Unable to load t5-small tokenizer saved with latest packages in older versions
#31139
opened May 30, 2024 by
jpmann
2 of 4 tasks
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.