-
Notifications
You must be signed in to change notification settings - Fork 695
Issues: mlc-ai/web-llm
[Announcement] Breaking changes regarding conversation template
#344
opened Mar 26, 2024 by
CharlieFRuan
Open
1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Tracking] Improve error handling by having customized error types
#470
opened Jun 12, 2024 by
Neet-Nestor
Running the MLCengine completion in the service worker results in
Receiving end does not exist
#469
opened Jun 11, 2024 by
talperetz
DOM manipulation (use case: filtering web search results locally)
#465
opened Jun 9, 2024 by
eMPee584
engine.interruptGenerate
makes future generations return with empty content
#447
opened May 31, 2024 by
3lectrologos
[Suggested feature] Add CroissantLLM, a French-English compact model
#431
opened May 28, 2024 by
neigeantre
Engine not instantiating for WebWorker
bug
Something isn't working
#394
opened May 13, 2024 by
kitzj
Models output is scrambled in Safari Technology Preview, which has WebGPU support
#386
opened May 2, 2024 by
felladrin
[Announcement] Breaking changes regarding conversation template
#344
opened Mar 26, 2024 by
CharlieFRuan
Previous Next
ProTip!
Updated in the last three days: updated:>2024-06-09.