Replies: 3 comments
-
As far as I am aware large is equal to large_v2, I don't think there is any
other large model. :)
Thanks and Best Regards,
Nick Bento
…On Mon, Aug 21, 2023 at 10:01 AM Nick Bento ***@***.***> wrote:
As far as I am aware large is equal to large_v2, I don't think there is
any other large model. :)
—
Reply to this email directly, view it on GitHub
<#248 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFTDKKFNV7OR6PU7IFVF5TXWNS4XANCNFSM6AAAAAA3YN4E2M>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
-
WIS has always used large-v2. In configs, URI params, etc it is "large". |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thanks for classifying....
…On Mon, Aug 21, 2023 at 10:48 AM Kristian Kielhofner < ***@***.***> wrote:
WIS has always used large-v2. In configs, URI params, etc it is "large".
—
Reply to this email directly, view it on GitHub
<#248 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEN5KAEFJEPPXHIM2A33YZTXWOGPHANCNFSM6AAAAAA3YN4E2M>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I noticed that with RC1 there has been a LARGE_V2 language model released.
When flashing willow - do we need to code large_v2 or will large still work to pick up v2?
This is my config setting: https://192.168.xxx.yyy:19000/api/willow?model=large&beam_size=5
Beta Was this translation helpful? Give feedback.
All reactions