Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

💄 style: Optimized MaxToken Slider #2258

Merged
merged 21 commits into from May 8, 2024
Merged

💄 style: Optimized MaxToken Slider #2258

merged 21 commits into from May 8, 2024

Conversation

sxjeru
Copy link
Contributor

@sxjeru sxjeru commented Apr 28, 2024

💻 变更类型 | Change Type

  • ✨ feat
  • 🐛 fix
  • ♻️ refactor
  • 💄 style
  • 🔨 chore
  • ⚡️ perf
  • 📝 docs

🔀 变更说明 | Description of Change

1k -> 0(无限制),200k -> 256k(避免重叠),k -> K。
限制最小值为 0,step 跨度从 1k 改为 2k 。
1k = 1000 -> 1k = 1024.

稍等尝试适配 i18n 已适配。
移动端测试刻度依旧重叠,尝试去掉单位。

  • 备忘:移动端点开自定义模型设置后,模型列表应当自动隐藏

📝 补充信息 | Additional Information

old:
image

new:
image

Copy link

vercel bot commented Apr 28, 2024

@sxjeru is attempting to deploy a commit to the LobeHub Team on Vercel.

A member of the Team first needs to authorize it.

@lobehubbot
Copy link
Member

👍 @sxjeru

Thank you for raising your pull request and contributing to our Community
Please make sure you have followed our contributing guidelines. We will review it as soon as possible.
If you encounter any problems, please feel free to connect with us.
非常感谢您提出拉取请求并为我们的社区做出贡献,请确保您已经遵循了我们的贡献指南,我们会尽快审查它。
如果您遇到任何问题,请随时与我们联系。

@arvinxx
Copy link
Contributor

arvinxx commented Apr 28, 2024

256k 不太合理,目前主流的都是200k

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


256k is not reasonable, the current mainstream is 200k

@sxjeru
Copy link
Contributor Author

sxjeru commented Apr 28, 2024

256k 可能不够实用,但大体需要自定义的模型(lobechat 未适配的)基本达不到 200k。
如果输入自定义模型 id,能自动从 lobechat 模型库中取到 tokens 值,想必能取代不少 200k 场景。

为什么会自定义重复的模型,因为 oneapi 。

如果实在希望保留 200k,这边可以妥协。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


256k may not be practical enough.

If you enter a custom model ID, the tokens value can be automatically obtained from the lobechat model library, which will surely replace many 200k scenes.

@sxjeru sxjeru marked this pull request as ready for review April 28, 2024 09:55
@arvinxx
Copy link
Contributor

arvinxx commented Apr 28, 2024

如果输入自定义模型 id,能自动从 lobechat 模型库中取到 tokens 值,想必能取代不少 200k 场景。

@sxjeru 我觉得这个倒是挺合理的,如果支持了这个,再把 200k 换成 256k 就没太大问题。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


If you enter a custom model id, the tokens value can be automatically obtained from the lobechat model library, which will surely replace many 200k scenes.

@sxjeru I think this is quite reasonable. If this is supported, then replacing 200k with 256k will not be a big problem.

@sxjeru
Copy link
Contributor Author

sxjeru commented Apr 28, 2024

commit 反馈。(上旧下新)

image
image


如果保留200k,是不是考虑去掉128k的轴标签。(保留刻度点)

image

这边还设想减小滑块的步长,从 仅刻度 细分到 1k 颗粒度,不知有无必要,是否会影响交互。
step 优先级在 mask 之前,所以不再考虑。

image

@arvinxx
Copy link
Contributor

arvinxx commented Apr 29, 2024

如果 0 是无限的话,我建议 token 的字符示意就应该也要用 无限大,而不是 Inf

如果保留200k,是不是考虑去掉128k的轴标签。(保留刻度点)

这个倒可以

这边还设想减小滑块的步长,从 仅刻度 细分到 1k 颗粒度,不知有无必要,是否会影响交互。

不需要减小。原因是开发 model 时, token 设定的最佳实践就是 2 次幂,一般常见的都是 4k、8k、16k、32k 等等。不太会有奇怪脑回路的7k 、15k 这种。如果真存在的话,再让用户在右边输入。 主动线上,我还是希望通过刻度保证用户配置操作的流畅性。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


If 0 is infinite, I suggest that the character representation of token should also be infinite instead of Inf.

If 200k is retained, consider removing the 128k axis labels. (keep tick points)

This is okay

Here we also imagine reducing the step size of the slider, subdividing it from scale only to 1k granularity. I wonder whether it is necessary or whether it will affect the interaction.

No need to reduce. The reason is that the best practice for model token setting is the power of 2. The common ones are 4k, 8k, 16k, 32k, etc. It’s unlikely that 7k or 15k will have strange brain circuits. If it does exist, just let the right side enter it. In mainstream operations, scales are still used to ensure configuration fluency.

Copy link

codecov bot commented Apr 29, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 92.96%. Comparing base (bf8ef1f) to head (2108db4).
Report is 16 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2258   +/-   ##
=======================================
  Coverage   92.96%   92.96%           
=======================================
  Files         301      301           
  Lines       17392    17393    +1     
  Branches     1255     1256    +1     
=======================================
+ Hits        16169    16170    +1     
  Misses       1223     1223           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@sxjeru
Copy link
Contributor Author

sxjeru commented Apr 29, 2024

如果 0 是无限的话,我建议 token 的字符示意就应该也要用 无限大,而不是 Inf

这样可能导致其他语言的 tag 会很长,与 1K、1M 不太搭配。此外也有 tooltip 作为提示。

设想减小滑块的步长……

实际倒不是让用户来拖滑块,只是单纯输入数字时滑块可视化能精确些。
(滑块内置的四舍五入其实不太适合此种场景,更适合向下取整,如 3.9k 判为 2k,而非 4k)

不会再考虑。


此外还存在 k 到底指 1024 还是 1000 的问题,按国外某些惯例:1 kilo = 1k = 1000,
1 Kibi = 1K = 1024
对于当下流行模型,32K 及以下为 1024,128k 及以上为 1000,Gemini 的 1M 又是 1024*1024 。

目前处理方案 ff17c88 是 [128000, 1M) 采用 1000(小写 k),其他为 1024(大写 K),往后因时而变。
不知是否合适,还是至少统一 K 的大小写。(毕竟只区分大小写 也没几个人会知晓用意)

image

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


If 0 is infinite, I suggest that the character representation of token should also be infinite instead of Inf

This may cause tags in other languages ​​to be very long, which does not match 1K or 1M.


In addition, there is also the question of whether k refers to 1024 or 1000. According to some foreign conventions: 1 kilo = 1k = 1000,
1 Kibi = 1K = 1024
For the current popular models, 32K and below are 1024, 128k and above are 1000, and Gemini's 1M is 1024*1024.

The current solution for ff17c88 is [128000, 1M), which uses 1000 (lowercase k), and the others are 1024 (uppercase K), which may change over time.
I don’t know if it’s appropriate or at least unify the upper and lower case of K. (After all, it’s case-sensitive and few people understand the meaning)

image

@arvinxx
Copy link
Contributor

arvinxx commented Apr 29, 2024

这样可能导致其他语言的 tag 会很长

我的意思是用 ∞ 这个符号

此外还存在 k 到底指 1024 还是 1000 的问题,按国外某些惯例:1 kilo = 1k = 1000,

你的考虑挺细致的,之前我也纠结过,但是后来想了下还是怎么简单怎么来了。其实100k以上就基本上不纠结 是1024单位还是 1000单位了。因为正常情况下都不会用到这个尺度

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


This may cause tags in other languages ​​to be very long

I mean using the symbol ∞

@arvinxx
Copy link
Contributor

arvinxx commented Apr 29, 2024

@sxjeru 需要 rebase 下 main 了

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@sxjeru Need to rebase main

Copy link
Contributor

@arvinxx arvinxx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

个别地方需要调整下~

next.config.mjs Outdated Show resolved Hide resolved
src/components/ModelSelect/index.tsx Show resolved Hide resolved
@arvinxx arvinxx changed the title 🔨 chore: Optimized MaxToken Slider 💄 style: Optimized MaxToken Slider May 5, 2024
Copy link

vercel bot commented May 5, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
lobe-chat-community ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 7, 2024 2:17am

@sxjeru
Copy link
Contributor Author

sxjeru commented May 5, 2024

tag 里的 ∞ 比例感觉还是小了。
image

@arvinxx
Copy link
Contributor

arvinxx commented May 5, 2024

tag 里的 ∞ 比例感觉还是小了。

是的…或者换成 icon 呢? https://lucide.dev/icons/infinity

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


The ∞ ratio in tag still feels too small.

Yes...or change it to icon? https://lucide.dev/icons/infinity

@sxjeru
Copy link
Contributor Author

sxjeru commented May 6, 2024

icon 可行,希望调的还算适合。
image

Copy link
Contributor

@arvinxx arvinxx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now, Thanks for your patience and great work!

@arvinxx arvinxx merged commit dfb892b into lobehub:main May 8, 2024
6 checks passed
@lobehubbot
Copy link
Member

❤️ Great PR @sxjeru ❤️

The growth of project is inseparable from user feedback and contribution, thanks for your contribution! If you are interesting with the lobehub developer community, please join our discord and then dm @arvinxx or @canisminor1990. They will invite you to our private developer channel. We are talking about the lobe-chat development or sharing ai newsletter around the world.
项目的成长离不开用户反馈和贡献,感谢您的贡献! 如果您对 LobeHub 开发者社区感兴趣,请加入我们的 discord,然后私信 @arvinxx@canisminor1990。他们会邀请您加入我们的私密开发者频道。我们将会讨论关于 Lobe Chat 的开发,分享和讨论全球范围内的 AI 消息。

github-actions bot pushed a commit that referenced this pull request May 8, 2024
### [Version 0.155.3](v0.155.2...v0.155.3)
<sup>Released on **2024-05-08**</sup>

#### 💄 Styles

- **misc**: Optimized MaxToken Slider.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### Styles

* **misc**: Optimized MaxToken Slider, closes [#2258](#2258) ([dfb892b](dfb892b))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
@lobehubbot
Copy link
Member

🎉 This PR is included in version 0.155.3 🎉

The release is available on:

Your semantic-release bot 📦🚀

github-actions bot pushed a commit to bentwnghk/lobe-chat that referenced this pull request May 8, 2024
### [Version&nbsp;1.35.1](v1.35.0...v1.35.1)
<sup>Released on **2024-05-08**</sup>

#### 💄 Styles

- **misc**: Optimized MaxToken Slider.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### Styles

* **misc**: Optimized MaxToken Slider, closes [lobehub#2258](https://github.com/bentwnghk/lobe-chat/issues/2258) ([dfb892b](dfb892b))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
@sxjeru sxjeru deleted the fork3 branch May 14, 2024 11:40
TheNameIsNigel pushed a commit to TheNameIsNigel/lobe-chat that referenced this pull request May 15, 2024
* Update MaxTokenSlider

* k -> K

* i18n

* Small screen adaptation

* `next/image` Un-configured Host

* ModelSelect.featureTag.tokens

* 128k=128,000, 4K=4096, Kibi / kilo

* Restore llm.ts

* Inf -> ∞

* patch

* const Kibi = 1024;

* refactor marks

* Update MaxTokenSlider.tsx

* infinity icon
TheNameIsNigel pushed a commit to TheNameIsNigel/lobe-chat that referenced this pull request May 15, 2024
### [Version&nbsp;0.155.3](lobehub/lobe-chat@v0.155.2...v0.155.3)
<sup>Released on **2024-05-08**</sup>

#### 💄 Styles

- **misc**: Optimized MaxToken Slider.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### Styles

* **misc**: Optimized MaxToken Slider, closes [lobehub#2258](lobehub#2258) ([dfb892b](lobehub@dfb892b))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants