You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, there is no indication about the cost related to the call of an LLM.
It would be interesting to understand how much the message affects the cost of invoking a LLM but it would be even more interesting when comparing models one to another.
In this picture, the cost could be displayed at the bottom, but it could also be part of the metadata section.
The text was updated successfully, but these errors were encountered:
bigadsoleiman
changed the title
add execution cost for chat and multi-chat playground
Show tokens consumption and estimated cost for chat and multi-chat playground
Mar 18, 2024
Do you think it would be useful to toggle this "cost insights" feature off and on in the settings?
Do you think it would be useful to show both the individual message cost (like in your awesome mockup) and the overall conversation cost?
Do you have any favorite frameworks/libraries that you'd recommend for implementing components of this feature?
Please feel free to contribute a draft pull request if you'd like to work on this -- all contributions welcome! And I'm ready to support you in that process.
Currently, there is no indication about the cost related to the call of an LLM.
It would be interesting to understand how much the message affects the cost of invoking a LLM but it would be even more interesting when comparing models one to another.
In this picture, the cost could be displayed at the bottom, but it could also be part of the metadata section.
The text was updated successfully, but these errors were encountered: