Skip to content

Releases: undreamai/LLMUnity

Release v1.2.8

27 May 11:28
Compare
Choose a tag to compare

πŸš€ Features

  • Switch to llamafile v0.8.6 (PR: #155)
  • Add phi-3 support (PR: #156)

Release v1.2.7

19 Apr 17:04
Compare
Choose a tag to compare

πŸš€ Features

  • Add Llama 3 and Vicuna chat templates (PR: #145)

πŸ“¦ General

  • Use the context size of the model by default for longer history (PR: #147)

Release v1.2.6

01 Apr 08:17
Compare
Choose a tag to compare

πŸš€ Features

  • Add documentation (PR: #135)

πŸ› Fixes

  • Add server security for interceptions from external llamafile servers (PR: #132)
  • Adapt server security for macOS (PR: #137)

πŸ“¦ General

  • Add sample to demonstrates the async functionality (PR: #136)

Release v1.2.5

23 Mar 09:24
Compare
Choose a tag to compare

πŸ› Fixes

  • Add to chat history only if the response is not null (PR: #123)
  • Allow SetTemplate function in Runtime (PR: #129)

Release v1.2.4

13 Mar 18:17
Compare
Choose a tag to compare

πŸš€ Features

  • Use llamafile v0.6.2 (PR: #111)
  • Pure text completion functionality (PR: #115)
  • Allow change of roles after starting the interaction (PR: #120)

πŸ› Fixes

  • use Debug.LogError instead of Exception for more verbosity (PR: #113)
  • Trim chat responses (PR: #118)
  • Fallback to CPU for macOS with unsupported GPU (PR: #119)
  • Removed duplicate EditorGUI.EndChangeCheck() (PR: #110)

πŸ“¦ General

  • Provide access to LLMUnity version (PR: #117)
  • Rename to "LLM for Unity" (PR: #121)

Release v1.2.3

09 Mar 10:43
Compare
Choose a tag to compare

πŸ› Fixes

  • Fix async server 2 (PR: #108)

Release v1.2.2

07 Mar 20:01
Compare
Choose a tag to compare

πŸ› Fixes

  • use namespaces in all classes (PR: #104)
  • await separately in StartServer (PR: #107)

Release v1.2.1

07 Mar 12:38
Compare
Choose a tag to compare

πŸ› Fixes

  • Kill server after Unity crash (PR: #101)
  • Persist chat template on remote servers (PR: #103)

Release v1.2.0

29 Feb 19:39
Compare
Choose a tag to compare

πŸš€ Features

  • LLM server unit tests (PR: #90)
  • Implement chat templates (PR: #92)
  • Stop chat functionality (PR: #95)
  • Keep only the llamafile binary (PR: #97)

πŸ› Fixes

  • Fix remote server functionality (PR: #96)
  • Fix Max issue needing to run llamafile manually the first time (PR: #98)

πŸ“¦ General

  • Async startup support (PR: #89)

Release v1.1.1

19 Feb 13:44
Compare
Choose a tag to compare

πŸ“¦ General

  • Refactoring and small enhancements (PR: #80)