Skip to content

Latest commit

 

History

History
executable file
·
11 lines (6 loc) · 833 Bytes

README.md

File metadata and controls

executable file
·
11 lines (6 loc) · 833 Bytes

The Willow Inference Server has been released!

Willow users can now self-host the Willow Inference Server for lightning-fast language inference tasks with Willow and other applications (even WebRTC) including STT, TTS, LLM, and more!

Hello Willow Users!

Many users across various forums, social media, etc are starting to receive their hardware! I have enabled Github discussions to centralize these great conversations - stop by, introduce yourself, and let us know how things are going with Willow! Between Github discussions and issues we can all work together to make sure our early adopters have the best experience possible!

Documentation

Visit official documentation on heywillow.io.