Skip to content

keisuke-okb/llm-tokenwise-inference

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

llm-tokenwise-inference

Token-wise and real-time display Inference module for Llama2 and other LLMs.

Getting Started

cd llm-tokenwise-inference
pip install -r requirements.txt
  • Run the following program in ipython or Jupyter.
from llminferencepkg import TokenWiseLLM
model = TokenWiseLLM("path/to/model") # or HF repository
model.inference("Question")

tokenwisellm

About

Token-wise and real-time display Inference module for Llama2 and other LLMs.

Topics

Resources

License

Stars

Watchers

Forks

Languages