Skip to content

llama #259

Answered by ningding97
yangYJT asked this question in Q&A
llama #259
Apr 5, 2023 · 2 comments · 1 reply
Discussion options

You must be logged in to vote

Of course, we have included a tutorial to use GPT-J to train Chat AI. You can use similar approach to train Llama, for example

from openprompt import plms
from openprompt.plms import *
from transformers import GPTJConfig, GPTJModel, GPTJForCausalLM, GPT2Tokenizer, LlamaConfig, LlamaForCausalLM, LlamaTokenizer
plms._MODEL_CLASSES["gptj"]= ModelClass(**{"config": GPTJConfig, "tokenizer": GPT2Tokenizer, "model": GPTJForCausalLM, "wrapper": LMTokenizerWrapper})
plms._MODEL_CLASSES["llama"]= ModelClass(**{"config": LlamaConfig, "tokenizer": LlamaTokenizer, "model": LlamaForCausalLM, "wrapper": LMTokenizerWrapper})

Please note that we are not distributing Llama checkpoints.

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
1 reply
@shuaizhao95
Comment options

Answer selected by ningding97
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants