Skip to content

关于mindnlp适配的bge_m3的模型参数权重 #1071

Answered by lvyufeng
PhilipGAQ asked this question in Q&A
Discussion options

You must be logged in to vote

用这个代码:https://github.com/mindspore-lab/mindnlp/blob/master/llm/inference/bge_m3/run_bge_m3.py

在拿到model之后,直接保存:

import mindspore
......
# Trust remote code is required to load the model
tokenizer = AutoTokenizer.from_pretrained('liuyanyi/bge-m3-hf')
model = AutoModel.from_pretrained('liuyanyi/bge-m3-hf')

mindspore.save_checkpoint(model, 'bge-m3.ckpt')

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by lvyufeng
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants