LLaMA model loading error _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg) TypeError: not a string

tokenizer = LlamaTokenizer.from_pretrained(lora_model_path)
The lora_model_path item is not a string type.
The parameter item of the running command is:
–lora_model ziqingyang/chinese-llama-plus-lora-7b
followed by ziqingyang/chinese-llama-plus-lora-7b is huggingface I changed the name
to
–lora_model ziqingyang\/chinese-llama-plus-lora-7b,
that is, add a backslash in front of the slash, so no error will be reported.
I guess this may be a BUG, ​​and I haven’t delved into it. Anyone who knows Comment to let us know.

Guess you like

Origin blog.csdn.net/artistkeepmonkey/article/details/130476763