如题,
错误信息:
Can't load tokenizer using from_pretrained, please update its configuration: Can't load tokenizer for 'Langboat/mengzi-gpt-neo-base'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'Langboat/mengzi-gpt-neo-base' is the correct path to a directory containing all relevant files for a GPT2TokenizerFast tokenizer.