你好,我按照openllm的官方文档执行,但是结果会报错: ```(myenv) root@d272...
# 中文-chinese
q
你好,我按照openllm的官方文档执行,但是结果会报错:
Copy code
(myenv) root@d2722d32a15a:/workspace# TRUST_REMOTE_CODE=True openllm start thudm/chatglm-6b --backend vllm
config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 773/773 [00:00<00:00, 90.4kB/s]
configuration_chatglm.py: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.28k/4.28k [00:00<00:00, 609kB/s]
A new version of the following files was downloaded from <https://huggingface.co/thudm/chatglm-6b>:
- configuration_chatglm.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
tokenizer_config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 441/441 [00:00<00:00, 316kB/s]
tokenization_chatglm.py: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 17.0k/17.0k [00:00<00:00, 283kB/s]
A new version of the following files was downloaded from <https://huggingface.co/thudm/chatglm-6b>:
- tokenization_chatglm.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
ice_text.model: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.71M/2.71M [00:08<00:00, 310kB/s]
Traceback (most recent call last):
  File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
    return io.open(
FileNotFoundError: [Errno 2] No such file or directory: b'/root/bentoml/models/vllm-thudm--chatglm-6b/latest'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 144, in get
    _tag.version = self._fs.readtext(_tag.latest_path())
  File "/workspace/myenv/lib/python3.9/site-packages/fs/base.py", line 693, in readtext
    self.open(
  File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
    return io.open(
  File "/workspace/myenv/lib/python3.9/site-packages/fs/error_tools.py", line 89, in __exit__
    reraise(fserror, fserror(self._path, exc=exc_value), traceback)
  File "/workspace/myenv/lib/python3.9/site-packages/six.py", line 718, in reraise
    raise value.with_traceback(tb)
  File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
    return io.open(
fs.errors.ResourceNotFound: resource 'vllm-thudm--chatglm-6b/latest' not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 118, in _recreate_latest
    items = self.list(tag.name)
  File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 95, in list
    raise NotFound(
bentoml.exceptions.NotFound: no Models with name 'vllm-thudm--chatglm-6b' found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workspace/myenv/lib/python3.9/site-packages/openllm/_llm.py", line 198, in __init__
    model = bentoml.models.get(self.tag)
  File "/workspace/myenv/lib/python3.9/site-packages/simple_di/__init__.py", line 139, in _
    return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
  File "/workspace/myenv/lib/python3.9/site-packages/bentoml/models.py", line 45, in get
    return _model_store.get(tag)
  File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 149, in get
    self._recreate_latest(_tag)
  File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 120, in _recreate_latest
    raise NotFound(
bentoml.exceptions.NotFound: no Models with name 'vllm-thudm--chatglm-6b' exist in BentoML store <osfs '/root/bentoml/models'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workspace/myenv/bin/openllm", line 8, in <module>
    sys.exit(cli())
  File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 204, in wrapper
    return_value = func(*args, **attrs)
  File "/workspace/myenv/lib/python3.9/site-packages/click/decorators.py", line 33, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 183, in wrapper
    return f(*args, **attrs)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 415, in start_command
    llm = openllm.LLM[t.Any, t.Any](
  File "/usr/local/lib/python3.9/typing.py", line 687, in __call__
    result = self.__origin__(*args, **kwargs)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm/_llm.py", line 200, in __init__
    model = openllm.serialisation.import_model(self, trust_remote_code=self.trust_remote_code)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/__init__.py", line 59, in caller
    return getattr(importlib.import_module(f'.{serde}', 'openllm.serialisation'), fn)(llm, *args, **kwargs)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/transformers/__init__.py", line 29, in import_model
    tokenizer = get_tokenizer(llm.model_id, trust_remote_code=trust_remote_code, **hub_attrs, **tokenizer_attrs)
  File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/transformers/_helpers.py", line 7, in get_tokenizer
    tokenizer = transformers.AutoTokenizer.from_pretrained(model_id_or_path, trust_remote_code=trust_remote_code, **attrs)
  File "/workspace/myenv/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 755, in from_pretrained
    return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
  File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
    return cls._from_pretrained(
  File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 196, in __init__
    super().__init__(
  File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 367, in __init__
    self._add_tokens(
  File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
    current_vocab = self.get_vocab().copy()
  File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 248, in get_vocab
    vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
  File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 244, in vocab_size
    return self.sp_tokenizer.num_tokens
AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'
(myenv) root@d2722d32a15a:/workspace# cd /root/bentoml/models/vllm-thudm--chatglm-6b
bash: cd: /root/bentoml/models/vllm-thudm--chatglm-6b: No such file or directory
报错信息提示:
Copy code
resource 'vllm-thudm--chatglm-6b/latest' not found
为什么会报这个错呢