Slackbot
04/07/2023, 2:03 AM孙琦
04/07/2023, 2:31 AMChaoyu
04/07/2023, 2:37 AMbentofil.yaml
里设置 index_url
选项到对应的源 https://docs.bentoml.org/en/latest/concepts/bento.html#pip-install-optionsSlackbot
04/11/2023, 8:39 AMChaoyu
04/24/2023, 7:45 AMPen-Hsuan Wang
05/09/2023, 11:00 AM# logger = logging.getLogger("bentoml")
logger = logging.getLogger("myapplogger")
並且使用 File Rotate。
但在運行的過程中發生 archived log file 依然會寫進最新的信息,並且舊的信息會被覆蓋掉。
請問,當使用自己設置的 logger 時,是不是應該關閉 bentoml 預設的 logger?
之前有看到 issue #1009 https://github.com/bentoml/BentoML/issues/1009 提到 cli flag --enable-log-to-file=false
,但現在在 bentoml serve
中並沒有看到這個 flag。想跟各位前輩們請教在 --production 的狀態下,該如何配置這個方法?
Thanks in advanceSlackbot
05/18/2023, 7:45 AMSlackbot
05/25/2023, 5:39 AMSlackbot
05/25/2023, 7:38 AM薛莲
05/25/2023, 7:41 AMSlackbot
05/25/2023, 9:11 AMSlackbot
05/29/2023, 8:28 AMSlackbot
05/30/2023, 3:03 AMSlackbot
09/15/2023, 10:03 AMSlackbot
10/08/2023, 2:20 AMJesen Chen
10/08/2023, 9:10 AMJesen Chen
10/09/2023, 2:22 AMSlackbot
10/09/2023, 2:22 AMSlackbot
12/06/2023, 3:44 AMQuincy yang
12/07/2023, 6:22 AM(myenv) root@d2722d32a15a:/workspace# TRUST_REMOTE_CODE=True openllm start thudm/chatglm-6b --backend vllm
config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 773/773 [00:00<00:00, 90.4kB/s]
configuration_chatglm.py: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.28k/4.28k [00:00<00:00, 609kB/s]
A new version of the following files was downloaded from <https://huggingface.co/thudm/chatglm-6b>:
- configuration_chatglm.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
tokenizer_config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 441/441 [00:00<00:00, 316kB/s]
tokenization_chatglm.py: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 17.0k/17.0k [00:00<00:00, 283kB/s]
A new version of the following files was downloaded from <https://huggingface.co/thudm/chatglm-6b>:
- tokenization_chatglm.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
ice_text.model: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.71M/2.71M [00:08<00:00, 310kB/s]
Traceback (most recent call last):
File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
return io.open(
FileNotFoundError: [Errno 2] No such file or directory: b'/root/bentoml/models/vllm-thudm--chatglm-6b/latest'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 144, in get
_tag.version = self._fs.readtext(_tag.latest_path())
File "/workspace/myenv/lib/python3.9/site-packages/fs/base.py", line 693, in readtext
self.open(
File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
return io.open(
File "/workspace/myenv/lib/python3.9/site-packages/fs/error_tools.py", line 89, in __exit__
reraise(fserror, fserror(self._path, exc=exc_value), traceback)
File "/workspace/myenv/lib/python3.9/site-packages/six.py", line 718, in reraise
raise value.with_traceback(tb)
File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
return io.open(
fs.errors.ResourceNotFound: resource 'vllm-thudm--chatglm-6b/latest' not found
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 118, in _recreate_latest
items = self.list(tag.name)
File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 95, in list
raise NotFound(
bentoml.exceptions.NotFound: no Models with name 'vllm-thudm--chatglm-6b' found
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspace/myenv/lib/python3.9/site-packages/openllm/_llm.py", line 198, in __init__
model = bentoml.models.get(self.tag)
File "/workspace/myenv/lib/python3.9/site-packages/simple_di/__init__.py", line 139, in _
return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
File "/workspace/myenv/lib/python3.9/site-packages/bentoml/models.py", line 45, in get
return _model_store.get(tag)
File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 149, in get
self._recreate_latest(_tag)
File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 120, in _recreate_latest
raise NotFound(
bentoml.exceptions.NotFound: no Models with name 'vllm-thudm--chatglm-6b' exist in BentoML store <osfs '/root/bentoml/models'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspace/myenv/bin/openllm", line 8, in <module>
sys.exit(cli())
File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 204, in wrapper
return_value = func(*args, **attrs)
File "/workspace/myenv/lib/python3.9/site-packages/click/decorators.py", line 33, in new_func
return f(get_current_context(), *args, **kwargs)
File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 183, in wrapper
return f(*args, **attrs)
File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 415, in start_command
llm = openllm.LLM[t.Any, t.Any](
File "/usr/local/lib/python3.9/typing.py", line 687, in __call__
result = self.__origin__(*args, **kwargs)
File "/workspace/myenv/lib/python3.9/site-packages/openllm/_llm.py", line 200, in __init__
model = openllm.serialisation.import_model(self, trust_remote_code=self.trust_remote_code)
File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/__init__.py", line 59, in caller
return getattr(importlib.import_module(f'.{serde}', 'openllm.serialisation'), fn)(llm, *args, **kwargs)
File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/transformers/__init__.py", line 29, in import_model
tokenizer = get_tokenizer(llm.model_id, trust_remote_code=trust_remote_code, **hub_attrs, **tokenizer_attrs)
File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/transformers/_helpers.py", line 7, in get_tokenizer
tokenizer = transformers.AutoTokenizer.from_pretrained(model_id_or_path, trust_remote_code=trust_remote_code, **attrs)
File "/workspace/myenv/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 755, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
return cls._from_pretrained(
File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 196, in __init__
super().__init__(
File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 367, in __init__
self._add_tokens(
File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 248, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 244, in vocab_size
return self.sp_tokenizer.num_tokens
AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'
(myenv) root@d2722d32a15a:/workspace# cd /root/bentoml/models/vllm-thudm--chatglm-6b
bash: cd: /root/bentoml/models/vllm-thudm--chatglm-6b: No such file or directory
报错信息提示:
resource 'vllm-thudm--chatglm-6b/latest' not found
为什么会报这个错呢留正风
03/05/2024, 8:34 AMatlas l
04/02/2024, 6:46 AMhj d
04/22/2024, 7:36 AMhj d
04/22/2024, 7:40 AM孙琦
04/23/2024, 4:24 AM孙琦
04/23/2024, 4:24 AMManjusaka
05/06/2024, 4:06 PMblake
08/21/2024, 8:39 AMz chen
09/17/2024, 9:17 AM球子
11/25/2024, 9:35 AM