https://bentoml.com logo
Join Slack
Powered by
# 中文-chinese
  • s

    Slackbot

    04/07/2023, 2:03 AM
    This message was deleted.
    c
    u
    • 3
    • 3
  • u

    孙琦

    04/07/2023, 2:31 AM
    对的,比如清华源,打包的时候经常超时
  • c

    Chaoyu

    04/07/2023, 2:37 AM
    可以的,可以在
    bentofil.yaml
    里设置
    index_url
    选项到对应的源 https://docs.bentoml.org/en/latest/concepts/bento.html#pip-install-options
  • s

    Slackbot

    04/11/2023, 8:39 AM
    This message was deleted.
    s
    • 2
    • 1
  • c

    Chaoyu

    04/24/2023, 7:45 AM
    https://hashdork.com/zh-TW/how-to-deploy-stable-diffusion-on-aws/
    🍱 1
  • p

    Pen-Hsuan Wang

    05/09/2023, 11:00 AM
    Hi, 大家好 想跟各位前輩們請教一下 logger 的配置問題。因為我不希望在用戶的服務器中紀錄太多模型的信息,因此想實作一個獨立於 ‘bentoml’ 的 logger。 e.g.
    Copy code
    # logger = logging.getLogger("bentoml")
    logger = logging.getLogger("myapplogger")
    並且使用 File Rotate。 但在運行的過程中發生 archived log file 依然會寫進最新的信息,並且舊的信息會被覆蓋掉。 請問,當使用自己設置的 logger 時,是不是應該關閉 bentoml 預設的 logger? 之前有看到 issue #1009 https://github.com/bentoml/BentoML/issues/1009 提到 cli flag
    --enable-log-to-file=false
    ,但現在在
    bentoml serve
    中並沒有看到這個 flag。想跟各位前輩們請教在 --production 的狀態下,該如何配置這個方法? Thanks in advance
  • s

    Slackbot

    05/18/2023, 7:45 AM
    This message was deleted.
    🏁 1
    j
    z
    • 3
    • 2
  • s

    Slackbot

    05/25/2023, 5:39 AM
    This message was deleted.
    x
    z
    • 3
    • 8
  • s

    Slackbot

    05/25/2023, 7:38 AM
    This message was deleted.
    s
    u
    • 3
    • 3
  • u

    薛莲

    05/25/2023, 7:41 AM
    当一个源下载不成功的时候自动切换另外的一个源
  • s

    Slackbot

    05/25/2023, 9:11 AM
    This message was deleted.
    s
    z
    • 3
    • 2
  • s

    Slackbot

    05/29/2023, 8:28 AM
    This message was deleted.
    x
    z
    • 3
    • 27
  • s

    Slackbot

    05/30/2023, 3:03 AM
    This message was deleted.
    s
    • 2
    • 1
  • s

    Slackbot

    09/15/2023, 10:03 AM
    This message was deleted.
    c
    • 2
    • 1
  • s

    Slackbot

    10/08/2023, 2:20 AM
    This message was deleted.
    s
    j
    • 3
    • 7
  • j

    Jesen Chen

    10/08/2023, 9:10 AM
    请问一下,这里的models 参数作用是什么?
  • j

    Jesen Chen

    10/09/2023, 2:22 AM
    自定义的runner,里面的model需要手动到cuda,这块有接口定义,到时候启动runner的时候,自动移动到gpu上么。。
    👀 1
  • s

    Slackbot

    10/09/2023, 2:22 AM
    This message was deleted.
    j
    • 2
    • 1
  • s

    Slackbot

    12/06/2023, 3:44 AM
    This message was deleted.
  • q

    Quincy yang

    12/07/2023, 6:22 AM
    你好,我按照openllm的官方文档执行,但是结果会报错:
    Copy code
    (myenv) root@d2722d32a15a:/workspace# TRUST_REMOTE_CODE=True openllm start thudm/chatglm-6b --backend vllm
    config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 773/773 [00:00<00:00, 90.4kB/s]
    configuration_chatglm.py: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.28k/4.28k [00:00<00:00, 609kB/s]
    A new version of the following files was downloaded from <https://huggingface.co/thudm/chatglm-6b>:
    - configuration_chatglm.py
    . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
    tokenizer_config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 441/441 [00:00<00:00, 316kB/s]
    tokenization_chatglm.py: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 17.0k/17.0k [00:00<00:00, 283kB/s]
    A new version of the following files was downloaded from <https://huggingface.co/thudm/chatglm-6b>:
    - tokenization_chatglm.py
    . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
    ice_text.model: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.71M/2.71M [00:08<00:00, 310kB/s]
    Traceback (most recent call last):
      File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
        return io.open(
    FileNotFoundError: [Errno 2] No such file or directory: b'/root/bentoml/models/vllm-thudm--chatglm-6b/latest'
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 144, in get
        _tag.version = self._fs.readtext(_tag.latest_path())
      File "/workspace/myenv/lib/python3.9/site-packages/fs/base.py", line 693, in readtext
        self.open(
      File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
        return io.open(
      File "/workspace/myenv/lib/python3.9/site-packages/fs/error_tools.py", line 89, in __exit__
        reraise(fserror, fserror(self._path, exc=exc_value), traceback)
      File "/workspace/myenv/lib/python3.9/site-packages/six.py", line 718, in reraise
        raise value.with_traceback(tb)
      File "/workspace/myenv/lib/python3.9/site-packages/fs/osfs.py", line 647, in open
        return io.open(
    fs.errors.ResourceNotFound: resource 'vllm-thudm--chatglm-6b/latest' not found
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 118, in _recreate_latest
        items = self.list(tag.name)
      File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 95, in list
        raise NotFound(
    bentoml.exceptions.NotFound: no Models with name 'vllm-thudm--chatglm-6b' found
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/workspace/myenv/lib/python3.9/site-packages/openllm/_llm.py", line 198, in __init__
        model = bentoml.models.get(self.tag)
      File "/workspace/myenv/lib/python3.9/site-packages/simple_di/__init__.py", line 139, in _
        return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
      File "/workspace/myenv/lib/python3.9/site-packages/bentoml/models.py", line 45, in get
        return _model_store.get(tag)
      File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 149, in get
        self._recreate_latest(_tag)
      File "/workspace/myenv/lib/python3.9/site-packages/bentoml/_internal/store.py", line 120, in _recreate_latest
        raise NotFound(
    bentoml.exceptions.NotFound: no Models with name 'vllm-thudm--chatglm-6b' exist in BentoML store <osfs '/root/bentoml/models'>
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/workspace/myenv/bin/openllm", line 8, in <module>
        sys.exit(cli())
      File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
        return self.main(*args, **kwargs)
      File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1078, in main
        rv = self.invoke(ctx)
      File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1688, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/workspace/myenv/lib/python3.9/site-packages/click/core.py", line 783, in invoke
        return __callback(*args, **kwargs)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 204, in wrapper
        return_value = func(*args, **attrs)
      File "/workspace/myenv/lib/python3.9/site-packages/click/decorators.py", line 33, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 183, in wrapper
        return f(*args, **attrs)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm_cli/entrypoint.py", line 415, in start_command
        llm = openllm.LLM[t.Any, t.Any](
      File "/usr/local/lib/python3.9/typing.py", line 687, in __call__
        result = self.__origin__(*args, **kwargs)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm/_llm.py", line 200, in __init__
        model = openllm.serialisation.import_model(self, trust_remote_code=self.trust_remote_code)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/__init__.py", line 59, in caller
        return getattr(importlib.import_module(f'.{serde}', 'openllm.serialisation'), fn)(llm, *args, **kwargs)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/transformers/__init__.py", line 29, in import_model
        tokenizer = get_tokenizer(llm.model_id, trust_remote_code=trust_remote_code, **hub_attrs, **tokenizer_attrs)
      File "/workspace/myenv/lib/python3.9/site-packages/openllm/serialisation/transformers/_helpers.py", line 7, in get_tokenizer
        tokenizer = transformers.AutoTokenizer.from_pretrained(model_id_or_path, trust_remote_code=trust_remote_code, **attrs)
      File "/workspace/myenv/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 755, in from_pretrained
        return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
      File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
        return cls._from_pretrained(
      File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
        tokenizer = cls(*init_inputs, **init_kwargs)
      File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 196, in __init__
        super().__init__(
      File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 367, in __init__
        self._add_tokens(
      File "/workspace/myenv/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
        current_vocab = self.get_vocab().copy()
      File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 248, in get_vocab
        vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
      File "/root/.cache/huggingface/modules/transformers_modules/thudm/chatglm-6b/8b7d33596d18c5e83e2da052d05ca4db02e60620/tokenization_chatglm.py", line 244, in vocab_size
        return self.sp_tokenizer.num_tokens
    AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'
    (myenv) root@d2722d32a15a:/workspace# cd /root/bentoml/models/vllm-thudm--chatglm-6b
    bash: cd: /root/bentoml/models/vllm-thudm--chatglm-6b: No such file or directory
    报错信息提示:
    Copy code
    resource 'vllm-thudm--chatglm-6b/latest' not found
    为什么会报这个错呢
  • u

    留正风

    03/05/2024, 8:34 AM
    这个模型应该是要另外下载到对应的目录,本身不自带的
  • a

    atlas l

    04/02/2024, 6:46 AM
    创建token时给了这个提示get user organization: cannot found organization: record not found,这个的用户信息在哪更改呢
    👀 1
  • h

    hj d

    04/22/2024, 7:36 AM
    Tried to run bentoml build and bentoml containerize It failed with following errorbentoml.exceptions.BentoMLException Command '['/usr/bin/docker', 'build', '--tag', 'face_service:ys35eyqao24dwaav', '--file', '/tmp/tmpocxs3mwafsTempFS/env/docker/Dockerfile', '/tmp/tmpocxs3mwafsTempFS/']' returned non-zero exit status 1.
  • h

    hj d

    04/22/2024, 7:40 AM
    这个问题有人遇到过吗,该怎么解决
  • u

    孙琦

    04/23/2024, 4:24 AM
    bentoml container 的时候怎么切换国内的源
    c
    • 2
    • 2
  • u

    孙琦

    04/23/2024, 4:24 AM
    1. 中国科技大学:https://mirrors.ustc.edu.cn/debian/ 2. 清华大学:https://mirrors.tuna.tsinghua.edu.cn/debian/ 3. 北京外国语大学:https://mirror.bfsu.edu.cn/debian/ 4. 阿里云:https://mirrors.aliyun.com/debian/
  • m

    Manjusaka

    05/06/2024, 4:06 PM
    说起来,关于 cloud 的bug 可以在哪反馈,可能和 Security 相关
    c
    • 2
    • 1
  • b

    blake

    08/21/2024, 8:39 AM
    👋 Hello,
  • z

    z chen

    09/17/2024, 9:17 AM
    👋 大家好!
  • u

    球子

    11/25/2024, 9:35 AM
    刚刚加入!我错过了什么?
    c
    • 2
    • 1