Skip to content

vllm serve need chat template. #39

@pzs19

Description

@pzs19

System Info / 系統信息

ValueError: As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.

Who can help? / 谁可以帮助到您?

No response

Information / 问题信息

  • The official example scripts / 官方的示例脚本
  • My own modified scripts / 我自己修改的脚本和任务

Reproduction / 复现过程

vllm serve THUDM/LongWriter-llama3.1-8b
curl http://localhost:8000/v1/completions
-H "Content-Type: application/json"
-d '{
"model": "THUDM/LongWriter-llama3.1-8b",
"prompt": "San Francisco is a",
"max_tokens": 7,
"temperature": 0
}'

Expected behavior / 期待表现

ValueError: As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions