Skip to content

A question about token limit #1

@0xzhouchenyu

Description

@0xzhouchenyu

I cloned your repository and replaced the docs with mine. It works fantastic, but sometimes it throws an error like:

2023-04-25 17:20:16.978 error_code=context_length_exceeded error_message="This model's maximum context length is 4097 tokens. However, your messages resulted in 5613 tokens. Please reduce the length of the messages." error_param=messages error_type=invalid_request_error message='OpenAI API error received' stream_error=False
Get response error: This model's maximum context length is 4097 tokens. However, your messages resulted in 5613 tokens. Please reduce the length of the messages.
Response: Connection error. Please try again later.

As I am not proficient with langchain, I would appreciate your help to resolve this issue. Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions