Skip to content
This repository was archived by the owner on Feb 21, 2025. It is now read-only.

[please test] BYOK with ollama #342

@olegklimov

Description

@olegklimov

With the ollama project it's easy to host our own AI models.

You can set up bring-your-own-key (BYOK) to connect to ollama server, and see if you can use StarCoder2 for code completion, llama models for chat.

Does it work at all? What we need to fix to make it better?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions