Skip to content

[BUG] Incorrect payload sent to OpenAI compatible endpoint after clustering #360

@nathn123

Description

@nathn123

Describe the bug
When using the OpenAI env vars and pointing to llamacpp-server which uses openAI chat completions endpoint, it incorrectly sends ollama post structure

To Reproduce
Steps to reproduce the behavior:

  1. Set up OpenAI env vars
  2. Connect to llamacpp-server
  3. Run Clustering

Expected behavior
It sends correctly formatted openAI chat completions payload and the LLM returns a playlist name
Screenshots
If applicable, add screenshots to help explain your problem.

2026-03-10T22:57:00.060442898Z srv  log_server_r: done request: POST /v1/chat/completions 192.168.1.102 400
2026-03-10T22:57:00.060453708Z srv  log_server_r: request:  {"model": "Qwen3.5-9B-Q4_K_M.gguf", "prompt": "You are an expert music collector and MUST give a title to this playlist

Environment (please complete the following information):

  • OS: Linux
  • Deployment: Docker-compose
  • AudioMuse-AI Version: v0.8.14
  • Jellyfin/Navidrome Version: N/A
  • Browser: Brave
  • CPU: xeon E5
  • RAM: 128GB
  • Disk: a few :D

Additional context
this appears to be this line causing the issue

is_openai_format = api_key != "no-key-needed" or "openai" in server_url.lower() or "openrouter" in server_url.lower()

it looks like it uses the url to select between openAI/openrouter and ollama maybe two funcs one ollama the other OpenAICompatible

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions