-
Notifications
You must be signed in to change notification settings - Fork 71
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
When using the OpenAI env vars and pointing to llamacpp-server which uses openAI chat completions endpoint, it incorrectly sends ollama post structure
To Reproduce
Steps to reproduce the behavior:
- Set up OpenAI env vars
- Connect to llamacpp-server
- Run Clustering
Expected behavior
It sends correctly formatted openAI chat completions payload and the LLM returns a playlist name
Screenshots
If applicable, add screenshots to help explain your problem.
2026-03-10T22:57:00.060442898Z srv log_server_r: done request: POST /v1/chat/completions 192.168.1.102 400
2026-03-10T22:57:00.060453708Z srv log_server_r: request: {"model": "Qwen3.5-9B-Q4_K_M.gguf", "prompt": "You are an expert music collector and MUST give a title to this playlist
Environment (please complete the following information):
- OS: Linux
- Deployment: Docker-compose
- AudioMuse-AI Version: v0.8.14
- Jellyfin/Navidrome Version: N/A
- Browser: Brave
- CPU: xeon E5
- RAM: 128GB
- Disk: a few :D
Additional context
this appears to be this line causing the issue
Line 64 in 8fc2492
| is_openai_format = api_key != "no-key-needed" or "openai" in server_url.lower() or "openrouter" in server_url.lower() |
it looks like it uses the url to select between openAI/openrouter and ollama maybe two funcs one ollama the other OpenAICompatible
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working