-
Notifications
You must be signed in to change notification settings - Fork 0
add support for LiteLLM #5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This will become the default backend in the future (replacing inhouse backends), so it's immediately required as a non-optional dep.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds support for LiteLLM as a new backend provider, which will replace in-house backends in the future. LiteLLM provides a unified interface to 100+ LLM providers through an OpenAI-compatible API format.
- Implements LiteLLMAdapter to convert between think's internal format and OpenAI-compatible format
- Adds LiteLLMClient as the main entry point for LiteLLM functionality
- Integrates LiteLLM into the existing provider system with registration and configuration
Reviewed Changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| think/llm/litellm.py | Complete implementation of LiteLLM adapter and client classes |
| think/llm/base.py | Adds "litellm" to providers list and registers LiteLLMClient |
| tests/llm/test_litellm_adapter.py | Comprehensive test suite for the LiteLLM adapter functionality |
| tests/conftest.py | Adds LiteLLM model URL to test configuration when OpenAI API key is available |
| pyproject.toml | Adds litellm>=1.75.7 as a required dependency |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
think/llm/litellm.py
Outdated
| arguments = tc.get("function", {}).get("arguments") | ||
| if arguments is None: | ||
| raise ValueError( | ||
| "Missing function arguments in assistant message: %r", tc |
Copilot
AI
Aug 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ValueError constructor is being called with two arguments, but the second argument tc is not formatted into the string. This should use string formatting like f"Missing tool call ID in assistant message: {tc!r}" or use a single string argument.
| "Missing function arguments in assistant message: %r", tc | |
| f"Missing tool call ID in assistant message: {tc!r}" | |
| ) | |
| name = tc.get("function", {}).get("name") | |
| if name is None: | |
| raise ValueError( | |
| f"Missing function name in assistant message: {tc!r}" | |
| ) | |
| arguments = tc.get("function", {}).get("arguments") | |
| if arguments is None: | |
| raise ValueError( | |
| f"Missing function arguments in assistant message: {tc!r}" |
think/llm/litellm.py
Outdated
| name = tc.get("function", {}).get("name") | ||
| if name is None: | ||
| raise ValueError( | ||
| "Missing function name in assistant message: %r", tc |
Copilot
AI
Aug 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ValueError constructor is being called with two arguments, but the second argument tc is not formatted into the string. This should use string formatting like f"Missing function name in assistant message: {tc!r}" or use a single string argument.
| "Missing function name in assistant message: %r", tc | |
| f"Missing function name in assistant message: {tc!r}" |
think/llm/litellm.py
Outdated
| arguments = tc.get("function", {}).get("arguments") | ||
| if arguments is None: | ||
| raise ValueError( | ||
| "Missing function arguments in assistant message: %r", tc |
Copilot
AI
Aug 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ValueError constructor is being called with two arguments, but the second argument tc is not formatted into the string. This should use string formatting like f"Missing function arguments in assistant message: {tc!r}" or use a single string argument.
| "Missing function arguments in assistant message: %r", tc | |
| f"Missing function arguments in assistant message: {tc!r}" |
think/llm/litellm.py
Outdated
| elif role == "user": | ||
| raw_content = message.get("content") | ||
| if raw_content is None: | ||
| raise ValueError("Missing content in user message: %r", message) |
Copilot
AI
Aug 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ValueError constructor is being called with two arguments, but the second argument message is not formatted into the string. This should use string formatting like f"Missing content in user message: {message!r}" or use a single string argument.
| raise ValueError("Missing content in user message: %r", message) | |
| raise ValueError(f"Missing content in user message: {message!r}") |
think/llm/litellm.py
Outdated
| parts.append( | ||
| ContentPart( | ||
| type=ContentType.text, | ||
| text=text or "", |
Copilot
AI
Aug 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The expression text or "" is redundant since text is already guaranteed to be non-None and non-empty due to the if text: condition on line 220. The fallback to empty string is unnecessary here.
| text=text or "", | |
| text=text, |
This will become the default backend in the future (replacing inhouse backends), so it's immediately required as a non-optional dep.