Skip to content

Add BYOLLM support with LiteLLM integration#12

Open
recursiveAF wants to merge 2 commits intomainfrom
feature/byollm
Open

Add BYOLLM support with LiteLLM integration#12
recursiveAF wants to merge 2 commits intomainfrom
feature/byollm

Conversation

@recursiveAF
Copy link
Contributor

Summary

  • Replace anthropic dependency with litellm
  • Add provider selection to noot init (anthropic, openai, gemini, groq, ollama)
  • Add config.py module for reading model from pyproject.toml
  • Flow.spawn() now reads model from project config by default
  • Add --provider flag for non-interactive init
  • Update templates and tests

Test plan

  • uv run pytest tests/ passes (14 tests)
  • uv run ruff check src/ tests/ passes
  • uv run pyright src/ passes
  • Manual test with Anthropic API key
  • Manual test with OpenAI API key

🤖 Generated with Claude Code

recursiveAF and others added 2 commits January 26, 2026 11:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants