Skip to content

feat: add Volcengine provider with configurable Ark models#47

Closed
囍博士 (qkmaosjtu) wants to merge 2 commits intolangchain-ai:mainfrom
qkmaosjtu:feat/volcengine-provider
Closed

feat: add Volcengine provider with configurable Ark models#47
囍博士 (qkmaosjtu) wants to merge 2 commits intolangchain-ai:mainfrom
qkmaosjtu:feat/volcengine-provider

Conversation

@qkmaosjtu
Copy link

Description

Adds Volcengine (Ark) as a provider and enables users to configure custom Ark model IDs from the UI, rather than hardcoding endpoints.

Related Issue

Fixes #

Type of Change

  • Bug fix (non-breaking change that fixes an issue)
  • New feature (non-breaking change that adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update
  • Refactoring (no functional changes)

Checklist

  • I have read the Contributing Guide
  • I have tested my changes locally (npm run lint)
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

Screenshots (if applicable)

Add screenshots to demonstrate UI changes.

Additional Notes

  • Volcengine Ark model IDs are user-specific, so models are now stored as custom entries rather than hardcoded.
  • Uses ARK_API_KEY and calls Ark via OpenAI-compatible Responses API.

@qkmaosjtu
Copy link
Author

Fixes #46

@qkmaosjtu
Copy link
Author

囍博士 (qkmaosjtu) commented Jan 25, 2026

The screenshot shows the model selector: Volcengine appears in the provider list and is selected; the model panel says “No models available,” with actions for “Edit API Key” and “Add Model.”
image

Because Ark model IDs are typically random, we need to support custom model names.

image

@qkmaosjtu
Copy link
Author

align Volcengine icon color with provider list style
image

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍

@hntrl
Copy link
Member

Hey 囍博士 (@qkmaosjtu)! We're looking to keep the supported provider list for openwork pretty small at the moment since things are early. I think the pecking order will go (1) add support for openai/anthropic/google model proxies, then (2) add third party model providers on a case-by-case basis (or have some option to bring custom chat models). Closing this in lieu of that, but I'd encourage you to keep a close eye to model-related improvements coming in the near future and evaluate if that works for your needs.

@qkmaosjtu
Copy link
Author

Hey 囍博士 (@qkmaosjtu)! We're looking to keep the supported provider list for openwork pretty small at the moment since things are early. I think the pecking order will go (1) add support for openai/anthropic/google model proxies, then (2) add third party model providers on a case-by-case basis (or have some option to bring custom chat models). Closing this in lieu of that, but I'd encourage you to keep a close eye to model-related improvements coming in the near future and evaluate if that works for your needs.

Thanks for the clarification and the context! That makes sense given where things are today. I’ll keep iterating on some tools for my own needs and will keep an eye on upcoming model-related improvements — hopefully I’ll be able to contribute something back to the community down the line.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants