feat: Support ModelScope Provider (#8026)

* feat: support modelscop

* feat: support modelscop

* style: update llm.ts

* style: update modelscope
This commit is contained in:
hedeqiang 2025-06-01 14:54:32 +08:00 committed by GitHub
parent 2f28a93d89
commit 7b91dfddd5
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
14 changed files with 467 additions and 4 deletions

View file

@ -186,6 +186,8 @@ ENV \
MINIMAX_API_KEY="" MINIMAX_MODEL_LIST="" \
# Mistral
MISTRAL_API_KEY="" MISTRAL_MODEL_LIST="" \
# ModelScope
MODELSCOPE_API_KEY="" MODELSCOPE_MODEL_LIST="" MODELSCOPE_PROXY_URL="" \
# Moonshot
MOONSHOT_API_KEY="" MOONSHOT_MODEL_LIST="" MOONSHOT_PROXY_URL="" \
# Novita

View file

@ -230,6 +230,8 @@ ENV \
MINIMAX_API_KEY="" MINIMAX_MODEL_LIST="" \
# Mistral
MISTRAL_API_KEY="" MISTRAL_MODEL_LIST="" \
# ModelScope
MODELSCOPE_API_KEY="" MODELSCOPE_MODEL_LIST="" MODELSCOPE_PROXY_URL="" \
# Moonshot
MOONSHOT_API_KEY="" MOONSHOT_MODEL_LIST="" MOONSHOT_PROXY_URL="" \
# Novita

View file

@ -188,6 +188,8 @@ ENV \
MINIMAX_API_KEY="" MINIMAX_MODEL_LIST="" \
# Mistral
MISTRAL_API_KEY="" MISTRAL_MODEL_LIST="" \
# ModelScope
MODELSCOPE_API_KEY="" MODELSCOPE_MODEL_LIST="" MODELSCOPE_PROXY_URL="" \
# Moonshot
MOONSHOT_API_KEY="" MOONSHOT_MODEL_LIST="" MOONSHOT_PROXY_URL="" \
# Novita

View file

@ -0,0 +1,113 @@
---
title: ModelScope Provider Setup
description: Learn how to configure and use ModelScope provider in LobeChat
tags:
- ModelScope
---
# ModelScope Provider Setup
ModelScope (魔塔社区) is Alibaba's open-source model community that provides access to various AI models. This guide will help you set up the ModelScope provider in LobeChat.
## Prerequisites
Before using ModelScope API, you need to:
1. **Create a ModelScope Account**
- Visit [ModelScope](https://www.modelscope.cn/)
- Register for an account
2. **Bind Alibaba Cloud Account**
- **Important**: ModelScope API requires binding with an Alibaba Cloud account
- Visit your [ModelScope Access Token page](https://www.modelscope.cn/my/myaccesstoken)
- Follow the instructions to bind your Alibaba Cloud account
- This step is mandatory for API access
3. **Get API Token**
- After binding your Alibaba Cloud account, generate an API token
- Copy the token for use in LobeChat
## Configuration
### Environment Variables
Add the following environment variables to your `.env` file:
```bash
# Enable ModelScope provider
ENABLED_MODELSCOPE=1
# ModelScope API key (required)
MODELSCOPE_API_KEY=your_modelscope_api_token
# Optional: Custom model list (comma-separated)
MODELSCOPE_MODEL_LIST=deepseek-ai/DeepSeek-V3-0324,Qwen/Qwen3-235B-A22B
# Optional: Proxy URL if needed
MODELSCOPE_PROXY_URL=https://your-proxy-url
```
### Docker Configuration
If using Docker, add the ModelScope environment variables to your `docker-compose.yml`:
```yaml
environment:
- ENABLED_MODELSCOPE=1
- MODELSCOPE_API_KEY=your_modelscope_api_token
- MODELSCOPE_MODEL_LIST=deepseek-ai/DeepSeek-V3-0324,Qwen/Qwen3-235B-A22B
```
## Available Models
ModelScope provides access to various models including:
- **DeepSeek Models**: DeepSeek-V3, DeepSeek-R1 series
- **Qwen Models**: Qwen3 series, Qwen2.5 series
- **Llama Models**: Meta-Llama-3 series
- **Other Models**: Various open-source models
## Troubleshooting
### Common Issues
1. **"Please bind your Alibaba Cloud account before use" Error**
- This means you haven't bound your Alibaba Cloud account to ModelScope
- Visit [ModelScope Access Token page](https://www.modelscope.cn/my/myaccesstoken)
- Complete the Alibaba Cloud account binding process
2. **401 Authentication Error**
- Check if your API token is correct
- Ensure the token hasn't expired
- Verify that your Alibaba Cloud account is properly bound
3. **Model Not Available**
- Some models may require additional permissions
- Check the model's page on ModelScope for access requirements
### Debug Mode
Enable debug mode to see detailed logs:
```bash
DEBUG_MODELSCOPE_CHAT_COMPLETION=1
```
## Notes
- ModelScope API is compatible with OpenAI API format
- The service is primarily designed for users in China
- Some models may have usage limitations or require additional verification
- API responses are in Chinese by default for some models
## Support
For ModelScope-specific issues:
- Visit [ModelScope Documentation](https://www.modelscope.cn/docs)
- Check [ModelScope Community](https://www.modelscope.cn/community)
For LobeChat integration issues:
- Check our [GitHub Issues](https://github.com/lobehub/lobe-chat/issues)
- Join our community discussions

View file

@ -0,0 +1,133 @@
---
title: ModelScope 提供商配置
description: 学习如何在 LobeChat 中配置和使用 ModelScope 提供商
tags:
- ModelScope
---
# ModelScope 提供商配置
ModelScope魔塔社区是阿里巴巴的开源模型社区提供各种 AI 模型的访问服务。本指南将帮助您在 LobeChat 中设置 ModelScope 提供商。
## 前置条件
在使用 ModelScope API 之前,您需要:
1. **创建 ModelScope 账户**
- 访问 [ModelScope](https://www.modelscope.cn/)
- 注册账户
2. **绑定阿里云账户**
- **重要**ModelScope API 需要绑定阿里云账户
- 访问您的 [ModelScope 访问令牌页面](https://www.modelscope.cn/my/myaccesstoken)
- 按照说明绑定您的阿里云账户
- 此步骤是 API 访问的必要条件
3. **获取 API 令牌**
- 绑定阿里云账户后,生成 API 令牌
- 复制令牌以在 LobeChat 中使用
## 配置
### 环境变量
在您的 `.env` 文件中添加以下环境变量:
```bash
# 启用 ModelScope 提供商
ENABLED_MODELSCOPE=1
# ModelScope API 密钥(必需)
MODELSCOPE_API_KEY=your_modelscope_api_token
# 可选:自定义模型列表(逗号分隔)
MODELSCOPE_MODEL_LIST=deepseek-ai/DeepSeek-V3-0324,Qwen/Qwen3-235B-A22B
# 可选:代理 URL如需要
MODELSCOPE_PROXY_URL=https://your-proxy-url
```
### Docker 配置
如果使用 Docker请在您的 `docker-compose.yml` 中添加 ModelScope 环境变量:
```yaml
environment:
- ENABLED_MODELSCOPE=1
- MODELSCOPE_API_KEY=your_modelscope_api_token
- MODELSCOPE_MODEL_LIST=deepseek-ai/DeepSeek-V3-0324,Qwen/Qwen3-235B-A22B
```
## 可用模型
ModelScope 提供各种模型的访问,包括:
- **DeepSeek 模型**DeepSeek-V3、DeepSeek-R1 系列
- **Qwen 模型**Qwen3 系列、Qwen2.5 系列
- **Llama 模型**Meta-Llama-3 系列
- **其他模型**:各种开源模型
## 故障排除
### 常见问题
1. **"请先绑定阿里云账户后使用" 错误**
- 这意味着您还没有将阿里云账户绑定到 ModelScope
- 访问 [ModelScope 访问令牌页面](https://www.modelscope.cn/my/myaccesstoken)
- 完成阿里云账户绑定流程
2. **401 认证错误**
- 检查您的 API 令牌是否正确
- 确保令牌没有过期
- 验证您的阿里云账户是否正确绑定
3. **模型不可用**
- 某些模型可能需要额外权限
- 查看 ModelScope 上模型页面的访问要求
### 调试模式
启用调试模式以查看详细日志:
```bash
DEBUG_MODELSCOPE_CHAT_COMPLETION=1
```
## 注意事项
- ModelScope API 与 OpenAI API 格式兼容
- 该服务主要面向中国用户设计
- 某些模型可能有使用限制或需要额外验证
- 某些模型的 API 响应默认为中文
## 支持
对于 ModelScope 特定问题:
- 访问 [ModelScope 文档](https://www.modelscope.cn/docs)
- 查看 [ModelScope 社区](https://www.modelscope.cn/community)
对于 LobeChat 集成问题:
- 查看我们的 [GitHub Issues](https://github.com/lobehub/lobe-chat/issues)
- 加入我们的社区讨论
## 模型 ID 格式
ModelScope 使用命名空间前缀格式的模型 ID例如
```
deepseek-ai/DeepSeek-V3-0324
deepseek-ai/DeepSeek-R1-0528
Qwen/Qwen3-235B-A22B
Qwen/Qwen3-32B
```
在配置模型列表时,请使用完整的模型 ID 格式。
## API 限制
- ModelScope API 有速率限制
- 某些模型可能需要特殊权限
- 建议在生产环境中监控 API 使用情况
- 部分高级模型可能需要付费使用

View file

@ -24,6 +24,7 @@ import { default as jina } from './jina';
import { default as lmstudio } from './lmstudio';
import { default as minimax } from './minimax';
import { default as mistral } from './mistral';
import { default as modelscope } from './modelscope';
import { default as moonshot } from './moonshot';
import { default as novita } from './novita';
import { default as nvidia } from './nvidia';
@ -97,6 +98,7 @@ export const LOBE_DEFAULT_MODEL_LIST = buildDefaultModelList({
lmstudio,
minimax,
mistral,
modelscope,
moonshot,
novita,
nvidia,
@ -151,6 +153,7 @@ export { default as jina } from './jina';
export { default as lmstudio } from './lmstudio';
export { default as minimax } from './minimax';
export { default as mistral } from './mistral';
export { default as modelscope } from './modelscope';
export { default as moonshot } from './moonshot';
export { default as novita } from './novita';
export { default as nvidia } from './nvidia';

View file

@ -0,0 +1,63 @@
import { AIChatModelCard } from '@/types/aiModel';
const modelscopeChatModels: AIChatModelCard[] = [
{
abilities: {
functionCall: true,
},
contextWindowTokens: 131_072,
description: 'DeepSeek-V3是DeepSeek第三代模型在多项基准测试中表现优异。',
displayName: 'DeepSeek-V3-0324',
enabled: true,
id: 'deepseek-ai/DeepSeek-V3-0324',
type: 'chat',
},
{
abilities: {
functionCall: true,
},
contextWindowTokens: 131_072,
description: 'DeepSeek-V3是DeepSeek第三代模型的最新版本具有强大的推理和对话能力。',
displayName: 'DeepSeek-V3',
enabled: true,
id: 'deepseek-ai/DeepSeek-V3',
type: 'chat',
},
{
abilities: {
functionCall: true,
},
contextWindowTokens: 131_072,
description: 'DeepSeek-R1是DeepSeek最新的推理模型专注于复杂推理任务。',
displayName: 'DeepSeek-R1',
enabled: true,
id: 'deepseek-ai/DeepSeek-R1',
type: 'chat',
},
{
abilities: {
functionCall: true,
},
contextWindowTokens: 131_072,
description: 'Qwen3-235B-A22B是通义千问3代超大规模模型提供顶级的AI能力。',
displayName: 'Qwen3-235B-A22B',
enabled: true,
id: 'Qwen/Qwen3-235B-A22B',
type: 'chat',
},
{
abilities: {
functionCall: true,
},
contextWindowTokens: 131_072,
description: 'Qwen3-32B是通义千问3代模型具有强大的推理和对话能力。',
displayName: 'Qwen3-32B',
enabled: true,
id: 'Qwen/Qwen3-32B',
type: 'chat',
},
];
export const allModels = [...modelscopeChatModels];
export default allModels;

View file

@ -162,6 +162,9 @@ export const getLLMConfig = () => {
ENABLED_INFINIAI: z.boolean(),
INFINIAI_API_KEY: z.string().optional(),
ENABLED_MODELSCOPE: z.boolean(),
MODELSCOPE_API_KEY: z.string().optional(),
},
runtimeEnv: {
API_KEY_SELECT_MODE: process.env.API_KEY_SELECT_MODE,
@ -322,6 +325,9 @@ export const getLLMConfig = () => {
ENABLED_INFINIAI: !!process.env.INFINIAI_API_KEY,
INFINIAI_API_KEY: process.env.INFINIAI_API_KEY,
ENABLED_MODELSCOPE: !!process.env.MODELSCOPE_API_KEY,
MODELSCOPE_API_KEY: process.env.MODELSCOPE_API_KEY,
},
});
};

View file

@ -24,6 +24,7 @@ import JinaProvider from './jina';
import LMStudioProvider from './lmstudio';
import MinimaxProvider from './minimax';
import MistralProvider from './mistral';
import ModelScopeProvider from './modelscope';
import MoonshotProvider from './moonshot';
import NovitaProvider from './novita';
import NvidiaProvider from './nvidia';
@ -67,6 +68,7 @@ export const LOBE_DEFAULT_MODEL_LIST: ChatModelCard[] = [
GithubProvider.chatModels,
MinimaxProvider.chatModels,
MistralProvider.chatModels,
ModelScopeProvider.chatModels,
MoonshotProvider.chatModels,
OllamaProvider.chatModels,
VLLMProvider.chatModels,
@ -130,6 +132,7 @@ export const DEFAULT_MODEL_PROVIDER_LIST = [
GroqProvider,
PerplexityProvider,
MistralProvider,
ModelScopeProvider,
Ai21Provider,
UpstageProvider,
XAIProvider,
@ -194,6 +197,7 @@ export { default as JinaProviderCard } from './jina';
export { default as LMStudioProviderCard } from './lmstudio';
export { default as MinimaxProviderCard } from './minimax';
export { default as MistralProviderCard } from './mistral';
export { default as ModelScopeProviderCard } from './modelscope';
export { default as MoonshotProviderCard } from './moonshot';
export { default as NovitaProviderCard } from './novita';
export { default as NvidiaProviderCard } from './nvidia';

View file

@ -0,0 +1,62 @@
import { ModelProviderCard } from '@/types/llm';
// ref: https://modelscope.cn/docs/model-service/API-Inference/intro
const ModelScope: ModelProviderCard = {
chatModels: [
{
contextWindowTokens: 131_072,
description: 'DeepSeek-V3是DeepSeek第三代模型在多项基准测试中表现优异。',
displayName: 'DeepSeek-V3-0324',
enabled: true,
functionCall: true,
id: 'deepseek-ai/DeepSeek-V3-0324',
},
{
contextWindowTokens: 131_072,
description: 'DeepSeek-V3是DeepSeek第三代模型的最新版本具有强大的推理和对话能力。',
displayName: 'DeepSeek-V3',
enabled: true,
functionCall: true,
id: 'deepseek-ai/DeepSeek-V3',
},
{
contextWindowTokens: 131_072,
description: 'DeepSeek-R1是DeepSeek最新的推理模型专注于复杂推理任务。',
displayName: 'DeepSeek-R1',
enabled: true,
functionCall: true,
id: 'deepseek-ai/DeepSeek-R1',
},
{
contextWindowTokens: 131_072,
description: 'Qwen3-235B-A22B是通义千问3代超大规模模型提供顶级的AI能力。',
displayName: 'Qwen3-235B-A22B',
enabled: true,
functionCall: true,
id: 'Qwen/Qwen3-235B-A22B',
},
{
contextWindowTokens: 131_072,
description: 'Qwen3-32B是通义千问3代模型具有强大的推理和对话能力。',
displayName: 'Qwen3-32B',
enabled: true,
functionCall: true,
id: 'Qwen/Qwen3-32B',
},
],
checkModel: 'Qwen/Qwen3-32B',
description: 'ModelScope是阿里云推出的模型即服务平台提供丰富的AI模型和推理服务。',
id: 'modelscope',
modelList: { showModelFetcher: true },
name: 'ModelScope',
settings: {
proxyUrl: {
placeholder: 'https://api-inference.modelscope.cn/v1',
},
sdkType: 'openai',
showModelFetcher: true,
},
url: 'https://modelscope.cn',
};
export default ModelScope;

View file

@ -0,0 +1,69 @@
import type { ChatModelCard } from '@/types/llm';
import { ModelProvider } from '../types';
import { LobeOpenAICompatibleFactory } from '../utils/openaiCompatibleFactory';
export interface ModelScopeModelCard {
created: number;
id: string;
object: string;
owned_by: string;
}
export const LobeModelScopeAI = LobeOpenAICompatibleFactory({
baseURL: 'https://api-inference.modelscope.cn/v1',
debug: {
chatCompletion: () => process.env.DEBUG_MODELSCOPE_CHAT_COMPLETION === '1',
},
models: async ({ client }) => {
const { LOBE_DEFAULT_MODEL_LIST } = await import('@/config/aiModels');
const functionCallKeywords = ['qwen', 'deepseek', 'llama'];
const visionKeywords = ['qwen-vl', 'qwen2-vl', 'llava'];
const reasoningKeywords = ['qwq', 'deepseek-r1'];
try {
const modelsPage = (await client.models.list()) as any;
const modelList: ModelScopeModelCard[] = modelsPage.data || [];
return modelList
.map((model) => {
const knownModel = LOBE_DEFAULT_MODEL_LIST.find(
(m) => model.id.toLowerCase() === m.id.toLowerCase(),
);
const modelId = model.id.toLowerCase();
return {
contextWindowTokens: knownModel?.contextWindowTokens ?? undefined,
displayName: knownModel?.displayName ?? model.id,
enabled: knownModel?.enabled || false,
functionCall:
functionCallKeywords.some((keyword) => modelId.includes(keyword)) ||
knownModel?.abilities?.functionCall ||
false,
id: model.id,
reasoning:
reasoningKeywords.some((keyword) => modelId.includes(keyword)) ||
knownModel?.abilities?.reasoning ||
false,
vision:
visionKeywords.some((keyword) => modelId.includes(keyword)) ||
knownModel?.abilities?.vision ||
false,
};
})
.filter(Boolean) as ChatModelCard[];
} catch (error) {
console.warn(
'Failed to fetch ModelScope models. Please ensure your ModelScope API key is valid and your Alibaba Cloud account is properly bound:',
error,
);
return [];
}
},
provider: ModelProvider.ModelScope,
});

View file

@ -1,17 +1,17 @@
import { LobeAi21AI } from './ai21';
import { LobeAi360AI } from './ai360';
import LobeAnthropicAI from './anthropic';
import { LobeAnthropicAI } from './anthropic';
import { LobeAzureOpenAI } from './azureOpenai';
import { LobeAzureAI } from './azureai';
import { LobeBaichuanAI } from './baichuan';
import LobeBedrockAI from './bedrock';
import { LobeBedrockAI } from './bedrock';
import { LobeCloudflareAI } from './cloudflare';
import { LobeCohereAI } from './cohere';
import { LobeDeepSeekAI } from './deepseek';
import { LobeFireworksAI } from './fireworksai';
import { LobeGiteeAI } from './giteeai';
import { LobeGithubAI } from './github';
import LobeGoogleAI from './google';
import { LobeGoogleAI } from './google';
import { LobeGroq } from './groq';
import { LobeHigressAI } from './higress';
import { LobeHuggingFaceAI } from './huggingface';
@ -22,10 +22,11 @@ import { LobeJinaAI } from './jina';
import { LobeLMStudioAI } from './lmstudio';
import { LobeMinimaxAI } from './minimax';
import { LobeMistralAI } from './mistral';
import { LobeModelScopeAI } from './modelscope';
import { LobeMoonshotAI } from './moonshot';
import { LobeNovitaAI } from './novita';
import { LobeNvidiaAI } from './nvidia';
import LobeOllamaAI from './ollama';
import { LobeOllamaAI } from './ollama';
import { LobeOpenAI } from './openai';
import { LobeOpenRouterAI } from './openrouter';
import { LobePerplexityAI } from './perplexity';
@ -75,6 +76,7 @@ export const providerRuntimeMap = {
lmstudio: LobeLMStudioAI,
minimax: LobeMinimaxAI,
mistral: LobeMistralAI,
modelscope: LobeModelScopeAI,
moonshot: LobeMoonshotAI,
novita: LobeNovitaAI,
nvidia: LobeNvidiaAI,

View file

@ -46,6 +46,7 @@ export enum ModelProvider {
LMStudio = 'lmstudio',
Minimax = 'minimax',
Mistral = 'mistral',
ModelScope = 'modelscope',
Moonshot = 'moonshot',
Novita = 'novita',
Nvidia = 'nvidia',

View file

@ -58,6 +58,7 @@ export interface UserKeyVaults extends SearchEngineKeyVaults {
lobehub?: any;
minimax?: OpenAICompatibleKeyVault;
mistral?: OpenAICompatibleKeyVault;
modelscope?: OpenAICompatibleKeyVault;
moonshot?: OpenAICompatibleKeyVault;
novita?: OpenAICompatibleKeyVault;
nvidia?: OpenAICompatibleKeyVault;