LocalAI/core/http/endpoints
Ettore Di Giacinto 87e6de1989
feat: wire transcription for llama.cpp, add streaming support (#9353)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2026-04-14 16:13:40 +02:00
..
anthropic fix(anthropic): do not emit empty tokens and fix SSE tool calls (#9258) 2026-04-07 00:38:21 +02:00
elevenlabs feat(api): Allow coding agents to interactively discover how to control and configure LocalAI (#9084) 2026-04-04 15:14:35 +02:00
explorer feat: add distributed mode (#9124) 2026-03-30 00:47:27 +02:00
jina feat(api): Allow coding agents to interactively discover how to control and configure LocalAI (#9084) 2026-04-04 15:14:35 +02:00
localai feat: backend versioning, upgrade detection and auto-upgrade (#9315) 2026-04-11 22:31:15 +02:00
mcp feat: add distributed mode (#9124) 2026-03-30 00:47:27 +02:00
ollama feat(api): add ollama compatibility (#9284) 2026-04-09 14:15:14 +02:00
openai feat: wire transcription for llama.cpp, add streaming support (#9353) 2026-04-14 16:13:40 +02:00
openresponses fix(reasoning): suppress partial tag tokens during autoparser warm-up 2026-04-04 20:45:57 +00:00