LocalAI/backend/go
Ettore Di Giacinto 87e6de1989
feat: wire transcription for llama.cpp, add streaming support (#9353)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2026-04-14 16:13:40 +02:00
..
acestep-cpp chore: ⬆️ Update ace-step/acestep.cpp to e0c8d75a672fca5684c88c68dbf6d12f58754258 (#9261) 2026-04-07 00:39:24 +02:00
llm/llama feat: add distributed mode (#9124) 2026-03-30 00:47:27 +02:00
local-store feat: add distributed mode (#9124) 2026-03-30 00:47:27 +02:00
opus feat: add distributed mode (#9124) 2026-03-30 00:47:27 +02:00
piper fix(package.sh): drop redundant -a and -R 2026-02-05 16:39:38 +01:00
qwen3-tts-cpp feat(qwen3tts.cpp): add new backend (#9316) 2026-04-11 23:14:26 +02:00
sam3-cpp feat(rocm): bump to 7.x (#9323) 2026-04-12 08:51:30 +02:00
silero-vad fix(package.sh): drop redundant -a and -R 2026-02-05 16:39:38 +01:00
stablediffusion-ggml feat(rocm): bump to 7.x (#9323) 2026-04-12 08:51:30 +02:00
voxtral feat: wire transcription for llama.cpp, add streaming support (#9353) 2026-04-14 16:13:40 +02:00
whisper feat: wire transcription for llama.cpp, add streaming support (#9353) 2026-04-14 16:13:40 +02:00