LocalAI/backend/python/kokoro
Ettore Di Giacinto 1d830ce7dd
feat(mlx): add mlx backend (#6049)
* chore: allow to install with pip

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* WIP

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Make the backend to build and actually work

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* List models from system only

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Add script to build darwin python backends

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Run protogen in libbackend

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Detect if mps is available across python backends

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* CI: try to build backend

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Debug CI

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Fixups

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Fixups

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Index mlx-vlm

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Remove mlx-vlm

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Drop CI test

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-08-22 08:42:29 +02:00
..
backend.py feat(mlx): add mlx backend (#6049) 2025-08-22 08:42:29 +02:00
install.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
Makefile feat(mlx): add mlx backend (#6049) 2025-08-22 08:42:29 +02:00
README.md feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
requirements-cpu.txt feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
requirements-cublas11.txt feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
requirements-cublas12.txt feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
requirements-hipblas.txt feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
requirements-intel.txt feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
requirements.txt feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
run.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
test.py feat(kokoro): complete kokoro integration (#5978) 2025-08-06 15:23:29 +02:00
test.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00

Kokoro TTS Backend for LocalAI

This is a gRPC server backend for LocalAI that uses the Kokoro TTS pipeline.

Creating a separate environment for kokoro project

make kokoro

Testing the gRPC server

make test

Features

  • Lightweight TTS model with 82 million parameters
  • Apache-licensed weights
  • Fast and cost-efficient
  • Multi-language support
  • Multiple voice options