LocalAI/backend/python/transformers
Ettore Di Giacinto cfd95745ed
feat: add cuda13 images (#7404)
* chore(ci): add cuda13 jobs

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Add to pipelines and to capabilities. Start to work on the gallery

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* gallery

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* capabilities: try to detect by looking at /usr/local

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* neutts

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* backends.yaml

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* add cuda13 l4t requirements.txt

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* add cuda13 requirements.txt

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Fixups

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Fixups

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Pin vllm

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Not all backends are compatible

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* add vllm to requirements

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* vllm is not pre-compiled for cuda 13

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-12-02 14:24:35 +01:00
..
backend.py feat(mlx): add mlx backend (#6049) 2025-08-22 08:42:29 +02:00
install.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
Makefile feat(mlx): add mlx backend (#6049) 2025-08-22 08:42:29 +02:00
README.md feat(transformers): add embeddings with Automodel (#1308) 2023-11-20 21:21:17 +01:00
requirements-cpu.txt chore(deps): bump protobuf from 6.32.0 to 6.33.1 in /backend/python/transformers (#7340) 2025-11-24 20:12:17 +00:00
requirements-cublas11.txt chore(deps): bump protobuf from 6.32.0 to 6.33.1 in /backend/python/transformers (#7340) 2025-11-24 20:12:17 +00:00
requirements-cublas12.txt chore(deps): bump protobuf from 6.32.0 to 6.33.1 in /backend/python/transformers (#7340) 2025-11-24 20:12:17 +00:00
requirements-cublas13.txt feat: add cuda13 images (#7404) 2025-12-02 14:24:35 +01:00
requirements-hipblas.txt chore(deps): bump protobuf from 6.32.0 to 6.33.1 in /backend/python/transformers (#7340) 2025-11-24 20:12:17 +00:00
requirements-intel.txt chore(deps): bump protobuf from 6.32.0 to 6.33.1 in /backend/python/transformers (#7340) 2025-11-24 20:12:17 +00:00
requirements.txt chore(deps): bump protobuf from 6.32.0 to 6.33.1 in /backend/python/transformers (#7340) 2025-11-24 20:12:17 +00:00
run.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
test.py feat(transformers): merge sentencetransformers backend (#4624) 2025-01-18 18:30:30 +01:00
test.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00

Creating a separate environment for the transformers project

make transformers