LocalAI/backend/python/vllm-omni
Ettore Di Giacinto 151ad271f2
feat(rocm): bump to 7.x (#9323)
feat(rocm): bump to 7.2.1

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2026-04-12 08:51:30 +02:00
..
backend.py feat: add distributed mode (#9124) 2026-03-30 00:47:27 +02:00
install.sh feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
Makefile feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
requirements-after.txt feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
requirements-cublas12-after.txt feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
requirements-cublas12.txt feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
requirements-hipblas.txt feat(rocm): bump to 7.x (#9323) 2026-04-12 08:51:30 +02:00
requirements.txt feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
run.sh feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
test.py feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00
test.sh feat(vllm-omni): add new backend (#8188) 2026-01-24 22:23:30 +01:00