LocalAI/backend/python/vllm
Ettore Di Giacinto daf39e1efd
chore(vllm/ci): set maximum number of jobs
Also added comments to clarify CPU usage during build.

Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2025-11-20 15:53:32 +01:00
..
backend.py fix: vllm missing logprobs (#5279) 2025-04-30 12:55:07 +00:00
install.sh chore(vllm/ci): set maximum number of jobs 2025-11-20 15:53:32 +01:00
Makefile feat(mlx): add mlx backend (#6049) 2025-08-22 08:42:29 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
requirements-after.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-cpu.txt chore(deps): bump pytorch to 2.7 in vllm (#5576) 2025-06-04 08:56:45 +02:00
requirements-cublas11-after.txt feat(venv): shared env (#3195) 2024-08-07 19:45:14 +02:00
requirements-cublas11.txt chore(deps): bump pytorch to 2.7 in vllm (#5576) 2025-06-04 08:56:45 +02:00
requirements-cublas12-after.txt feat(venv): shared env (#3195) 2024-08-07 19:45:14 +02:00
requirements-cublas12.txt chore(deps): bump pytorch to 2.7 in vllm (#5576) 2025-06-04 08:56:45 +02:00
requirements-hipblas.txt feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
requirements-install.txt feat: migrate python backends from conda to uv (#2215) 2024-05-10 15:08:08 +02:00
requirements-intel.txt chore(deps): bump pytorch to 2.7 in vllm (#5576) 2025-06-04 08:56:45 +02:00
requirements.txt chore(deps): bump grpcio from 1.75.1 to 1.76.0 in /backend/python/vllm (#6827) 2025-10-27 21:31:47 +01:00
run.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
test.py fix: vllm missing logprobs (#5279) 2025-04-30 12:55:07 +00:00
test.sh feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00

Creating a separate environment for the vllm project

make vllm