LocalAI/pkg
Ettore Di Giacinto a6d9988e84
feat(backend gallery): add meta packages (#5696)
* feat(backend gallery): add meta packages

So we can have meta packages such as "vllm" that automatically installs
the corresponding package depending on the GPU that is being currently
detected in the system.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* feat: use a metadata file

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-06-24 17:08:27 +02:00
..
assets fix: use rice when embedding large binaries (#5309) 2025-05-04 16:42:42 +02:00
audio feat: Realtime API support reboot (#5392) 2025-05-25 22:25:05 +02:00
concurrency chore: update jobresult_test.go (#4124) 2024-11-12 08:52:18 +01:00
downloader feat(backend gallery): display download progress (#5687) 2025-06-18 23:49:44 +02:00
functions Improve Comments and Documentation for MixedMode and ParseJSON Functions (#5626) 2025-06-11 09:46:53 +02:00
grpc Fix Typos in Comments and Error Messages (#5637) 2025-06-12 18:34:32 +02:00
langchain feat(llama.cpp): do not specify backends to autoload and add llama.cpp variants (#2232) 2024-05-04 17:56:12 +02:00
library fix: use rice when embedding large binaries (#5309) 2025-05-04 16:42:42 +02:00
model feat: Add backend gallery (#5607) 2025-06-15 14:56:52 +02:00
oci feat(backend gallery): display download progress (#5687) 2025-06-18 23:49:44 +02:00
sound feat: Realtime API support reboot (#5392) 2025-05-25 22:25:05 +02:00
startup feat(backend gallery): add meta packages (#5696) 2025-06-24 17:08:27 +02:00
store chore: fix go.mod module (#2635) 2024-06-23 08:24:36 +00:00
templates feat(llama.cpp): add support for audio input (#5466) 2025-05-26 16:06:03 +02:00
utils fix: adapt test to error changes 2025-05-30 17:43:59 +02:00
xsync chore: fix go.mod module (#2635) 2024-06-23 08:24:36 +00:00
xsysinfo feat: improve RAM estimation by using values from summary (#5525) 2025-06-05 19:16:26 +02:00