**Thefree, OpenAI, Anthropic alternative. Your All-in-One Complete AI Stack** - Run powerful language models, autonomous agents, and document intelligence **locally** on your hardware.
**Drop-inreplacement for OpenAI API** - modular suite of tools that work seamlessly together or independently.
Start with **[LocalAI](https://localai.io)**'s OpenAI-compatible API, extend with **[LocalAGI](https://github.com/mudler/LocalAGI)**'s autonomous agents, and enhance with **[LocalRecall](https://github.com/mudler/LocalRecall)**'s semantic search - all running locally on your hardware.
**OpenAIAPI Compatible** - Run AI models locally with our modular ecosystem. From language models to autonomous agents and semantic search, build your complete AI stack without the cloud.
items:
- title:LLM Inferencing
icon:memory_alt
description:LocalAI is a free, **Open Source** OpenAI alternative. Run **LLMs**, generate **images**, **audio** and more **locally** with consumer grade hardware.
ctaLink:
text:learn more
url:/basics/getting_started/
- title:Agentic-first
icon:smart_toy
description:|
Extend LocalAI with LocalAGI, an autonomous AI agent platform that runs locally, no coding required.
Build and deploy autonomous agents with ease. Interact with REST APIs or use the WebUI.
ctaLink:
text:learn more
url:https://github.com/mudler/LocalAGI
- title:Memory and Knowledge base
icon:psychology
description:
Extend LocalAI with LocalRecall, A local rest api for semantic search and memory management. Perfect for AI applications.
ctaLink:
text:learn more
url:https://github.com/mudler/LocalRecall
- title:OpenAI Compatible
icon:api
description:Drop-in replacement for OpenAI API. Compatible with existing applications and libraries.
ctaLink:
text:learn more
url:/basics/getting_started/
- title:NoGPU Required
icon:memory
description:Run on consumer grade hardware. No need for expensive GPUs or cloud services.
ctaLink:
text:learn more
url:/basics/getting_started/
- title:Multiple Models
icon:hub
description:|
Support for various model families including LLMs, image generation, and audio models.
Supports multiple backends for inferencing, including vLLM, llama.cpp, and more.
You can switch between them as needed and install them from the Web interface or the CLI.
ctaLink:
text:learn more
url:/model-compatibility
- title:Privacy Focused
icon:security
description:Keep your data local. No data leaves your machine, ensuring complete privacy.
ctaLink:
text:learn more
url:/basics/container/
- title:Easy Setup
icon:settings
description:Simple installation and configuration. Get started in minutes with Binaries installation, Docker, Podman, Kubernetes or local installation.
ctaLink:
text:learn more
url:/basics/getting_started/
- title:Community Driven
icon:groups
description:Active community support and regular updates. Contribute and help shape the future of LocalAI.
ctaLink:
text:learn more
url:https://github.com/mudler/LocalAI
- title:Extensible
icon:extension
description:Easy to extend and customize. Add new models and features as needed.
ctaLink:
text:learn more
url:/docs/integrations/
- title:Peer 2 Peer
icon:hub
description:|
LocalAI is designed to be a decentralized LLM inference, powered by a peer-to-peer system based on libp2p.
It is designed to be used in a local or remote network, and is compatible with any LLM model.
It works both in federated mode or by splitting models weights.
ctaLink:
text:learn more
url:/features/distribute/
- title:Open Source
icon:code
description:MIT licensed. Free to use, modify, and distribute. Community contributions welcome.
LocalAI makes it simple to run various AI models on your own hardware. From text generation to image creation, autonomous agents to semantic search - all orchestrated through a unified API.
LocalAI, created by **Ettore Di Giacinto (mudler)**, is a Free and Open Source, community-driven project to make Free, Open AI accessible to everyone. The LocalAI stack is MIT licensed, and the models trained by LocalAI are available under the Apache 2.0 License.