---
title: "Local AI LLM"
description: "LocalAI is a popular open-source, API, and LLM engine that allows you to download and run any GGUF model from HuggingFace and run it on CPU or GPU."
---

import Image from "next/image";

<Image
  src="/images/anythingllm-setup/llm-configuration/local/localai/header-image.png"
  height={1080}
  width={1920}
  quality={100}
  alt="Local AI LLM"
/>

# Local AI LLM

[LocalAI](https://localai.io) is a popular [open-source](https://github.com/mudler/LocalAI), API, and LLM engine that allows you to download and run any GGUF model from HuggingFace and run it on CPU or GPU.

LocalAI supports both LLMs, Embedding models, and image-generation models.

## Connecting to Local AI

LocalAI is a Docker container image that you must configure and run.

You can update your model to a different model at any time in the **Settings**.

<Image
  src="/images/anythingllm-setup/llm-configuration/local/localai/localai-llm.png"
  height={1080}
  width={1920}
  quality={100}
  alt="Local AI LLM settings"
/>
