---
title: Core Concepts
description: Understanding Llama Stack's service-oriented philosophy and key concepts
sidebar_label: Overview
sidebar_position: 1
---

Given Llama Stack's service-oriented philosophy, a few concepts and workflows arise which may not feel completely natural in the LLM landscape, especially if you are coming with a background in other frameworks.

## Documentation Structure

This section covers the fundamental concepts of Llama Stack:

- **[Architecture](architecture.mdx)** - Learn about Llama Stack's architectural design and principles
- **[APIs](/docs/concepts/apis/)** - Understanding the core APIs and their stability levels
  - [API Overview](apis/index.mdx) - Core APIs available in Llama Stack
  - [API Providers](apis/api_providers.mdx) - How providers implement APIs
  - [External APIs](apis/external.mdx) - External APIs available in Llama Stack
  - [API Stability Leveling](apis/api_leveling.mdx) - API stability and versioning
- **[Distributions](distributions.mdx)** - Pre-configured deployment packages
- **[Resources](resources.mdx)** - Understanding Llama Stack resources and their lifecycle

## Getting Started

If you're new to Llama Stack, we recommend starting with:

1. **[Architecture](architecture.mdx)** - Understand the overall system design
2. **[APIs](apis/index.mdx)** - Learn about the available APIs and their purpose
3. **[Distributions](distributions.mdx)** - Choose a pre-configured setup for your use case

Each concept builds upon the previous ones to give you a comprehensive understanding of how Llama Stack works and how to use it effectively.
