# Full-Stack Projects

We've created both tooling and a variety of example projects (all open-source) to help you get started building a full-stack LLM application.

## create-llama

`create-llama` is a command-line tool that will generate a full-stack application template for you. It supports both FastAPI, Vercel, and Node backends. This is one of the easiest ways to get started!

```{toctree}
---
maxdepth: 1
---
create-llama Blog <https://blog.llamaindex.ai/create-llama-a-command-line-tool-to-generate-llamaindex-apps-8f7683021191>
create-llama Repo <https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama>
create-llama Additional Templates <https://github.com/jerryjliu/create_llama_projects>
```

## Full-Stack Applications

The LlamaIndex team has also built some in-house projects - all of them open-sourced with MIT license - that you can use out of the box, or use as a template to kickstart your own project.

Check them out below.

### SEC Insights

```{toctree}
---
maxdepth: 1
---
SEC Insights App <https://secinsights.ai/>
SEC Insights Repo <https://github.com/run-llama/sec-insights>
```

### Chat LlamaIndex

```{toctree}
---
maxdepth: 1
---
Chat LlamaIndex App <https://chat-llamaindex.vercel.app/>
Chat LlamaIndex Repo <https://github.com/run-llama/chat-llamaindex>

```

### RAGs

```{toctree}
---
maxdepth: 1
---
RAGs Repo <https://github.com/run-llama/rags>

```

### RAG CLI

```{toctree}
---
maxdepth: 1
---
RAG CLI </use_cases/q_and_a/rag_cli.md>

```
