Spaces:
Build error
Build error
File size: 1,390 Bytes
5bf97a0 0702eb8 5bf97a0 0702eb8 5bf97a0 0702eb8 5bf97a0 0702eb8 5bf97a0 0702eb8 5bf97a0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
# Smart Retrieval Backend
The backend is built using Python & [FastAPI](https://fastapi.tiangolo.com/) bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama).
## Getting Started
First, setup the environment:
```bash
poetry install
poetry shell
```
Second, run the development server:
```bash
python main.py
```
Then call the API endpoint `/api/chat` to see the result:
```bash
curl --location 'localhost:8000/api/chat' \
--header 'Content-Type: application/json' \
--data '{ "messages": [{ "role": "user", "content": "Hello" }] }'
```
You can start editing the API by modifying `app/api/routers/chat.py`. The endpoint auto-updates as you save the file.
Open [http://localhost:8000/docs](http://localhost:8000/docs) with your browser to see the Swagger UI of the API.
The API allows CORS for all origins to simplify development. You can change this behavior by setting the `ENVIRONMENT` environment variable to `prod`:
```bash
ENVIRONMENT=prod uvicorn main:app
```
## Learn More
To learn more about LlamaIndex, take a look at the following resources:
- [LlamaIndex Documentation](https://docs.llamaindex.ai) - learn about LlamaIndex.
- [LlamaIndexTS Documentation](https://ts.llamaindex.ai) - learn about LlamaIndexTS (Typescript features).
- [FastAPI Documentation](https://fastapi.tiangolo.com/) - learn about FastAPI.
|