khronoz's picture
Update README
5bf97a0
|
raw
history blame
1.39 kB

Smart Retrieval Backend

The backend is built using Python & FastAPI bootstrapped with create-llama.

Getting Started

First, setup the environment:

poetry install
poetry shell

Second, run the development server:

python main.py

Then call the API endpoint /api/chat to see the result:

curl --location 'localhost:8000/api/chat' \
--header 'Content-Type: application/json' \
--data '{ "messages": [{ "role": "user", "content": "Hello" }] }'

You can start editing the API by modifying app/api/routers/chat.py. The endpoint auto-updates as you save the file.

Open http://localhost:8000/docs with your browser to see the Swagger UI of the API.

The API allows CORS for all origins to simplify development. You can change this behavior by setting the ENVIRONMENT environment variable to prod:

ENVIRONMENT=prod uvicorn main:app

Learn More

To learn more about LlamaIndex, take a look at the following resources: