Spaces:
Sleeping
Sleeping
title: Frontend | |
emoji: 🐢 | |
colorFrom: pink | |
colorTo: blue | |
sdk: docker | |
pinned: false | |
license: mit | |
app_port: 3000 | |
This is a [LlamaIndex](https://www.llamaindex.ai/) project using [Next.js](https://nextjs.org/) bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama). | |
## Getting Started | |
First, install the dependencies: | |
``` | |
npm install | |
``` | |
Second, generate the embeddings of the documents in the `./data` directory (if this folder exists - otherwise, skip this step): | |
``` | |
npm run generate | |
``` | |
Third, run the development server: | |
``` | |
npm run dev | |
``` | |
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. | |
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file. | |
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font. | |
## Using Docker | |
1. Build an image for the Next.js app: | |
``` | |
docker build -t <your_app_image_name> . | |
``` | |
2. Generate embeddings: | |
Parse the data and generate the vector embeddings if the `./data` folder exists - otherwise, skip this step: | |
``` | |
docker run \ | |
--rm \ | |
-v $(pwd)/.env:/app/.env \ # Use ENV variables and configuration from your file-system | |
-v $(pwd)/config:/app/config \ | |
-v $(pwd)/data:/app/data \ | |
-v $(pwd)/cache:/app/cache \ # Use your file system to store the vector database | |
<your_app_image_name> \ | |
npm run generate | |
``` | |
3. Start the app: | |
``` | |
docker run \ | |
--rm \ | |
-v $(pwd)/.env:/app/.env \ # Use ENV variables and configuration from your file-system | |
-v $(pwd)/config:/app/config \ | |
-v $(pwd)/cache:/app/cache \ # Use your file system to store gea vector database | |
-p 3000:3000 \ | |
<your_app_image_name> | |
``` | |
## Learn More | |
To learn more about LlamaIndex, take a look at the following resources: | |
- [LlamaIndex Documentation](https://docs.llamaindex.ai) - learn about LlamaIndex (Python features). | |
- [LlamaIndexTS Documentation](https://ts.llamaindex.ai) - learn about LlamaIndex (Typescript features). | |
You can check out [the LlamaIndexTS GitHub repository](https://github.com/run-llama/LlamaIndexTS) - your feedback and contributions are welcome! | |