English |
简体中文
[RagFlow](http://ragflow.io) is a knowledge management platform built on custom-build document understanding engine and LLM,
with reasoned and well-founded answers to your question. Clone this repository, you can deploy your own knowledge management
platform to empower your business with AI.
# Key Features
- **Custom-build document understanding engine.** Our deep learning engine is made according to the needs of analyzing and searching various type of documents in different domain.
- For documents from different domain for different purpose, the engine applys different analyzing and search strategy.
- Easily intervene and manipulate the data proccessing procedure when things goes beyond expectation.
- Multi-media document understanding is supported using OCR and multi-modal LLM.
- **State-of-the-art table structure and layout recognition.** Precisely extract and understand the document including table content. See [README.](./deepdoc/README.md)
- For PDF files, layout and table structures including row, column and span of them are recognized.
- Put the table accrossing the pages together.
- Reconstruct the table structure components into html table.
- **Querying database dumped data are supported.** After uploading tables from any database, you can search any data records just by asking.
- Instead of using SQL to query a database, every one cat get the wanted data just by asking using natrual language.
- The record number uploaded is not limited.
- Some extra description of column headers should be provided.
- **Reasoned and well-founded answers.** The cited document part in LLM's answer is provided and pointed out in the original document.
- The answers are based on retrieved result for which we apply vector-keyword hybrids search and rerank.
- The part of document cited in the answer is presented in the most expressive way.
- For PDF file, the cited parts in document can be located in the original PDF.
# Release Notification
**Star us on GitHub, and be notified for a new releases instantly!**

# Installation
## System Requirements
Be aware of the system minimum requirements before starting installation.
- CPU >= 2 cores
- RAM >= 8GB
Then, you need to check the following command:
```bash
121:/ragflow# sysctl vm.max_map_count
vm.max_map_count = 262144
```
If **vm.max_map_count** is not larger than 65535, please run the following commands:
```bash
121:/ragflow# sudo sysctl -w vm.max_map_count=262144
```
However, this change is not persistent and will be reset after a system reboot.
To make the change permanent, you need to update the **/etc/sysctl.conf**.
Add or update the following line in the file:
```bash
vm.max_map_count=262144
```
## Install docker
If your machine doesn't have *Docker* installed, please refer to [Install Docker Engine](https://docs.docker.com/engine/install/)
## Quick Start
> If you want to change the basic setups, like port, password .etc., please refer to [.env](./docker/.env) before starting the system.
> If you change anything in [.env](./docker/.env), please check [service_conf.yaml](./docker/service_conf.yaml) which is a
> configuration of the back-end service and should be consistent with [.env](./docker/.env).
> - In [service_conf.yaml](./docker/service_conf.yaml), configuration of *LLM* in **user_default_llm** is strongly recommended.
> In **user_default_llm** of [service_conf.yaml](./docker/service_conf.yaml), you need to specify LLM factory and your own _API_KEY_.
> It's O.K if you don't have _API_KEY_ at the moment, you can specify it later at the setting part after starting and logging in the system.
> - We have supported the flowing LLM factory, and the others is coming soon:
> [OpenAI](https://platform.openai.com/login?launch), [Tongyi-Qianwen](https://dashscope.console.aliyun.com/model),
> [ZHIPU-AI](https://open.bigmodel.cn/), [Moonshot](https://platform.moonshot.cn/docs/docs)
```bash
121:/# git clone https://github.com/infiniflow/ragflow.git
121:/# cd ragflow/docker
121:/ragflow/docker# docker compose up -d
```
> The core image is about 15GB, please be patient for the first time
After pulling all the images and running up, use the following command to check the server status. If you can have the following outputs,
_**Hallelujah!**_ You have successfully launched the system.
```bash
121:/ragflow# docker logs -f ragflow-server
____ ______ __
/ __ \ ____ _ ____ _ / ____// /____ _ __
/ /_/ // __ `// __ `// /_ / // __ \| | /| / /
/ _, _// /_/ // /_/ // __/ / // /_/ /| |/ |/ /
/_/ |_| \__,_/ \__, //_/ /_/ \____/ |__/|__/
/____/
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:9380
* Running on http://172.22.0.5:9380
INFO:werkzeug:Press CTRL+C to quit
```
Open your browser, enter the IP address of your server, _**Hallelujah**_ again!
> The default serving port is 80, if you want to change that, please refer to [docker-compose.yml](./docker-compose.yaml),
> and change the left part of *'80:80'*'.
# Configuration
If you need to change the default setting of the system when you deploy it. There several ways to configure it.
Please refer to [README](./docker/README.md) and manually set the configuration.
After changing something, please run *docker-compose up -d* again.
# RoadMap
- [ ] File manager.
- [ ] Support URLs. Crawl web and extract the main content.
# Contributing
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/infiniflow/ragflow/blob/main/CONTRIBUTING.md).
# License
This repository is available under the [Ragflow Open Source License](LICENSE), which is essentially Apache 2.0 with a few additional restrictions.