| # Core dependencies for Fire Evacuation RAG System | |
| numpy | |
| torch | |
| transformers | |
| sentence-transformers | |
| gradio | |
| # FAISS for vector similarity search | |
| # Use faiss-cpu for CPU-only systems, or faiss-gpu for GPU systems | |
| faiss-cpu | |
| # faiss-gpu>=1.7.4 # Uncomment if you have CUDA-capable GPU | |
| # Optional: For faster model loading and inference | |
| unsloth # Faster model loading with Unsloth | |
| # Optional: For model quantization (4-bit/8-bit) | |
| bitsandbytes # Required for 4-bit/8-bit quantization | |
| # Optional: For optimized attention (FlashAttention2) | |
| # flash-attn>=2.0.0 # Uncomment if you want FlashAttention2 support | |
| # Note: flash-attn requires CUDA and may need to be installed separately | |
| # Install with: pip install flash-attn --no-build-isolation | |