--- language: - en --- This repository contains the index files required for the `SearchWikipediaTool` of the [bot-with-plan](https://github.com/krasserm/bot-with-plan) project. It is based on the `krasserm/wikipedia-2023-11-en-embed-mxbai-int8-binary` dataset and contains the following files: - `faiss-ubinary.index`: [Faiss](https://github.com/facebookresearch/faiss) index file containing the `binary` embeddings - `usearch-int8-index`: [usearch](https://github.com/unum-cloud/usearch) index files containing the `int8` embeddings - `document-url-mappings.sqlite`: [SQLite](https://www.sqlite.org/) database file containing mappings from document URLs to text chunk indices The following code snippet demonstrates how to use the index files with the `SearchWikipediaTool`: ```python from sentence_transformers import CrossEncoder, SentenceTransformer from gba.client import Llama3Instruct, LlamaCppClient from gba.tools import SearchWikipediaTool from gba.utils import Scratchpad llm_model = Llama3Instruct(llm=LlamaCppClient(url="http://localhost:8084/completion", temperature=-1)) embedding_model = SentenceTransformer("mixedbread-ai/mxbai-embed-large-v1", device="cuda") rerank_model = CrossEncoder("mixedbread-ai/mxbai-rerank-large-v1", device="cuda") search_wikipedia = SearchWikipediaTool( llm=llm_model, embedding_model=embedding_model, rerank_model=rerank_model, top_k_nodes=10, top_k_related_documents=1, top_k_related_nodes=3, ) response = search_wikipedia.run( task="Search Wikipedia for the launch date of the first iPhone.", request="", scratchpad=Scratchpad(), ) ```