{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# ๐Ÿค– RAG Chatbot: ML/AI Knowledge Assistant\n", "\n", "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/your-username/your-repo/blob/main/rag_notebook.ipynb)\n", "\n", "## ๐Ÿ“‹ Project Overview\n", "\n", "This notebook implements a sophisticated **Retrieval-Augmented Generation (RAG) chatbot** that provides comprehensive information about machine learning, deep learning, AI, and related topics. The chatbot combines the power of modern AI technologies to deliver accurate, contextual responses.\n", "\n", "### ๐ŸŽฏ What This Notebook Does\n", "\n", "1. **Loads ML/AI Knowledge**: Accesses The Pile dataset from Hugging Face\n", "2. **Processes Text Data**: Filters and chunks relevant ML/AI content\n", "3. **Creates Vector Database**: Stores embeddings in Chroma for fast retrieval\n", "4. **Implements RAG Pipeline**: Retrieves relevant context and generates answers\n", "5. **Tests the System**: Validates functionality with sample questions\n", "\n", "### ๐Ÿ› ๏ธ Technologies Used\n", "\n", "- **๐Ÿค– Generation Model**: Google Gemini 2.5 Flash\n", "- **๐Ÿ”— RAG Framework**: LangChain\n", "- **๐Ÿ—„๏ธ Vector Database**: Chroma\n", "- **๐Ÿ“š Dataset**: The Pile (EleutherAI/the_pile) from Hugging Face\n", "- **๐Ÿง  Embeddings**: Sentence Transformers\n", "\n", "### ๐Ÿš€ How to Run This Notebook\n", "\n", "1. **Open in Colab**: Click the badge above or upload to Google Colab\n", "2. **Set API Key**: Add your Gemini API key to Colab secrets\n", "3. **Run All Cells**: Execute cells sequentially (Ctrl+F9)\n", "4. **Test Chatbot**: Try the sample questions at the end\n", "\n", "### ๐Ÿ“Š Expected Outputs\n", "\n", "- **Vector Database**: Chroma collection with ML/AI knowledge\n", "- **RAG Pipeline**: Fully functional question-answering system\n", "- **Test Results**: Sample Q&A demonstrating chatbot capabilities\n", "- **Configuration**: Settings file for deployment\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿ“ฆ Step 1: Installation and Setup\n", "\n", "### ๐Ÿ”ง Required Packages\n", "\n", "This cell installs all necessary dependencies for the RAG chatbot:\n", "\n", "- **Streamlit**: Web interface framework\n", "- **LangChain**: RAG pipeline orchestration\n", "- **Chroma**: Vector database for embeddings\n", "- **Sentence Transformers**: Text embedding models\n", "- **Google Generative AI**: Gemini API integration\n", "- **Hugging Face Datasets**: Dataset access\n", "\n", "### โš ๏ธ Important Notes\n", "\n", "- Run this cell first before any other cells\n", "- Installation may take 2-3 minutes\n", "- Restart runtime if you encounter import errors\n", "- All packages are pinned to specific versions for compatibility\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Install required packages\n", "!pip install streamlit==1.28.1\n", "!pip install langchain==0.1.0\n", "!pip install langchain-community==0.0.10\n", "!pip install langchain-google-genai==0.0.6\n", "!pip install chromadb==0.4.18\n", "!pip install datasets==2.14.6\n", "!pip install transformers==4.35.2\n", "!pip install sentence-transformers==2.2.2\n", "!pip install google-generativeai==0.3.2\n", "!pip install tiktoken==0.5.1\n", "!pip install numpy==1.24.3\n", "!pip install pandas==2.0.3\n", "!pip install tqdm==4.66.1\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿ”‘ Step 2: API Key Configuration\n", "\n", "### ๐Ÿ” Google Gemini API Setup\n", "\n", "To use this chatbot, you need a Google Gemini API key:\n", "\n", "1. **Get API Key**: Visit [Google AI Studio](https://makersuite.google.com/app/apikey)\n", "2. **Create Key**: Generate a new API key\n", "3. **Add to Colab**: Use the secrets manager (๐Ÿ”‘ icon in sidebar)\n", "4. **Set Secret Name**: `GEMINI_API_KEY`\n", "\n", "### ๐Ÿ›ก๏ธ Security Best Practices\n", "\n", "- Never hardcode API keys in notebooks\n", "- Use Colab secrets for secure storage\n", "- Keep your API key private and don't share it\n", "- Monitor your API usage to avoid unexpected charges\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Set up Google Gemini API key\n", "import os\n", "from google.colab import userdata\n", "\n", "# Get API key from Colab secrets\n", "try:\n", " GEMINI_API_KEY = userdata.get('GEMINI_API_KEY')\n", " os.environ['GOOGLE_API_KEY'] = GEMINI_API_KEY\n", " print(\"โœ… Gemini API key loaded successfully!\")\n", "except:\n", " print(\"โŒ Please add your Gemini API key to Colab secrets:\")\n", " print(\"1. Go to the key icon (๐Ÿ”‘) in the left sidebar\")\n", " print(\"2. Add a new secret with key 'GEMINI_API_KEY' and your API key as value\")\n", " print(\"3. Restart the runtime and run this cell again\")\n", " \n", " # Alternative: Set directly (not recommended for production)\n", " # GEMINI_API_KEY = \"your_api_key_here\"\n", " # os.environ['GOOGLE_API_KEY'] = GEMINI_API_KEY\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿ“š Step 3: Dataset Loading and Processing\n", "\n", "### ๐Ÿ—ƒ๏ธ The Pile Dataset Overview\n", "\n", "**The Pile** is a large-scale, diverse text dataset created by EleutherAI for training language models. For this project, we:\n", "\n", "- **Access via API**: Use Hugging Face Datasets library (no local downloads)\n", "- **Filter for ML/AI**: Extract content relevant to machine learning and AI\n", "- **Process Text**: Clean, chunk, and prepare for embedding\n", "- **Create Knowledge Base**: Build a searchable vector database\n", "\n", "### ๐Ÿ” Content Filtering Strategy\n", "\n", "We filter text samples using ML/AI keywords:\n", "- Machine learning, deep learning, neural networks\n", "- Artificial intelligence, algorithms, models\n", "- Training, data, features, classification\n", "- Regression, clustering, optimization, gradient, tensor\n", "\n", "### ๐Ÿ“Š Processing Pipeline\n", "\n", "1. **Load Dataset**: Stream data from Hugging Face\n", "2. **Filter Content**: Keep only ML/AI relevant text\n", "3. **Clean Text**: Remove extra whitespace and format\n", "4. **Chunk Text**: Split into manageable pieces (500 words)\n", "5. **Validate Length**: Keep chunks between 100-2000 characters\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Import required libraries\n", "import pandas as pd\n", "import numpy as np\n", "from datasets import load_dataset\n", "from tqdm import tqdm\n", "import re\n", "import os\n", "\n", "print(\"๐Ÿ“š Loading The Pile dataset...\")\n", "\n", "# Load a subset of The Pile dataset\n", "# We'll use a smaller subset for demonstration to avoid memory issues\n", "try:\n", " # Load a specific subset that contains ML/AI content\n", " dataset = load_dataset(\"EleutherAI/the_pile\", split=\"train\", streaming=True)\n", " \n", " # Take first 1000 samples for demonstration\n", " texts = []\n", " ml_keywords = ['machine learning', 'deep learning', 'neural network', 'artificial intelligence', \n", " 'algorithm', 'model', 'training', 'data', 'feature', 'classification', \n", " 'regression', 'clustering', 'optimization', 'gradient', 'tensor']\n", " \n", " print(\"๐Ÿ” Filtering ML/AI related content...\")\n", " count = 0\n", " for sample in tqdm(dataset, desc=\"Processing samples\"):\n", " if count >= 1000: # Limit to 1000 samples for Colab\n", " break\n", " \n", " text = sample['text']\n", " # Check if text contains ML/AI keywords\n", " if any(keyword in text.lower() for keyword in ml_keywords):\n", " # Clean and preprocess text\n", " text = re.sub(r'\\s+', ' ', text) # Remove extra whitespace\n", " text = text.strip()\n", " \n", " # Only keep texts that are reasonable length (not too short or too long)\n", " if 100 <= len(text) <= 2000:\n", " texts.append(text)\n", " count += 1\n", " \n", " print(f\"โœ… Loaded {len(texts)} ML/AI related text samples\")\n", " \n", "except Exception as e:\n", " print(f\"โŒ Error loading dataset: {e}\")\n", " print(\"๐Ÿ”„ Using fallback sample data...\")\n", " \n", " # Fallback sample data if The Pile is not accessible\n", " texts = [\n", " \"Machine learning is a subset of artificial intelligence that focuses on algorithms that can learn from data. Deep learning uses neural networks with multiple layers to process complex patterns in data.\",\n", " \"Neural networks are computing systems inspired by biological neural networks. They consist of interconnected nodes that process information using a connectionist approach.\",\n", " \"Supervised learning uses labeled training data to learn a mapping from inputs to outputs. Common algorithms include linear regression, decision trees, and support vector machines.\",\n", " \"Unsupervised learning finds hidden patterns in data without labeled examples. Clustering algorithms like K-means group similar data points together.\",\n", " \"Natural language processing combines computational linguistics with machine learning to help computers understand human language. It includes tasks like text classification and sentiment analysis.\",\n", " \"Computer vision enables machines to interpret and understand visual information from the world. It uses deep learning models like convolutional neural networks.\",\n", " \"Reinforcement learning is a type of machine learning where agents learn to make decisions by interacting with an environment and receiving rewards or penalties.\",\n", " \"Feature engineering is the process of selecting and transforming raw data into features that can be used by machine learning algorithms. Good features can significantly improve model performance.\",\n", " \"Cross-validation is a technique used to assess how well a machine learning model generalizes to new data. It involves splitting data into training and validation sets multiple times.\",\n", " \"Overfitting occurs when a model learns the training data too well and performs poorly on new data. Regularization techniques help prevent overfitting.\"\n", " ]\n", " print(f\"โœ… Using {len(texts)} sample texts\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿง  Step 4: Vector Database and Embeddings Setup\n", "\n", "### ๐Ÿ”ง Embedding Model Selection\n", "\n", "We use **Sentence Transformers** with the `all-MiniLM-L6-v2` model:\n", "\n", "- **Lightweight**: Fast and efficient for Colab environments\n", "- **High Quality**: Good semantic understanding for ML/AI content\n", "- **Multilingual**: Handles various text formats\n", "- **Optimized**: Designed for similarity search tasks\n", "\n", "### ๐Ÿ—„๏ธ Chroma Vector Database\n", "\n", "**Chroma** is our vector database choice because:\n", "\n", "- **Easy Setup**: Simple Python API\n", "- **Persistent Storage**: Saves embeddings between sessions\n", "- **Efficient Search**: Fast similarity search capabilities\n", "- **Scalable**: Can handle large collections of documents\n", "\n", "### ๐Ÿ“Š Database Architecture\n", "\n", "- **Collection Name**: `ml_ai_knowledge`\n", "- **Storage**: Local directory `./chroma_db`\n", "- **Metadata**: Document source, chunk index, text length\n", "- **Indexing**: Automatic vector indexing for fast retrieval\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Initialize embeddings and vector database\n", "from sentence_transformers import SentenceTransformer\n", "import chromadb\n", "from chromadb.config import Settings\n", "\n", "print(\"๐Ÿง  Initializing embeddings model...\")\n", "\n", "# Use a lightweight sentence transformer model\n", "embedding_model = SentenceTransformer('all-MiniLM-L6-v2')\n", "print(\"โœ… Embedding model loaded!\")\n", "\n", "print(\"๐Ÿ—„๏ธ Setting up Chroma vector database...\")\n", "\n", "# Create Chroma client with persistent storage\n", "chroma_client = chromadb.Client(Settings(\n", " persist_directory=\"./chroma_db\",\n", " anonymized_telemetry=False\n", "))\n", "\n", "# Create or get collection\n", "collection_name = \"ml_ai_knowledge\"\n", "try:\n", " collection = chroma_client.get_collection(collection_name)\n", " print(f\"โœ… Found existing collection: {collection_name}\")\n", "except:\n", " collection = chroma_client.create_collection(\n", " name=collection_name,\n", " metadata={\"description\": \"ML/AI knowledge base from The Pile dataset\"}\n", " )\n", " print(f\"โœ… Created new collection: {collection_name}\")\n", "\n", "print(\"๐ŸŽฏ Vector database ready!\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿ“ Step 5: Text Processing and Embedding Storage\n", "\n", "### ๐Ÿ”„ Text Chunking Strategy\n", "\n", "We implement intelligent text chunking to optimize retrieval:\n", "\n", "- **Chunk Size**: 500 words per chunk\n", "- **Overlap**: 50 words between chunks (prevents information loss)\n", "- **Minimum Length**: 50 characters (filters out empty chunks)\n", "- **Metadata**: Track source document and chunk position\n", "\n", "### ๐Ÿ’พ Batch Processing\n", "\n", "To handle large datasets efficiently:\n", "\n", "- **Batch Size**: 100 documents per batch\n", "- **Memory Management**: Process in chunks to avoid OOM errors\n", "- **Progress Tracking**: Visual progress bars for long operations\n", "- **Error Handling**: Graceful handling of processing errors\n", "\n", "### ๐Ÿท๏ธ Document Metadata\n", "\n", "Each document chunk includes:\n", "\n", "- **Source ID**: Original document identifier\n", "- **Chunk Index**: Position within the document\n", "- **Total Chunks**: Number of chunks in the document\n", "- **Text Length**: Character count for quality control\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Process and embed text data\n", "import uuid\n", "from tqdm import tqdm\n", "\n", "def chunk_text(text, chunk_size=500, overlap=50):\n", " \"\"\"Split text into overlapping chunks\"\"\"\n", " words = text.split()\n", " chunks = []\n", " \n", " for i in range(0, len(words), chunk_size - overlap):\n", " chunk = ' '.join(words[i:i + chunk_size])\n", " if len(chunk.strip()) > 50: # Only keep substantial chunks\n", " chunks.append(chunk)\n", " \n", " return chunks\n", "\n", "print(\"๐Ÿ“ Processing and chunking text data...\")\n", "\n", "# Check if collection already has data\n", "existing_count = collection.count()\n", "print(f\"๐Ÿ“Š Current documents in collection: {existing_count}\")\n", "\n", "if existing_count == 0:\n", " print(\"๐Ÿ”„ Adding new documents to collection...\")\n", " \n", " all_chunks = []\n", " chunk_ids = []\n", " chunk_metadatas = []\n", " \n", " for i, text in enumerate(tqdm(texts, desc=\"Processing texts\")):\n", " chunks = chunk_text(text)\n", " \n", " for j, chunk in enumerate(chunks):\n", " chunk_id = f\"doc_{i}_chunk_{j}\"\n", " metadata = {\n", " \"source\": f\"the_pile_doc_{i}\",\n", " \"chunk_index\": j,\n", " \"total_chunks\": len(chunks),\n", " \"text_length\": len(chunk)\n", " }\n", " \n", " all_chunks.append(chunk)\n", " chunk_ids.append(chunk_id)\n", " chunk_metadatas.append(metadata)\n", " \n", " print(f\"๐Ÿ“Š Created {len(all_chunks)} text chunks\")\n", " \n", " # Add documents to Chroma in batches to avoid memory issues\n", " batch_size = 100\n", " for i in tqdm(range(0, len(all_chunks), batch_size), desc=\"Adding to Chroma\"):\n", " batch_chunks = all_chunks[i:i + batch_size]\n", " batch_ids = chunk_ids[i:i + batch_size]\n", " batch_metadatas = chunk_metadatas[i:i + batch_size]\n", " \n", " collection.add(\n", " documents=batch_chunks,\n", " ids=batch_ids,\n", " metadatas=batch_metadatas\n", " )\n", " \n", " print(\"โœ… All documents added to Chroma!\")\n", "else:\n", " print(\"โœ… Collection already contains data, skipping addition\")\n", "\n", "# Verify the collection\n", "final_count = collection.count()\n", "print(f\"๐Ÿ“Š Final document count: {final_count}\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿค– Step 6: Google Gemini Model Integration\n", "\n", "### ๐Ÿง  Model Configuration\n", "\n", "We use **Google Gemini 2.5 Flash** for text generation:\n", "\n", "- **Model**: `gemini-2.0-flash-exp` (latest available)\n", "- **Temperature**: 0.7 (balanced creativity and accuracy)\n", "- **Max Tokens**: 1024 (sufficient for detailed responses)\n", "- **System Integration**: LangChain wrapper for easy use\n", "\n", "### ๐Ÿ”ง LangChain Integration\n", "\n", "**LangChain** provides:\n", "\n", "- **Unified Interface**: Consistent API across different LLMs\n", "- **Message Handling**: System and human message management\n", "- **Error Handling**: Robust error management and retries\n", "- **Streaming**: Optional streaming responses\n", "\n", "### ๐Ÿงช Model Testing\n", "\n", "We test the model to ensure:\n", "\n", "- **API Connectivity**: Verify API key and connection\n", "- **Response Quality**: Check output format and content\n", "- **Error Handling**: Test error scenarios\n", "- **Performance**: Measure response times\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Initialize Gemini model\n", "from langchain_google_genai import ChatGoogleGenerativeAI\n", "from langchain.schema import HumanMessage, SystemMessage\n", "\n", "print(\"๐Ÿค– Initializing Gemini 2.5 Flash model...\")\n", "\n", "# Initialize the Gemini model\n", "llm = ChatGoogleGenerativeAI(\n", " model=\"gemini-2.0-flash-exp\", # Using the latest available model\n", " temperature=0.7,\n", " max_output_tokens=1024,\n", " convert_system_message_to_human=True\n", ")\n", "\n", "print(\"โœ… Gemini model initialized!\")\n", "\n", "# Test the model\n", "try:\n", " test_response = llm.invoke(\"Hello! Can you tell me about machine learning?\")\n", " print(\"๐Ÿงช Test response:\", test_response.content[:100] + \"...\")\n", " print(\"โœ… Gemini model is working!\")\n", "except Exception as e:\n", " print(f\"โŒ Error testing Gemini model: {e}\")\n", " print(\"Please check your API key and try again.\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿ” Step 7: RAG Pipeline Implementation\n", "\n", "### ๐Ÿ”„ Complete RAG Workflow\n", "\n", "The RAG pipeline combines retrieval and generation:\n", "\n", "1. **Query Processing**: User question is received\n", "2. **Document Retrieval**: Similar documents are found using vector search\n", "3. **Context Assembly**: Retrieved documents are combined into context\n", "4. **Answer Generation**: Gemini generates response using context\n", "5. **Response Delivery**: Formatted answer is returned to user\n", "\n", "### ๐ŸŽฏ Retrieval Strategy\n", "\n", "- **Similarity Search**: Cosine similarity between query and documents\n", "- **Top-K Results**: Retrieve top 5 most relevant documents\n", "- **Context Length**: Combine retrieved documents for comprehensive context\n", "- **Metadata Tracking**: Track similarity scores and document sources\n", "\n", "### ๐Ÿค– Generation Strategy\n", "\n", "- **System Prompt**: Specialized instructions for ML/AI responses\n", "- **Context Integration**: Retrieved documents used as context\n", "- **Response Formatting**: Markdown support for rich text\n", "- **Error Handling**: Graceful handling of generation errors\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Create RAG pipeline\n", "def retrieve_relevant_docs(query, n_results=5):\n", " \"\"\"Retrieve relevant documents from Chroma\"\"\"\n", " try:\n", " results = collection.query(\n", " query_texts=[query],\n", " n_results=n_results\n", " )\n", " \n", " # Extract documents and metadata\n", " documents = results['documents'][0]\n", " metadatas = results['metadatas'][0]\n", " distances = results['distances'][0]\n", " \n", " return documents, metadatas, distances\n", " except Exception as e:\n", " print(f\"Error retrieving documents: {e}\")\n", " return [], [], []\n", "\n", "def create_context(documents):\n", " \"\"\"Create context string from retrieved documents\"\"\"\n", " context = \"\\n\\n\".join(documents)\n", " return context\n", "\n", "def generate_answer(query, context):\n", " \"\"\"Generate answer using Gemini with retrieved context\"\"\"\n", " system_prompt = \"\"\"You are an AI assistant specialized in machine learning, deep learning, and artificial intelligence. \n", " Use the provided context to answer questions accurately and comprehensively. If the context doesn't contain enough \n", " information, you can supplement with your general knowledge, but always prioritize the provided context.\n", " \n", " Provide clear, well-structured answers with examples when appropriate.\"\"\"\n", " \n", " user_prompt = f\"\"\"Context:\n", " {context}\n", " \n", " Question: {query}\n", " \n", " Please provide a comprehensive answer based on the context above.\"\"\"\n", " \n", " try:\n", " messages = [\n", " SystemMessage(content=system_prompt),\n", " HumanMessage(content=user_prompt)\n", " ]\n", " \n", " response = llm.invoke(messages)\n", " return response.content\n", " except Exception as e:\n", " return f\"Error generating answer: {e}\"\n", "\n", "def rag_pipeline(query, n_results=5):\n", " \"\"\"Complete RAG pipeline\"\"\"\n", " print(f\"๐Ÿ” Processing query: '{query}'\")\n", " \n", " # Retrieve relevant documents\n", " documents, metadatas, distances = retrieve_relevant_docs(query, n_results)\n", " \n", " if not documents:\n", " return \"Sorry, I couldn't find relevant information for your query.\"\n", " \n", " print(f\"๐Ÿ“š Retrieved {len(documents)} relevant documents\")\n", " \n", " # Create context\n", " context = create_context(documents)\n", " \n", " # Generate answer\n", " answer = generate_answer(query, context)\n", " \n", " return answer, documents, metadatas, distances\n", "\n", "print(\"โœ… RAG pipeline created!\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿงช Step 8: System Testing and Validation\n", "\n", "### ๐ŸŽฏ Test Questions\n", "\n", "We test the RAG system with diverse ML/AI questions covering:\n", "\n", "- **Basic Concepts**: Fundamental ML/AI definitions\n", "- **Algorithms**: Specific algorithm explanations\n", "- **Applications**: Real-world use cases\n", "- **Technical Details**: Deep technical concepts\n", "- **Advanced Topics**: Cutting-edge AI research\n", "\n", "### ๐Ÿ“Š Performance Metrics\n", "\n", "During testing, we evaluate:\n", "\n", "- **Response Quality**: Accuracy and relevance of answers\n", "- **Retrieval Performance**: Quality of retrieved documents\n", "- **Response Time**: Speed of query processing\n", "- **Context Relevance**: How well retrieved context matches queries\n", "\n", "### ๐Ÿ” Debugging Information\n", "\n", "Each test shows:\n", "\n", "- **Retrieved Documents**: Number and content of retrieved chunks\n", "- **Similarity Scores**: Distance metrics for relevance assessment\n", "- **Response Content**: Generated answer quality\n", "- **Error Handling**: Any issues encountered\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Test the RAG system\n", "test_questions = [\n", " \"What is machine learning?\",\n", " \"How do neural networks work?\",\n", " \"What is the difference between supervised and unsupervised learning?\",\n", " \"Explain deep learning\",\n", " \"What is overfitting in machine learning?\"\n", "]\n", "\n", "print(\"๐Ÿงช Testing RAG system with sample questions...\\n\")\n", "\n", "for i, question in enumerate(test_questions, 1):\n", " print(f\"โ“ Question {i}: {question}\")\n", " print(\"-\" * 50)\n", " \n", " try:\n", " answer, documents, metadatas, distances = rag_pipeline(question)\n", " print(f\"๐Ÿค– Answer: {answer}\")\n", " print(f\"๐Ÿ“Š Retrieved {len(documents)} documents\")\n", " print(f\"๐ŸŽฏ Similarity scores: {[f'{d:.3f}' for d in distances]}\")\n", " print(\"\\n\" + \"=\"*80 + \"\\n\")\n", " except Exception as e:\n", " print(f\"โŒ Error: {e}\")\n", " print(\"\\n\" + \"=\"*80 + \"\\n\")\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ๐Ÿ’พ Step 9: Configuration and Deployment Preparation\n", "\n", "### ๐Ÿ”ง Configuration Management\n", "\n", "We save system configuration for deployment:\n", "\n", "- **Model Settings**: Embedding model, LLM parameters\n", "- **Database Config**: Collection name, storage settings\n", "- **Pipeline Settings**: Retrieval parameters, generation settings\n", "- **Version Info**: Component versions for reproducibility\n", "\n", "### ๐Ÿ“ Output Files\n", "\n", "The notebook generates:\n", "\n", "- **Vector Database**: `./chroma_db/` directory with embeddings\n", "- **Configuration**: `rag_config.json` with system settings\n", "- **Test Results**: Validation of system functionality\n", "- **Documentation**: Setup and usage instructions\n", "\n", "### ๐Ÿš€ Deployment Readiness\n", "\n", "The system is now ready for:\n", "\n", "- **Streamlit Deployment**: Use `app.py` for web interface\n", "- **Hugging Face Spaces**: Deploy to cloud platform\n", "- **Local Development**: Run in local environment\n", "- **Production Use**: Scale for multiple users\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Save components for Streamlit app\n", "import pickle\n", "import json\n", "\n", "print(\"๐Ÿ’พ Saving components for Streamlit app...\")\n", "\n", "# Save the RAG pipeline functions and configuration\n", "rag_config = {\n", " 'collection_name': collection_name,\n", " 'embedding_model_name': 'all-MiniLM-L6-v2',\n", " 'gemini_model': 'gemini-2.0-flash-exp',\n", " 'temperature': 0.7,\n", " 'max_output_tokens': 1024,\n", " 'n_results': 5\n", "}\n", "\n", "# Save configuration\n", "with open('rag_config.json', 'w') as f:\n", " json.dump(rag_config, f, indent=2)\n", "\n", "print(\"โœ… Configuration saved to rag_config.json\")\n", "\n", "# Create a simple test to verify everything works\n", "print(\"\\n๐ŸŽฏ Final verification test...\")\n", "test_query = \"What is artificial intelligence?\"\n", "try:\n", " answer, docs, metas, dists = rag_pipeline(test_query)\n", " print(f\"โœ… Test successful! Answer length: {len(answer)} characters\")\n", " print(f\"๐Ÿ“Š Retrieved {len(docs)} documents\")\n", "except Exception as e:\n", " print(f\"โŒ Test failed: {e}\")\n", "\n", "print(\"\\n๐ŸŽ‰ RAG system is ready!\")\n", "print(\"๐Ÿ“ Files created:\")\n", "print(\" - chroma_db/ (vector database)\")\n", "print(\" - rag_config.json (configuration)\")\n", "print(\"\\n๐Ÿš€ You can now use this system in the Streamlit app!\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "iKl63WFB8KGV" }, "source": [ "# ๐Ÿค– RAG Chatbot: ML/AI Knowledge Assistant\n", "\n", "This notebook implements a Retrieval-Augmented Generation (RAG) chatbot that provides information about machine learning, deep learning, AI, and related topics using:\n", "\n", "- **Generation Model**: Google Gemini 2.5 Flash\n", "- **RAG Framework**: LangChain\n", "- **Vector Database**: Chroma\n", "- **Dataset**: The Pile (EleutherAI/the_pile) from Hugging Face\n", "\n", "## ๐ŸŽฏ Project Overview\n", "\n", "The chatbot works by:\n", "1. Loading text data from The Pile dataset\n", "2. Preprocessing and embedding the text\n", "3. Storing embeddings in Chroma vector database\n", "4. Retrieving relevant context for user queries\n", "5. Generating answers using Gemini 2.5 Flash with retrieved context\n" ] }, { "cell_type": "markdown", "metadata": { "id": "_ZAXLi2l8KGX" }, "source": [ "## ๐Ÿ“ฆ Installation and Setup\n", "\n", "First, let's install all required packages:\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "collapsed": true, "id": "fDpasd0-8KGX", "outputId": "6b2904de-54e4-4808-ab4b-074636ca272f" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Collecting streamlit==1.28.1\n", " Downloading streamlit-1.28.1-py2.py3-none-any.whl.metadata (8.1 kB)\n", "Requirement already satisfied: altair<6,>=4.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (5.5.0)\n", "Requirement already satisfied: blinker<2,>=1.0.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (1.9.0)\n", "Requirement already satisfied: cachetools<6,>=4.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (5.5.2)\n", "Requirement already satisfied: click<9,>=7.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (8.3.0)\n", "Collecting importlib-metadata<7,>=1.4 (from streamlit==1.28.1)\n", " Downloading importlib_metadata-6.11.0-py3-none-any.whl.metadata (4.9 kB)\n", "Collecting numpy<2,>=1.19.3 (from streamlit==1.28.1)\n", " Downloading numpy-1.26.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (61 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m61.0/61.0 kB\u001b[0m \u001b[31m2.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hCollecting packaging<24,>=16.8 (from streamlit==1.28.1)\n", " Downloading packaging-23.2-py3-none-any.whl.metadata (3.2 kB)\n", "Requirement already satisfied: pandas<3,>=1.3.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (2.2.2)\n", "Collecting pillow<11,>=7.1.0 (from streamlit==1.28.1)\n", " Downloading pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl.metadata (9.2 kB)\n", "Collecting protobuf<5,>=3.20 (from streamlit==1.28.1)\n", " Downloading protobuf-4.25.8-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes)\n", "Requirement already satisfied: pyarrow>=6.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (18.1.0)\n", "Requirement already satisfied: python-dateutil<3,>=2.7.3 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (2.9.0.post0)\n", "Requirement already satisfied: requests<3,>=2.27 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (2.32.4)\n", "Requirement already satisfied: rich<14,>=10.14.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (13.9.4)\n", "Requirement already satisfied: tenacity<9,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (8.5.0)\n", "Requirement already satisfied: toml<2,>=0.10.1 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (0.10.2)\n", "Requirement already satisfied: typing-extensions<5,>=4.3.0 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (4.15.0)\n", "Requirement already satisfied: tzlocal<6,>=1.1 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (5.3.1)\n", "Collecting validators<1,>=0.2 (from streamlit==1.28.1)\n", " Downloading validators-0.35.0-py3-none-any.whl.metadata (3.9 kB)\n", "Requirement already satisfied: gitpython!=3.1.19,<4,>=3.0.7 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (3.1.45)\n", "Collecting pydeck<1,>=0.8.0b4 (from streamlit==1.28.1)\n", " Downloading pydeck-0.9.1-py2.py3-none-any.whl.metadata (4.1 kB)\n", "Requirement already satisfied: tornado<7,>=6.0.3 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (6.5.1)\n", "Requirement already satisfied: watchdog>=2.1.5 in /usr/local/lib/python3.12/dist-packages (from streamlit==1.28.1) (6.0.0)\n", "Requirement already satisfied: jinja2 in /usr/local/lib/python3.12/dist-packages (from altair<6,>=4.0->streamlit==1.28.1) (3.1.6)\n", "Requirement already satisfied: jsonschema>=3.0 in /usr/local/lib/python3.12/dist-packages (from altair<6,>=4.0->streamlit==1.28.1) (4.25.1)\n", "Requirement already satisfied: narwhals>=1.14.2 in /usr/local/lib/python3.12/dist-packages (from altair<6,>=4.0->streamlit==1.28.1) (2.9.0)\n", "Requirement already satisfied: gitdb<5,>=4.0.1 in /usr/local/lib/python3.12/dist-packages (from gitpython!=3.1.19,<4,>=3.0.7->streamlit==1.28.1) (4.0.12)\n", "Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.12/dist-packages (from importlib-metadata<7,>=1.4->streamlit==1.28.1) (3.23.0)\n", "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.12/dist-packages (from pandas<3,>=1.3.0->streamlit==1.28.1) (2025.2)\n", "Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.12/dist-packages (from pandas<3,>=1.3.0->streamlit==1.28.1) (2025.2)\n", "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.12/dist-packages (from python-dateutil<3,>=2.7.3->streamlit==1.28.1) (1.17.0)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit==1.28.1) (3.4.4)\n", "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit==1.28.1) (3.11)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit==1.28.1) (2.5.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit==1.28.1) (2025.10.5)\n", "Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.12/dist-packages (from rich<14,>=10.14.0->streamlit==1.28.1) (4.0.0)\n", "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.12/dist-packages (from rich<14,>=10.14.0->streamlit==1.28.1) (2.19.2)\n", "Requirement already satisfied: smmap<6,>=3.0.1 in /usr/local/lib/python3.12/dist-packages (from gitdb<5,>=4.0.1->gitpython!=3.1.19,<4,>=3.0.7->streamlit==1.28.1) (5.0.2)\n", "Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.12/dist-packages (from jinja2->altair<6,>=4.0->streamlit==1.28.1) (3.0.3)\n", "Requirement already satisfied: attrs>=22.2.0 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.1) (25.4.0)\n", "Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.1) (2025.9.1)\n", "Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.1) (0.37.0)\n", "Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.1) (0.27.1)\n", "Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.12/dist-packages (from markdown-it-py>=2.2.0->rich<14,>=10.14.0->streamlit==1.28.1) (0.1.2)\n", "Downloading streamlit-1.28.1-py2.py3-none-any.whl (8.4 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m8.4/8.4 MB\u001b[0m \u001b[31m26.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading importlib_metadata-6.11.0-py3-none-any.whl (23 kB)\n", "Downloading numpy-1.26.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.0 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m18.0/18.0 MB\u001b[0m \u001b[31m64.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading packaging-23.2-py3-none-any.whl (53 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m53.0/53.0 kB\u001b[0m \u001b[31m2.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl (4.5 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m4.5/4.5 MB\u001b[0m \u001b[31m92.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading protobuf-4.25.8-cp37-abi3-manylinux2014_x86_64.whl (294 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m294.9/294.9 kB\u001b[0m \u001b[31m17.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading pydeck-0.9.1-py2.py3-none-any.whl (6.9 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m6.9/6.9 MB\u001b[0m \u001b[31m68.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading validators-0.35.0-py3-none-any.whl (44 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m44.7/44.7 kB\u001b[0m \u001b[31m3.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hInstalling collected packages: validators, protobuf, pillow, packaging, numpy, importlib-metadata, pydeck, streamlit\n", " Attempting uninstall: protobuf\n", " Found existing installation: protobuf 5.29.5\n", " Uninstalling protobuf-5.29.5:\n", " Successfully uninstalled protobuf-5.29.5\n", " Attempting uninstall: pillow\n", " Found existing installation: pillow 11.3.0\n", " Uninstalling pillow-11.3.0:\n", " Successfully uninstalled pillow-11.3.0\n", " Attempting uninstall: packaging\n", " Found existing installation: packaging 25.0\n", " Uninstalling packaging-25.0:\n", " Successfully uninstalled packaging-25.0\n", " Attempting uninstall: numpy\n", " Found existing installation: numpy 2.0.2\n", " Uninstalling numpy-2.0.2:\n", " Successfully uninstalled numpy-2.0.2\n", " Attempting uninstall: importlib-metadata\n", " Found existing installation: importlib_metadata 8.7.0\n", " Uninstalling importlib_metadata-8.7.0:\n", " Successfully uninstalled importlib_metadata-8.7.0\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "db-dtypes 1.4.3 requires packaging>=24.2.0, but you have packaging 23.2 which is incompatible.\n", "google-cloud-bigquery 3.38.0 requires packaging>=24.2.0, but you have packaging 23.2 which is incompatible.\n", "xarray 2025.10.1 requires packaging>=24.1, but you have packaging 23.2 which is incompatible.\n", "jaxlib 0.7.2 requires numpy>=2.0, but you have numpy 1.26.4 which is incompatible.\n", "opencv-contrib-python 4.12.0.88 requires numpy<2.3.0,>=2; python_version >= \"3.9\", but you have numpy 1.26.4 which is incompatible.\n", "jax 0.7.2 requires numpy>=2.0, but you have numpy 1.26.4 which is incompatible.\n", "ydf 0.13.0 requires protobuf<7.0.0,>=5.29.1, but you have protobuf 4.25.8 which is incompatible.\n", "opencv-python-headless 4.12.0.88 requires numpy<2.3.0,>=2; python_version >= \"3.9\", but you have numpy 1.26.4 which is incompatible.\n", "opencv-python 4.12.0.88 requires numpy<2.3.0,>=2; python_version >= \"3.9\", but you have numpy 1.26.4 which is incompatible.\n", "pytensor 2.35.1 requires numpy>=2.0, but you have numpy 1.26.4 which is incompatible.\n", "grpcio-status 1.71.2 requires protobuf<6.0dev,>=5.26.1, but you have protobuf 4.25.8 which is incompatible.\n", "thinc 8.3.6 requires numpy<3.0.0,>=2.0.0, but you have numpy 1.26.4 which is incompatible.\n", "opentelemetry-proto 1.37.0 requires protobuf<7.0,>=5.0, but you have protobuf 4.25.8 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed importlib-metadata-6.11.0 numpy-1.26.4 packaging-23.2 pillow-10.4.0 protobuf-4.25.8 pydeck-0.9.1 streamlit-1.28.1 validators-0.35.0\n" ] }, { "data": { "application/vnd.colab-display-data+json": { "id": "5e32996eb2e54f5eb325b2894ed2cc8b", "pip_warning": { "packages": [ "PIL", "google", "numpy", "packaging" ] } } }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Collecting langchain==0.1.0\n", " Downloading langchain-0.1.0-py3-none-any.whl.metadata (13 kB)\n", "Requirement already satisfied: PyYAML>=5.3 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (6.0.3)\n", "Requirement already satisfied: SQLAlchemy<3,>=1.4 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.0.44)\n", "Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (3.13.1)\n", "Collecting dataclasses-json<0.7,>=0.5.7 (from langchain==0.1.0)\n", " Downloading dataclasses_json-0.6.7-py3-none-any.whl.metadata (25 kB)\n", "Requirement already satisfied: jsonpatch<2.0,>=1.33 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (1.33)\n", "Collecting langchain-community<0.1,>=0.0.9 (from langchain==0.1.0)\n", " Downloading langchain_community-0.0.38-py3-none-any.whl.metadata (8.7 kB)\n", "Collecting langchain-core<0.2,>=0.1.7 (from langchain==0.1.0)\n", " Downloading langchain_core-0.1.53-py3-none-any.whl.metadata (5.9 kB)\n", "Collecting langsmith<0.1.0,>=0.0.77 (from langchain==0.1.0)\n", " Downloading langsmith-0.0.92-py3-none-any.whl.metadata (9.9 kB)\n", "Requirement already satisfied: numpy<2,>=1 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (1.26.4)\n", "Requirement already satisfied: pydantic<3,>=1 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.11.10)\n", "Requirement already satisfied: requests<3,>=2 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.32.4)\n", "Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (8.5.0)\n", "Requirement already satisfied: aiohappyeyeballs>=2.5.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (2.6.1)\n", "Requirement already satisfied: aiosignal>=1.4.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.4.0)\n", "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (25.4.0)\n", "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.8.0)\n", "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (6.7.0)\n", "Requirement already satisfied: propcache>=0.2.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (0.4.1)\n", "Requirement already satisfied: yarl<2.0,>=1.17.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.22.0)\n", "Collecting marshmallow<4.0.0,>=3.18.0 (from dataclasses-json<0.7,>=0.5.7->langchain==0.1.0)\n", " Downloading marshmallow-3.26.1-py3-none-any.whl.metadata (7.3 kB)\n", "Collecting typing-inspect<1,>=0.4.0 (from dataclasses-json<0.7,>=0.5.7->langchain==0.1.0)\n", " Downloading typing_inspect-0.9.0-py3-none-any.whl.metadata (1.5 kB)\n", "Requirement already satisfied: jsonpointer>=1.9 in /usr/local/lib/python3.12/dist-packages (from jsonpatch<2.0,>=1.33->langchain==0.1.0) (3.0.0)\n", "INFO: pip is looking at multiple versions of langchain-community to determine which version is compatible with other requirements. This could take a while.\n", "Collecting langchain-community<0.1,>=0.0.9 (from langchain==0.1.0)\n", " Downloading langchain_community-0.0.37-py3-none-any.whl.metadata (8.7 kB)\n", " Downloading langchain_community-0.0.36-py3-none-any.whl.metadata (8.7 kB)\n", " Downloading langchain_community-0.0.35-py3-none-any.whl.metadata (8.7 kB)\n", " Downloading langchain_community-0.0.34-py3-none-any.whl.metadata (8.5 kB)\n", " Downloading langchain_community-0.0.33-py3-none-any.whl.metadata (8.5 kB)\n", " Downloading langchain_community-0.0.32-py3-none-any.whl.metadata (8.5 kB)\n", " Downloading langchain_community-0.0.31-py3-none-any.whl.metadata (8.4 kB)\n", "INFO: pip is still looking at multiple versions of langchain-community to determine which version is compatible with other requirements. This could take a while.\n", " Downloading langchain_community-0.0.30-py3-none-any.whl.metadata (8.4 kB)\n", " Downloading langchain_community-0.0.29-py3-none-any.whl.metadata (8.3 kB)\n", " Downloading langchain_community-0.0.28-py3-none-any.whl.metadata (8.3 kB)\n", " Downloading langchain_community-0.0.27-py3-none-any.whl.metadata (8.2 kB)\n", " Downloading langchain_community-0.0.26-py3-none-any.whl.metadata (8.2 kB)\n", "INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.\n", " Downloading langchain_community-0.0.25-py3-none-any.whl.metadata (8.1 kB)\n", " Downloading langchain_community-0.0.24-py3-none-any.whl.metadata (8.1 kB)\n", " Downloading langchain_community-0.0.23-py3-none-any.whl.metadata (8.1 kB)\n", " Downloading langchain_community-0.0.22-py3-none-any.whl.metadata (8.1 kB)\n", " Downloading langchain_community-0.0.21-py3-none-any.whl.metadata (8.1 kB)\n", " Downloading langchain_community-0.0.20-py3-none-any.whl.metadata (8.1 kB)\n", "INFO: pip is looking at multiple versions of langchain-core to determine which version is compatible with other requirements. This could take a while.\n", "Collecting langchain-core<0.2,>=0.1.7 (from langchain==0.1.0)\n", " Downloading langchain_core-0.1.52-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.51-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.50-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.49-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.48-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.47-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.46-py3-none-any.whl.metadata (5.9 kB)\n", "INFO: pip is still looking at multiple versions of langchain-core to determine which version is compatible with other requirements. This could take a while.\n", " Downloading langchain_core-0.1.45-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.44-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.43-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.42-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.41-py3-none-any.whl.metadata (5.9 kB)\n", "INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.\n", " Downloading langchain_core-0.1.40-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.39-py3-none-any.whl.metadata (5.9 kB)\n", " Downloading langchain_core-0.1.38-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.37-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.36-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.35-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.34-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.33-py3-none-any.whl.metadata (6.0 kB)\n", "Requirement already satisfied: anyio<5,>=3 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1.7->langchain==0.1.0) (4.11.0)\n", " Downloading langchain_core-0.1.32-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.31-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.30-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.29-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.28-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.27-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.26-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.25-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.24-py3-none-any.whl.metadata (6.0 kB)\n", " Downloading langchain_core-0.1.23-py3-none-any.whl.metadata (6.0 kB)\n", "Collecting langsmith<0.1.0,>=0.0.77 (from langchain==0.1.0)\n", " Downloading langsmith-0.0.87-py3-none-any.whl.metadata (10 kB)\n", "Requirement already satisfied: packaging<24.0,>=23.2 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1.7->langchain==0.1.0) (23.2)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (2.33.2)\n", "Requirement already satisfied: typing-extensions>=4.12.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (4.15.0)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (0.4.2)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (3.4.4)\n", "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (3.11)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (2.5.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (2025.10.5)\n", "Requirement already satisfied: greenlet>=1 in /usr/local/lib/python3.12/dist-packages (from SQLAlchemy<3,>=1.4->langchain==0.1.0) (3.2.4)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.12/dist-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1.7->langchain==0.1.0) (1.3.1)\n", "Collecting mypy-extensions>=0.3.0 (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain==0.1.0)\n", " Downloading mypy_extensions-1.1.0-py3-none-any.whl.metadata (1.1 kB)\n", "Downloading langchain-0.1.0-py3-none-any.whl (797 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m798.0/798.0 kB\u001b[0m \u001b[31m14.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading dataclasses_json-0.6.7-py3-none-any.whl (28 kB)\n", "Downloading langchain_community-0.0.20-py3-none-any.whl (1.7 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m1.7/1.7 MB\u001b[0m \u001b[31m63.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading langchain_core-0.1.23-py3-none-any.whl (241 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m241.2/241.2 kB\u001b[0m \u001b[31m13.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading langsmith-0.0.87-py3-none-any.whl (55 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m55.4/55.4 kB\u001b[0m \u001b[31m2.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading marshmallow-3.26.1-py3-none-any.whl (50 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m50.9/50.9 kB\u001b[0m \u001b[31m3.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading typing_inspect-0.9.0-py3-none-any.whl (8.8 kB)\n", "Downloading mypy_extensions-1.1.0-py3-none-any.whl (5.0 kB)\n", "Installing collected packages: mypy-extensions, marshmallow, typing-inspect, langsmith, dataclasses-json, langchain-core, langchain-community, langchain\n", " Attempting uninstall: langsmith\n", " Found existing installation: langsmith 0.4.37\n", " Uninstalling langsmith-0.4.37:\n", " Successfully uninstalled langsmith-0.4.37\n", " Attempting uninstall: langchain-core\n", " Found existing installation: langchain-core 0.3.79\n", " Uninstalling langchain-core-0.3.79:\n", " Successfully uninstalled langchain-core-0.3.79\n", " Attempting uninstall: langchain\n", " Found existing installation: langchain 0.3.27\n", " Uninstalling langchain-0.3.27:\n", " Successfully uninstalled langchain-0.3.27\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "langchain-text-splitters 0.3.11 requires langchain-core<2.0.0,>=0.3.75, but you have langchain-core 0.1.23 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed dataclasses-json-0.6.7 langchain-0.1.0 langchain-community-0.0.20 langchain-core-0.1.23 langsmith-0.0.87 marshmallow-3.26.1 mypy-extensions-1.1.0 typing-inspect-0.9.0\n", "Traceback (most recent call last):\n", " File \"\", line 1331, in _find_and_load_unlocked\n", " File \"\", line 935, in _load_unlocked\n", " File \"\", line 999, in exec_module\n", " File \"\", line 488, in _call_with_frames_removed\n", " File \"/usr/local/lib/python3.12/dist-packages/pip/_internal/cli/main_parser.py\", line 9, in \n", " from pip._internal.build_env import get_runnable_pip\n", " File \"/usr/local/lib/python3.12/dist-packages/pip/_internal/build_env.py\", line 15, in \n", " from pip._vendor.packaging.requirements import Requirement\n", " File \"/usr/local/lib/python3.12/dist-packages/pip/_vendor/packaging/requirements.py\", line 8, in \n", "^C\n" ] } ], "source": [ "# Install required packages\n", "!pip install streamlit==1.28.1\n", "!pip install langchain==0.1.0\n", "!pip install langchain-community==0.0.10\n", "!pip install langchain-google-genai==0.0.6\n", "!pip install chromadb==0.4.18\n", "!pip install datasets==2.14.6\n", "!pip install transformers==4.35.2\n", "!pip install sentence-transformers==2.2.2\n", "!pip install google-generativeai==0.3.2\n", "!pip install tiktoken==0.5.1\n", "!pip install numpy==1.24.3\n", "!pip install pandas==2.0.3\n", "!pip install tqdm==4.66.1\n" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "collapsed": true, "id": "kJ8PM-Tj9LNC", "outputId": "b6618d66-ef61-43e8-c0c9-fcfc6656993b" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Collecting chromadb\n", " Downloading chromadb-1.2.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.2 kB)\n", "Requirement already satisfied: build>=1.0.3 in /usr/local/lib/python3.12/dist-packages (from chromadb) (1.3.0)\n", "Requirement already satisfied: pydantic>=1.9 in /usr/local/lib/python3.12/dist-packages (from chromadb) (2.11.10)\n", "Collecting pybase64>=1.4.1 (from chromadb)\n", " Downloading pybase64-1.4.2-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl.metadata (8.7 kB)\n", "Requirement already satisfied: uvicorn>=0.18.3 in /usr/local/lib/python3.12/dist-packages (from uvicorn[standard]>=0.18.3->chromadb) (0.38.0)\n", "Requirement already satisfied: numpy>=1.22.5 in /usr/local/lib/python3.12/dist-packages (from chromadb) (1.26.4)\n", "Collecting posthog<6.0.0,>=2.4.0 (from chromadb)\n", " Downloading posthog-5.4.0-py3-none-any.whl.metadata (5.7 kB)\n", "Requirement already satisfied: typing-extensions>=4.5.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (4.15.0)\n", "Collecting onnxruntime>=1.14.1 (from chromadb)\n", " Downloading onnxruntime-1.23.2-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.metadata (5.1 kB)\n", "Requirement already satisfied: opentelemetry-api>=1.2.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (1.37.0)\n", "Collecting opentelemetry-exporter-otlp-proto-grpc>=1.2.0 (from chromadb)\n", " Downloading opentelemetry_exporter_otlp_proto_grpc-1.38.0-py3-none-any.whl.metadata (2.4 kB)\n", "Requirement already satisfied: opentelemetry-sdk>=1.2.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (1.37.0)\n", "Requirement already satisfied: tokenizers>=0.13.2 in /usr/local/lib/python3.12/dist-packages (from chromadb) (0.22.1)\n", "Collecting pypika>=0.48.9 (from chromadb)\n", " Downloading PyPika-0.48.9.tar.gz (67 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m67.3/67.3 kB\u001b[0m \u001b[31m3.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25h Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n", " Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n", " Preparing metadata (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n", "Requirement already satisfied: tqdm>=4.65.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (4.67.1)\n", "Requirement already satisfied: overrides>=7.3.1 in /usr/local/lib/python3.12/dist-packages (from chromadb) (7.7.0)\n", "Requirement already satisfied: importlib-resources in /usr/local/lib/python3.12/dist-packages (from chromadb) (6.5.2)\n", "Requirement already satisfied: grpcio>=1.58.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (1.75.1)\n", "Collecting bcrypt>=4.0.1 (from chromadb)\n", " Downloading bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl.metadata (10 kB)\n", "Requirement already satisfied: typer>=0.9.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (0.20.0)\n", "Collecting kubernetes>=28.1.0 (from chromadb)\n", " Downloading kubernetes-34.1.0-py2.py3-none-any.whl.metadata (1.7 kB)\n", "Requirement already satisfied: tenacity>=8.2.3 in /usr/local/lib/python3.12/dist-packages (from chromadb) (8.5.0)\n", "Requirement already satisfied: pyyaml>=6.0.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (6.0.3)\n", "Collecting mmh3>=4.0.1 (from chromadb)\n", " Downloading mmh3-5.2.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl.metadata (14 kB)\n", "Requirement already satisfied: orjson>=3.9.12 in /usr/local/lib/python3.12/dist-packages (from chromadb) (3.11.3)\n", "Requirement already satisfied: httpx>=0.27.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (0.28.1)\n", "Requirement already satisfied: rich>=10.11.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (13.9.4)\n", "Requirement already satisfied: jsonschema>=4.19.0 in /usr/local/lib/python3.12/dist-packages (from chromadb) (4.25.1)\n", "Requirement already satisfied: packaging>=19.1 in /usr/local/lib/python3.12/dist-packages (from build>=1.0.3->chromadb) (23.2)\n", "Requirement already satisfied: pyproject_hooks in /usr/local/lib/python3.12/dist-packages (from build>=1.0.3->chromadb) (1.2.0)\n", "Requirement already satisfied: anyio in /usr/local/lib/python3.12/dist-packages (from httpx>=0.27.0->chromadb) (4.11.0)\n", "Requirement already satisfied: certifi in /usr/local/lib/python3.12/dist-packages (from httpx>=0.27.0->chromadb) (2025.10.5)\n", "Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.12/dist-packages (from httpx>=0.27.0->chromadb) (1.0.9)\n", "Requirement already satisfied: idna in /usr/local/lib/python3.12/dist-packages (from httpx>=0.27.0->chromadb) (3.11)\n", "Requirement already satisfied: h11>=0.16 in /usr/local/lib/python3.12/dist-packages (from httpcore==1.*->httpx>=0.27.0->chromadb) (0.16.0)\n", "Requirement already satisfied: attrs>=22.2.0 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=4.19.0->chromadb) (25.4.0)\n", "Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=4.19.0->chromadb) (2025.9.1)\n", "Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=4.19.0->chromadb) (0.37.0)\n", "Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=4.19.0->chromadb) (0.27.1)\n", "Requirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.12/dist-packages (from kubernetes>=28.1.0->chromadb) (1.17.0)\n", "Requirement already satisfied: python-dateutil>=2.5.3 in /usr/local/lib/python3.12/dist-packages (from kubernetes>=28.1.0->chromadb) (2.9.0.post0)\n", "Requirement already satisfied: google-auth>=1.0.1 in /usr/local/lib/python3.12/dist-packages (from kubernetes>=28.1.0->chromadb) (2.38.0)\n", "Requirement already satisfied: websocket-client!=0.40.0,!=0.41.*,!=0.42.*,>=0.32.0 in /usr/local/lib/python3.12/dist-packages (from kubernetes>=28.1.0->chromadb) (1.9.0)\n", "Requirement already satisfied: requests in /usr/local/lib/python3.12/dist-packages (from kubernetes>=28.1.0->chromadb) (2.32.4)\n", "Requirement already satisfied: requests-oauthlib in /usr/local/lib/python3.12/dist-packages (from kubernetes>=28.1.0->chromadb) (2.0.0)\n", "Collecting urllib3<2.4.0,>=1.24.2 (from kubernetes>=28.1.0->chromadb)\n", " Downloading urllib3-2.3.0-py3-none-any.whl.metadata (6.5 kB)\n", "Collecting durationpy>=0.7 (from kubernetes>=28.1.0->chromadb)\n", " Downloading durationpy-0.10-py3-none-any.whl.metadata (340 bytes)\n", "Collecting coloredlogs (from onnxruntime>=1.14.1->chromadb)\n", " Downloading coloredlogs-15.0.1-py2.py3-none-any.whl.metadata (12 kB)\n", "Requirement already satisfied: flatbuffers in /usr/local/lib/python3.12/dist-packages (from onnxruntime>=1.14.1->chromadb) (25.9.23)\n", "Requirement already satisfied: protobuf in /usr/local/lib/python3.12/dist-packages (from onnxruntime>=1.14.1->chromadb) (4.25.8)\n", "Requirement already satisfied: sympy in /usr/local/lib/python3.12/dist-packages (from onnxruntime>=1.14.1->chromadb) (1.13.3)\n", "Requirement already satisfied: importlib-metadata<8.8.0,>=6.0 in /usr/local/lib/python3.12/dist-packages (from opentelemetry-api>=1.2.0->chromadb) (6.11.0)\n", "Requirement already satisfied: googleapis-common-protos~=1.57 in /usr/local/lib/python3.12/dist-packages (from opentelemetry-exporter-otlp-proto-grpc>=1.2.0->chromadb) (1.71.0)\n", "Collecting opentelemetry-exporter-otlp-proto-common==1.38.0 (from opentelemetry-exporter-otlp-proto-grpc>=1.2.0->chromadb)\n", " Downloading opentelemetry_exporter_otlp_proto_common-1.38.0-py3-none-any.whl.metadata (1.8 kB)\n", "Collecting opentelemetry-proto==1.38.0 (from opentelemetry-exporter-otlp-proto-grpc>=1.2.0->chromadb)\n", " Downloading opentelemetry_proto-1.38.0-py3-none-any.whl.metadata (2.3 kB)\n", "Collecting opentelemetry-sdk>=1.2.0 (from chromadb)\n", " Downloading opentelemetry_sdk-1.38.0-py3-none-any.whl.metadata (1.5 kB)\n", "Collecting protobuf (from onnxruntime>=1.14.1->chromadb)\n", " Downloading protobuf-6.33.0-cp39-abi3-manylinux2014_x86_64.whl.metadata (593 bytes)\n", "Collecting opentelemetry-api>=1.2.0 (from chromadb)\n", " Downloading opentelemetry_api-1.38.0-py3-none-any.whl.metadata (1.5 kB)\n", "Collecting opentelemetry-semantic-conventions==0.59b0 (from opentelemetry-sdk>=1.2.0->chromadb)\n", " Downloading opentelemetry_semantic_conventions-0.59b0-py3-none-any.whl.metadata (2.4 kB)\n", "Collecting backoff>=1.10.0 (from posthog<6.0.0,>=2.4.0->chromadb)\n", " Downloading backoff-2.2.1-py3-none-any.whl.metadata (14 kB)\n", "Requirement already satisfied: distro>=1.5.0 in /usr/local/lib/python3.12/dist-packages (from posthog<6.0.0,>=2.4.0->chromadb) (1.9.0)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from pydantic>=1.9->chromadb) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.12/dist-packages (from pydantic>=1.9->chromadb) (2.33.2)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from pydantic>=1.9->chromadb) (0.4.2)\n", "Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.12/dist-packages (from rich>=10.11.0->chromadb) (4.0.0)\n", "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.12/dist-packages (from rich>=10.11.0->chromadb) (2.19.2)\n", "Requirement already satisfied: huggingface-hub<2.0,>=0.16.4 in /usr/local/lib/python3.12/dist-packages (from tokenizers>=0.13.2->chromadb) (0.35.3)\n", "Requirement already satisfied: click>=8.0.0 in /usr/local/lib/python3.12/dist-packages (from typer>=0.9.0->chromadb) (8.3.0)\n", "Requirement already satisfied: shellingham>=1.3.0 in /usr/local/lib/python3.12/dist-packages (from typer>=0.9.0->chromadb) (1.5.4)\n", "Collecting httptools>=0.6.3 (from uvicorn[standard]>=0.18.3->chromadb)\n", " Downloading httptools-0.7.1-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl.metadata (3.5 kB)\n", "Requirement already satisfied: python-dotenv>=0.13 in /usr/local/lib/python3.12/dist-packages (from uvicorn[standard]>=0.18.3->chromadb) (1.1.1)\n", "Collecting uvloop>=0.15.1 (from uvicorn[standard]>=0.18.3->chromadb)\n", " Downloading uvloop-0.22.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (4.9 kB)\n", "Collecting watchfiles>=0.13 (from uvicorn[standard]>=0.18.3->chromadb)\n", " Downloading watchfiles-1.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.9 kB)\n", "Requirement already satisfied: websockets>=10.4 in /usr/local/lib/python3.12/dist-packages (from uvicorn[standard]>=0.18.3->chromadb) (15.0.1)\n", "Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.12/dist-packages (from google-auth>=1.0.1->kubernetes>=28.1.0->chromadb) (5.5.2)\n", "Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.12/dist-packages (from google-auth>=1.0.1->kubernetes>=28.1.0->chromadb) (0.4.2)\n", "Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.12/dist-packages (from google-auth>=1.0.1->kubernetes>=28.1.0->chromadb) (4.9.1)\n", "Requirement already satisfied: filelock in /usr/local/lib/python3.12/dist-packages (from huggingface-hub<2.0,>=0.16.4->tokenizers>=0.13.2->chromadb) (3.20.0)\n", "Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.12/dist-packages (from huggingface-hub<2.0,>=0.16.4->tokenizers>=0.13.2->chromadb) (2025.3.0)\n", "Requirement already satisfied: hf-xet<2.0.0,>=1.1.3 in /usr/local/lib/python3.12/dist-packages (from huggingface-hub<2.0,>=0.16.4->tokenizers>=0.13.2->chromadb) (1.1.10)\n", "Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.12/dist-packages (from importlib-metadata<8.8.0,>=6.0->opentelemetry-api>=1.2.0->chromadb) (3.23.0)\n", "Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.12/dist-packages (from markdown-it-py>=2.2.0->rich>=10.11.0->chromadb) (0.1.2)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests->kubernetes>=28.1.0->chromadb) (3.4.4)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.12/dist-packages (from anyio->httpx>=0.27.0->chromadb) (1.3.1)\n", "Collecting humanfriendly>=9.1 (from coloredlogs->onnxruntime>=1.14.1->chromadb)\n", " Downloading humanfriendly-10.0-py2.py3-none-any.whl.metadata (9.2 kB)\n", "Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.12/dist-packages (from requests-oauthlib->kubernetes>=28.1.0->chromadb) (3.3.1)\n", "Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.12/dist-packages (from sympy->onnxruntime>=1.14.1->chromadb) (1.3.0)\n", "Requirement already satisfied: pyasn1<0.7.0,>=0.6.1 in /usr/local/lib/python3.12/dist-packages (from pyasn1-modules>=0.2.1->google-auth>=1.0.1->kubernetes>=28.1.0->chromadb) (0.6.1)\n", "Downloading chromadb-1.2.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (20.7 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m20.7/20.7 MB\u001b[0m \u001b[31m100.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl (278 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m278.2/278.2 kB\u001b[0m \u001b[31m23.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading kubernetes-34.1.0-py2.py3-none-any.whl (2.0 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m2.0/2.0 MB\u001b[0m \u001b[31m91.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading mmh3-5.2.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl (103 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m103.3/103.3 kB\u001b[0m \u001b[31m7.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading onnxruntime-1.23.2-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (17.4 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m17.4/17.4 MB\u001b[0m \u001b[31m96.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading opentelemetry_exporter_otlp_proto_grpc-1.38.0-py3-none-any.whl (19 kB)\n", "Downloading opentelemetry_exporter_otlp_proto_common-1.38.0-py3-none-any.whl (18 kB)\n", "Downloading opentelemetry_proto-1.38.0-py3-none-any.whl (72 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m72.5/72.5 kB\u001b[0m \u001b[31m6.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading opentelemetry_sdk-1.38.0-py3-none-any.whl (132 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m132.3/132.3 kB\u001b[0m \u001b[31m12.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading opentelemetry_api-1.38.0-py3-none-any.whl (65 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m65.9/65.9 kB\u001b[0m \u001b[31m5.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading opentelemetry_semantic_conventions-0.59b0-py3-none-any.whl (207 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m208.0/208.0 kB\u001b[0m \u001b[31m18.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading posthog-5.4.0-py3-none-any.whl (105 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m105.4/105.4 kB\u001b[0m \u001b[31m10.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading pybase64-1.4.2-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl (71 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m71.6/71.6 kB\u001b[0m \u001b[31m6.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading backoff-2.2.1-py3-none-any.whl (15 kB)\n", "Downloading durationpy-0.10-py3-none-any.whl (3.9 kB)\n", "Downloading httptools-0.7.1-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl (517 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m517.7/517.7 kB\u001b[0m \u001b[31m41.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading protobuf-6.33.0-cp39-abi3-manylinux2014_x86_64.whl (323 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m323.2/323.2 kB\u001b[0m \u001b[31m29.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading urllib3-2.3.0-py3-none-any.whl (128 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m128.4/128.4 kB\u001b[0m \u001b[31m13.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading uvloop-0.22.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (4.4 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m4.4/4.4 MB\u001b[0m \u001b[31m78.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading watchfiles-1.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (456 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m456.8/456.8 kB\u001b[0m \u001b[31m39.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading coloredlogs-15.0.1-py2.py3-none-any.whl (46 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m46.0/46.0 kB\u001b[0m \u001b[31m3.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading humanfriendly-10.0-py2.py3-none-any.whl (86 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m86.8/86.8 kB\u001b[0m \u001b[31m8.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hBuilding wheels for collected packages: pypika\n", " Building wheel for pypika (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for pypika: filename=pypika-0.48.9-py2.py3-none-any.whl size=53803 sha256=516b39e779dee1661d0529c1b6cb9bb888d5692f8bd7d0d64965f2cec930571b\n", " Stored in directory: /root/.cache/pip/wheels/d5/3d/69/8d68d249cd3de2584f226e27fd431d6344f7d70fd856ebd01b\n", "Successfully built pypika\n", "Installing collected packages: pypika, durationpy, uvloop, urllib3, pybase64, protobuf, mmh3, humanfriendly, httptools, bcrypt, backoff, watchfiles, opentelemetry-proto, opentelemetry-api, coloredlogs, posthog, opentelemetry-semantic-conventions, opentelemetry-exporter-otlp-proto-common, onnxruntime, opentelemetry-sdk, kubernetes, opentelemetry-exporter-otlp-proto-grpc, chromadb\n", " Attempting uninstall: urllib3\n", " Found existing installation: urllib3 2.5.0\n", " Uninstalling urllib3-2.5.0:\n", " Successfully uninstalled urllib3-2.5.0\n", " Attempting uninstall: protobuf\n", " Found existing installation: protobuf 4.25.8\n", " Uninstalling protobuf-4.25.8:\n", " Successfully uninstalled protobuf-4.25.8\n", " Attempting uninstall: opentelemetry-proto\n", " Found existing installation: opentelemetry-proto 1.37.0\n", " Uninstalling opentelemetry-proto-1.37.0:\n", " Successfully uninstalled opentelemetry-proto-1.37.0\n", " Attempting uninstall: opentelemetry-api\n", " Found existing installation: opentelemetry-api 1.37.0\n", " Uninstalling opentelemetry-api-1.37.0:\n", " Successfully uninstalled opentelemetry-api-1.37.0\n", " Attempting uninstall: opentelemetry-semantic-conventions\n", " Found existing installation: opentelemetry-semantic-conventions 0.58b0\n", " Uninstalling opentelemetry-semantic-conventions-0.58b0:\n", " Successfully uninstalled opentelemetry-semantic-conventions-0.58b0\n", " Attempting uninstall: opentelemetry-exporter-otlp-proto-common\n", " Found existing installation: opentelemetry-exporter-otlp-proto-common 1.37.0\n", " Uninstalling opentelemetry-exporter-otlp-proto-common-1.37.0:\n", " Successfully uninstalled opentelemetry-exporter-otlp-proto-common-1.37.0\n", " Attempting uninstall: opentelemetry-sdk\n", " Found existing installation: opentelemetry-sdk 1.37.0\n", " Uninstalling opentelemetry-sdk-1.37.0:\n", " Successfully uninstalled opentelemetry-sdk-1.37.0\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "streamlit 1.28.1 requires protobuf<5,>=3.20, but you have protobuf 6.33.0 which is incompatible.\n", "tensorflow 2.19.0 requires protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<6.0.0dev,>=3.20.3, but you have protobuf 6.33.0 which is incompatible.\n", "google-adk 1.16.0 requires opentelemetry-api<=1.37.0,>=1.37.0, but you have opentelemetry-api 1.38.0 which is incompatible.\n", "google-adk 1.16.0 requires opentelemetry-sdk<=1.37.0,>=1.37.0, but you have opentelemetry-sdk 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-exporter-otlp-proto-common==1.37.0, but you have opentelemetry-exporter-otlp-proto-common 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-proto==1.37.0, but you have opentelemetry-proto 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-sdk~=1.37.0, but you have opentelemetry-sdk 1.38.0 which is incompatible.\n", "grpcio-status 1.71.2 requires protobuf<6.0dev,>=5.26.1, but you have protobuf 6.33.0 which is incompatible.\n", "google-ai-generativelanguage 0.6.15 requires protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<6.0.0dev,>=3.20.2, but you have protobuf 6.33.0 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed backoff-2.2.1 bcrypt-5.0.0 chromadb-1.2.1 coloredlogs-15.0.1 durationpy-0.10 httptools-0.7.1 humanfriendly-10.0 kubernetes-34.1.0 mmh3-5.2.0 onnxruntime-1.23.2 opentelemetry-api-1.38.0 opentelemetry-exporter-otlp-proto-common-1.38.0 opentelemetry-exporter-otlp-proto-grpc-1.38.0 opentelemetry-proto-1.38.0 opentelemetry-sdk-1.38.0 opentelemetry-semantic-conventions-0.59b0 posthog-5.4.0 protobuf-6.33.0 pybase64-1.4.2 pypika-0.48.9 urllib3-2.3.0 uvloop-0.22.1 watchfiles-1.1.1\n" ] }, { "data": { "application/vnd.colab-display-data+json": { "id": "c1f003adaad449fab5173c9cdfef809a", "pip_warning": { "packages": [ "google" ] } } }, "metadata": {}, "output_type": "display_data" } ], "source": [ "!pip install chromadb" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "-JfcmDbM80N4", "outputId": "9166b955-0fbf-485e-fa95-2ea104a0eb58" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Kurulum baลŸarฤฑlฤฑ!\n" ] } ], "source": [ "import streamlit, langchain, chromadb\n", "print(\"Kurulum baลŸarฤฑlฤฑ!\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "1XFPdkQt8KGY" }, "source": [ "## ๐Ÿ”‘ API Key Setup\n", "\n", "Set up your Google Gemini API key. You can get one from [Google AI Studio](https://makersuite.google.com/app/apikey).\n" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "Px01b50d8KGY", "outputId": "ddd2b336-842a-45f5-d376-5a718fc2ba0f" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "โœ… Gemini API key loaded successfully!\n" ] } ], "source": [ "# Set up Google Gemini API key\n", "import os\n", "from google.colab import userdata\n", "\n", "# Get API key from Colab secrets\n", "try:\n", " GEMINI_API_KEY = userdata.get('GEMINI_API_KEY')\n", " os.environ['GOOGLE_API_KEY'] = GEMINI_API_KEY\n", " print(\"โœ… Gemini API key loaded successfully!\")\n", "except:\n", " print(\"โŒ Please add your Gemini API key to Colab secrets:\")\n", " print(\"1. Go to the key icon (๐Ÿ”‘) in the left sidebar\")\n", " print(\"2. Add a new secret with key 'GEMINI_API_KEY' and your API key as value\")\n", " print(\"3. Restart the runtime and run this cell again\")\n", "\n", " # Alternative: Set directly (not recommended for production)\n", " # GEMINI_API_KEY = \"your_api_key_here\"\n", " # os.environ['GOOGLE_API_KEY'] = GEMINI_API_KEY\n" ] }, { "cell_type": "markdown", "metadata": { "id": "BFfdwXKB8KGZ" }, "source": [ "## ๐Ÿ“š Step 1: Load Dataset from The Pile\n", "\n", "We'll load text data from The Pile dataset using Hugging Face's datasets library. We'll focus on ML/AI related content.\n" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "R2yLLasv8KGZ", "outputId": "b7b1a4d9-71f7-4efb-c706-3d0844f5ab0c" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿ“š Loading The Pile dataset...\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/usr/local/lib/python3.12/dist-packages/huggingface_hub/utils/_auth.py:94: UserWarning: \n", "The secret `HF_TOKEN` does not exist in your Colab secrets.\n", "To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.\n", "You will be able to reuse this secret in all of your notebooks.\n", "Please note that authentication is recommended but still optional to access public models or datasets.\n", " warnings.warn(\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "โŒ Error loading dataset: No (supported) data files found in EleutherAI/the_pile\n", "๐Ÿ”„ Using fallback sample data...\n", "โœ… Using 10 sample texts\n" ] } ], "source": [ "# Import required libraries\n", "import pandas as pd\n", "import numpy as np\n", "from datasets import load_dataset\n", "from tqdm import tqdm\n", "import re\n", "import os\n", "\n", "print(\"๐Ÿ“š Loading The Pile dataset...\")\n", "\n", "# Load a subset of The Pile dataset\n", "# We'll use a smaller subset for demonstration to avoid memory issues\n", "try:\n", " # Load a specific subset that contains ML/AI content\n", " dataset = load_dataset(\"EleutherAI/the_pile\", split=\"train\", streaming=True)\n", "\n", " # Take first 1000 samples for demonstration\n", " texts = []\n", " ml_keywords = ['machine learning', 'deep learning', 'neural network', 'artificial intelligence',\n", " 'algorithm', 'model', 'training', 'data', 'feature', 'classification',\n", " 'regression', 'clustering', 'optimization', 'gradient', 'tensor']\n", "\n", " print(\"๐Ÿ” Filtering ML/AI related content...\")\n", " count = 0\n", " for sample in tqdm(dataset, desc=\"Processing samples\"):\n", " if count >= 1000: # Limit to 1000 samples for Colab\n", " break\n", "\n", " text = sample['text']\n", " # Check if text contains ML/AI keywords\n", " if any(keyword in text.lower() for keyword in ml_keywords):\n", " # Clean and preprocess text\n", " text = re.sub(r'\\s+', ' ', text) # Remove extra whitespace\n", " text = text.strip()\n", "\n", " # Only keep texts that are reasonable length (not too short or too long)\n", " if 100 <= len(text) <= 2000:\n", " texts.append(text)\n", " count += 1\n", "\n", " print(f\"โœ… Loaded {len(texts)} ML/AI related text samples\")\n", "\n", "except Exception as e:\n", " print(f\"โŒ Error loading dataset: {e}\")\n", " print(\"๐Ÿ”„ Using fallback sample data...\")\n", "\n", " # Fallback sample data if The Pile is not accessible\n", " texts = [\n", " \"Machine learning is a subset of artificial intelligence that focuses on algorithms that can learn from data. Deep learning uses neural networks with multiple layers to process complex patterns in data.\",\n", " \"Neural networks are computing systems inspired by biological neural networks. They consist of interconnected nodes that process information using a connectionist approach.\",\n", " \"Supervised learning uses labeled training data to learn a mapping from inputs to outputs. Common algorithms include linear regression, decision trees, and support vector machines.\",\n", " \"Unsupervised learning finds hidden patterns in data without labeled examples. Clustering algorithms like K-means group similar data points together.\",\n", " \"Natural language processing combines computational linguistics with machine learning to help computers understand human language. It includes tasks like text classification and sentiment analysis.\",\n", " \"Computer vision enables machines to interpret and understand visual information from the world. It uses deep learning models like convolutional neural networks.\",\n", " \"Reinforcement learning is a type of machine learning where agents learn to make decisions by interacting with an environment and receiving rewards or penalties.\",\n", " \"Feature engineering is the process of selecting and transforming raw data into features that can be used by machine learning algorithms. Good features can significantly improve model performance.\",\n", " \"Cross-validation is a technique used to assess how well a machine learning model generalizes to new data. It involves splitting data into training and validation sets multiple times.\",\n", " \"Overfitting occurs when a model learns the training data too well and performs poorly on new data. Regularization techniques help prevent overfitting.\"\n", " ]\n", " print(f\"โœ… Using {len(texts)} sample texts\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "qh2CjHfG8KGa" }, "source": [ "## ๐Ÿง  Step 2: Initialize Embeddings and Vector Database\n", "\n", "We'll use sentence transformers for embeddings and Chroma for vector storage.\n" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "R0bU_JZU8KGa", "outputId": "95cb186d-edf1-4a30-adee-c3ddc4b58753" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿง  Initializing embeddings model...\n", "โœ… Embedding model loaded!\n", "๐Ÿ—„๏ธ Setting up Chroma vector database...\n", "โœ… Created new collection: ml_ai_knowledge\n", "๐ŸŽฏ Vector database ready!\n" ] } ], "source": [ "# Initialize embeddings and vector database\n", "from sentence_transformers import SentenceTransformer\n", "import chromadb\n", "from chromadb.config import Settings\n", "\n", "print(\"๐Ÿง  Initializing embeddings model...\")\n", "\n", "# Use a lightweight sentence transformer model\n", "embedding_model = SentenceTransformer('all-MiniLM-L6-v2')\n", "print(\"โœ… Embedding model loaded!\")\n", "\n", "print(\"๐Ÿ—„๏ธ Setting up Chroma vector database...\")\n", "\n", "# Create Chroma client with persistent storage\n", "chroma_client = chromadb.Client(Settings(\n", " persist_directory=\"./chroma_db\",\n", " anonymized_telemetry=False\n", "))\n", "\n", "# Create or get collection\n", "collection_name = \"ml_ai_knowledge\"\n", "try:\n", " collection = chroma_client.get_collection(collection_name)\n", " print(f\"โœ… Found existing collection: {collection_name}\")\n", "except:\n", " collection = chroma_client.create_collection(\n", " name=collection_name,\n", " metadata={\"description\": \"ML/AI knowledge base from The Pile dataset\"}\n", " )\n", " print(f\"โœ… Created new collection: {collection_name}\")\n", "\n", "print(\"๐ŸŽฏ Vector database ready!\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "ZyBiQoRU8KGa" }, "source": [ "## ๐Ÿ“ Step 3: Process and Embed Text Data\n", "\n", "We'll chunk the text data and create embeddings for storage in Chroma.\n" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "qPd0P6w98KGb", "outputId": "7e049617-131c-4278-fffa-a248fa78398a" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿ“ Processing and chunking text data...\n", "๐Ÿ“Š Current documents in collection: 0\n", "๐Ÿ”„ Adding new documents to collection...\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "Processing texts: 100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 10/10 [00:00<00:00, 82891.38it/s]\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿ“Š Created 10 text chunks\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "Adding to Chroma: 100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 1/1 [00:01<00:00, 1.07s/it]" ] }, { "name": "stdout", "output_type": "stream", "text": [ "โœ… All documents added to Chroma!\n", "๐Ÿ“Š Final document count: 10\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "\n" ] } ], "source": [ "# Process and embed text data\n", "import uuid\n", "from tqdm import tqdm\n", "\n", "def chunk_text(text, chunk_size=500, overlap=50):\n", " \"\"\"Split text into overlapping chunks\"\"\"\n", " words = text.split()\n", " chunks = []\n", "\n", " for i in range(0, len(words), chunk_size - overlap):\n", " chunk = ' '.join(words[i:i + chunk_size])\n", " if len(chunk.strip()) > 50: # Only keep substantial chunks\n", " chunks.append(chunk)\n", "\n", " return chunks\n", "\n", "print(\"๐Ÿ“ Processing and chunking text data...\")\n", "\n", "# Check if collection already has data\n", "existing_count = collection.count()\n", "print(f\"๐Ÿ“Š Current documents in collection: {existing_count}\")\n", "\n", "if existing_count == 0:\n", " print(\"๐Ÿ”„ Adding new documents to collection...\")\n", "\n", " all_chunks = []\n", " chunk_ids = []\n", " chunk_metadatas = []\n", "\n", " for i, text in enumerate(tqdm(texts, desc=\"Processing texts\")):\n", " chunks = chunk_text(text)\n", "\n", " for j, chunk in enumerate(chunks):\n", " chunk_id = f\"doc_{i}_chunk_{j}\"\n", " metadata = {\n", " \"source\": f\"the_pile_doc_{i}\",\n", " \"chunk_index\": j,\n", " \"total_chunks\": len(chunks),\n", " \"text_length\": len(chunk)\n", " }\n", "\n", " all_chunks.append(chunk)\n", " chunk_ids.append(chunk_id)\n", " chunk_metadatas.append(metadata)\n", "\n", " print(f\"๐Ÿ“Š Created {len(all_chunks)} text chunks\")\n", "\n", " # Add documents to Chroma in batches to avoid memory issues\n", " batch_size = 100\n", " for i in tqdm(range(0, len(all_chunks), batch_size), desc=\"Adding to Chroma\"):\n", " batch_chunks = all_chunks[i:i + batch_size]\n", " batch_ids = chunk_ids[i:i + batch_size]\n", " batch_metadatas = chunk_metadatas[i:i + batch_size]\n", "\n", " collection.add(\n", " documents=batch_chunks,\n", " ids=batch_ids,\n", " metadatas=batch_metadatas\n", " )\n", "\n", " print(\"โœ… All documents added to Chroma!\")\n", "else:\n", " print(\"โœ… Collection already contains data, skipping addition\")\n", "\n", "# Verify the collection\n", "final_count = collection.count()\n", "print(f\"๐Ÿ“Š Final document count: {final_count}\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "zb0xDTl98KGb" }, "source": [ "## ๐Ÿค– Step 4: Initialize Gemini Model\n", "\n", "Set up the Google Gemini 2.5 Flash model for text generation.\n" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "collapsed": true, "id": "dhQeFUB0_i6g", "outputId": "8f5e9ac9-2c34-45f8-9c12-d36d748a9a70" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Collecting langchain_google_genai\n", " Downloading langchain_google_genai-3.0.0-py3-none-any.whl.metadata (7.1 kB)\n", "Collecting langchain-core<2.0.0,>=1.0.0 (from langchain_google_genai)\n", " Downloading langchain_core-1.0.1-py3-none-any.whl.metadata (3.5 kB)\n", "Collecting google-ai-generativelanguage<1.0.0,>=0.7.0 (from langchain_google_genai)\n", " Downloading google_ai_generativelanguage-0.9.0-py3-none-any.whl.metadata (10 kB)\n", "Requirement already satisfied: pydantic<3.0.0,>=2.0.0 in /usr/local/lib/python3.12/dist-packages (from langchain_google_genai) (2.11.10)\n", "Collecting filetype<2.0.0,>=1.2.0 (from langchain_google_genai)\n", " Downloading filetype-1.2.0-py2.py3-none-any.whl.metadata (6.5 kB)\n", "Requirement already satisfied: google-api-core!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1 in /usr/local/lib/python3.12/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (2.26.0)\n", "Requirement already satisfied: google-auth!=2.24.0,!=2.25.0,<3.0.0,>=2.14.1 in /usr/local/lib/python3.12/dist-packages (from google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (2.38.0)\n", "Requirement already satisfied: grpcio<2.0.0,>=1.33.2 in /usr/local/lib/python3.12/dist-packages (from google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (1.75.1)\n", "Requirement already satisfied: proto-plus<2.0.0,>=1.22.3 in /usr/local/lib/python3.12/dist-packages (from google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (1.26.1)\n", "Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<7.0.0,>=3.20.2 in /usr/local/lib/python3.12/dist-packages (from google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (6.33.0)\n", "Requirement already satisfied: jsonpatch<2.0.0,>=1.33.0 in /usr/local/lib/python3.12/dist-packages (from langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (1.33)\n", "Collecting langsmith<1.0.0,>=0.3.45 (from langchain-core<2.0.0,>=1.0.0->langchain_google_genai)\n", " Downloading langsmith-0.4.38-py3-none-any.whl.metadata (14 kB)\n", "Requirement already satisfied: packaging<26.0.0,>=23.2.0 in /usr/local/lib/python3.12/dist-packages (from langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (23.2)\n", "Requirement already satisfied: pyyaml<7.0.0,>=5.3.0 in /usr/local/lib/python3.12/dist-packages (from langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (6.0.3)\n", "Requirement already satisfied: tenacity!=8.4.0,<10.0.0,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (8.5.0)\n", "Requirement already satisfied: typing-extensions<5.0.0,>=4.7.0 in /usr/local/lib/python3.12/dist-packages (from langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (4.15.0)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3.0.0,>=2.0.0->langchain_google_genai) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3.0.0,>=2.0.0->langchain_google_genai) (2.33.2)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3.0.0,>=2.0.0->langchain_google_genai) (0.4.2)\n", "Requirement already satisfied: googleapis-common-protos<2.0.0,>=1.56.2 in /usr/local/lib/python3.12/dist-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (1.71.0)\n", "Requirement already satisfied: requests<3.0.0,>=2.18.0 in /usr/local/lib/python3.12/dist-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (2.32.4)\n", "Requirement already satisfied: grpcio-status<2.0.0,>=1.33.2 in /usr/local/lib/python3.12/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (1.71.2)\n", "Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.12/dist-packages (from google-auth!=2.24.0,!=2.25.0,<3.0.0,>=2.14.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (5.5.2)\n", "Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.12/dist-packages (from google-auth!=2.24.0,!=2.25.0,<3.0.0,>=2.14.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (0.4.2)\n", "Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.12/dist-packages (from google-auth!=2.24.0,!=2.25.0,<3.0.0,>=2.14.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (4.9.1)\n", "Requirement already satisfied: jsonpointer>=1.9 in /usr/local/lib/python3.12/dist-packages (from jsonpatch<2.0.0,>=1.33.0->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (3.0.0)\n", "Requirement already satisfied: httpx<1,>=0.23.0 in /usr/local/lib/python3.12/dist-packages (from langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (0.28.1)\n", "Requirement already satisfied: orjson>=3.9.14 in /usr/local/lib/python3.12/dist-packages (from langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (3.11.3)\n", "Requirement already satisfied: requests-toolbelt>=1.0.0 in /usr/local/lib/python3.12/dist-packages (from langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (1.0.0)\n", "Requirement already satisfied: zstandard>=0.23.0 in /usr/local/lib/python3.12/dist-packages (from langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (0.25.0)\n", "Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<7.0.0,>=3.20.2 (from google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai)\n", " Downloading protobuf-5.29.5-cp38-abi3-manylinux2014_x86_64.whl.metadata (592 bytes)\n", "Requirement already satisfied: anyio in /usr/local/lib/python3.12/dist-packages (from httpx<1,>=0.23.0->langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (4.11.0)\n", "Requirement already satisfied: certifi in /usr/local/lib/python3.12/dist-packages (from httpx<1,>=0.23.0->langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (2025.10.5)\n", "Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.12/dist-packages (from httpx<1,>=0.23.0->langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (1.0.9)\n", "Requirement already satisfied: idna in /usr/local/lib/python3.12/dist-packages (from httpx<1,>=0.23.0->langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (3.11)\n", "Requirement already satisfied: h11>=0.16 in /usr/local/lib/python3.12/dist-packages (from httpcore==1.*->httpx<1,>=0.23.0->langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (0.16.0)\n", "Requirement already satisfied: pyasn1<0.7.0,>=0.6.1 in /usr/local/lib/python3.12/dist-packages (from pyasn1-modules>=0.2.1->google-auth!=2.24.0,!=2.25.0,<3.0.0,>=2.14.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (0.6.1)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3.0.0,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (3.4.4)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3.0.0,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0,>=1.34.1->google-ai-generativelanguage<1.0.0,>=0.7.0->langchain_google_genai) (2.3.0)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.12/dist-packages (from anyio->httpx<1,>=0.23.0->langsmith<1.0.0,>=0.3.45->langchain-core<2.0.0,>=1.0.0->langchain_google_genai) (1.3.1)\n", "Downloading langchain_google_genai-3.0.0-py3-none-any.whl (57 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m57.8/57.8 kB\u001b[0m \u001b[31m2.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading filetype-1.2.0-py2.py3-none-any.whl (19 kB)\n", "Downloading google_ai_generativelanguage-0.9.0-py3-none-any.whl (1.4 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m1.4/1.4 MB\u001b[0m \u001b[31m28.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading langchain_core-1.0.1-py3-none-any.whl (467 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m467.1/467.1 kB\u001b[0m \u001b[31m36.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading langsmith-0.4.38-py3-none-any.whl (397 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m397.3/397.3 kB\u001b[0m \u001b[31m27.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading protobuf-5.29.5-cp38-abi3-manylinux2014_x86_64.whl (319 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m319.9/319.9 kB\u001b[0m \u001b[31m25.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hInstalling collected packages: filetype, protobuf, langsmith, langchain-core, google-ai-generativelanguage, langchain_google_genai\n", " Attempting uninstall: protobuf\n", " Found existing installation: protobuf 6.33.0\n", " Uninstalling protobuf-6.33.0:\n", " Successfully uninstalled protobuf-6.33.0\n", " Attempting uninstall: langsmith\n", " Found existing installation: langsmith 0.0.87\n", " Uninstalling langsmith-0.0.87:\n", " Successfully uninstalled langsmith-0.0.87\n", " Attempting uninstall: langchain-core\n", " Found existing installation: langchain-core 0.1.23\n", " Uninstalling langchain-core-0.1.23:\n", " Successfully uninstalled langchain-core-0.1.23\n", " Attempting uninstall: google-ai-generativelanguage\n", " Found existing installation: google-ai-generativelanguage 0.6.15\n", " Uninstalling google-ai-generativelanguage-0.6.15:\n", " Successfully uninstalled google-ai-generativelanguage-0.6.15\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "streamlit 1.28.1 requires protobuf<5,>=3.20, but you have protobuf 5.29.5 which is incompatible.\n", "langchain-community 0.0.20 requires langchain-core<0.2,>=0.1.21, but you have langchain-core 1.0.1 which is incompatible.\n", "langchain-community 0.0.20 requires langsmith<0.1,>=0.0.83, but you have langsmith 0.4.38 which is incompatible.\n", "langchain 0.1.0 requires langchain-core<0.2,>=0.1.7, but you have langchain-core 1.0.1 which is incompatible.\n", "langchain 0.1.0 requires langsmith<0.1.0,>=0.0.77, but you have langsmith 0.4.38 which is incompatible.\n", "google-generativeai 0.8.5 requires google-ai-generativelanguage==0.6.15, but you have google-ai-generativelanguage 0.9.0 which is incompatible.\n", "google-adk 1.16.0 requires opentelemetry-api<=1.37.0,>=1.37.0, but you have opentelemetry-api 1.38.0 which is incompatible.\n", "google-adk 1.16.0 requires opentelemetry-sdk<=1.37.0,>=1.37.0, but you have opentelemetry-sdk 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-exporter-otlp-proto-common==1.37.0, but you have opentelemetry-exporter-otlp-proto-common 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-proto==1.37.0, but you have opentelemetry-proto 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-sdk~=1.37.0, but you have opentelemetry-sdk 1.38.0 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed filetype-1.2.0 google-ai-generativelanguage-0.9.0 langchain-core-1.0.1 langchain_google_genai-3.0.0 langsmith-0.4.38 protobuf-5.29.5\n" ] }, { "data": { "application/vnd.colab-display-data+json": { "id": "7df3f39236514b8ea81591ff665a9afa", "pip_warning": { "packages": [ "google", "langchain_core" ] } } }, "metadata": {}, "output_type": "display_data" } ], "source": [ "!pip install langchain_google_genai" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "collapsed": true, "id": "W45gx35UBKJU", "outputId": "996b18bf-be33-4010-b9da-6fefb862e886" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Requirement already satisfied: langchain==0.1.0 in /usr/local/lib/python3.12/dist-packages (0.1.0)\n", "Requirement already satisfied: PyYAML>=5.3 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (6.0.3)\n", "Requirement already satisfied: SQLAlchemy<3,>=1.4 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.0.44)\n", "Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (3.13.1)\n", "Requirement already satisfied: dataclasses-json<0.7,>=0.5.7 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (0.6.7)\n", "Requirement already satisfied: jsonpatch<2.0,>=1.33 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (1.33)\n", "Requirement already satisfied: langchain-community<0.1,>=0.0.9 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (0.0.20)\n", "Collecting langchain-core<0.2,>=0.1.7 (from langchain==0.1.0)\n", " Using cached langchain_core-0.1.53-py3-none-any.whl.metadata (5.9 kB)\n", "Collecting langsmith<0.1.0,>=0.0.77 (from langchain==0.1.0)\n", " Using cached langsmith-0.0.92-py3-none-any.whl.metadata (9.9 kB)\n", "Requirement already satisfied: numpy<2,>=1 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (1.26.4)\n", "Requirement already satisfied: pydantic<3,>=1 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.11.10)\n", "Requirement already satisfied: requests<3,>=2 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.32.4)\n", "Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (8.5.0)\n", "Requirement already satisfied: aiohappyeyeballs>=2.5.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (2.6.1)\n", "Requirement already satisfied: aiosignal>=1.4.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.4.0)\n", "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (25.4.0)\n", "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.8.0)\n", "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (6.7.0)\n", "Requirement already satisfied: propcache>=0.2.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (0.4.1)\n", "Requirement already satisfied: yarl<2.0,>=1.17.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.22.0)\n", "Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /usr/local/lib/python3.12/dist-packages (from dataclasses-json<0.7,>=0.5.7->langchain==0.1.0) (3.26.1)\n", "Requirement already satisfied: typing-inspect<1,>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from dataclasses-json<0.7,>=0.5.7->langchain==0.1.0) (0.9.0)\n", "Requirement already satisfied: jsonpointer>=1.9 in /usr/local/lib/python3.12/dist-packages (from jsonpatch<2.0,>=1.33->langchain==0.1.0) (3.0.0)\n", "INFO: pip is looking at multiple versions of langchain-core to determine which version is compatible with other requirements. This could take a while.\n", "Collecting langchain-core<0.2,>=0.1.7 (from langchain==0.1.0)\n", " Using cached langchain_core-0.1.52-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.51-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.50-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.49-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.48-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.47-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.46-py3-none-any.whl.metadata (5.9 kB)\n", "INFO: pip is still looking at multiple versions of langchain-core to determine which version is compatible with other requirements. This could take a while.\n", " Using cached langchain_core-0.1.45-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.44-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.43-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.42-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.41-py3-none-any.whl.metadata (5.9 kB)\n", "INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.\n", " Using cached langchain_core-0.1.40-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.39-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.38-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.37-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.36-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.35-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.34-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.33-py3-none-any.whl.metadata (6.0 kB)\n", "Requirement already satisfied: anyio<5,>=3 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1.7->langchain==0.1.0) (4.11.0)\n", " Using cached langchain_core-0.1.32-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.31-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.30-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.29-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.28-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.27-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.26-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.25-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.24-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.23-py3-none-any.whl.metadata (6.0 kB)\n", "Collecting langsmith<0.1.0,>=0.0.77 (from langchain==0.1.0)\n", " Using cached langsmith-0.0.87-py3-none-any.whl.metadata (10 kB)\n", "Requirement already satisfied: packaging<24.0,>=23.2 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1.7->langchain==0.1.0) (23.2)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (2.33.2)\n", "Requirement already satisfied: typing-extensions>=4.12.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (4.15.0)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (0.4.2)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (3.4.4)\n", "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (3.11)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (2.3.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (2025.10.5)\n", "Requirement already satisfied: greenlet>=1 in /usr/local/lib/python3.12/dist-packages (from SQLAlchemy<3,>=1.4->langchain==0.1.0) (3.2.4)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.12/dist-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1.7->langchain==0.1.0) (1.3.1)\n", "Requirement already satisfied: mypy-extensions>=0.3.0 in /usr/local/lib/python3.12/dist-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain==0.1.0) (1.1.0)\n", "Using cached langchain_core-0.1.23-py3-none-any.whl (241 kB)\n", "Using cached langsmith-0.0.87-py3-none-any.whl (55 kB)\n", "Installing collected packages: langsmith, langchain-core\n", " Attempting uninstall: langsmith\n", " Found existing installation: langsmith 0.4.38\n", " Uninstalling langsmith-0.4.38:\n", " Successfully uninstalled langsmith-0.4.38\n", " Attempting uninstall: langchain-core\n", " Found existing installation: langchain-core 1.0.1\n", " Uninstalling langchain-core-1.0.1:\n", " Successfully uninstalled langchain-core-1.0.1\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "langchain-google-genai 3.0.0 requires langchain-core<2.0.0,>=1.0.0, but you have langchain-core 0.1.23 which is incompatible.\n", "langchain-text-splitters 0.3.11 requires langchain-core<2.0.0,>=0.3.75, but you have langchain-core 0.1.23 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed langchain-core-0.1.23 langsmith-0.0.87\n" ] }, { "data": { "application/vnd.colab-display-data+json": { "id": "0b12704309ed4801ae5cdee9a766a829", "pip_warning": { "packages": [ "langchain_core", "langsmith" ] } } }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Collecting langchain_google_genai==2.0.0\n", " Downloading langchain_google_genai-2.0.0-py3-none-any.whl.metadata (3.9 kB)\n", "Collecting google-generativeai<0.8.0,>=0.7.0 (from langchain_google_genai==2.0.0)\n", " Downloading google_generativeai-0.7.2-py3-none-any.whl.metadata (4.0 kB)\n", "Collecting langchain-core<0.4,>=0.3.0 (from langchain_google_genai==2.0.0)\n", " Downloading langchain_core-0.3.79-py3-none-any.whl.metadata (3.2 kB)\n", "Requirement already satisfied: pydantic<3,>=2 in /usr/local/lib/python3.12/dist-packages (from langchain_google_genai==2.0.0) (2.11.10)\n", "Collecting google-ai-generativelanguage==0.6.6 (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0)\n", " Downloading google_ai_generativelanguage-0.6.6-py3-none-any.whl.metadata (5.6 kB)\n", "Requirement already satisfied: google-api-core in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (2.26.0)\n", "Requirement already satisfied: google-api-python-client in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (2.185.0)\n", "Requirement already satisfied: google-auth>=2.15.0 in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (2.38.0)\n", "Requirement already satisfied: protobuf in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (5.29.5)\n", "Requirement already satisfied: tqdm in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (4.67.1)\n", "Requirement already satisfied: typing-extensions in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (4.15.0)\n", "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in /usr/local/lib/python3.12/dist-packages (from google-ai-generativelanguage==0.6.6->google-generativeai<0.8.0,>=0.7.0->langchain_google_genai==2.0.0) (1.26.1)\n", "\u001b[31mERROR: Operation cancelled by user\u001b[0m\u001b[31m\n", "\u001b[0mTraceback (most recent call last):\n", " File \"/usr/local/lib/python3.12/dist-packages/pip/_internal/cli/base_command.py\", line 179, in exc_logging_wrapper\n", " status = run_func(*args)\n", " ^^^^^^^^^^^^^^^\n", " File \"/usr/local/lib/python3.12/dist-packages/pip/_internal/cli/req_command.py\", line 67, in wrapper\n", " return func(self, options, args)\n", " ^^^^^^^^^^^^^^^^^^^^^^^^^\n", " File \"/usr/local/lib/python3.12/dist-packages/pip/_internal/commands/install.py\", line 377, in run\n", "^C\n" ] } ], "source": [ "!pip install langchain==0.1.0\n", "!pip install langchain_google_genai==2.0.0" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "collapsed": true, "id": "u3YDEYVaCE9m", "outputId": "74f9c2fd-6811-44aa-96ea-e55b90d36a49" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Found existing installation: langchain 0.1.0\n", "Uninstalling langchain-0.1.0:\n", " Successfully uninstalled langchain-0.1.0\n", "Found existing installation: langchain-core 0.1.23\n", "Uninstalling langchain-core-0.1.23:\n", " Successfully uninstalled langchain-core-0.1.23\n", "Found existing installation: langsmith 0.0.87\n", "Uninstalling langsmith-0.0.87:\n", " Successfully uninstalled langsmith-0.0.87\n", "\u001b[33mWARNING: Skipping langchain_google_genai as it is not installed.\u001b[0m\u001b[33m\n", "\u001b[0m" ] } ], "source": [ "!pip uninstall -y langchain langchain-core langsmith langchain_google_genai" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "collapsed": true, "id": "7D95aJhGCHx8", "outputId": "2281333c-b4b9-48d8-8410-97019aac947b" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Collecting langchain==0.1.0\n", " Using cached langchain-0.1.0-py3-none-any.whl.metadata (13 kB)\n", "Requirement already satisfied: PyYAML>=5.3 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (6.0.3)\n", "Requirement already satisfied: SQLAlchemy<3,>=1.4 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.0.44)\n", "Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (3.13.1)\n", "Requirement already satisfied: dataclasses-json<0.7,>=0.5.7 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (0.6.7)\n", "Requirement already satisfied: jsonpatch<2.0,>=1.33 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (1.33)\n", "Requirement already satisfied: langchain-community<0.1,>=0.0.9 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (0.0.20)\n", "Collecting langchain-core<0.2,>=0.1.7 (from langchain==0.1.0)\n", " Using cached langchain_core-0.1.53-py3-none-any.whl.metadata (5.9 kB)\n", "Collecting langsmith<0.1.0,>=0.0.77 (from langchain==0.1.0)\n", " Using cached langsmith-0.0.92-py3-none-any.whl.metadata (9.9 kB)\n", "Requirement already satisfied: numpy<2,>=1 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (1.26.4)\n", "Requirement already satisfied: pydantic<3,>=1 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.11.10)\n", "Requirement already satisfied: requests<3,>=2 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (2.32.4)\n", "Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from langchain==0.1.0) (8.5.0)\n", "Requirement already satisfied: aiohappyeyeballs>=2.5.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (2.6.1)\n", "Requirement already satisfied: aiosignal>=1.4.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.4.0)\n", "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (25.4.0)\n", "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.8.0)\n", "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (6.7.0)\n", "Requirement already satisfied: propcache>=0.2.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (0.4.1)\n", "Requirement already satisfied: yarl<2.0,>=1.17.0 in /usr/local/lib/python3.12/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.0) (1.22.0)\n", "Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /usr/local/lib/python3.12/dist-packages (from dataclasses-json<0.7,>=0.5.7->langchain==0.1.0) (3.26.1)\n", "Requirement already satisfied: typing-inspect<1,>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from dataclasses-json<0.7,>=0.5.7->langchain==0.1.0) (0.9.0)\n", "Requirement already satisfied: jsonpointer>=1.9 in /usr/local/lib/python3.12/dist-packages (from jsonpatch<2.0,>=1.33->langchain==0.1.0) (3.0.0)\n", "INFO: pip is looking at multiple versions of langchain-core to determine which version is compatible with other requirements. This could take a while.\n", "Collecting langchain-core<0.2,>=0.1.7 (from langchain==0.1.0)\n", " Using cached langchain_core-0.1.52-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.51-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.50-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.49-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.48-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.47-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.46-py3-none-any.whl.metadata (5.9 kB)\n", "INFO: pip is still looking at multiple versions of langchain-core to determine which version is compatible with other requirements. This could take a while.\n", " Using cached langchain_core-0.1.45-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.44-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.43-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.42-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.41-py3-none-any.whl.metadata (5.9 kB)\n", "INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.\n", " Using cached langchain_core-0.1.40-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.39-py3-none-any.whl.metadata (5.9 kB)\n", " Using cached langchain_core-0.1.38-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.37-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.36-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.35-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.34-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.33-py3-none-any.whl.metadata (6.0 kB)\n", "Requirement already satisfied: anyio<5,>=3 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1.7->langchain==0.1.0) (4.11.0)\n", " Using cached langchain_core-0.1.32-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.31-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.30-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.29-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.28-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.27-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.26-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.25-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.24-py3-none-any.whl.metadata (6.0 kB)\n", " Using cached langchain_core-0.1.23-py3-none-any.whl.metadata (6.0 kB)\n", "Collecting langsmith<0.1.0,>=0.0.77 (from langchain==0.1.0)\n", " Using cached langsmith-0.0.87-py3-none-any.whl.metadata (10 kB)\n", "Requirement already satisfied: packaging<24.0,>=23.2 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1.7->langchain==0.1.0) (23.2)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (2.33.2)\n", "Requirement already satisfied: typing-extensions>=4.12.2 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (4.15.0)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from pydantic<3,>=1->langchain==0.1.0) (0.4.2)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (3.4.4)\n", "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (3.11)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (2.3.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain==0.1.0) (2025.10.5)\n", "Requirement already satisfied: greenlet>=1 in /usr/local/lib/python3.12/dist-packages (from SQLAlchemy<3,>=1.4->langchain==0.1.0) (3.2.4)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.12/dist-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1.7->langchain==0.1.0) (1.3.1)\n", "Requirement already satisfied: mypy-extensions>=0.3.0 in /usr/local/lib/python3.12/dist-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain==0.1.0) (1.1.0)\n", "Using cached langchain-0.1.0-py3-none-any.whl (797 kB)\n", "Using cached langchain_core-0.1.23-py3-none-any.whl (241 kB)\n", "Using cached langsmith-0.0.87-py3-none-any.whl (55 kB)\n", "Installing collected packages: langsmith, langchain-core, langchain\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "langchain-text-splitters 0.3.11 requires langchain-core<2.0.0,>=0.3.75, but you have langchain-core 0.1.23 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed langchain-0.1.0 langchain-core-0.1.23 langsmith-0.0.87\n", "\u001b[31mERROR: Could not find a version that satisfies the requirement langchain_google_genai==1.0.0 (from versions: 0.0.1rc0, 0.0.1, 0.0.2, 0.0.3, 0.0.4, 0.0.5, 0.0.6, 0.0.7, 0.0.8, 0.0.9, 0.0.10rc0, 0.0.11, 1.0.1, 1.0.2, 1.0.3, 1.0.4, 1.0.5, 1.0.6, 1.0.7, 1.0.8, 1.0.9, 1.0.10, 2.0.0.dev1, 2.0.0, 2.0.1, 2.0.2, 2.0.3, 2.0.4, 2.0.5, 2.0.6, 2.0.7, 2.0.8, 2.0.9, 2.0.10, 2.0.11, 2.1.0, 2.1.1, 2.1.2, 2.1.3, 2.1.4, 2.1.5, 2.1.6, 2.1.7, 2.1.8, 2.1.9, 2.1.10, 2.1.11, 2.1.12, 3.0.0a1, 3.0.0rc1, 3.0.0)\u001b[0m\u001b[31m\n", "\u001b[0m\u001b[31mERROR: No matching distribution found for langchain_google_genai==1.0.0\u001b[0m\u001b[31m\n", "\u001b[0m" ] } ], "source": [ "!pip install langchain==0.1.0\n", "!pip install langchain_google_genai==1.0.0\n" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "collapsed": true, "id": "yUKfAmX5CVCA", "outputId": "16ca9815-a269-48eb-9e6b-8241b2d888e0" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[33mWARNING: Skipping langchain_google_genai as it is not installed.\u001b[0m\u001b[33m\n", "\u001b[0mCollecting langchain_google_genai==1.0.1\n", " Downloading langchain_google_genai-1.0.1-py3-none-any.whl.metadata (3.8 kB)\n", "Collecting google-generativeai<0.5.0,>=0.4.1 (from langchain_google_genai==1.0.1)\n", " Downloading google_generativeai-0.4.1-py3-none-any.whl.metadata (6.2 kB)\n", "Requirement already satisfied: langchain-core<0.2,>=0.1 in /usr/local/lib/python3.12/dist-packages (from langchain_google_genai==1.0.1) (0.1.23)\n", "Collecting google-ai-generativelanguage==0.4.0 (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1)\n", " Downloading google_ai_generativelanguage-0.4.0-py3-none-any.whl.metadata (5.1 kB)\n", "Requirement already satisfied: google-auth>=2.15.0 in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (2.38.0)\n", "Requirement already satisfied: google-api-core in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (2.26.0)\n", "Requirement already satisfied: protobuf in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (5.29.5)\n", "Requirement already satisfied: pydantic in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (2.11.10)\n", "Requirement already satisfied: tqdm in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (4.67.1)\n", "Requirement already satisfied: typing-extensions in /usr/local/lib/python3.12/dist-packages (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (4.15.0)\n", "Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in /usr/local/lib/python3.12/dist-packages (from google-ai-generativelanguage==0.4.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (1.26.1)\n", "Collecting protobuf (from google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1)\n", " Using cached protobuf-4.25.8-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes)\n", "Requirement already satisfied: PyYAML>=5.3 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (6.0.3)\n", "Requirement already satisfied: anyio<5,>=3 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (4.11.0)\n", "Requirement already satisfied: jsonpatch<2.0,>=1.33 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (1.33)\n", "Requirement already satisfied: langsmith<0.0.88,>=0.0.87 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (0.0.87)\n", "Requirement already satisfied: packaging<24.0,>=23.2 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (23.2)\n", "Requirement already satisfied: requests<3,>=2 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (2.32.4)\n", "Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (8.5.0)\n", "Requirement already satisfied: idna>=2.8 in /usr/local/lib/python3.12/dist-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (3.11)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.12/dist-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (1.3.1)\n", "Requirement already satisfied: googleapis-common-protos<2.0.0,>=1.56.2 in /usr/local/lib/python3.12/dist-packages (from google-api-core->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (1.71.0)\n", "Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.12/dist-packages (from google-auth>=2.15.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (5.5.2)\n", "Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.12/dist-packages (from google-auth>=2.15.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (0.4.2)\n", "Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.12/dist-packages (from google-auth>=2.15.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (4.9.1)\n", "Requirement already satisfied: jsonpointer>=1.9 in /usr/local/lib/python3.12/dist-packages (from jsonpatch<2.0,>=1.33->langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (3.0.0)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.12/dist-packages (from pydantic->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.12/dist-packages (from pydantic->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (2.33.2)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.12/dist-packages (from pydantic->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (0.4.2)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (3.4.4)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (2.3.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2->langchain-core<0.2,>=0.1->langchain_google_genai==1.0.1) (2025.10.5)\n", "Requirement already satisfied: grpcio<2.0.0,>=1.33.2 in /usr/local/lib/python3.12/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0->google-ai-generativelanguage==0.4.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (1.75.1)\n", "Requirement already satisfied: grpcio-status<2.0.0,>=1.33.2 in /usr/local/lib/python3.12/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0->google-ai-generativelanguage==0.4.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (1.71.2)\n", "Requirement already satisfied: pyasn1<0.7.0,>=0.6.1 in /usr/local/lib/python3.12/dist-packages (from pyasn1-modules>=0.2.1->google-auth>=2.15.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1) (0.6.1)\n", "INFO: pip is looking at multiple versions of grpcio-status to determine which version is compatible with other requirements. This could take a while.\n", "Collecting grpcio-status<2.0.0,>=1.33.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0->google-ai-generativelanguage==0.4.0->google-generativeai<0.5.0,>=0.4.1->langchain_google_genai==1.0.1)\n", " Downloading grpcio_status-1.76.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.75.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.75.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.74.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.73.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.73.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.72.2-py3-none-any.whl.metadata (1.1 kB)\n", "INFO: pip is still looking at multiple versions of grpcio-status to determine which version is compatible with other requirements. This could take a while.\n", " Downloading grpcio_status-1.72.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.71.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.70.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.69.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.68.1-py3-none-any.whl.metadata (1.1 kB)\n", "INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.\n", " Downloading grpcio_status-1.68.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.67.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.67.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.66.2-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.66.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.66.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.65.5-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.65.4-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.65.2-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.65.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.64.3-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.64.1-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.64.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.63.2-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.63.0-py3-none-any.whl.metadata (1.1 kB)\n", " Downloading grpcio_status-1.62.3-py3-none-any.whl.metadata (1.3 kB)\n", "Downloading langchain_google_genai-1.0.1-py3-none-any.whl (28 kB)\n", "Downloading google_generativeai-0.4.1-py3-none-any.whl (137 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m137.4/137.4 kB\u001b[0m \u001b[31m6.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading google_ai_generativelanguage-0.4.0-py3-none-any.whl (598 kB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m598.7/598.7 kB\u001b[0m \u001b[31m21.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hUsing cached protobuf-4.25.8-cp37-abi3-manylinux2014_x86_64.whl (294 kB)\n", "Downloading grpcio_status-1.62.3-py3-none-any.whl (14 kB)\n", "Installing collected packages: protobuf, grpcio-status, google-ai-generativelanguage, google-generativeai, langchain_google_genai\n", " Attempting uninstall: protobuf\n", " Found existing installation: protobuf 5.29.5\n", " Uninstalling protobuf-5.29.5:\n", " Successfully uninstalled protobuf-5.29.5\n", " Attempting uninstall: grpcio-status\n", " Found existing installation: grpcio-status 1.71.2\n", " Uninstalling grpcio-status-1.71.2:\n", " Successfully uninstalled grpcio-status-1.71.2\n", " Attempting uninstall: google-ai-generativelanguage\n", " Found existing installation: google-ai-generativelanguage 0.9.0\n", " Uninstalling google-ai-generativelanguage-0.9.0:\n", " Successfully uninstalled google-ai-generativelanguage-0.9.0\n", " Attempting uninstall: google-generativeai\n", " Found existing installation: google-generativeai 0.8.5\n", " Uninstalling google-generativeai-0.8.5:\n", " Successfully uninstalled google-generativeai-0.8.5\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "opentelemetry-proto 1.38.0 requires protobuf<7.0,>=5.0, but you have protobuf 4.25.8 which is incompatible.\n", "google-adk 1.16.0 requires opentelemetry-api<=1.37.0,>=1.37.0, but you have opentelemetry-api 1.38.0 which is incompatible.\n", "google-adk 1.16.0 requires opentelemetry-sdk<=1.37.0,>=1.37.0, but you have opentelemetry-sdk 1.38.0 which is incompatible.\n", "ydf 0.13.0 requires protobuf<7.0.0,>=5.29.1, but you have protobuf 4.25.8 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-exporter-otlp-proto-common==1.37.0, but you have opentelemetry-exporter-otlp-proto-common 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-proto==1.37.0, but you have opentelemetry-proto 1.38.0 which is incompatible.\n", "opentelemetry-exporter-otlp-proto-http 1.37.0 requires opentelemetry-sdk~=1.37.0, but you have opentelemetry-sdk 1.38.0 which is incompatible.\u001b[0m\u001b[31m\n", "\u001b[0mSuccessfully installed google-ai-generativelanguage-0.4.0 google-generativeai-0.4.1 grpcio-status-1.62.3 langchain_google_genai-1.0.1 protobuf-4.25.8\n" ] }, { "data": { "application/vnd.colab-display-data+json": { "id": "901a9c7b1abe4dbb929d45ecfe93b5ee", "pip_warning": { "packages": [ "google" ] } } }, "metadata": {}, "output_type": "display_data" } ], "source": [ "!pip uninstall -y langchain_google_genai\n", "!pip install langchain_google_genai==1.0.1" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "v1wPtvTv8KGb", "outputId": "aa9260db-5070-4690-a403-406e9ec45f27" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿค– Initializing Gemini 2.5 Flash model...\n", "โœ… Gemini model initialized!\n", "๐Ÿงช Test response: Okay, let's dive into the fascinating world of machine learning!\n", "\n", "**What is Machine Learning (ML)?**...\n", "โœ… Gemini model is working!\n" ] } ], "source": [ "# Initialize Gemini model\n", "from langchain_google_genai import ChatGoogleGenerativeAI\n", "from langchain.schema import HumanMessage, SystemMessage\n", "\n", "print(\"๐Ÿค– Initializing Gemini 2.5 Flash model...\")\n", "\n", "# Initialize the Gemini model\n", "llm = ChatGoogleGenerativeAI(\n", " model=\"gemini-2.0-flash-exp\", # Using the latest available model\n", " temperature=0.7,\n", " max_output_tokens=1024,\n", " convert_system_message_to_human=True\n", ")\n", "\n", "print(\"โœ… Gemini model initialized!\")\n", "\n", "# Test the model\n", "try:\n", " test_response = llm.invoke(\"Hello! Can you tell me about machine learning?\")\n", " print(\"๐Ÿงช Test response:\", test_response.content[:100] + \"...\")\n", " print(\"โœ… Gemini model is working!\")\n", "except Exception as e:\n", " print(f\"โŒ Error testing Gemini model: {e}\")\n", " print(\"Please check your API key and try again.\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "RPMySqbW8KGb" }, "source": [ "## ๐Ÿ” Step 5: Create RAG Pipeline\n", "\n", "Now we'll create the complete RAG pipeline that retrieves relevant context and generates answers.\n" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "pL3kfIAD8KGb", "outputId": "804d3325-f7a3-4ff0-e20b-bb8885c306b1" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "โœ… RAG pipeline created!\n" ] } ], "source": [ "# Create RAG pipeline\n", "def retrieve_relevant_docs(query, n_results=5):\n", " \"\"\"Retrieve relevant documents from Chroma\"\"\"\n", " try:\n", " results = collection.query(\n", " query_texts=[query],\n", " n_results=n_results\n", " )\n", "\n", " # Extract documents and metadata\n", " documents = results['documents'][0]\n", " metadatas = results['metadatas'][0]\n", " distances = results['distances'][0]\n", "\n", " return documents, metadatas, distances\n", " except Exception as e:\n", " print(f\"Error retrieving documents: {e}\")\n", " return [], [], []\n", "\n", "def create_context(documents):\n", " \"\"\"Create context string from retrieved documents\"\"\"\n", " context = \"\\n\\n\".join(documents)\n", " return context\n", "\n", "def generate_answer(query, context):\n", " \"\"\"Generate answer using Gemini with retrieved context\"\"\"\n", " system_prompt = \"\"\"You are an AI assistant specialized in machine learning, deep learning, and artificial intelligence.\n", " Use the provided context to answer questions accurately and comprehensively. If the context doesn't contain enough\n", " information, you can supplement with your general knowledge, but always prioritize the provided context.\n", "\n", " Provide clear, well-structured answers with examples when appropriate.\"\"\"\n", "\n", " user_prompt = f\"\"\"Context:\n", " {context}\n", "\n", " Question: {query}\n", "\n", " Please provide a comprehensive answer based on the context above.\"\"\"\n", "\n", " try:\n", " messages = [\n", " SystemMessage(content=system_prompt),\n", " HumanMessage(content=user_prompt)\n", " ]\n", "\n", " response = llm.invoke(messages)\n", " return response.content\n", " except Exception as e:\n", " return f\"Error generating answer: {e}\"\n", "\n", "def rag_pipeline(query, n_results=5):\n", " \"\"\"Complete RAG pipeline\"\"\"\n", " print(f\"๐Ÿ” Processing query: '{query}'\")\n", "\n", " # Retrieve relevant documents\n", " documents, metadatas, distances = retrieve_relevant_docs(query, n_results)\n", "\n", " if not documents:\n", " return \"Sorry, I couldn't find relevant information for your query.\"\n", "\n", " print(f\"๐Ÿ“š Retrieved {len(documents)} relevant documents\")\n", "\n", " # Create context\n", " context = create_context(documents)\n", "\n", " # Generate answer\n", " answer = generate_answer(query, context)\n", "\n", " return answer, documents, metadatas, distances\n", "\n", "print(\"โœ… RAG pipeline created!\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "8b0eHdTG8KGc" }, "source": [ "## ๐Ÿงช Step 6: Test the RAG System\n", "\n", "Let's test our RAG chatbot with some sample questions about ML/AI topics.\n" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "Z5CsJNXC8KGc", "outputId": "dce825e4-2d3b-433b-e4fc-0878b454a1fe" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿงช Testing RAG system with sample questions...\n", "\n", "โ“ Question 1: What is machine learning?\n", "--------------------------------------------------\n", "๐Ÿ” Processing query: 'What is machine learning?'\n", "๐Ÿ“š Retrieved 5 relevant documents\n", "๐Ÿค– Answer: Machine learning is a subset of artificial intelligence that focuses on algorithms that can learn from data. It enables systems to improve their performance on a specific task over time without being explicitly programmed.\n", "\n", "๐Ÿ“Š Retrieved 5 documents\n", "๐ŸŽฏ Similarity scores: ['0.528', '0.741', '0.901', '0.926', '0.984']\n", "\n", "================================================================================\n", "\n", "โ“ Question 2: How do neural networks work?\n", "--------------------------------------------------\n", "๐Ÿ” Processing query: 'How do neural networks work?'\n", "๐Ÿ“š Retrieved 5 relevant documents\n", "๐Ÿค– Answer: Neural networks are computing systems inspired by biological neural networks, employing a connectionist approach to process information. They consist of interconnected nodes that work together to learn patterns from data. The context describes them as the basis for deep learning, where multiple layers of neural networks are used to process complex patterns.\n", "\n", "๐Ÿ“Š Retrieved 5 documents\n", "๐ŸŽฏ Similarity scores: ['0.623', '0.861', '1.019', '1.083', '1.360']\n", "\n", "================================================================================\n", "\n", "โ“ Question 3: What is the difference between supervised and unsupervised learning?\n", "--------------------------------------------------\n", "๐Ÿ” Processing query: 'What is the difference between supervised and unsupervised learning?'\n", "๐Ÿ“š Retrieved 5 relevant documents\n", "๐Ÿค– Answer: The key difference between supervised and unsupervised learning lies in the data they use and the type of problem they solve.\n", "\n", "* **Supervised learning** uses labeled data, meaning each data point has a corresponding target or outcome associated with it. The algorithm learns a mapping from inputs to outputs based on this labeled training data. Examples of supervised learning algorithms include linear regression, decision trees, and support vector machines. A practical example is predicting house prices based on features like size and location, where the labeled data consists of houses with known prices.\n", "\n", "* **Unsupervised learning**, on the other hand, uses unlabeled data. The goal is to discover hidden patterns or structures within the data without any prior knowledge of the desired outcome. Clustering algorithms like K-means are examples of unsupervised learning techniques, where the algorithm groups similar data points together. For instance, customer segmentation based on purchasing behavior, where the algorithm identifies distinct groups of customers without pre-defined labels.\n", "\n", "๐Ÿ“Š Retrieved 5 documents\n", "๐ŸŽฏ Similarity scores: ['0.779', '0.958', '1.266', '1.466', '1.476']\n", "\n", "================================================================================\n", "\n", "โ“ Question 4: Explain deep learning\n", "--------------------------------------------------\n", "๐Ÿ” Processing query: 'Explain deep learning'\n", "๐Ÿ“š Retrieved 5 relevant documents\n", "๐Ÿค– Answer: Deep learning is a subfield of machine learning that employs neural networks with multiple layers (hence, \"deep\") to analyze and extract intricate patterns from data. This approach is particularly effective for handling complex tasks such as image recognition, natural language processing, and speech recognition.\n", "\n", "Here's a breakdown of key aspects:\n", "\n", "* **Neural Networks:** Deep learning models are based on artificial neural networks, which are computing systems inspired by biological neural networks found in the brain. These networks consist of interconnected nodes (neurons) organized in layers.\n", "\n", "* **Multiple Layers:** The \"deep\" in deep learning refers to the use of many layers in the neural network. These layers enable the model to learn hierarchical representations of data, where each layer extracts increasingly abstract features. For example, in computer vision, the first layers might detect edges and corners, while deeper layers combine these features to recognize objects.\n", "\n", "* **Learning from Data:** Like other machine learning techniques, deep learning algorithms learn from data. The network adjusts the connections between neurons (weights) based on the input data to improve its performance on a specific task.\n", "\n", "* **Applications:** Deep learning is used in various applications, including:\n", " * **Computer Vision:** Enabling machines to \"see\" and interpret images and videos. For instance, convolutional neural networks (CNNs) are a popular deep learning architecture for image recognition.\n", " * **Natural Language Processing:** Allowing machines to understand and generate human language.\n", " * **Speech Recognition:** Converting spoken language into text.\n", " * **Reinforcement Learning:** Training agents to make decisions in complex environments.\n", "๐Ÿ“Š Retrieved 5 documents\n", "๐ŸŽฏ Similarity scores: ['0.613', '0.788', '1.096', '1.145', '1.239']\n", "\n", "================================================================================\n", "\n", "โ“ Question 5: What is overfitting in machine learning?\n", "--------------------------------------------------\n", "๐Ÿ” Processing query: 'What is overfitting in machine learning?'\n", "๐Ÿ“š Retrieved 5 relevant documents\n", "๐Ÿค– Answer: Overfitting in machine learning, as explained in the provided context, occurs when a model learns the training data too well. This means the model captures not only the underlying patterns in the data but also the noise or irrelevant details. As a result, while the model may perform exceptionally well on the training data, it fails to generalize to new, unseen data, leading to poor performance. Regularization techniques are employed to mitigate overfitting.\n", "\n", "๐Ÿ“Š Retrieved 5 documents\n", "๐ŸŽฏ Similarity scores: ['0.615', '1.125', '1.164', '1.281', '1.348']\n", "\n", "================================================================================\n", "\n" ] } ], "source": [ "# Test the RAG system\n", "test_questions = [\n", " \"What is machine learning?\",\n", " \"How do neural networks work?\",\n", " \"What is the difference between supervised and unsupervised learning?\",\n", " \"Explain deep learning\",\n", " \"What is overfitting in machine learning?\"\n", "]\n", "\n", "print(\"๐Ÿงช Testing RAG system with sample questions...\\n\")\n", "\n", "for i, question in enumerate(test_questions, 1):\n", " print(f\"โ“ Question {i}: {question}\")\n", " print(\"-\" * 50)\n", "\n", " try:\n", " answer, documents, metadatas, distances = rag_pipeline(question)\n", " print(f\"๐Ÿค– Answer: {answer}\")\n", " print(f\"๐Ÿ“Š Retrieved {len(documents)} documents\")\n", " print(f\"๐ŸŽฏ Similarity scores: {[f'{d:.3f}' for d in distances]}\")\n", " print(\"\\n\" + \"=\"*80 + \"\\n\")\n", " except Exception as e:\n", " print(f\"โŒ Error: {e}\")\n", " print(\"\\n\" + \"=\"*80 + \"\\n\")\n" ] }, { "cell_type": "markdown", "metadata": { "id": "ig2vPO8J8KGc" }, "source": [ "## ๐Ÿ’พ Step 7: Save Components for Streamlit App\n", "\n", "Save the necessary components so they can be used in the Streamlit app.\n" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "7Ydbi57S8KGc", "outputId": "cb618051-152f-45ce-ddf9-69bfe4ebc405" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "๐Ÿ’พ Saving components for Streamlit app...\n", "โœ… Configuration saved to rag_config.json\n", "\n", "๐ŸŽฏ Final verification test...\n", "๐Ÿ” Processing query: 'What is artificial intelligence?'\n", "๐Ÿ“š Retrieved 5 relevant documents\n", "โœ… Test successful! Answer length: 255 characters\n", "๐Ÿ“Š Retrieved 5 documents\n", "\n", "๐ŸŽ‰ RAG system is ready!\n", "๐Ÿ“ Files created:\n", " - chroma_db/ (vector database)\n", " - rag_config.json (configuration)\n", "\n", "๐Ÿš€ You can now use this system in the Streamlit app!\n" ] } ], "source": [ "# Save components for Streamlit app\n", "import pickle\n", "import json\n", "\n", "print(\"๐Ÿ’พ Saving components for Streamlit app...\")\n", "\n", "# Save the RAG pipeline functions and configuration\n", "rag_config = {\n", " 'collection_name': collection_name,\n", " 'embedding_model_name': 'all-MiniLM-L6-v2',\n", " 'gemini_model': 'gemini-2.0-flash-exp',\n", " 'temperature': 0.7,\n", " 'max_output_tokens': 1024,\n", " 'n_results': 5\n", "}\n", "\n", "# Save configuration\n", "with open('rag_config.json', 'w') as f:\n", " json.dump(rag_config, f, indent=2)\n", "\n", "print(\"โœ… Configuration saved to rag_config.json\")\n", "\n", "# Create a simple test to verify everything works\n", "print(\"\\n๐ŸŽฏ Final verification test...\")\n", "test_query = \"What is artificial intelligence?\"\n", "try:\n", " answer, docs, metas, dists = rag_pipeline(test_query)\n", " print(f\"โœ… Test successful! Answer length: {len(answer)} characters\")\n", " print(f\"๐Ÿ“Š Retrieved {len(docs)} documents\")\n", "except Exception as e:\n", " print(f\"โŒ Test failed: {e}\")\n", "\n", "print(\"\\n๐ŸŽ‰ RAG system is ready!\")\n", "print(\"๐Ÿ“ Files created:\")\n", "print(\" - chroma_db/ (vector database)\")\n", "print(\" - rag_config.json (configuration)\")\n", "print(\"\\n๐Ÿš€ You can now use this system in the Streamlit app!\")\n" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "collapsed": true, "id": "2Ms81G6CL5vA", "outputId": "d683e31f-84f8-4fe0-abb6-1d5afcf9230e" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Requirement already satisfied: streamlit in /usr/local/lib/python3.12/dist-packages (1.28.1)\n", "Collecting streamlit-chat\n", " Downloading streamlit_chat-0.1.1-py3-none-any.whl.metadata (4.2 kB)\n", "Requirement already satisfied: altair<6,>=4.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (5.5.0)\n", "Requirement already satisfied: blinker<2,>=1.0.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (1.9.0)\n", "Requirement already satisfied: cachetools<6,>=4.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (5.5.2)\n", "Requirement already satisfied: click<9,>=7.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (8.3.0)\n", "Requirement already satisfied: importlib-metadata<7,>=1.4 in /usr/local/lib/python3.12/dist-packages (from streamlit) (6.11.0)\n", "Requirement already satisfied: numpy<2,>=1.19.3 in /usr/local/lib/python3.12/dist-packages (from streamlit) (1.26.4)\n", "Requirement already satisfied: packaging<24,>=16.8 in /usr/local/lib/python3.12/dist-packages (from streamlit) (23.2)\n", "Requirement already satisfied: pandas<3,>=1.3.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (2.2.2)\n", "Requirement already satisfied: pillow<11,>=7.1.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (10.4.0)\n", "Requirement already satisfied: protobuf<5,>=3.20 in /usr/local/lib/python3.12/dist-packages (from streamlit) (4.25.8)\n", "Requirement already satisfied: pyarrow>=6.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (18.1.0)\n", "Requirement already satisfied: python-dateutil<3,>=2.7.3 in /usr/local/lib/python3.12/dist-packages (from streamlit) (2.9.0.post0)\n", "Requirement already satisfied: requests<3,>=2.27 in /usr/local/lib/python3.12/dist-packages (from streamlit) (2.32.4)\n", "Requirement already satisfied: rich<14,>=10.14.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (13.9.4)\n", "Requirement already satisfied: tenacity<9,>=8.1.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (8.5.0)\n", "Requirement already satisfied: toml<2,>=0.10.1 in /usr/local/lib/python3.12/dist-packages (from streamlit) (0.10.2)\n", "Requirement already satisfied: typing-extensions<5,>=4.3.0 in /usr/local/lib/python3.12/dist-packages (from streamlit) (4.15.0)\n", "Requirement already satisfied: tzlocal<6,>=1.1 in /usr/local/lib/python3.12/dist-packages (from streamlit) (5.3.1)\n", "Requirement already satisfied: validators<1,>=0.2 in /usr/local/lib/python3.12/dist-packages (from streamlit) (0.35.0)\n", "Requirement already satisfied: gitpython!=3.1.19,<4,>=3.0.7 in /usr/local/lib/python3.12/dist-packages (from streamlit) (3.1.45)\n", "Requirement already satisfied: pydeck<1,>=0.8.0b4 in /usr/local/lib/python3.12/dist-packages (from streamlit) (0.9.1)\n", "Requirement already satisfied: tornado<7,>=6.0.3 in /usr/local/lib/python3.12/dist-packages (from streamlit) (6.5.1)\n", "Requirement already satisfied: watchdog>=2.1.5 in /usr/local/lib/python3.12/dist-packages (from streamlit) (6.0.0)\n", "Requirement already satisfied: jinja2 in /usr/local/lib/python3.12/dist-packages (from altair<6,>=4.0->streamlit) (3.1.6)\n", "Requirement already satisfied: jsonschema>=3.0 in /usr/local/lib/python3.12/dist-packages (from altair<6,>=4.0->streamlit) (4.25.1)\n", "Requirement already satisfied: narwhals>=1.14.2 in /usr/local/lib/python3.12/dist-packages (from altair<6,>=4.0->streamlit) (2.9.0)\n", "Requirement already satisfied: gitdb<5,>=4.0.1 in /usr/local/lib/python3.12/dist-packages (from gitpython!=3.1.19,<4,>=3.0.7->streamlit) (4.0.12)\n", "Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.12/dist-packages (from importlib-metadata<7,>=1.4->streamlit) (3.23.0)\n", "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.12/dist-packages (from pandas<3,>=1.3.0->streamlit) (2025.2)\n", "Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.12/dist-packages (from pandas<3,>=1.3.0->streamlit) (2025.2)\n", "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.12/dist-packages (from python-dateutil<3,>=2.7.3->streamlit) (1.17.0)\n", "Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit) (3.4.4)\n", "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit) (3.11)\n", "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit) (2.3.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests<3,>=2.27->streamlit) (2025.10.5)\n", "Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.12/dist-packages (from rich<14,>=10.14.0->streamlit) (4.0.0)\n", "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.12/dist-packages (from rich<14,>=10.14.0->streamlit) (2.19.2)\n", "Requirement already satisfied: smmap<6,>=3.0.1 in /usr/local/lib/python3.12/dist-packages (from gitdb<5,>=4.0.1->gitpython!=3.1.19,<4,>=3.0.7->streamlit) (5.0.2)\n", "Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.12/dist-packages (from jinja2->altair<6,>=4.0->streamlit) (3.0.3)\n", "Requirement already satisfied: attrs>=22.2.0 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit) (25.4.0)\n", "Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit) (2025.9.1)\n", "Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit) (0.37.0)\n", "Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.12/dist-packages (from jsonschema>=3.0->altair<6,>=4.0->streamlit) (0.27.1)\n", "Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.12/dist-packages (from markdown-it-py>=2.2.0->rich<14,>=10.14.0->streamlit) (0.1.2)\n", "Downloading streamlit_chat-0.1.1-py3-none-any.whl (1.2 MB)\n", "\u001b[2K \u001b[90mโ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”โ”\u001b[0m \u001b[32m1.2/1.2 MB\u001b[0m \u001b[31m16.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hInstalling collected packages: streamlit-chat\n", "Successfully installed streamlit-chat-0.1.1\n" ] } ], "source": [ "!pip install streamlit streamlit-chat" ] }, { "cell_type": "code", "execution_count": 51, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 650 }, "id": "6E6Q04tDIinz", "outputId": "e8e0da16-c9db-45bd-e807-ddcc5462ef35" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "It looks like you are running Gradio on a hosted Jupyter notebook, which requires `share=True`. Automatically setting `share=True` (you can turn this off by setting `share=False` in `launch()` explicitly).\n", "\n", "Colab notebook detected. To show errors in colab notebook, set debug=True in launch()\n", "* Running on public URL: https://97ee63e42da153d3f5.gradio.live\n", "\n", "This share link expires in 1 week. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces)\n" ] }, { "data": { "text/html": [ "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/plain": [] }, "execution_count": 51, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import gradio as gr\n", "\n", "# Chat history\n", "chat_history = []\n", "\n", "# Chat logic\n", "def chat_with_rag(user_input):\n", " global chat_history\n", " if not user_input.strip():\n", " return chat_history\n", "\n", " # Call your RAG pipeline\n", " answer, _, _, _ = rag_pipeline(user_input)\n", "\n", " chat_history.append({\"role\": \"user\", \"message\": user_input})\n", " chat_history.append({\"role\": \"assistant\", \"message\": answer})\n", " return chat_history\n", "\n", "def clear_chat():\n", " global chat_history\n", " chat_history = []\n", " return chat_history\n", "\n", "# HTML message template with modern styling\n", "def format_message(message):\n", " role = message[\"role\"]\n", " text = message[\"message\"]\n", "\n", " if role == \"user\":\n", " bubble_color = \"#4CAF50\" # Green\n", " text_color = \"#FFFFFF\"\n", " justify = \"flex-end\"\n", " avatar = \"https://cdn-icons-png.flaticon.com/512/194/194938.png\"\n", " name = \"You\"\n", " else:\n", " bubble_color = \"#1E1E1E\" # Dark grey\n", " text_color = \"#F5F5F5\"\n", " justify = \"flex-start\"\n", " avatar = \"https://cdn-icons-png.flaticon.com/512/1995/1995574.png\"\n", " name = \"AI Assistant\"\n", "\n", " return f\"\"\"\n", "
\n", " \n", "
\n", "
{name}
\n", "
{text}
\n", "
\n", "
\n", " \"\"\"\n", "\n", "custom_theme = gr.themes.Soft(\n", " primary_hue=\"green\"\n", ")\n", "\n", "with gr.Blocks(theme=custom_theme, css=\"\"\"\n", " #chatbox {\n", " height: 450px;\n", " overflow-y: auto;\n", " padding: 12px;\n", " background-color: #F7F7F7;\n", " border-radius: 15px;\n", " border: 1px solid #DDD;\n", " }\n", " #user_input textarea {\n", " font-size: 16px;\n", " }\n", " .gr-button {\n", " font-weight: 600;\n", " }\n", "\"\"\") as demo:\n", "\n", "\n", "\n", " gr.Markdown(\n", " \"\"\"\n", "
\n", " \n", "
\n", "

AI Chat Assistant

\n", " \n", " Your AI assistant for Machine Learning, Deep Learning, and AI โ€” Explore insights, learn concepts, and get expert guidance.\n", " \n", "
\n", "
\n", " \"\"\",\n", " elem_id=\"title\"\n", " )\n", "\n", "\n", "\n", " chat_box = gr.HTML(elem_id=\"chatbox\")\n", " user_input = gr.Textbox(\n", " placeholder=\"Type your question...\",\n", " label=\"Your message\",\n", " lines=2\n", " )\n", " send_btn = gr.Button(\"Send\", variant=\"primary\")\n", " clear_btn = gr.Button(\"Clear Chat\", variant=\"secondary\")\n", "\n", " def update_display(user_message):\n", " chat_with_rag(user_message)\n", " html = \"\".join([format_message(m) for m in chat_history])\n", " return html, \"\"\n", "\n", " send_btn.click(update_display, inputs=user_input, outputs=[chat_box, user_input])\n", " clear_btn.click(lambda: (\"\", clear_chat()), None, outputs=chat_box)\n", "\n", "demo.launch()\n" ] } ], "metadata": { "colab": { "provenance": [] }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "name": "python" } }, "nbformat": 4, "nbformat_minor": 0 }