{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [],
   "source": [
    "from dotenv import load_dotenv\n",
    "from llama_index.llms.sambanovasystems import SambaNovaCloud\n",
    "\n",
    "load_dotenv()\n",
    "\n",
    "MODEL = \"Meta-Llama-3.1-405B-Instruct\"\n",
    "\n",
    "llm = SambaNovaCloud(model=MODEL)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "The history of Artificial Intelligence (AI) spans several decades, and it is a story of continuous innovation, experimentation, and advancement. Here's a comprehensive overview of the major milestones in the development of AI:\n",
      "\n",
      "**Early Beginnings (1950s)**\n",
      "\n",
      "The term \"Artificial Intelligence\" was first coined in 1956 by John McCarthy, a computer scientist and cognitive scientist, at the Dartmouth Summer Research Project on Artificial Intelligence. This conference is considered the birthplace of AI as a field of research.\n",
      "\n",
      "In the 1950s, computer scientists like Alan Turing, Marvin Minsky, and Seymour Papert began exploring the idea of creating machines that could think and learn like humans. Turing's 1950 paper, \"Computing Machinery and Intelligence,\" proposed the Turing Test, a measure of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human.\n",
      "\n",
      "**Rule-Based Expert Systems (1960s-1970s)**\n",
      "\n",
      "In the 1960s and 1970s, AI research focused on developing rule-based expert systems. These systems mimicked human decision-making by using a set of pre-defined rules to reason and solve problems. The first expert system, MYCIN, was developed in 1976 at Stanford University.\n",
      "\n",
      "**Machine Learning (1980s)**\n",
      "\n",
      "Machine learning, a subset of AI, emerged in the 1980s. Machine learning algorithms enabled machines to learn from data and improve their performance over time. David Rumelhart, Geoffrey Hinton, and Yann LeCun developed the backpropagation algorithm, a fundamental component of neural networks, in the 1980s.\n",
      "\n",
      "**AI Winter (1980s-1990s)**\n",
      "\n",
      "Despite the progress made in AI research, the field experienced a decline in funding and interest in the 1980s and 1990s. This period is known as the \"AI Winter.\" The lack of significant breakthroughs and the failure of many AI projects led to a decrease in investment and a shift in focus towards other areas of computer science.\n",
      "\n",
      "**Resurgence (2000s)**\n",
      "\n",
      "The 21st century saw a resurgence of interest in AI, driven by advances in computing power, data storage, and machine learning algorithms. The development of deep learning techniques, such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs), enabled machines to learn complex patterns in data.\n",
      "\n",
      "**Big Data and Deep Learning (2010s)**\n",
      "\n",
      "The availability of large datasets and the development of deep learning frameworks like TensorFlow and PyTorch led to significant breakthroughs in AI research. AI applications began to emerge in various industries, including computer vision, natural language processing, and robotics.\n",
      "\n",
      "**Current State (2020s)**\n",
      "\n",
      "Today, AI is ubiquitous, with applications in:\n",
      "\n",
      "1. Virtual assistants (e.g., Siri, Alexa)\n",
      "2. Image recognition (e.g., facial recognition, object detection)\n",
      "3. Natural language processing (e.g., language translation, text summarization)\n",
      "4. Robotics (e.g., autonomous vehicles, robotic process automation)\n",
      "5. Healthcare (e.g., medical diagnosis, personalized medicine)\n",
      "\n",
      "The history of AI is marked by periods of rapid progress, followed by periods of decline and rediscovery. As AI continues to evolve, we can expect to see new breakthroughs and innovations that transform industries and improve our lives.\n",
      "\n",
      "**Key Players and Milestones:**\n",
      "\n",
      "1. Alan Turing (1950): Proposed the Turing Test\n",
      "2. John McCarthy (1956): Coined the term \"Artificial Intelligence\"\n",
      "3. David Rumelhart, Geoffrey Hinton, and Yann LeCun (1980s): Developed the backpropagation algorithm\n",
      "4. Yann LeCun, Yoshua Bengio, and Geoffrey Hinton (2010s): Developed deep learning techniques\n",
      "5. Andrew Ng and Fei-Fei Li (2010s): Developed AI applications in computer vision and natural language processing\n",
      "\n",
      "**Timeline:**\n",
      "\n",
      "* 1950: Alan Turing proposes the Turing Test\n",
      "* 1956: John McCarthy coins the term \"Artificial Intelligence\"\n",
      "* 1960s: Rule-based expert systems emerge\n",
      "* 1980s: Machine learning and AI Winter\n",
      "* 2000s: Resurgence of interest in AI\n",
      "* 2010s: Big data and deep learning lead to significant breakthroughs\n",
      "* 2020s: AI becomes ubiquitous in various industries"
     ]
    }
   ],
   "source": [
    "query = \"write about the history of AI\"\n",
    "\n",
    "streaming_response = llm.stream_complete(query)\n",
    "\n",
    "for chunk in streaming_response:\n",
    "    try:\n",
    "        print(chunk.raw[\"choices\"][0][\"delta\"][\"content\"], end=\"\", flush=True)\n",
    "    except: \n",
    "        pass"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "base",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.12.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
