{
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "ajTjpWJ9-8qg"
      },
      "source": [
        "![image.png]()\n",
        "\n",
        "# H2OGPT INFERENCE\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "vpiBaR_9-Fiw"
      },
      "source": [
        "| Code Credits | Link |\n",
        "| ----------- | ---- |\n",
        "| 🎉 Repository | [![GitHub Repository](https://img.shields.io/github/stars/h2oai/h2ogpt?style=social)](https://github.com/h2oai/h2ogpt) |\n",
        "| 🚀 Online inference | [[Online h2o]](https://gpt.h2o.ai) |\n",
        "| 🔥 Discover More Colab Notebooks | [![GitHub Repository](https://img.shields.io/badge/GitHub-Repository-black?style=flat-square&logo=github)](https://github.com/R3gm/InsightSolver-Colab/) |"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "2EJxbcss8rtz"
      },
      "source": [
        "# Submit a file in the red square"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "QC1kaJzI7uan"
      },
      "source": [
        "![submit.png]()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "s4CZSdHO7rUy"
      },
      "source": [
        "# Prompt mode\n",
        "`Content or file:` This Agreement is governed by English law and the parties submit to the exclusive jurisdiction of the English courts in relation to any dispute (contractual or non-contractual) concerning this Agreement save that either party may apply to any court for an injunction or other relief to protect its Intellectual Property Rights. Source: 28-pl Content: No Waiver. Failure or delay in exercising any right or remedy under this Agreement shall not constitute a waiver of such (or any other) right or remedy.\n",
        "\n",
        "11.7 Severability. The invalidity, illegality or unenforceability of any term (or part of a term) of this Agreement shall not affect the continuation in force of the remainder of the term (if any) and this Agreement.\n",
        "\n",
        "11.8 No Agency. Except as expressly stated otherwise, nothing in this Agreement shall create an agency, partnership or joint venture of any kind between the parties.\n",
        "\n",
        "11.9 No Third-Party Beneficiaries. Source: 30-pl Content: (b) if Google believes, in good faith, that the Distributor has violated or caused Google to violate any Anti-Bribery Laws (as defined in Clause 8.5) or that such a violation is reasonably likely to occur, Source: 4-pl  \n",
        "\n",
        "\n",
        "`QUESTION:` Which state/country’s law governs the interpretation of the contract?\n",
        "\n",
        "\n",
        "`FINAL ANSWER:` This Agreement is governed by English law. SOURCES: 28-pl"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "yi9F4Gb09aFp"
      },
      "source": [
        "# Setup"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "wWIkHXJnuYX2",
        "outputId": "706f1675-a4f7-4523-f52d-e537355e40a3"
      },
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "fatal: destination path 'h2ogpt' already exists and is not an empty directory.\n",
            "/content/h2ogpt\n",
            "  Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n",
            "  Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n",
            "  Preparing metadata (pyproject.toml) ... \u001b[?25l\u001b[?25hdone\n",
            "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n",
            "argilla 1.7.0 requires numpy<1.24.0, but you have numpy 1.24.2 which is incompatible.\n",
            "argilla 1.7.0 requires pandas<2.0.0,>=1.0.0, but you have pandas 2.0.0 which is incompatible.\n",
            "google-colab 1.0.0 requires pandas==1.5.3, but you have pandas 2.0.0 which is incompatible.\n",
            "google-colab 1.0.0 requires requests==2.27.1, but you have requests 2.31.0 which is incompatible.\n",
            "numba 0.56.4 requires numpy<1.24,>=1.18, but you have numpy 1.24.2 which is incompatible.\n",
            "tensorflow 2.12.0 requires numpy<1.24,>=1.22, but you have numpy 1.24.2 which is incompatible.\u001b[0m\u001b[31m\n",
            "\u001b[0m\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n",
            "google-colab 1.0.0 requires requests==2.27.1, but you have requests 2.31.0 which is incompatible.\u001b[0m\u001b[31m\n",
            "\u001b[0m"
          ]
        }
      ],
      "source": [
        "!pip -q install psutil\n",
        "!git clone https://github.com/h2oai/h2ogpt.git\n",
        "%cd h2ogpt\n",
        "!pip install -q -r requirements.txt\n",
        "user_path = '/content/docs'\n",
        "!pip install -q -r requirements_optional_langchain.txt\n",
        "\n",
        "import psutil\n",
        "import os\n",
        "pid = os.getpid()\n",
        "py = psutil.Process(pid)\n",
        "py.kill()\n",
        "print('Ignore the warnings')"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "_u6Eke0sAURo"
      },
      "source": [
        "# Run gradio UI"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "oUlU9fpjAR8-",
        "outputId": "12a88992-11c2-4787-face-230f2022e58f"
      },
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "/content/h2ogpt\n",
            "2023-05-23 00:35:35.094648: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT\n",
            "\n",
            "===================================BUG REPORT===================================\n",
            "Welcome to bitsandbytes. For bug reports, please run\n",
            "\n",
            "python -m bitsandbytes\n",
            "\n",
            " and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues\n",
            "================================================================================\n",
            "bin /usr/local/lib/python3.10/dist-packages/bitsandbytes/libbitsandbytes_cuda118.so\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: /usr/lib64-nvidia did not contain ['libcudart.so', 'libcudart.so.11.0', 'libcudart.so.12.0'] as expected! Searching further paths...\n",
            "  warn(msg)\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/sys/fs/cgroup/memory.events /var/colab/cgroup/jupyter-children/memory.events')}\n",
            "  warn(msg)\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('8013'), PosixPath('//172.28.0.1'), PosixPath('http')}\n",
            "  warn(msg)\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('//colab.research.google.com/tun/m/cc48301118ce562b961b3c22d803539adc1e0c19/gpu-t4-s-2dlzv0dqlweqe --tunnel_background_save_delay=10s --tunnel_periodic_background_save_frequency=30m0s --enable_output_coalescing=true --output_coalescing_required=true'), PosixPath('--logtostderr --listen_host=172.28.0.12 --target_host=172.28.0.12 --tunnel_background_save_url=https')}\n",
            "  warn(msg)\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('/env/python')}\n",
            "  warn(msg)\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath('module'), PosixPath('//ipykernel.pylab.backend_inline')}\n",
            "  warn(msg)\n",
            "CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching in backup paths...\n",
            "/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: Found duplicate ['libcudart.so', 'libcudart.so.11.0', 'libcudart.so.12.0'] files: {PosixPath('/usr/local/cuda/lib64/libcudart.so'), PosixPath('/usr/local/cuda/lib64/libcudart.so.11.0')}.. We'll flip a coin and try one of these, in order to fail forward.\n",
            "Either way, this might cause trouble in the future:\n",
            "If you get `CUDA error: invalid device function` errors, the above might be the cause and the solution is to make sure only one ['libcudart.so', 'libcudart.so.11.0', 'libcudart.so.12.0'] in the paths that we search based on your env.\n",
            "  warn(msg)\n",
            "CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so\n",
            "CUDA SETUP: Highest compute capability among GPUs detected: 7.5\n",
            "CUDA SETUP: Detected CUDA version 118\n",
            "CUDA SETUP: Loading binary /usr/local/lib/python3.10/dist-packages/bitsandbytes/libbitsandbytes_cuda118.so...\n",
            "Using Model h2oai/h2ogpt-oig-oasst1-512-6.9b\n",
            "Generating model with params:\n",
            "load_8bit: True\n",
            "load_half: True\n",
            "infer_devices: True\n",
            "base_model: h2oai/h2ogpt-oig-oasst1-512-6.9b\n",
            "tokenizer_base_model: \n",
            "lora_weights: \n",
            "gpu_id: 0\n",
            "prompt_type: plain\n",
            "temperature: 0.1\n",
            "top_p: 0.75\n",
            "top_k: 40\n",
            "num_beams: 1\n",
            "repetition_penalty: 1.07\n",
            "num_return_sequences: 1\n",
            "do_sample: False\n",
            "max_new_tokens: 256\n",
            "min_new_tokens: 0\n",
            "early_stopping: False\n",
            "max_time: 180\n",
            "debug: False\n",
            "save_dir: None\n",
            "share: True\n",
            "local_files_only: False\n",
            "resume_download: True\n",
            "use_auth_token: False\n",
            "trust_remote_code: True\n",
            "src_lang: English\n",
            "tgt_lang: Russian\n",
            "gradio: True\n",
            "gradio_avoid_processing_markdown: False\n",
            "chat: True\n",
            "chat_context: False\n",
            "stream_output: True\n",
            "show_examples: False\n",
            "verbose: False\n",
            "h2ocolors: True\n",
            "height: 400\n",
            "show_lora: True\n",
            "login_mode_if_model0: False\n",
            "block_gradio_exit: True\n",
            "concurrency_count: 1\n",
            "api_open: False\n",
            "allow_api: True\n",
            "input_lines: 1\n",
            "auth: None\n",
            "sanitize_user_prompt: True\n",
            "sanitize_bot_response: True\n",
            "extra_model_options: []\n",
            "extra_lora_options: []\n",
            "score_model: OpenAssistant/reward-model-deberta-v3-large-v2\n",
            "auto_score: True\n",
            "eval_sharegpt_prompts_only: 0\n",
            "eval_sharegpt_prompts_only_seed: 1234\n",
            "eval_sharegpt_as_output: False\n",
            "langchain_mode: UserData\n",
            "visible_langchain_modes: ['UserData', 'MyData']\n",
            "user_path: user_path\n",
            "load_db_if_exists: True\n",
            "keep_sources_in_context: False\n",
            "db_type: chroma\n",
            "use_openai_embedding: False\n",
            "use_openai_model: False\n",
            "hf_embedding_model: sentence-transformers/all-MiniLM-L6-v2\n",
            "allow_upload_to_user_data: True\n",
            "allow_upload_to_my_data: True\n",
            "enable_url_upload: True\n",
            "enable_text_upload: True\n",
            "enable_sources_list: True\n",
            "chunk: True\n",
            "chunk_size: 512\n",
            "k: 4\n",
            "n_jobs: -1\n",
            "enable_captions: True\n",
            "captions_model: Salesforce/blip-image-captioning-base\n",
            "pre_load_caption_model: False\n",
            "caption_gpu: True\n",
            "enable_ocr: False\n",
            "is_hf: False\n",
            "is_gpth2oai: False\n",
            "is_public: False\n",
            "is_low_mem: False\n",
            "admin_pass: None\n",
            "raise_generate_gpu_exceptions: True\n",
            "n_gpus: 1\n",
            "model_lower: h2oai/h2ogpt-oig-oasst1-512-6.9b\n",
            "first_para: False\n",
            "text_limit: None\n",
            "placeholder_instruction: Enter a question or imperative.\n",
            "placeholder_input: \n",
            "examples: [['Jeff: Can I train a ? Transformers model on Amazon SageMaker? \\nPhilipp: Sure you can use the new Hugging Face Deep Learning Container. \\nJeff: ok.\\nJeff: and how can I get started? \\nJeff: where can I find documentation? \\nPhilipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face', '', '', True, 'plain', 0.1, 0.75, 40, 4, 256, 0, False, 180, 1.0, 1, False, True, '', '', 'Disabled'], ['Translate English to French', 'Good morning', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Give detailed answer for whether Einstein or Newton is smarter.', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Explain in detailed list, all the best practices for coding in python.', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Create a markdown table with 3 rows for the primary colors, and 2 columns, with color name and hex codes.', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Translate to German:  My name is Arthur', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], [\"Please answer to the following question. Who is going to be the next Ballon d'or?\", '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering.', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Please answer the following question. What is the boiling point of Nitrogen?', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Answer the following yes/no question. Can you write a whole Haiku in a single tweet?', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Simplify the following expression: (False or False and True). Explain your answer.', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], [\"Premise: At my age you will probably have learnt one lesson. Hypothesis:  It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?\", '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['The square root of x is the cube root of y. What is y to the power of 2, if x = 4?', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['Answer the following question by reasoning step by step.  The cafeteria had 23 apples. If they used 20 for lunch, and bought 6 more, how many apple do they have?', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['def area_of_rectangle(a: float, b: float):\\n    \"\"\"Return the area of the rectangle.\"\"\"', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['# a function in native python:\\ndef mean(a):\\n    return sum(a)/len(a)\\n\\n# the same function using numpy:\\nimport numpy as np\\ndef mean(a):', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled'], ['X = np.random.randn(100, 100)\\ny = np.random.randint(0, 1, 100)\\n\\n# fit random forest classifier with 20 estimators', '', '', True, 'plain', 0.1, 0.75, 40, 1, 256, 0, False, 180, 1.07, 1, False, True, '', '', 'Disabled']]\n",
            "task_info: Auto-complete phrase, code, etc.\n",
            "Command: generate.py --base_model=h2oai/h2ogpt-oig-oasst1-512-6.9b --langchain_mode=UserData --user_path=user_path --load_8bit=True --share=True\n",
            "Hash: 585d7667a6d0849470feb9cfb755b4560743b130\n",
            "Prep: persist_directory=db_dir_UserData exists, using\n",
            "DO Loading db: UserData\n",
            "Using embedded DuckDB with persistence: data will be stored in: db_dir_UserData\n",
            "DONE Loading db: UserData\n",
            "Get h2oai/h2ogpt-oig-oasst1-512-6.9b model\n",
            "device_map: {'': 0}\n",
            "Loading checkpoint shards: 100% 3/3 [01:03<00:00, 21.15s/it]\n",
            "Get OpenAssistant/reward-model-deberta-v3-large-v2 model\n",
            "device_map: {'': 0}\n",
            "Running on local URL:  http://0.0.0.0:7860\n",
            "Running on public URL: https://53289ab485b17b9a8b.gradio.live\n",
            "\n",
            "This share link expires in 72 hours. For free permanent hosting and GPU upgrades (NEW!), check out Spaces: https://huggingface.co/spaces\n",
            "Started GUI\n"
          ]
        }
      ],
      "source": [
        "%cd h2ogpt\n",
        "!python generate.py --base_model=h2oai/h2ogpt-oig-oasst1-512-6.9b --langchain_mode=UserData --user_path=user_path  --load_8bit=True --share=True"
      ]
    }
  ],
  "metadata": {
    "accelerator": "GPU",
    "colab": {
      "gpuType": "T4",
      "provenance": []
    },
    "gpuClass": "standard",
    "kernelspec": {
      "display_name": "Python 3",
      "name": "python3"
    },
    "language_info": {
      "name": "python"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}