rogerkoranteng commited on
Commit
3506b46
·
verified ·
1 Parent(s): fe07957

Upload folder using huggingface_hub

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitkeep +0 -0
  2. .ipynb_checkpoints/model fine-tunning-checkpoint.ipynb +923 -0
  3. README.md +2 -8
  4. Untitled.ipynb +138 -0
  5. Untitled1.ipynb +94 -0
  6. chatbot/.env +4 -0
  7. chatbot/.gradio/certificate.pem +31 -0
  8. chatbot/README.md +6 -0
  9. chatbot/app.py +157 -0
  10. chatbot/app2.pyn +226 -0
  11. chatbot/app2.pynnn +213 -0
  12. chatbot/bar_chart.png +0 -0
  13. chatbot/category_breakdown.png +0 -0
  14. chatbot/chat_history.json +62 -0
  15. chatbot/chroma_migrated/chroma.sqlite3 +0 -0
  16. chatbot/fine_tuned_model/config.json +2026 -0
  17. chatbot/fine_tuned_model/model.safetensors +3 -0
  18. chatbot/fined-tuned-model.lora.h5 +3 -0
  19. chatbot/heatmap.png +0 -0
  20. chatbot/inventory.db +0 -0
  21. chatbot/inventory_chart.png +0 -0
  22. chatbot/inventory_counts.png +0 -0
  23. chatbot/inventory_images.db +0 -0
  24. chatbot/inventory_log.json +91 -0
  25. chatbot/line_chart.png +0 -0
  26. chatbot/main.py.save +52 -0
  27. chatbot/main2.py +48 -0
  28. chatbot/model_config.json +27 -0
  29. chatbot/models/500/config.json +2026 -0
  30. chatbot/models/500/model.safetensors +3 -0
  31. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/config.json +37 -0
  32. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/generation_config.json +7 -0
  33. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/merges.txt +0 -0
  34. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/model-00001-of-00002.safetensors +3 -0
  35. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/model-00002-of-00002.safetensors +3 -0
  36. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/model.safetensors.index.json +225 -0
  37. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/special_tokens_map.json +34 -0
  38. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/tokenizer.json +0 -0
  39. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/tokenizer_config.json +154 -0
  40. chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/vocab.json +0 -0
  41. chatbot/models/custom_model/config.json +2026 -0
  42. chatbot/models/custom_model/model.safetensors +3 -0
  43. chatbot/models/microsoft/Phi-3.5-mini-instruct/added_tokens.json +13 -0
  44. chatbot/models/microsoft/Phi-3.5-mini-instruct/config.json +138 -0
  45. chatbot/models/microsoft/Phi-3.5-mini-instruct/generation_config.json +11 -0
  46. chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00001-of-00004.safetensors +3 -0
  47. chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00002-of-00004.safetensors +3 -0
  48. chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00003-of-00004.safetensors +3 -0
  49. chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00004-of-00004.safetensors +3 -0
  50. chatbot/models/microsoft/Phi-3.5-mini-instruct/model.safetensors.index.json +202 -0
.gitkeep ADDED
File without changes
.ipynb_checkpoints/model fine-tunning-checkpoint.ipynb ADDED
@@ -0,0 +1,923 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "markdown",
5
+ "id": "96d10e25-bc25-4058-baaa-a74ab15af266",
6
+ "metadata": {},
7
+ "source": [
8
+ "# Kaggle API Setup for TensorFlow and Keras NLP Project\n",
9
+ "Gemma is a family of lightweight, state-of-the-art open models built from the same research and technology used to create the Gemini models.\n",
10
+ "\n",
11
+ "## Introduction\n",
12
+ "In this notebook, we will prepare the environment to download and utilize the Gemma model in a TensorFlow and Keras NLP project. We will begin by importing the necessary libraries, followed by configuring the Kaggle API credentials to enable seamless access to the required datasets.\n",
13
+ "\n",
14
+ "## 1. Importing Required Libraries\n",
15
+ "To start, we will import the essential libraries for this project, including TensorFlow, Keras, and Keras NLP, which are crucial for building and deploying NLP models.\n"
16
+ ]
17
+ },
18
+ {
19
+ "cell_type": "code",
20
+ "execution_count": null,
21
+ "id": "996572a0-edb7-49c9-89cc-27ad9b5a42d2",
22
+ "metadata": {
23
+ "scrolled": true
24
+ },
25
+ "outputs": [],
26
+ "source": [
27
+ "# Import required libraries\n",
28
+ "\n",
29
+ "import os\n",
30
+ "import tensorflow as tf\n",
31
+ "import keras_nlp\n",
32
+ "import keras\n",
33
+ "import json\n",
34
+ "\n",
35
+ "# Ignore Warnings\n",
36
+ "from silence_tensorflow import silence_tensorflow\n",
37
+ "silence_tensorflow()\n",
38
+ "import warnings\n",
39
+ "warnings.filterwarnings(\"ignore\")\n",
40
+ "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3' "
41
+ ]
42
+ },
43
+ {
44
+ "cell_type": "code",
45
+ "execution_count": 2,
46
+ "id": "29ec8457-c4b5-4017-b190-812e0b2428f5",
47
+ "metadata": {},
48
+ "outputs": [],
49
+ "source": [
50
+ "# Set Kaggle API credentials\n",
51
+ "\n",
52
+ "os.environ[\"KAGGLE_USERNAME\"] = \"rogerkorantenng\"\n",
53
+ "os.environ[\"KAGGLE_KEY\"] = \"9a33b6e88bcb6058b1281d777fa6808d\""
54
+ ]
55
+ },
56
+ {
57
+ "cell_type": "markdown",
58
+ "id": "0145d6ca-9424-4c32-9e87-7cbac86cf65f",
59
+ "metadata": {},
60
+ "source": [
61
+ "## 2. Building and Compiling the Gemma Model\n",
62
+ "\n",
63
+ "In this section, we will build and compile the Gemma model, a language model designed for natural language processing (NLP) tasks. The process involves several key steps: loading the pre-trained model, enabling fine-tuning with LoRA (Low-Rank Adaptation), configuring the input sequence length, setting up the optimizer, and compiling the model for training.\n"
64
+ ]
65
+ },
66
+ {
67
+ "cell_type": "code",
68
+ "execution_count": 3,
69
+ "id": "e8045579-6e77-4879-b630-d1a9d2550a4d",
70
+ "metadata": {},
71
+ "outputs": [],
72
+ "source": [
73
+ "def get_compiled_model():\n",
74
+ " gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset(\"gemma_2b_en\")\n",
75
+ " gemma_lm.summary()\n",
76
+ "\n",
77
+ " gemma_lm.backbone.enable_lora(rank=4)\n",
78
+ " gemma_lm.summary()\n",
79
+ " \n",
80
+ " # Set the sequence length to 128 before using the model.\n",
81
+ " gemma_lm.preprocessor.sequence_length = 256\n",
82
+ " \n",
83
+ " # Use AdamW (a common optimizer for transformer models).\n",
84
+ " optimizer = keras.optimizers.AdamW(\n",
85
+ " learning_rate=5e-5,\n",
86
+ " weight_decay=0.01,\n",
87
+ " )\n",
88
+ " \n",
89
+ " # Exclude layernorm and bias terms from decay.\n",
90
+ " optimizer.exclude_from_weight_decay(var_names=[\"bias\", \"scale\"])\n",
91
+ " \n",
92
+ " gemma_lm.compile(\n",
93
+ " loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),\n",
94
+ " optimizer=optimizer,\n",
95
+ " weighted_metrics=[keras.metrics.SparseCategoricalAccuracy()],\n",
96
+ " )\n",
97
+ "\n",
98
+ " \n",
99
+ " return gemma_lm"
100
+ ]
101
+ },
102
+ {
103
+ "cell_type": "markdown",
104
+ "id": "6475baba-ce54-40ec-a484-22629936c845",
105
+ "metadata": {},
106
+ "source": [
107
+ "## 3. Loading and Processing the Dataset\n",
108
+ "\n",
109
+ "In this section, we will define a function to load and process a JSON dataset. The dataset is read line by line, and each line is parsed and formatted according to the required structure. The function returns a list of formatted examples that can be used for training or analysis.\n"
110
+ ]
111
+ },
112
+ {
113
+ "cell_type": "code",
114
+ "execution_count": 4,
115
+ "id": "238f65b1-75c2-42a1-abd3-b4d8c7c36799",
116
+ "metadata": {},
117
+ "outputs": [],
118
+ "source": [
119
+ "def get_dataset():\n",
120
+ " # Initialize an empty list to hold the processed data.\n",
121
+ " data = []\n",
122
+ " \n",
123
+ " # Open and read the JSON file line by line.\n",
124
+ " with open('/project/data/combined_dataset.json') as file:\n",
125
+ " for line in file:\n",
126
+ " features = json.loads(line)\n",
127
+ " \n",
128
+ " # Filter out examples without \"Context\".\n",
129
+ " if not features.get(\"Context\"):\n",
130
+ " continue\n",
131
+ " \n",
132
+ " # Format the example as a string.\n",
133
+ " template = \"Instruction:\\n{Context}\\n\\nResponse:\\n{Response}\"\n",
134
+ " formatted_example = template.format(**features)\n",
135
+ " \n",
136
+ " # Append the formatted example to the data list.\n",
137
+ " data.append(formatted_example)\n",
138
+ " \n",
139
+ " return data "
140
+ ]
141
+ },
142
+ {
143
+ "cell_type": "code",
144
+ "execution_count": 5,
145
+ "id": "d589e374-cc82-4132-9613-23ee064aa10e",
146
+ "metadata": {},
147
+ "outputs": [
148
+ {
149
+ "name": "stderr",
150
+ "output_type": "stream",
151
+ "text": [
152
+ "2024-08-30 16:13:39.953235: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13775 MB memory: -> device: 0, name: Tesla T4, pci bus id: 0000:18:00.0, compute capability: 7.5\n",
153
+ "2024-08-30 16:13:39.954747: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 13775 MB memory: -> device: 1, name: Tesla T4, pci bus id: 0000:19:00.0, compute capability: 7.5\n",
154
+ "2024-08-30 16:13:39.956103: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 13775 MB memory: -> device: 2, name: Tesla T4, pci bus id: 0000:35:00.0, compute capability: 7.5\n",
155
+ "2024-08-30 16:13:39.957459: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 13775 MB memory: -> device: 3, name: Tesla T4, pci bus id: 0000:36:00.0, compute capability: 7.5\n",
156
+ "2024-08-30 16:13:39.958812: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:4 with 13775 MB memory: -> device: 4, name: Tesla T4, pci bus id: 0000:e7:00.0, compute capability: 7.5\n",
157
+ "2024-08-30 16:13:39.960166: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:5 with 13775 MB memory: -> device: 5, name: Tesla T4, pci bus id: 0000:e8:00.0, compute capability: 7.5\n",
158
+ "2024-08-30 16:13:39.961521: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:6 with 13775 MB memory: -> device: 6, name: Tesla T4, pci bus id: 0000:f4:00.0, compute capability: 7.5\n",
159
+ "2024-08-30 16:13:39.962850: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:7 with 13775 MB memory: -> device: 7, name: Tesla T4, pci bus id: 0000:f5:00.0, compute capability: 7.5\n",
160
+ "normalizer.cc(51) LOG(INFO) precompiled_charsmap is empty. use identity normalization.\n"
161
+ ]
162
+ },
163
+ {
164
+ "data": {
165
+ "text/html": [
166
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">Preprocessor: \"gemma_causal_lm_preprocessor\"</span>\n",
167
+ "</pre>\n"
168
+ ],
169
+ "text/plain": [
170
+ "\u001b[1mPreprocessor: \"gemma_causal_lm_preprocessor\"\u001b[0m\n"
171
+ ]
172
+ },
173
+ "metadata": {},
174
+ "output_type": "display_data"
175
+ },
176
+ {
177
+ "data": {
178
+ "text/html": [
179
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
180
+ "┃<span style=\"font-weight: bold\"> Tokenizer (type) </span>┃<span style=\"font-weight: bold\"> Vocab # </span>┃\n",
181
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
182
+ "│ gemma_tokenizer (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">GemmaTokenizer</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">256,000</span> │\n",
183
+ "└────────────────────────────────────────────────────┴─────────────────────────────────────────────────────┘\n",
184
+ "</pre>\n"
185
+ ],
186
+ "text/plain": [
187
+ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
188
+ "┃\u001b[1m \u001b[0m\u001b[1mTokenizer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Vocab #\u001b[0m\u001b[1m \u001b[0m┃\n",
189
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
190
+ "│ gemma_tokenizer (\u001b[38;5;33mGemmaTokenizer\u001b[0m) │ \u001b[38;5;34m256,000\u001b[0m │\n",
191
+ "└────────────────────────────────────────────────────┴─────────────────────────────────────────────────────┘\n"
192
+ ]
193
+ },
194
+ "metadata": {},
195
+ "output_type": "display_data"
196
+ },
197
+ {
198
+ "data": {
199
+ "text/html": [
200
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">Model: \"gemma_causal_lm\"</span>\n",
201
+ "</pre>\n"
202
+ ],
203
+ "text/plain": [
204
+ "\u001b[1mModel: \"gemma_causal_lm\"\u001b[0m\n"
205
+ ]
206
+ },
207
+ "metadata": {},
208
+ "output_type": "display_data"
209
+ },
210
+ {
211
+ "data": {
212
+ "text/html": [
213
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
214
+ "┃<span style=\"font-weight: bold\"> Layer (type) </span>┃<span style=\"font-weight: bold\"> Output Shape </span>┃<span style=\"font-weight: bold\"> Param # </span>┃<span style=\"font-weight: bold\"> Connected to </span>┃\n",
215
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
216
+ "│ padding_mask (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">InputLayer</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │ - │\n",
217
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
218
+ "│ token_ids (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">InputLayer</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │ - │\n",
219
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
220
+ "│ gemma_backbone │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">2048</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">2,506,172,416</span> │ padding_mask[<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>][<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>], │\n",
221
+ "│ (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">GemmaBackbone</span>) │ │ │ token_ids[<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>][<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>] │\n",
222
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
223
+ "│ token_embedding │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">256000</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">524,288,000</span> │ gemma_backbone[<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>][<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>] │\n",
224
+ "│ (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">ReversibleEmbedding</span>) │ │ │ │\n",
225
+ "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n",
226
+ "</pre>\n"
227
+ ],
228
+ "text/plain": [
229
+ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
230
+ "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mConnected to \u001b[0m\u001b[1m \u001b[0m┃\n",
231
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
232
+ "│ padding_mask (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n",
233
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
234
+ "│ token_ids (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n",
235
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
236
+ "│ gemma_backbone │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m2048\u001b[0m) │ \u001b[38;5;34m2,506,172,416\u001b[0m │ padding_mask[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m], │\n",
237
+ "│ (\u001b[38;5;33mGemmaBackbone\u001b[0m) │ │ │ token_ids[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n",
238
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
239
+ "│ token_embedding │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m256000\u001b[0m) │ \u001b[38;5;34m524,288,000\u001b[0m │ gemma_backbone[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n",
240
+ "│ (\u001b[38;5;33mReversibleEmbedding\u001b[0m) │ │ │ │\n",
241
+ "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n"
242
+ ]
243
+ },
244
+ "metadata": {},
245
+ "output_type": "display_data"
246
+ },
247
+ {
248
+ "data": {
249
+ "text/html": [
250
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Total params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">2,506,172,416</span> (9.34 GB)\n",
251
+ "</pre>\n"
252
+ ],
253
+ "text/plain": [
254
+ "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m2,506,172,416\u001b[0m (9.34 GB)\n"
255
+ ]
256
+ },
257
+ "metadata": {},
258
+ "output_type": "display_data"
259
+ },
260
+ {
261
+ "data": {
262
+ "text/html": [
263
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">2,506,172,416</span> (9.34 GB)\n",
264
+ "</pre>\n"
265
+ ],
266
+ "text/plain": [
267
+ "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m2,506,172,416\u001b[0m (9.34 GB)\n"
268
+ ]
269
+ },
270
+ "metadata": {},
271
+ "output_type": "display_data"
272
+ },
273
+ {
274
+ "data": {
275
+ "text/html": [
276
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Non-trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> (0.00 B)\n",
277
+ "</pre>\n"
278
+ ],
279
+ "text/plain": [
280
+ "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n"
281
+ ]
282
+ },
283
+ "metadata": {},
284
+ "output_type": "display_data"
285
+ },
286
+ {
287
+ "data": {
288
+ "text/html": [
289
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">Preprocessor: \"gemma_causal_lm_preprocessor\"</span>\n",
290
+ "</pre>\n"
291
+ ],
292
+ "text/plain": [
293
+ "\u001b[1mPreprocessor: \"gemma_causal_lm_preprocessor\"\u001b[0m\n"
294
+ ]
295
+ },
296
+ "metadata": {},
297
+ "output_type": "display_data"
298
+ },
299
+ {
300
+ "data": {
301
+ "text/html": [
302
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
303
+ "┃<span style=\"font-weight: bold\"> Tokenizer (type) </span>┃<span style=\"font-weight: bold\"> Vocab # </span>┃\n",
304
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
305
+ "│ gemma_tokenizer (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">GemmaTokenizer</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">256,000</span> │\n",
306
+ "└────────────────────────────────────────────────────┴─────────────────────────────────────────────────────┘\n",
307
+ "</pre>\n"
308
+ ],
309
+ "text/plain": [
310
+ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
311
+ "┃\u001b[1m \u001b[0m\u001b[1mTokenizer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Vocab #\u001b[0m\u001b[1m \u001b[0m┃\n",
312
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━��━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
313
+ "│ gemma_tokenizer (\u001b[38;5;33mGemmaTokenizer\u001b[0m) │ \u001b[38;5;34m256,000\u001b[0m │\n",
314
+ "└────────────────────────────────────────────────────┴─────────────────────────────────────────────────────┘\n"
315
+ ]
316
+ },
317
+ "metadata": {},
318
+ "output_type": "display_data"
319
+ },
320
+ {
321
+ "data": {
322
+ "text/html": [
323
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">Model: \"gemma_causal_lm\"</span>\n",
324
+ "</pre>\n"
325
+ ],
326
+ "text/plain": [
327
+ "\u001b[1mModel: \"gemma_causal_lm\"\u001b[0m\n"
328
+ ]
329
+ },
330
+ "metadata": {},
331
+ "output_type": "display_data"
332
+ },
333
+ {
334
+ "data": {
335
+ "text/html": [
336
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
337
+ "┃<span style=\"font-weight: bold\"> Layer (type) </span>┃<span style=\"font-weight: bold\"> Output Shape </span>┃<span style=\"font-weight: bold\"> Param # </span>┃<span style=\"font-weight: bold\"> Connected to </span>┃\n",
338
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
339
+ "│ padding_mask (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">InputLayer</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │ - │\n",
340
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
341
+ "│ token_ids (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">InputLayer</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │ - │\n",
342
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
343
+ "│ gemma_backbone │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">2048</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">2,507,536,384</span> │ padding_mask[<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>][<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>], │\n",
344
+ "│ (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">GemmaBackbone</span>) │ │ │ token_ids[<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>][<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>] │\n",
345
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
346
+ "│ token_embedding │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">256000</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">524,288,000</span> │ gemma_backbone[<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>][<span style=\"color: #00af00; text-decoration-color: #00af00\">0</span>] │\n",
347
+ "│ (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">ReversibleEmbedding</span>) │ │ │ │\n",
348
+ "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n",
349
+ "</pre>\n"
350
+ ],
351
+ "text/plain": [
352
+ "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
353
+ "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mConnected to \u001b[0m\u001b[1m \u001b[0m┃\n",
354
+ "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
355
+ "│ padding_mask (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n",
356
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
357
+ "│ token_ids (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n",
358
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
359
+ "│ gemma_backbone │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m2048\u001b[0m) │ \u001b[38;5;34m2,507,536,384\u001b[0m │ padding_mask[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m], │\n",
360
+ "│ (\u001b[38;5;33mGemmaBackbone\u001b[0m) │ │ │ token_ids[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n",
361
+ "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
362
+ "│ token_embedding │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m256000\u001b[0m) │ \u001b[38;5;34m524,288,000\u001b[0m │ gemma_backbone[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n",
363
+ "│ (\u001b[38;5;33mReversibleEmbedding\u001b[0m) │ │ │ │\n",
364
+ "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n"
365
+ ]
366
+ },
367
+ "metadata": {},
368
+ "output_type": "display_data"
369
+ },
370
+ {
371
+ "data": {
372
+ "text/html": [
373
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Total params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">2,507,536,384</span> (9.34 GB)\n",
374
+ "</pre>\n"
375
+ ],
376
+ "text/plain": [
377
+ "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m2,507,536,384\u001b[0m (9.34 GB)\n"
378
+ ]
379
+ },
380
+ "metadata": {},
381
+ "output_type": "display_data"
382
+ },
383
+ {
384
+ "data": {
385
+ "text/html": [
386
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">1,363,968</span> (5.20 MB)\n",
387
+ "</pre>\n"
388
+ ],
389
+ "text/plain": [
390
+ "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m1,363,968\u001b[0m (5.20 MB)\n"
391
+ ]
392
+ },
393
+ "metadata": {},
394
+ "output_type": "display_data"
395
+ },
396
+ {
397
+ "data": {
398
+ "text/html": [
399
+ "<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Non-trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">2,506,172,416</span> (9.34 GB)\n",
400
+ "</pre>\n"
401
+ ],
402
+ "text/plain": [
403
+ "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m2,506,172,416\u001b[0m (9.34 GB)\n"
404
+ ]
405
+ },
406
+ "metadata": {},
407
+ "output_type": "display_data"
408
+ }
409
+ ],
410
+ "source": [
411
+ "# Get Model and Compile\n",
412
+ "model = get_compiled_model()\n",
413
+ "\n",
414
+ "# Get the dataset outside the strategy scope.\n",
415
+ "data = get_dataset()"
416
+ ]
417
+ },
418
+ {
419
+ "cell_type": "markdown",
420
+ "id": "50106081-cd9e-4246-974e-a4e7db99be97",
421
+ "metadata": {},
422
+ "source": [
423
+ "## 4. Defining the Prompt Template and Generating Responses\n",
424
+ "\n",
425
+ "In this section, we define a template for creating prompts that the language model will use to generate responses. The template includes placeholders for an 'instruction' and a 'response'. We then format this template with actual data to create a complete prompt. Finally, the prompt is passed to the language model to generate a response.\n"
426
+ ]
427
+ },
428
+ {
429
+ "cell_type": "code",
430
+ "execution_count": 6,
431
+ "id": "47a8c8da-b73a-4684-9199-176b4fa9bd3d",
432
+ "metadata": {},
433
+ "outputs": [
434
+ {
435
+ "name": "stderr",
436
+ "output_type": "stream",
437
+ "text": [
438
+ "2024-08-30 16:13:57.828298: E tensorflow/core/util/util.cc:131] oneDNN supports DT_INT64 only on platforms with AVX-512. Falling back to the default Eigen-based implementation if present.\n",
439
+ "WARNING: All log messages before absl::InitializeLog() is called are written to STDERR\n",
440
+ "I0000 00:00:1725034446.797146 18073 service.cc:146] XLA service 0x5a7be0cb31d0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\n",
441
+ "I0000 00:00:1725034446.797176 18073 service.cc:154] StreamExecutor device (0): Tesla T4, Compute Capability 7.5\n",
442
+ "I0000 00:00:1725034446.797180 18073 service.cc:154] StreamExecutor device (1): Tesla T4, Compute Capability 7.5\n",
443
+ "I0000 00:00:1725034446.797182 18073 service.cc:154] StreamExecutor device (2): Tesla T4, Compute Capability 7.5\n",
444
+ "I0000 00:00:1725034446.797185 18073 service.cc:154] StreamExecutor device (3): Tesla T4, Compute Capability 7.5\n",
445
+ "I0000 00:00:1725034446.797188 18073 service.cc:154] StreamExecutor device (4): Tesla T4, Compute Capability 7.5\n",
446
+ "I0000 00:00:1725034446.797190 18073 service.cc:154] StreamExecutor device (5): Tesla T4, Compute Capability 7.5\n",
447
+ "I0000 00:00:1725034446.797193 18073 service.cc:154] StreamExecutor device (6): Tesla T4, Compute Capability 7.5\n",
448
+ "I0000 00:00:1725034446.797196 18073 service.cc:154] StreamExecutor device (7): Tesla T4, Compute Capability 7.5\n",
449
+ "2024-08-30 16:14:07.351952: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.\n",
450
+ "2024-08-30 16:14:09.094695: I external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:531] Loaded cuDNN version 8905\n",
451
+ "I0000 00:00:1725034456.429832 18073 device_compiler.h:188] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process.\n"
452
+ ]
453
+ },
454
+ {
455
+ "name": "stdout",
456
+ "output_type": "stream",
457
+ "text": [
458
+ "Instruction:\n",
459
+ "I'm going through some things with my feelings and myself. I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\n",
460
+ " I've never tried or contemplated suicide. I've always wanted to fix my issues, but I never get around to it.\n",
461
+ " How can I change my feeling of being worthless to everyone?\n",
462
+ "\n",
463
+ "Response:\n",
464
+ "I'm sorry to hear that you're going through some things. I'm not sure what you mean by \"I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\" I'm not sure what you mean by \"I've never tried or contemplated suicide.\" I'm not sure what you mean by \"I've always wanted to fix my issues, but I never get around to it.\" I'm not sure what you mean by \"How can I change my feeling of being worthless to everyone?\"\n",
465
+ "\n",
466
+ "I'm not sure what you mean by \"I'm sorry to hear that you're going through some things.\"\n",
467
+ "\n",
468
+ "I'm not sure what you mean by \"I'm not sure what you mean by 'I barely sleep\n"
469
+ ]
470
+ }
471
+ ],
472
+ "source": [
473
+ "# Define the template with placeholders for 'instruction' and 'response'\n",
474
+ "template = \"Instruction:\\n{instruction}\\n\\nResponse:\\n{response}\"\n",
475
+ "\n",
476
+ "# Create the prompt by formatting the template with actual data\n",
477
+ "prompt = template.format(\n",
478
+ " instruction=\"I'm going through some things with my feelings and myself. I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\\n I've never tried or contemplated suicide. I've always wanted to fix my issues, but I never get around to it.\\n How can I change my feeling of being worthless to everyone?\",\n",
479
+ " response=\"\",\n",
480
+ ")\n",
481
+ "\n",
482
+ "# Assuming gemma_lm is a language model that you're using to generate text\n",
483
+ "print(model.generate(prompt, max_length=256))\n"
484
+ ]
485
+ },
486
+ {
487
+ "cell_type": "markdown",
488
+ "id": "9784a448-b7a2-40e7-8162-3caa3ac64d22",
489
+ "metadata": {},
490
+ "source": [
491
+ "## 5. Model Fine Tunning\n",
492
+ "\n",
493
+ "In this section, we compile the model, prepare the dataset, and then train the model using the data. We will walk through the steps of obtaining the compiled model, loading the dataset, and fitting the model to the data.\n"
494
+ ]
495
+ },
496
+ {
497
+ "cell_type": "code",
498
+ "execution_count": 7,
499
+ "id": "9355a347-5893-4f94-bff8-09cb126d818d",
500
+ "metadata": {},
501
+ "outputs": [
502
+ {
503
+ "name": "stdout",
504
+ "output_type": "stream",
505
+ "text": [
506
+ "Epoch 1/40\n"
507
+ ]
508
+ },
509
+ {
510
+ "name": "stderr",
511
+ "output_type": "stream",
512
+ "text": [
513
+ "W0000 00:00:1725034506.217331 18546 assert_op.cc:38] Ignoring Assert operator compile_loss/sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/assert_equal_1/Assert/Assert\n",
514
+ "2024-08-30 16:15:23.759806: I external/local_xla/xla/stream_executor/cuda/cuda_asm_compiler.cc:393] ptxas warning : Registers are spilled to local memory in function 'loop_add_subtract_fusion_2', 220 bytes spill stores, 220 bytes spill loads\n",
515
+ "\n"
516
+ ]
517
+ },
518
+ {
519
+ "name": "stdout",
520
+ "output_type": "stream",
521
+ "text": [
522
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2455s\u001b[0m 682ms/step - loss: 2.0513 - sparse_categorical_accuracy: 0.4572\n",
523
+ "Epoch 2/40\n",
524
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2398s\u001b[0m 683ms/step - loss: 1.9421 - sparse_categorical_accuracy: 0.4760\n",
525
+ "Epoch 3/40\n",
526
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2398s\u001b[0m 683ms/step - loss: 1.8639 - sparse_categorical_accuracy: 0.4929\n",
527
+ "Epoch 4/40\n",
528
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2396s\u001b[0m 682ms/step - loss: 1.7974 - sparse_categorical_accuracy: 0.5083\n",
529
+ "Epoch 5/40\n",
530
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 683ms/step - loss: 1.7301 - sparse_categorical_accuracy: 0.5239\n",
531
+ "Epoch 6/40\n",
532
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2396s\u001b[0m 682ms/step - loss: 1.6699 - sparse_categorical_accuracy: 0.5380\n",
533
+ "Epoch 7/40\n",
534
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 682ms/step - loss: 1.6151 - sparse_categorical_accuracy: 0.5512\n",
535
+ "Epoch 8/40\n",
536
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2399s\u001b[0m 683ms/step - loss: 1.5641 - sparse_categorical_accuracy: 0.5633\n",
537
+ "Epoch 9/40\n",
538
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2393s\u001b[0m 681ms/step - loss: 1.5194 - sparse_categorical_accuracy: 0.5776\n",
539
+ "Epoch 10/40\n",
540
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━��\u001b[0m\u001b[37m\u001b[0m \u001b[1m2392s\u001b[0m 681ms/step - loss: 1.4795 - sparse_categorical_accuracy: 0.5869\n",
541
+ "Epoch 11/40\n",
542
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2393s\u001b[0m 681ms/step - loss: 1.4477 - sparse_categorical_accuracy: 0.5949\n",
543
+ "Epoch 12/40\n",
544
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2395s\u001b[0m 682ms/step - loss: 1.4191 - sparse_categorical_accuracy: 0.6025\n",
545
+ "Epoch 13/40\n",
546
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 683ms/step - loss: 1.3948 - sparse_categorical_accuracy: 0.6080\n",
547
+ "Epoch 14/40\n",
548
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 682ms/step - loss: 1.3707 - sparse_categorical_accuracy: 0.6142\n",
549
+ "Epoch 15/40\n",
550
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2398s\u001b[0m 683ms/step - loss: 1.3508 - sparse_categorical_accuracy: 0.6195\n",
551
+ "Epoch 16/40\n",
552
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 682ms/step - loss: 1.3308 - sparse_categorical_accuracy: 0.6236\n",
553
+ "Epoch 17/40\n",
554
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2400s\u001b[0m 683ms/step - loss: 1.3068 - sparse_categorical_accuracy: 0.6303\n",
555
+ "Epoch 18/40\n",
556
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2400s\u001b[0m 683ms/step - loss: 1.2879 - sparse_categorical_accuracy: 0.6350\n",
557
+ "Epoch 19/40\n",
558
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2401s\u001b[0m 683ms/step - loss: 1.2676 - sparse_categorical_accuracy: 0.6395\n",
559
+ "Epoch 20/40\n",
560
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2396s\u001b[0m 682ms/step - loss: 1.2474 - sparse_categorical_accuracy: 0.6444\n",
561
+ "Epoch 21/40\n",
562
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2395s\u001b[0m 682ms/step - loss: 1.2283 - sparse_categorical_accuracy: 0.6491\n",
563
+ "Epoch 22/40\n",
564
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2398s\u001b[0m 683ms/step - loss: 1.2086 - sparse_categorical_accuracy: 0.6543\n",
565
+ "Epoch 23/40\n",
566
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2399s\u001b[0m 683ms/step - loss: 1.1896 - sparse_categorical_accuracy: 0.6593\n",
567
+ "Epoch 24/40\n",
568
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2400s\u001b[0m 683ms/step - loss: 1.1706 - sparse_categorical_accuracy: 0.6644\n",
569
+ "Epoch 25/40\n",
570
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2399s\u001b[0m 683ms/step - loss: 1.1508 - sparse_categorical_accuracy: 0.6695\n",
571
+ "Epoch 26/40\n",
572
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2401s\u001b[0m 683ms/step - loss: 1.1322 - sparse_categorical_accuracy: 0.6744\n",
573
+ "Epoch 27/40\n",
574
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2400s\u001b[0m 683ms/step - loss: 1.1152 - sparse_categorical_accuracy: 0.6789\n",
575
+ "Epoch 28/40\n",
576
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 682ms/step - loss: 1.0921 - sparse_categorical_accuracy: 0.6851\n",
577
+ "Epoch 29/40\n",
578
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2396s\u001b[0m 682ms/step - loss: 1.0791 - sparse_categorical_accuracy: 0.6881\n",
579
+ "Epoch 30/40\n",
580
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2398s\u001b[0m 683ms/step - loss: 1.0581 - sparse_categorical_accuracy: 0.6941\n",
581
+ "Epoch 31/40\n",
582
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2399s\u001b[0m 683ms/step - loss: 1.0382 - sparse_categorical_accuracy: 0.6994\n",
583
+ "Epoch 32/40\n",
584
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2401s\u001b[0m 684ms/step - loss: 1.0208 - sparse_categorical_accuracy: 0.7045\n",
585
+ "Epoch 33/40\n",
586
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2402s\u001b[0m 684ms/step - loss: 1.0037 - sparse_categorical_accuracy: 0.7089\n",
587
+ "Epoch 34/40\n",
588
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2402s\u001b[0m 684ms/step - loss: 0.9862 - sparse_categorical_accuracy: 0.7137\n",
589
+ "Epoch 35/40\n",
590
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2405s\u001b[0m 685ms/step - loss: 0.9688 - sparse_categorical_accuracy: 0.7183\n",
591
+ "Epoch 36/40\n",
592
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2402s\u001b[0m 684ms/step - loss: 0.9554 - sparse_categorical_accuracy: 0.7219\n",
593
+ "Epoch 37/40\n",
594
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2402s\u001b[0m 684ms/step - loss: 0.9479 - sparse_categorical_accuracy: 0.7239\n",
595
+ "Epoch 38/40\n",
596
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2404s\u001b[0m 685ms/step - loss: 0.9224 - sparse_categorical_accuracy: 0.7313\n",
597
+ "Epoch 39/40\n",
598
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2408s\u001b[0m 686ms/step - loss: 0.9132 - sparse_categorical_accuracy: 0.7335\n",
599
+ "Epoch 40/40\n",
600
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2410s\u001b[0m 686ms/step - loss: 0.8930 - sparse_categorical_accuracy: 0.7399\n"
601
+ ]
602
+ },
603
+ {
604
+ "data": {
605
+ "text/plain": [
606
+ "<keras.src.callbacks.history.History at 0x7b0d2c394a00>"
607
+ ]
608
+ },
609
+ "execution_count": 7,
610
+ "metadata": {},
611
+ "output_type": "execute_result"
612
+ }
613
+ ],
614
+ "source": [
615
+ "# Fit the model using the data.\n",
616
+ "model.fit(data, epochs=40, batch_size=1, verbose=1)"
617
+ ]
618
+ },
619
+ {
620
+ "cell_type": "markdown",
621
+ "id": "8ca51424-9aa2-4933-9f97-106eec5347c5",
622
+ "metadata": {},
623
+ "source": [
624
+ "## 6. Generating a Response from the Fine-Tuned Language Model\n",
625
+ "\n",
626
+ "In this section, we will define a template for generating prompts and use the language model `gemma_lm` to generate a response based on the provided instruction. This process involves creating a formatted prompt and then using the model to produce a response.\n"
627
+ ]
628
+ },
629
+ {
630
+ "cell_type": "code",
631
+ "execution_count": 9,
632
+ "id": "9199e8bd-3269-4b4a-bb8b-78399965883f",
633
+ "metadata": {},
634
+ "outputs": [
635
+ {
636
+ "name": "stdout",
637
+ "output_type": "stream",
638
+ "text": [
639
+ "Instruction:\n",
640
+ "I'm going through some things with my feelings and myself. I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\n",
641
+ " I've never tried or contemplated suicide. I've always wanted to fix my issues, but I never get around to it.\n",
642
+ " How can I change my feeling of being worthless to everyone?\n",
643
+ "\n",
644
+ "Response:\n",
645
+ "It sounds like you are having a really tough time with feeling this way.  Feeling this way is not normal and it is important to talk about these feelings with someone.  You are not worthless, you are a wonderful person who is going through some tough times.  Therapy can help you to work through these issues and come to a place of self-love and acceptance.  You can do this!  \n"
646
+ ]
647
+ }
648
+ ],
649
+ "source": [
650
+ "# Define the template with placeholders for 'instruction' and 'response'\n",
651
+ "template = \"Instruction:\\n{instruction}\\n\\nResponse:\\n{response}\"\n",
652
+ "\n",
653
+ "# Create the prompt by formatting the template with actual data\n",
654
+ "prompt = template.format(\n",
655
+ " instruction=\"I'm going through some things with my feelings and myself. I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\\n I've never tried or contemplated suicide. I've always wanted to fix my issues, but I never get around to it.\\n How can I change my feeling of being worthless to everyone?\",\n",
656
+ " response=\"\",\n",
657
+ ")\n",
658
+ "\n",
659
+ "# Assuming gemma_lm is a language model that you're using to generate text\n",
660
+ "print(model.generate(prompt, max_length=256))\n"
661
+ ]
662
+ },
663
+ {
664
+ "cell_type": "code",
665
+ "execution_count": null,
666
+ "id": "b418c2e7-af66-4625-887d-0c1e5d1f4464",
667
+ "metadata": {},
668
+ "outputs": [
669
+ {
670
+ "name": "stdout",
671
+ "output_type": "stream",
672
+ "text": [
673
+ "Epoch 1/90\n",
674
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2388s\u001b[0m 680ms/step - loss: 0.8785 - sparse_categorical_accuracy: 0.7438\n",
675
+ "Epoch 2/90\n",
676
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2388s\u001b[0m 680ms/step - loss: 0.8661 - sparse_categorical_accuracy: 0.7473\n",
677
+ "Epoch 3/90\n",
678
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2384s\u001b[0m 679ms/step - loss: 0.8522 - sparse_categorical_accuracy: 0.7514\n",
679
+ "Epoch 4/90\n",
680
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2380s\u001b[0m 678ms/step - loss: 0.8408 - sparse_categorical_accuracy: 0.7547\n",
681
+ "Epoch 5/90\n",
682
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2379s\u001b[0m 677ms/step - loss: 0.8266 - sparse_categorical_accuracy: 0.7585\n",
683
+ "Epoch 6/90\n",
684
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2380s\u001b[0m 678ms/step - loss: 0.8117 - sparse_categorical_accuracy: 0.7635\n",
685
+ "Epoch 7/90\n",
686
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2379s\u001b[0m 677ms/step - loss: 0.8048 - sparse_categorical_accuracy: 0.7653\n",
687
+ "Epoch 8/90\n",
688
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2382s\u001b[0m 678ms/step - loss: 0.7892 - sparse_categorical_accuracy: 0.7698\n",
689
+ "Epoch 9/90\n",
690
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2387s\u001b[0m 680ms/step - loss: 0.7790 - sparse_categorical_accuracy: 0.7724\n",
691
+ "Epoch 10/90\n",
692
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2387s\u001b[0m 680ms/step - loss: 0.7680 - sparse_categorical_accuracy: 0.7754\n",
693
+ "Epoch 11/90\n",
694
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2391s\u001b[0m 681ms/step - loss: 0.7569 - sparse_categorical_accuracy: 0.7786\n",
695
+ "Epoch 12/90\n",
696
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2392s\u001b[0m 681ms/step - loss: 0.7476 - sparse_categorical_accuracy: 0.7821\n",
697
+ "Epoch 13/90\n",
698
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2396s\u001b[0m 682ms/step - loss: 0.7373 - sparse_categorical_accuracy: 0.7844\n",
699
+ "Epoch 14/90\n",
700
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2399s\u001b[0m 683ms/step - loss: 0.7335 - sparse_categorical_accuracy: 0.7855\n",
701
+ "Epoch 15/90\n",
702
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2401s\u001b[0m 684ms/step - loss: 0.7246 - sparse_categorical_accuracy: 0.7883\n",
703
+ "Epoch 16/90\n",
704
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2405s\u001b[0m 685ms/step - loss: 0.6987 - sparse_categorical_accuracy: 0.7965\n",
705
+ "Epoch 17/90\n",
706
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2400s\u001b[0m 683ms/step - loss: 0.7001 - sparse_categorical_accuracy: 0.7956\n",
707
+ "Epoch 18/90\n",
708
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2400s\u001b[0m 683ms/step - loss: 0.6918 - sparse_categorical_accuracy: 0.7976\n",
709
+ "Epoch 19/90\n",
710
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 683ms/step - loss: 0.6806 - sparse_categorical_accuracy: 0.8007\n",
711
+ "Epoch 20/90\n",
712
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2395s\u001b[0m 682ms/step - loss: 0.6768 - sparse_categorical_accuracy: 0.8020\n",
713
+ "Epoch 21/90\n",
714
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2393s\u001b[0m 681ms/step - loss: 0.6644 - sparse_categorical_accuracy: 0.8058\n",
715
+ "Epoch 22/90\n",
716
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2390s\u001b[0m 680ms/step - loss: 0.6582 - sparse_categorical_accuracy: 0.8080\n",
717
+ "Epoch 23/90\n",
718
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2385s\u001b[0m 679ms/step - loss: 0.6479 - sparse_categorical_accuracy: 0.8107\n",
719
+ "Epoch 24/90\n",
720
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2380s\u001b[0m 678ms/step - loss: 0.6394 - sparse_categorical_accuracy: 0.8131\n",
721
+ "Epoch 25/90\n",
722
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2376s\u001b[0m 677ms/step - loss: 0.6351 - sparse_categorical_accuracy: 0.8141\n",
723
+ "Epoch 26/90\n",
724
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2372s\u001b[0m 675ms/step - loss: 0.6264 - sparse_categorical_accuracy: 0.8169\n",
725
+ "Epoch 27/90\n",
726
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2373s\u001b[0m 676ms/step - loss: 0.6198 - sparse_categorical_accuracy: 0.8193\n",
727
+ "Epoch 28/90\n",
728
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2367s\u001b[0m 674ms/step - loss: 0.6127 - sparse_categorical_accuracy: 0.8213\n",
729
+ "Epoch 29/90\n",
730
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2361s\u001b[0m 672ms/step - loss: 0.6048 - sparse_categorical_accuracy: 0.8236\n",
731
+ "Epoch 30/90\n",
732
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2358s\u001b[0m 671ms/step - loss: 0.6168 - sparse_categorical_accuracy: 0.8185\n",
733
+ "Epoch 31/90\n",
734
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2357s\u001b[0m 671ms/step - loss: 0.5947 - sparse_categorical_accuracy: 0.8266\n",
735
+ "Epoch 32/90\n",
736
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2356s\u001b[0m 671ms/step - loss: 0.5878 - sparse_categorical_accuracy: 0.8279\n",
737
+ "Epoch 33/90\n",
738
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2353s\u001b[0m 670ms/step - loss: 0.5781 - sparse_categorical_accuracy: 0.8315\n",
739
+ "Epoch 34/90\n",
740
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2349s\u001b[0m 669ms/step - loss: 0.5729 - sparse_categorical_accuracy: 0.8335\n",
741
+ "Epoch 35/90\n",
742
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2348s\u001b[0m 668ms/step - loss: 0.5684 - sparse_categorical_accuracy: 0.8344\n",
743
+ "Epoch 36/90\n",
744
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2346s\u001b[0m 668ms/step - loss: 0.5684 - sparse_categorical_accuracy: 0.8339\n",
745
+ "Epoch 37/90\n",
746
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2346s\u001b[0m 668ms/step - loss: 0.5550 - sparse_categorical_accuracy: 0.8389\n",
747
+ "Epoch 38/90\n",
748
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2343s\u001b[0m 667ms/step - loss: 0.5544 - sparse_categorical_accuracy: 0.8384\n",
749
+ "Epoch 39/90\n",
750
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2343s\u001b[0m 667ms/step - loss: 0.5454 - sparse_categorical_accuracy: 0.8413\n",
751
+ "Epoch 40/90\n",
752
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2343s\u001b[0m 667ms/step - loss: 0.5389 - sparse_categorical_accuracy: 0.8433\n",
753
+ "Epoch 41/90\n",
754
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2344s\u001b[0m 667ms/step - loss: 0.5456 - sparse_categorical_accuracy: 0.8406\n",
755
+ "Epoch 42/90\n",
756
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2347s\u001b[0m 668ms/step - loss: 0.5289 - sparse_categorical_accuracy: 0.8462\n",
757
+ "Epoch 43/90\n",
758
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2349s\u001b[0m 669ms/step - loss: 0.5264 - sparse_categorical_accuracy: 0.8465\n",
759
+ "Epoch 44/90\n",
760
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2356s\u001b[0m 671ms/step - loss: 0.5192 - sparse_categorical_accuracy: 0.8489\n",
761
+ "Epoch 45/90\n",
762
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2363s\u001b[0m 673ms/step - loss: 0.5127 - sparse_categorical_accuracy: 0.8513\n",
763
+ "Epoch 46/90\n",
764
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2369s\u001b[0m 674ms/step - loss: 0.5090 - sparse_categorical_accuracy: 0.8522\n",
765
+ "Epoch 47/90\n",
766
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2375s\u001b[0m 676ms/step - loss: 0.5033 - sparse_categorical_accuracy: 0.8538\n",
767
+ "Epoch 48/90\n",
768
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2378s\u001b[0m 677ms/step - loss: 0.5023 - sparse_categorical_accuracy: 0.8541\n",
769
+ "Epoch 49/90\n",
770
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2382s\u001b[0m 678ms/step - loss: 0.4946 - sparse_categorical_accuracy: 0.8565\n",
771
+ "Epoch 50/90\n",
772
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2386s\u001b[0m 679ms/step - loss: 0.4915 - sparse_categorical_accuracy: 0.8567\n",
773
+ "Epoch 51/90\n",
774
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2389s\u001b[0m 680ms/step - loss: 0.4842 - sparse_categorical_accuracy: 0.8597\n",
775
+ "Epoch 52/90\n",
776
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2392s\u001b[0m 681ms/step - loss: 0.4836 - sparse_categorical_accuracy: 0.8599\n",
777
+ "Epoch 53/90\n",
778
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2394s\u001b[0m 682ms/step - loss: 0.4772 - sparse_categorical_accuracy: 0.8611\n",
779
+ "Epoch 54/90\n",
780
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2397s\u001b[0m 683ms/step - loss: 0.4749 - sparse_categorical_accuracy: 0.8617\n",
781
+ "Epoch 55/90\n",
782
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2393s\u001b[0m 681ms/step - loss: 0.4785 - sparse_categorical_accuracy: 0.8603\n",
783
+ "Epoch 56/90\n",
784
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2389s\u001b[0m 680ms/step - loss: 0.4587 - sparse_categorical_accuracy: 0.8680\n",
785
+ "Epoch 57/90\n",
786
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2389s\u001b[0m 680ms/step - loss: 0.4649 - sparse_categorical_accuracy: 0.8655\n",
787
+ "Epoch 58/90\n",
788
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2385s\u001b[0m 679ms/step - loss: 0.4573 - sparse_categorical_accuracy: 0.8675\n",
789
+ "Epoch 59/90\n",
790
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2382s\u001b[0m 678ms/step - loss: 0.4545 - sparse_categorical_accuracy: 0.8689\n",
791
+ "Epoch 60/90\n",
792
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2376s\u001b[0m 676ms/step - loss: 0.4499 - sparse_categorical_accuracy: 0.8697\n",
793
+ "Epoch 61/90\n",
794
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2377s\u001b[0m 677ms/step - loss: 0.4482 - sparse_categorical_accuracy: 0.8698\n",
795
+ "Epoch 62/90\n",
796
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2370s\u001b[0m 675ms/step - loss: 0.4421 - sparse_categorical_accuracy: 0.8722\n",
797
+ "Epoch 63/90\n",
798
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2360s\u001b[0m 672ms/step - loss: 0.4370 - sparse_categorical_accuracy: 0.8739\n",
799
+ "Epoch 64/90\n",
800
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2352s\u001b[0m 670ms/step - loss: 0.4315 - sparse_categorical_accuracy: 0.8753\n",
801
+ "Epoch 65/90\n",
802
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2351s\u001b[0m 669ms/step - loss: 0.4342 - sparse_categorical_accuracy: 0.8747\n",
803
+ "Epoch 66/90\n",
804
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2355s\u001b[0m 670ms/step - loss: 0.4305 - sparse_categorical_accuracy: 0.8758\n",
805
+ "Epoch 67/90\n",
806
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2354s\u001b[0m 670ms/step - loss: 0.4266 - sparse_categorical_accuracy: 0.8771\n",
807
+ "Epoch 68/90\n",
808
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2351s\u001b[0m 669ms/step - loss: 0.4204 - sparse_categorical_accuracy: 0.8790\n",
809
+ "Epoch 69/90\n",
810
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2350s\u001b[0m 669ms/step - loss: 0.4176 - sparse_categorical_accuracy: 0.8795\n",
811
+ "Epoch 70/90\n",
812
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2346s\u001b[0m 668ms/step - loss: 0.4132 - sparse_categorical_accuracy: 0.8811\n",
813
+ "Epoch 71/90\n",
814
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2344s\u001b[0m 667ms/step - loss: 0.4143 - sparse_categorical_accuracy: 0.8804\n",
815
+ "Epoch 72/90\n",
816
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2342s\u001b[0m 667ms/step - loss: 0.4078 - sparse_categorical_accuracy: 0.8828\n",
817
+ "Epoch 73/90\n",
818
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2343s\u001b[0m 667ms/step - loss: 0.4041 - sparse_categorical_accuracy: 0.8837\n",
819
+ "Epoch 74/90\n",
820
+ "\u001b[1m3512/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m2341s\u001b[0m 666ms/step - loss: 0.4060 - sparse_categorical_accuracy: 0.8832\n",
821
+ "Epoch 75/90\n",
822
+ "\u001b[1m2888/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m━━━━\u001b[0m \u001b[1m6:55\u001b[0m 667ms/step - loss: 0.4078 - sparse_categorical_accuracy: 0.8826"
823
+ ]
824
+ }
825
+ ],
826
+ "source": [
827
+ "# Fit the model using the data.\n",
828
+ "model.fit(data, epochs=90, batch_size=1, verbose=1)"
829
+ ]
830
+ },
831
+ {
832
+ "cell_type": "code",
833
+ "execution_count": 20,
834
+ "id": "f5d19bd9-72c8-4519-933b-c94a25b1ee3d",
835
+ "metadata": {},
836
+ "outputs": [],
837
+ "source": [
838
+ "model.backbone.save_lora_weights(\"model2.lora.h5\")"
839
+ ]
840
+ },
841
+ {
842
+ "cell_type": "code",
843
+ "execution_count": 23,
844
+ "id": "1fda3d9d-488d-46a2-8f4e-b0b83f175426",
845
+ "metadata": {},
846
+ "outputs": [
847
+ {
848
+ "name": "stdout",
849
+ "output_type": "stream",
850
+ "text": [
851
+ "Instruction:\n",
852
+ "I'm going through some things with my feelings and myself. I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\n",
853
+ " I've never tried or contemplated suicide. I've always wanted to fix my issues, but I never get around to it.\n",
854
+ " How can I change my feeling of being worthless to everyone?\n",
855
+ "\n",
856
+ "Response:\n",
857
+ "I'm glad you're willing to slow down, but it's a complicated feeling that can be challenging to identify and work through. I've heard it described as feeling like a heavy cloud or rain cloud, and it's good to know where you are when you want to know what to do about it.It is possible to know more about this feeling and why it's arising from time to time like this. It helps to know that what you're experiencing is likely to do with your sense of who you are and your sense of how important you are in people's lives. It's also helpful to know that this feeling is something that's happening deep within you, not something outside of you.It can be valuable to look at the relationships in your life and consider your place in them.\n"
858
+ ]
859
+ }
860
+ ],
861
+ "source": [
862
+ "# Define the template with placeholders for 'instruction' and 'response'\n",
863
+ "template = \"Instruction:\\n{instruction}\\n\\nResponse:\\n{response}\"\n",
864
+ "\n",
865
+ "# Create the prompt by formatting the template with actual data\n",
866
+ "prompt = template.format(\n",
867
+ " instruction=\"I'm going through some things with my feelings and myself. I barely sleep and I do nothing but think about how I'm worthless and how I shouldn't be here.\\n I've never tried or contemplated suicide. I've always wanted to fix my issues, but I never get around to it.\\n How can I change my feeling of being worthless to everyone?\",\n",
868
+ " response=\"\",\n",
869
+ ")\n",
870
+ "\n",
871
+ "# Assuming gemma_lm is a language model that you're using to generate text\n",
872
+ "print(model.generate(prompt, max_length=256))\n"
873
+ ]
874
+ },
875
+ {
876
+ "cell_type": "code",
877
+ "execution_count": null,
878
+ "id": "252c7c12-e763-4524-8cd4-7582431efb02",
879
+ "metadata": {},
880
+ "outputs": [
881
+ {
882
+ "name": "stdout",
883
+ "output_type": "stream",
884
+ "text": [
885
+ "\u001b[1m3136/3512\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m━━━\u001b[0m \u001b[1m4:11\u001b[0m 668ms/step - loss: 0.3683 - sparse_categorical_accuracy: 0.8940"
886
+ ]
887
+ }
888
+ ],
889
+ "source": [
890
+ "model.fit(data, epochs=1, batch_size=1, verbose=1)"
891
+ ]
892
+ },
893
+ {
894
+ "cell_type": "code",
895
+ "execution_count": null,
896
+ "id": "11bda333-9e55-4a62-9992-3d595d27ff54",
897
+ "metadata": {},
898
+ "outputs": [],
899
+ "source": []
900
+ }
901
+ ],
902
+ "metadata": {
903
+ "kernelspec": {
904
+ "display_name": "Python 3 (ipykernel)",
905
+ "language": "python",
906
+ "name": "python3"
907
+ },
908
+ "language_info": {
909
+ "codemirror_mode": {
910
+ "name": "ipython",
911
+ "version": 3
912
+ },
913
+ "file_extension": ".py",
914
+ "mimetype": "text/x-python",
915
+ "name": "python",
916
+ "nbconvert_exporter": "python",
917
+ "pygments_lexer": "ipython3",
918
+ "version": "3.10.12"
919
+ }
920
+ },
921
+ "nbformat": 4,
922
+ "nbformat_minor": 5
923
+ }
README.md CHANGED
@@ -1,12 +1,6 @@
1
  ---
2
- title: Mental Sage
3
- emoji: 🦀
4
- colorFrom: red
5
- colorTo: yellow
6
  sdk: gradio
7
  sdk_version: 5.5.0
8
- app_file: app.py
9
- pinned: false
10
  ---
11
-
12
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
1
  ---
2
+ title: Mental-Sage
3
+ app_file: main.py
 
 
4
  sdk: gradio
5
  sdk_version: 5.5.0
 
 
6
  ---
 
 
Untitled.ipynb ADDED
@@ -0,0 +1,138 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 4,
6
+ "id": "fac5022c-df54-4824-8c57-98fe045372fd",
7
+ "metadata": {},
8
+ "outputs": [
9
+ {
10
+ "name": "stderr",
11
+ "output_type": "stream",
12
+ "text": [
13
+ "2024-09-02 15:52:58.146880: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 692 MB memory: -> device: 0, name: Tesla T4, pci bus id: 0000:18:00.0, compute capability: 7.5\n",
14
+ "2024-09-02 15:52:58.148409: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 13673 MB memory: -> device: 1, name: Tesla T4, pci bus id: 0000:19:00.0, compute capability: 7.5\n",
15
+ "2024-09-02 15:52:58.149801: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 13673 MB memory: -> device: 2, name: Tesla T4, pci bus id: 0000:35:00.0, compute capability: 7.5\n",
16
+ "2024-09-02 15:52:58.151206: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 13673 MB memory: -> device: 3, name: Tesla T4, pci bus id: 0000:36:00.0, compute capability: 7.5\n",
17
+ "2024-09-02 15:52:58.152545: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:4 with 13673 MB memory: -> device: 4, name: Tesla T4, pci bus id: 0000:e7:00.0, compute capability: 7.5\n",
18
+ "2024-09-02 15:52:58.153906: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:5 with 13673 MB memory: -> device: 5, name: Tesla T4, pci bus id: 0000:e8:00.0, compute capability: 7.5\n",
19
+ "2024-09-02 15:52:58.155220: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:6 with 13673 MB memory: -> device: 6, name: Tesla T4, pci bus id: 0000:f4:00.0, compute capability: 7.5\n",
20
+ "2024-09-02 15:52:58.156627: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:7 with 13673 MB memory: -> device: 7, name: Tesla T4, pci bus id: 0000:f5:00.0, compute capability: 7.5\n",
21
+ "2024-09-02 15:53:08.871520: W external/local_tsl/tsl/framework/bfc_allocator.cc:482] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.95GiB (rounded to 2097152000)requested by op StatelessRandomNormalV2\n",
22
+ "If the cause is memory fragmentation maybe the environment variable 'TF_GPU_ALLOCATOR=cuda_malloc_async' will improve the situation. \n",
23
+ "Current allocation summary follows.\n",
24
+ "Current allocation summary follows.\n",
25
+ "2024-09-02 15:53:08.871548: I external/local_tsl/tsl/framework/bfc_allocator.cc:1039] BFCAllocator dump for GPU_0_bfc\n",
26
+ "2024-09-02 15:53:08.871559: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (256): \tTotal Chunks: 6, Chunks in use: 6. 1.5KiB allocated for chunks. 1.5KiB in use in bin. 48B client-requested in use in bin.\n",
27
+ "2024-09-02 15:53:08.871566: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (512): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
28
+ "2024-09-02 15:53:08.871572: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (1024): \tTotal Chunks: 1, Chunks in use: 1. 1.2KiB allocated for chunks. 1.2KiB in use in bin. 1.0KiB client-requested in use in bin.\n",
29
+ "2024-09-02 15:53:08.871577: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (2048): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
30
+ "2024-09-02 15:53:08.871582: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (4096): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
31
+ "2024-09-02 15:53:08.871586: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (8192): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
32
+ "2024-09-02 15:53:08.871591: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (16384): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
33
+ "2024-09-02 15:53:08.871596: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (32768): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
34
+ "2024-09-02 15:53:08.871600: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (65536): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
35
+ "2024-09-02 15:53:08.871605: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (131072): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
36
+ "2024-09-02 15:53:08.871609: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (262144): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
37
+ "2024-09-02 15:53:08.871613: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (524288): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
38
+ "2024-09-02 15:53:08.871618: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (1048576): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
39
+ "2024-09-02 15:53:08.871622: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (2097152): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
40
+ "2024-09-02 15:53:08.871627: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (4194304): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
41
+ "2024-09-02 15:53:08.871631: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (8388608): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
42
+ "2024-09-02 15:53:08.871636: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (16777216): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
43
+ "2024-09-02 15:53:08.871640: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (33554432): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
44
+ "2024-09-02 15:53:08.871645: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (67108864): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
45
+ "2024-09-02 15:53:08.871650: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (134217728): \tTotal Chunks: 0, Chunks in use: 0. 0B allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
46
+ "2024-09-02 15:53:08.871656: I external/local_tsl/tsl/framework/bfc_allocator.cc:1046] Bin (268435456): \tTotal Chunks: 1, Chunks in use: 0. 692.62MiB allocated for chunks. 0B in use in bin. 0B client-requested in use in bin.\n",
47
+ "2024-09-02 15:53:08.871662: I external/local_tsl/tsl/framework/bfc_allocator.cc:1062] Bin for 1.95GiB was 256.00MiB, Chunk State: \n",
48
+ "2024-09-02 15:53:08.871670: I external/local_tsl/tsl/framework/bfc_allocator.cc:1068] Size: 692.62MiB | Requested Size: 0B | in_use: 0 | bin_num: 20, prev: Size: 256B | Requested Size: 16B | in_use: 1 | bin_num: -1\n",
49
+ "2024-09-02 15:53:08.871674: I external/local_tsl/tsl/framework/bfc_allocator.cc:1075] Next region of size 726269952\n",
50
+ "2024-09-02 15:53:08.871681: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000000 of size 256 next 1\n",
51
+ "2024-09-02 15:53:08.871685: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000100 of size 1280 next 2\n",
52
+ "2024-09-02 15:53:08.871689: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000600 of size 256 next 3\n",
53
+ "2024-09-02 15:53:08.871692: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000700 of size 256 next 4\n",
54
+ "2024-09-02 15:53:08.871695: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000800 of size 256 next 5\n",
55
+ "2024-09-02 15:53:08.871700: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000900 of size 256 next 6\n",
56
+ "2024-09-02 15:53:08.871703: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] InUse at 7f7fac000a00 of size 256 next 7\n",
57
+ "2024-09-02 15:53:08.871708: I external/local_tsl/tsl/framework/bfc_allocator.cc:1095] Free at 7f7fac000b00 of size 726267136 next 18446744073709551615\n",
58
+ "2024-09-02 15:53:08.871713: I external/local_tsl/tsl/framework/bfc_allocator.cc:1100] Summary of in-use Chunks by size: \n",
59
+ "2024-09-02 15:53:08.871718: I external/local_tsl/tsl/framework/bfc_allocator.cc:1103] 6 Chunks of size 256 totalling 1.5KiB\n",
60
+ "2024-09-02 15:53:08.871723: I external/local_tsl/tsl/framework/bfc_allocator.cc:1103] 1 Chunks of size 1280 totalling 1.2KiB\n",
61
+ "2024-09-02 15:53:08.871727: I external/local_tsl/tsl/framework/bfc_allocator.cc:1107] Sum Total of in-use chunks: 2.8KiB\n",
62
+ "2024-09-02 15:53:08.871732: I external/local_tsl/tsl/framework/bfc_allocator.cc:1109] Total bytes in pool: 726269952 memory_limit_: 726269952 available bytes: 0 curr_region_allocation_bytes_: 1452539904\n",
63
+ "2024-09-02 15:53:08.871740: I external/local_tsl/tsl/framework/bfc_allocator.cc:1114] Stats: \n",
64
+ "Limit: 726269952\n",
65
+ "InUse: 2816\n",
66
+ "MaxInUse: 2816\n",
67
+ "NumAllocs: 9\n",
68
+ "MaxAllocSize: 1280\n",
69
+ "Reserved: 0\n",
70
+ "PeakReserved: 0\n",
71
+ "LargestFreeBlock: 0\n",
72
+ "\n",
73
+ "2024-09-02 15:53:08.871746: W external/local_tsl/tsl/framework/bfc_allocator.cc:494] *___________________________________________________________________________________________________\n",
74
+ "2024-09-02 15:53:08.871766: W tensorflow/core/framework/op_kernel.cc:1840] OP_REQUIRES failed at stateless_random_ops_v2.cc:64 : RESOURCE_EXHAUSTED: OOM when allocating tensor with shape[256000,2048] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\n",
75
+ "2024-09-02 15:53:08.871780: I tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: RESOURCE_EXHAUSTED: OOM when allocating tensor with shape[256000,2048] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\n"
76
+ ]
77
+ },
78
+ {
79
+ "ename": "ResourceExhaustedError",
80
+ "evalue": "{{function_node __wrapped__StatelessRandomNormalV2_device_/job:localhost/replica:0/task:0/device:GPU:0}} OOM when allocating tensor with shape[256000,2048] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:StatelessRandomNormalV2]",
81
+ "output_type": "error",
82
+ "traceback": [
83
+ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
84
+ "\u001b[0;31mResourceExhaustedError\u001b[0m Traceback (most recent call last)",
85
+ "Cell \u001b[0;32mIn[4], line 6\u001b[0m\n\u001b[1;32m 4\u001b[0m os\u001b[38;5;241m.\u001b[39menviron[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mKAGGLE_KEY\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m9a33b6e88bcb6058b1281d777fa6808d\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 5\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mkeras_nlp\u001b[39;00m\n\u001b[0;32m----> 6\u001b[0m gemma_lm \u001b[38;5;241m=\u001b[39m \u001b[43mkeras_nlp\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmodels\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mGemmaCausalLM\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfrom_preset\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mgemma_2b_en\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 7\u001b[0m gemma_lm\u001b[38;5;241m.\u001b[39mgenerate(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mKeras is a\u001b[39m\u001b[38;5;124m\"\u001b[39m, max_length\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m30\u001b[39m)\n",
86
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras_nlp/src/models/task.py:264\u001b[0m, in \u001b[0;36mTask.from_preset\u001b[0;34m(cls, preset, load_weights, **kwargs)\u001b[0m\n\u001b[1;32m 262\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mdtype\u001b[39m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;129;01min\u001b[39;00m kwargs:\n\u001b[1;32m 263\u001b[0m config_overrides[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mdtype\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m kwargs\u001b[38;5;241m.\u001b[39mpop(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mdtype\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[0;32m--> 264\u001b[0m backbone \u001b[38;5;241m=\u001b[39m \u001b[43mbackbone_preset_cls\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfrom_preset\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 265\u001b[0m \u001b[43m \u001b[49m\u001b[43mpreset\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 266\u001b[0m \u001b[43m \u001b[49m\u001b[43mload_weights\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mload_weights\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 267\u001b[0m \u001b[43m \u001b[49m\u001b[43mconfig_overrides\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mconfig_overrides\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 268\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 269\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpreprocessor\u001b[39m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;129;01min\u001b[39;00m kwargs:\n\u001b[1;32m 270\u001b[0m preprocessor \u001b[38;5;241m=\u001b[39m kwargs\u001b[38;5;241m.\u001b[39mpop(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpreprocessor\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n",
87
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras_nlp/src/models/backbone.py:190\u001b[0m, in \u001b[0;36mBackbone.from_preset\u001b[0;34m(cls, preset, load_weights, **kwargs)\u001b[0m\n\u001b[1;32m 183\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28missubclass\u001b[39m(preset_cls, \u001b[38;5;28mcls\u001b[39m):\n\u001b[1;32m 184\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 185\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mPreset has type `\u001b[39m\u001b[38;5;132;01m{\u001b[39;00mpreset_cls\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m` which is not a \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 186\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124ma subclass of calling class `\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mcls\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m`. Call \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 187\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m`from_preset` directly on `\u001b[39m\u001b[38;5;132;01m{\u001b[39;00mpreset_cls\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m` instead.\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 188\u001b[0m )\n\u001b[0;32m--> 190\u001b[0m backbone \u001b[38;5;241m=\u001b[39m \u001b[43mload_serialized_object\u001b[49m\u001b[43m(\u001b[49m\u001b[43mpreset\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mCONFIG_FILE\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 191\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m load_weights:\n\u001b[1;32m 192\u001b[0m jax_memory_cleanup(backbone)\n",
88
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras_nlp/src/utils/preset_utils.py:569\u001b[0m, in \u001b[0;36mload_serialized_object\u001b[0;34m(preset, config_file, config_overrides)\u001b[0m\n\u001b[1;32m 567\u001b[0m config \u001b[38;5;241m=\u001b[39m load_config(preset, config_file)\n\u001b[1;32m 568\u001b[0m config[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mconfig\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m {\u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mconfig[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mconfig\u001b[39m\u001b[38;5;124m\"\u001b[39m], \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mconfig_overrides}\n\u001b[0;32m--> 569\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mkeras\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msaving\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mdeserialize_keras_object\u001b[49m\u001b[43m(\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\n",
89
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras/src/saving/serialization_lib.py:718\u001b[0m, in \u001b[0;36mdeserialize_keras_object\u001b[0;34m(config, custom_objects, safe_mode, **kwargs)\u001b[0m\n\u001b[1;32m 716\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m custom_obj_scope, safe_mode_scope:\n\u001b[1;32m 717\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 718\u001b[0m instance \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mcls\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfrom_config\u001b[49m\u001b[43m(\u001b[49m\u001b[43minner_config\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 719\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 720\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m(\n\u001b[1;32m 721\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mcls\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m could not be deserialized properly. Please\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 722\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m ensure that components that are Python object\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 726\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[38;5;124mconfig=\u001b[39m\u001b[38;5;132;01m{\u001b[39;00mconfig\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m.\u001b[39m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[38;5;124mException encountered: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00me\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 727\u001b[0m )\n",
90
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras_nlp/src/models/backbone.py:119\u001b[0m, in \u001b[0;36mBackbone.from_config\u001b[0;34m(cls, config)\u001b[0m\n\u001b[1;32m 115\u001b[0m \u001b[38;5;129m@classmethod\u001b[39m\n\u001b[1;32m 116\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mfrom_config\u001b[39m(\u001b[38;5;28mcls\u001b[39m, config):\n\u001b[1;32m 117\u001b[0m \u001b[38;5;66;03m# The default `from_config()` for functional models will return a\u001b[39;00m\n\u001b[1;32m 118\u001b[0m \u001b[38;5;66;03m# vanilla `keras.Model`. We override it to get a subclass instance back.\u001b[39;00m\n\u001b[0;32m--> 119\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mcls\u001b[39;49m\u001b[43m(\u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\n",
91
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras_nlp/src/models/gemma/gemma_backbone.py:171\u001b[0m, in \u001b[0;36mGemmaBackbone.__init__\u001b[0;34m(self, vocabulary_size, num_layers, num_query_heads, num_key_value_heads, hidden_dim, intermediate_dim, head_dim, query_head_dim_normalize, use_post_ffw_norm, use_post_attention_norm, attention_logit_soft_cap, final_logit_soft_cap, use_sliding_window_attention, sliding_window_size, layer_norm_epsilon, dropout, dtype, **kwargs)\u001b[0m\n\u001b[1;32m 165\u001b[0m token_id_input \u001b[38;5;241m=\u001b[39m keras\u001b[38;5;241m.\u001b[39mInput(\n\u001b[1;32m 166\u001b[0m shape\u001b[38;5;241m=\u001b[39m(\u001b[38;5;28;01mNone\u001b[39;00m,), dtype\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mfloat32\u001b[39m\u001b[38;5;124m\"\u001b[39m, name\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtoken_ids\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 167\u001b[0m )\n\u001b[1;32m 168\u001b[0m padding_mask_input \u001b[38;5;241m=\u001b[39m keras\u001b[38;5;241m.\u001b[39mInput(\n\u001b[1;32m 169\u001b[0m shape\u001b[38;5;241m=\u001b[39m(\u001b[38;5;28;01mNone\u001b[39;00m,), dtype\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mfloat32\u001b[39m\u001b[38;5;124m\"\u001b[39m, name\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpadding_mask\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 170\u001b[0m )\n\u001b[0;32m--> 171\u001b[0m x \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mtoken_embedding\u001b[49m\u001b[43m(\u001b[49m\u001b[43mtoken_id_input\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 172\u001b[0m x \u001b[38;5;241m=\u001b[39m x \u001b[38;5;241m*\u001b[39m ops\u001b[38;5;241m.\u001b[39mcast(ops\u001b[38;5;241m.\u001b[39msqrt(hidden_dim), x\u001b[38;5;241m.\u001b[39mdtype)\n\u001b[1;32m 173\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m transformer_layer \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mtransformer_layers:\n",
92
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras/src/utils/traceback_utils.py:122\u001b[0m, in \u001b[0;36mfilter_traceback.<locals>.error_handler\u001b[0;34m(*args, **kwargs)\u001b[0m\n\u001b[1;32m 119\u001b[0m filtered_tb \u001b[38;5;241m=\u001b[39m _process_traceback_frames(e\u001b[38;5;241m.\u001b[39m__traceback__)\n\u001b[1;32m 120\u001b[0m \u001b[38;5;66;03m# To get the full stack trace, call:\u001b[39;00m\n\u001b[1;32m 121\u001b[0m \u001b[38;5;66;03m# `keras.config.disable_traceback_filtering()`\u001b[39;00m\n\u001b[0;32m--> 122\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m e\u001b[38;5;241m.\u001b[39mwith_traceback(filtered_tb) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m 123\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[1;32m 124\u001b[0m \u001b[38;5;28;01mdel\u001b[39;00m filtered_tb\n",
93
+ "File \u001b[0;32m~/.local/lib/python3.10/site-packages/keras_nlp/src/layers/modeling/reversible_embedding.py:115\u001b[0m, in \u001b[0;36mReversibleEmbedding.build\u001b[0;34m(self, inputs_shape)\u001b[0m\n\u001b[1;32m 114\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mbuild\u001b[39m(\u001b[38;5;28mself\u001b[39m, inputs_shape\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m):\n\u001b[0;32m--> 115\u001b[0m \u001b[38;5;28;43msuper\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbuild\u001b[49m\u001b[43m(\u001b[49m\u001b[43minputs_shape\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 117\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mtie_weights:\n\u001b[1;32m 118\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mreverse_embeddings \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39madd_weight(\n\u001b[1;32m 119\u001b[0m name\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mreverse_embeddings\u001b[39m\u001b[38;5;124m\"\u001b[39m,\n\u001b[1;32m 120\u001b[0m shape\u001b[38;5;241m=\u001b[39m(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39moutput_dim, \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39minput_dim),\n\u001b[1;32m 121\u001b[0m initializer\u001b[38;5;241m=\u001b[39m\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39membeddings_initializer,\n\u001b[1;32m 122\u001b[0m dtype\u001b[38;5;241m=\u001b[39m\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdtype,\n\u001b[1;32m 123\u001b[0m )\n",
94
+ "\u001b[0;31mResourceExhaustedError\u001b[0m: {{function_node __wrapped__StatelessRandomNormalV2_device_/job:localhost/replica:0/task:0/device:GPU:0}} OOM when allocating tensor with shape[256000,2048] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:StatelessRandomNormalV2]"
95
+ ]
96
+ }
97
+ ],
98
+ "source": [
99
+ "# Set Kaggle API credentials\n",
100
+ "import os\n",
101
+ "os.environ[\"KAGGLE_USERNAME\"] = \"rogerkorantenng\"\n",
102
+ "os.environ[\"KAGGLE_KEY\"] = \"9a33b6e88bcb6058b1281d777fa6808d\"\n",
103
+ "import keras_nlp\n",
104
+ "gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset(\"gemma_2b_en\")\n",
105
+ "gemma_lm.generate(\"Keras is a\", max_length=30)"
106
+ ]
107
+ },
108
+ {
109
+ "cell_type": "code",
110
+ "execution_count": null,
111
+ "id": "e5488db1-48b2-4d0c-a4cd-1329a333b3f9",
112
+ "metadata": {},
113
+ "outputs": [],
114
+ "source": []
115
+ }
116
+ ],
117
+ "metadata": {
118
+ "kernelspec": {
119
+ "display_name": "Python 3 (ipykernel)",
120
+ "language": "python",
121
+ "name": "python3"
122
+ },
123
+ "language_info": {
124
+ "codemirror_mode": {
125
+ "name": "ipython",
126
+ "version": 3
127
+ },
128
+ "file_extension": ".py",
129
+ "mimetype": "text/x-python",
130
+ "name": "python",
131
+ "nbconvert_exporter": "python",
132
+ "pygments_lexer": "ipython3",
133
+ "version": "3.10.12"
134
+ }
135
+ },
136
+ "nbformat": 4,
137
+ "nbformat_minor": 5
138
+ }
Untitled1.ipynb ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 2,
6
+ "id": "be79942c-2080-452a-a349-ea19b7611c85",
7
+ "metadata": {},
8
+ "outputs": [
9
+ {
10
+ "name": "stderr",
11
+ "output_type": "stream",
12
+ "text": [
13
+ "2024-09-03 17:04:36.485116: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.\n",
14
+ "2024-09-03 17:04:36.500874: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered\n",
15
+ "2024-09-03 17:04:36.519822: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered\n",
16
+ "2024-09-03 17:04:36.525533: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered\n",
17
+ "2024-09-03 17:04:36.539736: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\n",
18
+ "To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n"
19
+ ]
20
+ },
21
+ {
22
+ "data": {
23
+ "text/plain": [
24
+ "'3.5.0'"
25
+ ]
26
+ },
27
+ "execution_count": 2,
28
+ "metadata": {},
29
+ "output_type": "execute_result"
30
+ }
31
+ ],
32
+ "source": [
33
+ "import keras\n",
34
+ "keras.__version__"
35
+ ]
36
+ },
37
+ {
38
+ "cell_type": "code",
39
+ "execution_count": 3,
40
+ "id": "00e538b8-91c1-401b-9cc9-320fc496a828",
41
+ "metadata": {},
42
+ "outputs": [
43
+ {
44
+ "name": "stdout",
45
+ "output_type": "stream",
46
+ "text": [
47
+ "Name: keras\n",
48
+ "Version: 3.5.0\n",
49
+ "Summary: Multi-backend Keras.\n",
50
+ "Home-page: https://github.com/keras-team/keras\n",
51
+ "Author: Keras team\n",
52
+ "Author-email: keras-users@googlegroups.com\n",
53
+ "License: Apache License 2.0\n",
54
+ "Location: /home/workbench/.local/lib/python3.10/site-packages\n",
55
+ "Requires: absl-py, h5py, ml-dtypes, namex, numpy, optree, packaging, rich\n",
56
+ "Required-by: tensorflow\n"
57
+ ]
58
+ }
59
+ ],
60
+ "source": [
61
+ "!pip show keras\n"
62
+ ]
63
+ },
64
+ {
65
+ "cell_type": "code",
66
+ "execution_count": null,
67
+ "id": "85cff0f4-2cb7-4c20-ba0c-a1c6198bf8dd",
68
+ "metadata": {},
69
+ "outputs": [],
70
+ "source": []
71
+ }
72
+ ],
73
+ "metadata": {
74
+ "kernelspec": {
75
+ "display_name": "Python 3 (ipykernel)",
76
+ "language": "python",
77
+ "name": "python3"
78
+ },
79
+ "language_info": {
80
+ "codemirror_mode": {
81
+ "name": "ipython",
82
+ "version": 3
83
+ },
84
+ "file_extension": ".py",
85
+ "mimetype": "text/x-python",
86
+ "name": "python",
87
+ "nbconvert_exporter": "python",
88
+ "pygments_lexer": "ipython3",
89
+ "version": "3.10.12"
90
+ }
91
+ },
92
+ "nbformat": 4,
93
+ "nbformat_minor": 5
94
+ }
chatbot/.env ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ GPT4_API_KEY=XTtd4wdC0mlEdbQqp0KWOe9aWrlg4z4HurxXlgBf5ASUvrB9iPkwJQQJ99AKACHYHv6XJ3w3AAABACOGaeSg
2
+ GPT4O_API_KEY=XTtd4wdC0mlEdbQqp0KWOe9aWrlg4z4HurxXlgBf5ASUvrB9iPkwJQQJ99AKACHYHv6XJ3w3AAABACOGaeSg
3
+ GPT35_TURBO_API_KEY=cfe75d17d47c40318db6230d2677d84f
4
+ GPT4_32K_API_KEY=AebO97c0w0lnrppFNpbuPVueKth7HfGOdMbKeaX5R2Uql6wvJA5qJQQJ99AKACL93NaXJ3w3AAABACOGh8MD
chatbot/.gradio/certificate.pem ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ -----BEGIN CERTIFICATE-----
2
+ MIIFazCCA1OgAwIBAgIRAIIQz7DSQONZRGPgu2OCiwAwDQYJKoZIhvcNAQELBQAw
3
+ TzELMAkGA1UEBhMCVVMxKTAnBgNVBAoTIEludGVybmV0IFNlY3VyaXR5IFJlc2Vh
4
+ cmNoIEdyb3VwMRUwEwYDVQQDEwxJU1JHIFJvb3QgWDEwHhcNMTUwNjA0MTEwNDM4
5
+ WhcNMzUwNjA0MTEwNDM4WjBPMQswCQYDVQQGEwJVUzEpMCcGA1UEChMgSW50ZXJu
6
+ ZXQgU2VjdXJpdHkgUmVzZWFyY2ggR3JvdXAxFTATBgNVBAMTDElTUkcgUm9vdCBY
7
+ MTCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIBAK3oJHP0FDfzm54rVygc
8
+ h77ct984kIxuPOZXoHj3dcKi/vVqbvYATyjb3miGbESTtrFj/RQSa78f0uoxmyF+
9
+ 0TM8ukj13Xnfs7j/EvEhmkvBioZxaUpmZmyPfjxwv60pIgbz5MDmgK7iS4+3mX6U
10
+ A5/TR5d8mUgjU+g4rk8Kb4Mu0UlXjIB0ttov0DiNewNwIRt18jA8+o+u3dpjq+sW
11
+ T8KOEUt+zwvo/7V3LvSye0rgTBIlDHCNAymg4VMk7BPZ7hm/ELNKjD+Jo2FR3qyH
12
+ B5T0Y3HsLuJvW5iB4YlcNHlsdu87kGJ55tukmi8mxdAQ4Q7e2RCOFvu396j3x+UC
13
+ B5iPNgiV5+I3lg02dZ77DnKxHZu8A/lJBdiB3QW0KtZB6awBdpUKD9jf1b0SHzUv
14
+ KBds0pjBqAlkd25HN7rOrFleaJ1/ctaJxQZBKT5ZPt0m9STJEadao0xAH0ahmbWn
15
+ OlFuhjuefXKnEgV4We0+UXgVCwOPjdAvBbI+e0ocS3MFEvzG6uBQE3xDk3SzynTn
16
+ jh8BCNAw1FtxNrQHusEwMFxIt4I7mKZ9YIqioymCzLq9gwQbooMDQaHWBfEbwrbw
17
+ qHyGO0aoSCqI3Haadr8faqU9GY/rOPNk3sgrDQoo//fb4hVC1CLQJ13hef4Y53CI
18
+ rU7m2Ys6xt0nUW7/vGT1M0NPAgMBAAGjQjBAMA4GA1UdDwEB/wQEAwIBBjAPBgNV
19
+ HRMBAf8EBTADAQH/MB0GA1UdDgQWBBR5tFnme7bl5AFzgAiIyBpY9umbbjANBgkq
20
+ hkiG9w0BAQsFAAOCAgEAVR9YqbyyqFDQDLHYGmkgJykIrGF1XIpu+ILlaS/V9lZL
21
+ ubhzEFnTIZd+50xx+7LSYK05qAvqFyFWhfFQDlnrzuBZ6brJFe+GnY+EgPbk6ZGQ
22
+ 3BebYhtF8GaV0nxvwuo77x/Py9auJ/GpsMiu/X1+mvoiBOv/2X/qkSsisRcOj/KK
23
+ NFtY2PwByVS5uCbMiogziUwthDyC3+6WVwW6LLv3xLfHTjuCvjHIInNzktHCgKQ5
24
+ ORAzI4JMPJ+GslWYHb4phowim57iaztXOoJwTdwJx4nLCgdNbOhdjsnvzqvHu7Ur
25
+ TkXWStAmzOVyyghqpZXjFaH3pO3JLF+l+/+sKAIuvtd7u+Nxe5AW0wdeRlN8NwdC
26
+ jNPElpzVmbUq4JUagEiuTDkHzsxHpFKVK7q4+63SM1N95R1NbdWhscdCb+ZAJzVc
27
+ oyi3B43njTOQ5yOf+1CceWxG1bQVs5ZufpsMljq4Ui0/1lvh+wjChP4kqKOJ2qxq
28
+ 4RgqsahDYVvTH9w7jXbyLeiNdd8XM2w9U/t7y0Ff/9yi0GE44Za4rF2LN9d11TPA
29
+ mRGunUHBcnWEvgJBQl9nJEiU0Zsnvgc/ubhPgXRR4Xq37Z0j4r7g1SgEEzwxA57d
30
+ emyPxgcYxn/eR44/KJ4EBs+lVDR3veyJm+kXQ99b21/+jh5Xos1AnX5iItreGCc=
31
+ -----END CERTIFICATE-----
chatbot/README.md ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ ---
2
+ title: chatbot
3
+ app_file: main.py
4
+ sdk: gradio
5
+ sdk_version: 4.42.0
6
+ ---
chatbot/app.py ADDED
@@ -0,0 +1,157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+ import os
3
+ import requests
4
+ from dotenv import load_dotenv
5
+
6
+ # Load environment variables from .env file
7
+ load_dotenv()
8
+
9
+ # Define endpoints and securely load API keys from environment variables
10
+ model_config = {
11
+ "gpt-4": {
12
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2024-08-01-preview",
13
+ "api_key": os.getenv("GPT4_API_KEY")
14
+ },
15
+ "gpt-4o": {
16
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-preview",
17
+ "api_key": os.getenv("GPT4O_API_KEY")
18
+ },
19
+ "gpt-35-turbo": {
20
+ "endpoint": "https://rogerkoranteng.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2024-08-01-preview",
21
+ "api_key": os.getenv("GPT35_TURBO_API_KEY")
22
+ },
23
+ "gpt-4-32k": {
24
+ "endpoint": "https://roger-m38orjxq-australiaeast.openai.azure.com/openai/deployments/gpt-4-32k/chat/completions?api-version=2024-08-01-preview",
25
+ "api_key": os.getenv("GPT4_32K_API_KEY")
26
+ }
27
+ }
28
+
29
+
30
+ # Function to generate a response from Azure OpenAI
31
+ def generate_response(model_choice, user_message, chat_history):
32
+ model_info = model_config.get(model_choice)
33
+
34
+ if not model_info:
35
+ return "Invalid model selection. Please choose a valid model.", chat_history
36
+
37
+ chat_history.append({"role": "user", "content": user_message})
38
+
39
+ headers = {
40
+ "Content-Type": "application/json",
41
+ "api-key": model_info['api_key'],
42
+ }
43
+
44
+ data = {
45
+ "messages": chat_history,
46
+ "max_tokens": 150,
47
+ "temperature": 0.7,
48
+ }
49
+
50
+ try:
51
+ response = requests.post(
52
+ model_info["endpoint"],
53
+ headers=headers,
54
+ json=data
55
+ )
56
+
57
+ response.raise_for_status()
58
+ response_data = response.json()
59
+
60
+ assistant_message = response_data['choices'][0]['message']['content']
61
+ chat_history.append({"role": "assistant", "content": assistant_message})
62
+
63
+ return assistant_message, chat_history
64
+
65
+ except requests.exceptions.RequestException as e:
66
+ return f"Error: {e}", chat_history
67
+
68
+
69
+ # Function to format chat history with improved readability
70
+ def format_chat_history(history):
71
+ formatted_history = ""
72
+ for message in history:
73
+ role = "User:" if message["role"] == "user" else "Sage:"
74
+ formatted_history += f"{role} {message['content']}\n\n" # Message content with a line break
75
+ return formatted_history.strip() # Remove any trailing whitespace
76
+
77
+
78
+ # List of available models
79
+ azure_models = [
80
+ "gpt-4",
81
+ "gpt-4o",
82
+ "gpt-35-turbo",
83
+ "gpt-4-32k"
84
+ ]
85
+
86
+
87
+ # Function to handle model change status update
88
+ def change_model(model_choice):
89
+ return f"Selected model: {model_choice}"
90
+
91
+
92
+ # Create the Gradio interface
93
+ with gr.Blocks() as interface:
94
+ gr.Markdown("## Sage - Your Mental Health Advisor")
95
+
96
+ with gr.Tab("Model Selection"):
97
+ gr.Markdown("### Select Model for Chat")
98
+ model_dropdown = gr.Dropdown(
99
+ choices=azure_models,
100
+ label="Choose a Model",
101
+ value=azure_models[0], # Default model
102
+ interactive=True
103
+ )
104
+
105
+ # Add status update on model change
106
+ status_textbox = gr.Textbox(label="Model Selection Status", value="Selected model: gpt-4", interactive=False)
107
+ model_dropdown.change(change_model, inputs=model_dropdown, outputs=status_textbox)
108
+
109
+ gr.Markdown("The selected model will be used for chat interaction.")
110
+
111
+ with gr.Tab("Chat Interface"):
112
+ gr.Markdown("### Chat with Sage - Your Mental Health Advisor")
113
+ chat_history_state = gr.State([]) # Store chat history
114
+ model_choice_state = gr.State(azure_models[0]) # Default model
115
+ user_message = gr.Textbox(
116
+ placeholder="Hello, I am Sage. How can I assist you today?",
117
+ label="Your Message",
118
+ lines=2,
119
+ scale=7
120
+ )
121
+
122
+ model_dropdown.change(
123
+ lambda x: x, inputs=model_dropdown, outputs=model_choice_state
124
+ )
125
+
126
+ assistant_response = gr.Textbox(label="Assistant's Response")
127
+ submit_button = gr.Button("Send Message")
128
+
129
+ submit_button.click(
130
+ generate_response,
131
+ inputs=[model_choice_state, user_message, chat_history_state],
132
+ outputs=[assistant_response, chat_history_state]
133
+ )
134
+
135
+ # Chat History Tab
136
+ with gr.Tab("Chat History"):
137
+ gr.Markdown("### Chat History")
138
+
139
+ # Add the fetch history button inside the "Chat History" tab
140
+ fetch_history_button = gr.Button("Fetch Chat History")
141
+ chat_history_display = gr.Textbox(label="Chat History", interactive=False, lines=10)
142
+
143
+
144
+ # Function to fetch and display chat history
145
+ def fetch_chat_history(chat_history):
146
+ return format_chat_history(chat_history)
147
+
148
+
149
+ # Bind the fetch history button to fetch the chat history
150
+ fetch_history_button.click(
151
+ fetch_chat_history,
152
+ inputs=chat_history_state,
153
+ outputs=chat_history_display
154
+ )
155
+
156
+ # Launch the Gradio app
157
+ interface.launch(server_name="0.0.0.0", server_port=8080, share=True)
chatbot/app2.pyn ADDED
@@ -0,0 +1,226 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import gradio as gr
3
+ import pandas as pd
4
+ import numpy as np
5
+ import chromadb
6
+ from chromadb.config import Settings
7
+ from io import StringIO
8
+ from sentence_transformers import SentenceTransformer
9
+ import plotly.express as px
10
+ from sklearn.manifold import TSNE
11
+
12
+
13
+ # Constants for Model Configuration
14
+ MODEL_CONFIG = {
15
+ "gpt-4": {
16
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2024-08-01-preview",
17
+ "api_key": os.getenv("GPT4_API_KEY")
18
+ },
19
+ "gpt-4o": {
20
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-preview",
21
+ "api_key": os.getenv("GPT4O_API_KEY")
22
+ },
23
+ "gpt-35-turbo": {
24
+ "endpoint": "https://rogerkoranteng.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2024-08-01-preview",
25
+ "api_key": os.getenv("GPT35_TURBO_API_KEY")
26
+ },
27
+ "gpt-4-32k": {
28
+ "endpoint": "https://roger-m38orjxq-australiaeast.openai.azure.com/openai/deployments/gpt-4-32k/chat/completions?api-version=2024-08-01-preview",
29
+ "api_key": os.getenv("GPT4_32K_API_KEY")
30
+ }
31
+ }
32
+
33
+ # Initialize Chroma client with DuckDB and Parquet for persistence
34
+ chroma_client = chromadb.Client()
35
+
36
+
37
+ # Functions for Data Processing and Embedding
38
+
39
+ def process_csv_text(temp_file):
40
+ """Process the uploaded CSV and return the dataframe and column options."""
41
+ if isinstance(temp_file, str):
42
+ df = pd.read_csv(StringIO(temp_file))
43
+ else:
44
+ df = pd.read_csv(temp_file.name, header='infer', sep=',')
45
+ return df, gr.Dropdown.update(choices=list(df.columns))
46
+
47
+
48
+ def insert_or_update_chroma(col, table, model_name, similarity_metric):
49
+ """Insert or update embeddings in ChromaDB."""
50
+ try:
51
+ collection = chroma_client.create_collection(
52
+ name="my_collection",
53
+ embedding_function=SentenceTransformer(model_name),
54
+ metadata={"hnsw:space": similarity_metric}
55
+ )
56
+ except Exception:
57
+ print("Collection exists, deleting it")
58
+ chroma_client.delete_collection(name='my_collection')
59
+ collection = chroma_client.create_collection(
60
+ name="my_collection",
61
+ embedding_function=SentenceTransformer(model_name),
62
+ metadata={"hnsw:space": similarity_metric}
63
+ )
64
+
65
+ if collection:
66
+ try:
67
+ collection.add(
68
+ documents=list(table[col]),
69
+ metadatas=[{"source": i} for i in range(len(table))],
70
+ ids=[str(i + 1) for i in range(len(table))]
71
+ )
72
+ return "Embedding calculations and insertions successful"
73
+ except Exception as e:
74
+ return f"Error in embedding calculations: {e}"
75
+
76
+
77
+ def show_fig():
78
+ """Show t-SNE 2D plot for embeddings."""
79
+ collection = chroma_client.get_collection(name="my_collection")
80
+ embeddings = collection.get(include=['embeddings', 'documents'])
81
+
82
+ df = pd.DataFrame({
83
+ 'text': embeddings['documents'],
84
+ 'embedding': embeddings['embeddings']
85
+ })
86
+
87
+ embeddings_np = np.array(df['embedding'].tolist())
88
+ tsne = TSNE(n_components=2, random_state=42)
89
+ transformed = tsne.fit_transform(embeddings_np)
90
+
91
+ df['tsne_x'] = transformed[:, 0]
92
+ df['tsne_y'] = transformed[:, 1]
93
+
94
+ fig = px.scatter(df, x='tsne_x', y='tsne_y', hover_name='text')
95
+ return fig, transformed
96
+
97
+
98
+ def show_test_string_fig(test_string, tsne, model_name, similarity_metric):
99
+ """Show t-SNE plot with test string to compare embeddings."""
100
+ collection = chroma_client.get_collection(name="my_collection",
101
+ embedding_function=SentenceTransformer(model_name))
102
+
103
+ collection.add(
104
+ documents=[test_string],
105
+ metadatas=[{"source": 'test'}],
106
+ ids=['test_sample']
107
+ )
108
+
109
+ embeddings = collection.get(include=['embeddings', 'documents'])
110
+ df = pd.DataFrame({
111
+ 'text': embeddings['documents'],
112
+ 'embedding': embeddings['embeddings'],
113
+ 'set': ['orig' if document != test_string else 'test_string' for document in embeddings["documents"]]
114
+ })
115
+
116
+ embeddings_np = np.array(df['embedding'].tolist())
117
+ transformed = tsne.transform(embeddings_np)
118
+
119
+ df['tsne_x'] = transformed[:, 0]
120
+ df['tsne_y'] = transformed[:, 1]
121
+
122
+ fig = px.scatter(df, x='tsne_x', y='tsne_y', hover_name='text', color='set')
123
+ return fig, tsne
124
+
125
+
126
+ def ask_gpt(message, messages_history, embedding_model, system_prompt, temperature, max_tokens, chatgpt_model):
127
+ """Interacts with the OpenAI API using Azure endpoint."""
128
+ if len(messages_history) < 1:
129
+ messages_history = [{"role": "system", "content": system_prompt}]
130
+
131
+ model_info = MODEL_CONFIG[chatgpt_model]
132
+ headers = {"Content-Type": "application/json", "api-key": model_info["api_key"]}
133
+
134
+ message = retrieve_similar(message, embedding_model)
135
+ messages_history += [{"role": "user", "content": message}]
136
+
137
+ response = openai.ChatCompletion.create(
138
+ model=chatgpt_model,
139
+ messages=messages_history,
140
+ temperature=temperature,
141
+ max_tokens=max_tokens
142
+ )
143
+
144
+ return response['choices'][0]['message']['content'], messages_history
145
+
146
+
147
+ def retrieve_similar(prompt, embedding_model):
148
+ """Retrieve similar documents from ChromaDB to enhance context."""
149
+ # Initialize SentenceTransformer correctly
150
+ embedding_function = SentenceTransformer(embedding_model)
151
+
152
+ collection = chroma_client.get_collection(
153
+ name="my_collection",
154
+ embedding_function=embedding_function
155
+ )
156
+
157
+ results = collection.query(query_texts=prompt, n_results=10)
158
+ additional_context = ''
159
+ for i, document in enumerate(results['documents'][0]):
160
+ additional_context += f'{i + 1}. {document}\n'
161
+
162
+ return additional_context + f'Question: {prompt}'
163
+
164
+
165
+
166
+ # Gradio Interface Setup
167
+
168
+ def build_gradio_ui():
169
+ """Setup the complete Gradio UI."""
170
+ with gr.Blocks() as demo:
171
+ # Tab 1: Upload CSV and Display Data
172
+ with gr.Tab("Upload data"):
173
+ upload_button = gr.File(label="Upload CSV", file_types=['.csv'], file_count="single")
174
+ table = gr.Dataframe(type="pandas", interactive=True)
175
+ cols = gr.Dropdown(choices=[], label='Dataframe columns')
176
+
177
+ upload_button.change(fn=process_csv_text, inputs=upload_button, outputs=[table, cols])
178
+
179
+ # Tab 2: ChromaDB, Embeddings, and Plotting
180
+ with gr.Tab("ChromaDB and Embeddings"):
181
+ cols = gr.Dropdown(choices=[], label='Dataframe columns')
182
+ embedding_model = gr.Dropdown(value='all-MiniLM-L6-v2',
183
+ choices=['all-MiniLM-L6-v2', 'intfloat/e5-small-v2', 'intfloat/e5-base-v2',
184
+ 'intfloat/e5-large-v2', 'paraphrase-multilingual-MiniLM-L12-v2'],
185
+ label='Embedding Model')
186
+ similarity_metric = gr.Dropdown(value='cosine', choices=['cosine', 'l2'], label='Similarity Metric')
187
+ embedding_button = gr.Button(value="Insert or Update Embeddings")
188
+ text = gr.Textbox(label='Process Status')
189
+
190
+ show_embeddings_button = gr.Button(value="Show Embeddings")
191
+ embeddings_plot = gr.Plot()
192
+
193
+ tsne = gr.State(value=None) # Using gr.State for intermediate results (tsne)
194
+ test_string = gr.Textbox(label='Test String')
195
+
196
+ calculate_2d_repr_button = gr.Button(value="Calculate 2D Representation")
197
+ embeddings_plot_with_text_string = gr.Plot()
198
+
199
+ embedding_button.click(insert_or_update_chroma, inputs=[cols, table, embedding_model, similarity_metric], outputs=[text])
200
+ show_embeddings_button.click(show_fig, inputs=[], outputs=[embeddings_plot, tsne])
201
+ calculate_2d_repr_button.click(show_test_string_fig, inputs=[test_string, tsne, embedding_model, similarity_metric], outputs=[embeddings_plot_with_text_string, tsne])
202
+
203
+ # Tab 3: Chat with GPT
204
+ with gr.Tab("Chat"):
205
+ system_prompt = gr.Textbox(value="You are a helpful assistant.", label="System Message")
206
+ chatgpt_model = gr.Dropdown(value="gpt-4", choices=list(MODEL_CONFIG.keys()), label="ChatGPT Model")
207
+ temperature = gr.Slider(minimum=0, maximum=2, step=0.1, value=0.7, label="Temperature")
208
+ max_tokens = gr.Slider(minimum=50, maximum=2000, step=50, value=300, label="Max Tokens")
209
+ chatbot = gr.Chatbot(label="ChatGPT Chat")
210
+ clear_button = gr.Button("Clear Chat History")
211
+ msg = gr.Textbox()
212
+ msg_log = gr.Textbox("Message history", label="History")
213
+
214
+ # Replacing `.submit()` with `.change()` to trigger callback when user enters a message
215
+ msg.submit(fn=ask_gpt, inputs=[msg, chatbot, system_prompt, embedding_model, temperature, max_tokens, chatgpt_model], outputs=[msg, chatbot])
216
+ clear_button.click(fn=lambda: None, inputs=None, outputs=[chatbot])
217
+
218
+ return demo
219
+
220
+ # Launch the Gradio interface
221
+ demo = build_gradio_ui()
222
+ demo.launch(server_name="0.0.0.0", server_port=8080, share=True)
223
+
224
+
225
+
226
+ # Launch the Gradio interf
chatbot/app2.pynnn ADDED
@@ -0,0 +1,213 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import gradio as gr
3
+ import pandas as pd
4
+ import numpy as np
5
+ import chromadb
6
+ from chromadb.config import Settings
7
+ from io import StringIO
8
+ from sentence_transformers import SentenceTransformer
9
+ import openai
10
+ import plotly.express as px
11
+ from sklearn.manifold import TSNE
12
+
13
+ # Initialize Chroma client with DuckDB and Parquet for persistence
14
+ chroma_client = chromadb.Client(Settings(
15
+ chroma_db_impl="duckdb+parquet",
16
+ persist_directory="./chroma_db"
17
+ ))
18
+
19
+ # Model Configuration for Dynamic Dropdown
20
+ model_config = {
21
+ "gpt-4": {
22
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2024-08-01-preview",
23
+ "api_key": os.getenv("GPT4_API_KEY")
24
+ },
25
+ "gpt-4o": {
26
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-preview",
27
+ "api_key": os.getenv("GPT4O_API_KEY")
28
+ },
29
+ "gpt-35-turbo": {
30
+ "endpoint": "https://rogerkoranteng.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2024-08-01-preview",
31
+ "api_key": os.getenv("GPT35_TURBO_API_KEY")
32
+ },
33
+ "gpt-4-32k": {
34
+ "endpoint": "https://roger-m38orjxq-australiaeast.openai.azure.com/openai/deployments/gpt-4-32k/chat/completions?api-version=2024-08-01-preview",
35
+ "api_key": os.getenv("GPT4_32K_API_KEY")
36
+ }
37
+ }
38
+
39
+ # Function to process uploaded CSV
40
+ def process_csv_text(temp_file):
41
+ if isinstance(temp_file, str):
42
+ df = pd.read_csv(StringIO(temp_file))
43
+ else:
44
+ df = pd.read_csv(temp_file.name, header='infer', sep=',')
45
+ return df, gr.Dropdown.update(choices=list(df.columns))
46
+
47
+ # Insert or update ChromaDB with embeddings
48
+ def insert_or_update_chroma(col, table, model_name, similarity_metric, client=chroma_client):
49
+ try:
50
+ collection = client.create_collection(name="my_collection",
51
+ embedding_function=SentenceTransformer(model_name),
52
+ metadata={"hnsw:space": similarity_metric})
53
+ except Exception as e:
54
+ print("Collection exists, deleting it")
55
+ client.delete_collection(name='my_collection')
56
+ collection = client.create_collection(name="my_collection",
57
+ embedding_function=SentenceTransformer(model_name),
58
+ metadata={"hnsw:space": similarity_metric})
59
+ if collection:
60
+ try:
61
+ collection.add(
62
+ documents=list(table[col]),
63
+ metadatas=[{"source": i} for i in range(len(table))],
64
+ ids=[str(i + 1) for i in range(len(table))]
65
+ )
66
+ return "Embedding calculations and insertions successful"
67
+ except Exception as e:
68
+ return "Error in embedding calculations"
69
+
70
+ # Show plot with embeddings using t-SNE
71
+ def show_fig():
72
+ collection = chroma_client.get_collection(name="my_collection")
73
+ embeddings = collection.get(include=['embeddings', 'documents'])
74
+
75
+ df = pd.DataFrame({
76
+ 'text': embeddings['documents'],
77
+ 'embedding': embeddings['embeddings']
78
+ })
79
+
80
+ embeddings_np = np.array(df['embedding'].tolist())
81
+ tsne = TSNE(n_components=2, random_state=42)
82
+ transformed = tsne.fit_transform(embeddings_np)
83
+
84
+ df['tsne_x'] = transformed[:, 0]
85
+ df['tsne_y'] = transformed[:, 1]
86
+
87
+ fig = px.scatter(df, x='tsne_x', y='tsne_y', hover_name='text')
88
+ return fig, transformed
89
+
90
+ # Show test string figure
91
+ def show_test_string_fig(test_string, tsne, model_name, similarity_metric):
92
+ collection = chroma_client.get_collection(name="my_collection",
93
+ embedding_function=SentenceTransformer(model_name))
94
+
95
+ collection.add(
96
+ documents=[test_string],
97
+ metadatas=[{"source": 'test'}],
98
+ ids=['test_sample']
99
+ )
100
+
101
+ embeddings = collection.get(include=['embeddings', 'documents'])
102
+
103
+ df = pd.DataFrame({
104
+ 'text': embeddings['documents'],
105
+ 'embedding': embeddings['embeddings'],
106
+ 'set': ['orig' if document != test_string else 'test_string' for document in embeddings["documents"]]
107
+ })
108
+
109
+ embeddings_np = np.array(df['embedding'].tolist())
110
+ transformed = tsne.transform(embeddings_np)
111
+
112
+ df['tsne_x'] = transformed[:, 0]
113
+ df['tsne_y'] = transformed[:, 1]
114
+
115
+ fig = px.scatter(df, x='tsne_x', y='tsne_y', hover_name='text', color='set')
116
+ return fig, tsne
117
+
118
+ # Function to interact with OpenAI's Azure API
119
+ def ask_gpt(message, messages_history, embedding_model, system_prompt, temperature, max_tokens, chatgpt_model):
120
+ if len(messages_history) < 1:
121
+ messages_history = [{"role": "system", "content": system_prompt}]
122
+ model_info = model_config[chatgpt_model]
123
+ headers = {
124
+ "Content-Type": "application/json",
125
+ "api-key": model_info["api_key"]
126
+ }
127
+
128
+ message = retrieve_similar(message, embedding_model)
129
+
130
+ messages_history += [{"role": "user", "content": message}]
131
+ response = openai.ChatCompletion.create(
132
+ model=chatgpt_model,
133
+ messages=messages_history,
134
+ temperature=temperature,
135
+ max_tokens=max_tokens
136
+ )
137
+
138
+ return response['choices'][0]['message']['content'], messages_history
139
+
140
+ # Function to retrieve similar questions from ChromaDB
141
+ def retrieve_similar(prompt, embedding_model, client=chroma_client):
142
+ collection = client.get_collection(name="my_collection", embedding_function=SentenceTransformer(model_name=embedding_model))
143
+
144
+ results = collection.query(query_texts=prompt, n_results=10)
145
+ additional_context = ''
146
+ for i, document in enumerate(results['documents'][0]):
147
+ if i == 0:
148
+ additional_context = 'Information: \n' + str(i+1) + '. ' + document
149
+ else:
150
+ additional_context += '\n' + str(i+1) + '. ' + document
151
+
152
+ prompt_with_context = additional_context + '\nQuestion: ' + prompt
153
+ return prompt_with_context
154
+
155
+ # Gradio App Setup
156
+ with gr.Blocks() as demo:
157
+ # Tab 1: Upload CSV and Display Data
158
+ with gr.Tab("Upload data"):
159
+ upload_button = gr.UploadButton(label="Upload csv", file_types=['.csv'], file_count="single")
160
+ table = gr.Dataframe(type="pandas", max_rows='20', overflow_row_behaviour='paginate', interactive=True)
161
+ cols = gr.Dropdown(choices=[], label='Dataframe columns')
162
+
163
+ upload_button.upload(fn=process_csv_text, inputs=upload_button, outputs=[table, cols], api_name="upload_csv")
164
+
165
+ # Tab 2: ChromaDB, Embeddings, and Plotting
166
+ with gr.Tab("Select Column and insert embeddings to ChromaDb"):
167
+ with gr.Row():
168
+ gr.Markdown("<br>")
169
+ with gr.Row():
170
+ cols = gr.Dropdown(choices=['text_column_1_placeholder'], label='Dataframe columns')
171
+
172
+ with gr.Row():
173
+ embedding_model = gr.Dropdown(value='all-MiniLM-L6-v2', choices=['all-MiniLM-L6-v2', 'intfloat/e5-small-v2', 'intfloat/e5-base-v2', 'intfloat/e5-large-v2','paraphrase-multilingual-MiniLM-L12-v2'], label='Embedding model to use')
174
+ similarity_metric = gr.Dropdown(value='cosine', choices=['cosine', 'l2'], label='Similarity metric to use')
175
+
176
+ with gr.Row():
177
+ embedding_button = gr.Button(value="Insert or update rows from selected column to embeddings db")
178
+ text = gr.Textbox(label='Process status for Chroma', placeholder='This will be updated once you click "Process status for Chroma"')
179
+
180
+ with gr.Row():
181
+ show_embeddings_button = gr.Button(value="Calculate 2d values from embeddings and show scatter plot")
182
+ embeddings_plot = gr.Plot()
183
+
184
+ with gr.Row():
185
+ tsne = gr.State(value=None)
186
+ test_string = gr.Textbox(label='test string to try to embed', value="Insert test string here")
187
+
188
+ with gr.Row():
189
+ calculate_2d_repr_button = gr.Button(value="See where text string is in 2d")
190
+ embeddings_plot_with_text_string = gr.Plot()
191
+
192
+ embedding_button.click(insert_or_update_chroma, inputs=[cols, table, embedding_model, similarity_metric], outputs=[text])
193
+ show_embeddings_button.click(show_fig, inputs=[], outputs=[embeddings_plot, tsne])
194
+ calculate_2d_repr_button.click(show_test_string_fig, inputs=[test_string, tsne, embedding_model, similarity_metric], outputs=[embeddings_plot_with_text_string, tsne])
195
+
196
+ # Tab 3: Chat with GPT Models
197
+ with gr.Tab("Chat"):
198
+ system_prompt = gr.Textbox(value="You are a helpful assistant.", label="System Message")
199
+ chatgpt_model = gr.Dropdown(value="gpt-4", choices=list(model_config.keys()), label="ChatGPT Model to Use")
200
+ temperature = gr.Slider(minimum=0, maximum=2, step=0.1, value=0.7, label="Temperature")
201
+ max_tokens = gr.Slider(minimum=50, maximum=2000, step=50, value=300, label="Max Tokens")
202
+ chatbot = gr.Chatbot(label="ChatGPT Chat")
203
+ clear_button = gr.Button("Clear Chat History")
204
+
205
+ msg = gr.Textbox()
206
+ msg_log = gr.Textbox("Message history will be visible here", label='Message history')
207
+
208
+ msg.submit(ask_gpt, [msg, chatbot], [msg, chatbot])
209
+ chatbot.submit(ask_gpt, [chatbot, system_prompt, embedding_model, temperature, max_tokens, chatgpt_model], [chatbot, system_prompt])
210
+ clear_button.click(fn=lambda: None, inputs=None, outputs=[chatbot])
211
+
212
+ # Launch Gradio interface
213
+ demo.launch()
chatbot/bar_chart.png ADDED
chatbot/category_breakdown.png ADDED
chatbot/chat_history.json ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "role": "assistant",
4
+ "content": "Hello, I am Sage. How can I assist you today?"
5
+ },
6
+ {
7
+ "role": "user",
8
+ "content": "Hi Sage, my boyfriend broke up with me. I'm feeling so sad. What should I do?"
9
+ },
10
+ {
11
+ "role": "assistant",
12
+ "content": "I'm sorry to hear that you're going through a tough time. Breakups can be incredibly painful, but there are ways to help you cope with the emotions and eventually move forward. Here are a few suggestions:\n\n1. **Allow Yourself to Grieve**: It's okay to feel sad, angry, or confused. Allow yourself time to process these emotions. It's an important part of healing.\n\n2. **Reach Out for Support**: Talk to friends and family who care about you. Sharing your feelings can be very therapeutic and remind you that you're not alone.\n\n3. **Stay Active**: Engaging in physical activities can help boost your mood and take your mind off the pain. Consider trying out exercises like walking, yoga, or anything you enjoy.\n\n4. **Pursue Interests**: Reconnect with hobbies and interests that you may have neglected. This can help you find joy and fulfillment in other areas of your life.\n\n5. **Avoid Contact**: It might be helpful to take a break from seeing or communicating with your ex while you heal. Constant contact can often prolong the pain.\n\n6. **Seek Professional Help**: If you find it very difficult to cope, speaking with a therapist or counselor can provide professional support and strategies to manage your feelings.\n\nRemember, it's okay to take things one day at a time. Healing is a process, and it's different for everyone."
13
+ },
14
+ {
15
+ "role": "user",
16
+ "content": "Hello, Sage. I feel like nobody likes me. What should I do?"
17
+ },
18
+ {
19
+ "role": "assistant",
20
+ "content": "Feeling like nobody likes you can be really tough, but remember, these feelings don't define your worth or your ability to be liked. Here are some steps you can take to address these feelings:\n\n1. **Reflect on Your Feelings**: Sometimes, these feelings might not reflect reality but rather our own insecurities or past experiences. Try to assess if there's any specific reason you feel this way.\n\n2. **Communicate**: Talk to people you trust about how you feel. They might provide reassurance and help you see that people do care about you.\n\n3. **Build Self-Esteem**: Engage in activities that make you feel good about yourself. This could be hobbies, sports, learning new skills, or volunteering. Accomplishments can boost your self-esteem and change how you perceive yourself.\n\n4. **Social Skills**: Sometimes, improving social skills can boost your confidence in interactions. This might include practicing small talk, being a good listener, or showing interest in others' activities.\n\n5. **Meet New People**: Sometimes the feeling of not being liked can stem from not having found the right group of friends. Try joining clubs, groups, or online communities where you can meet people with similar interests.\n\n6. **Consider Professional Help**: If these feelings are overwhelming and persistent, talking to a therapist or counselor can be very beneficial. They can help you work through these feelings and develop strategies to improve your social interactions.\n\nRemember, many people feel this way at some point in their lives. It's brave to confront these feelings and seek ways to improve your situation. You're not alone in this."
21
+ },
22
+ {
23
+ "role": "user",
24
+ "content": "Hello, I am feeling sad today, what should I do?"
25
+ },
26
+ {
27
+ "role": "assistant",
28
+ "content": "I'm sorry to hear that you're feeling sad. It's important to take care of yourself and address your emotions. Here are a few suggestions that might help improve your mood:\n\n1. **Acknowledge Your Feelings**: It's okay to feel sad. Acknowledging your emotions is the first step toward feeling better. Allow yourself some time to just feel what you're feeling without judgment.\n\n2. **Engage in a Favorite Activity**: Doing something you enjoy, such as reading a book, watching a favorite movie, or engaging in a hobby, can provide a comforting distraction and bring some joy.\n\n3. **Connect with Others**: Reach out to friends or family members who can offer support. Sometimes just talking about how you feel can lighten the burden.\n\n4. **Get Some Exercise**: Physical activity can boost endorphins, which are chemicals in the brain that can lighten your mood. Even a short walk can make a difference.\n\n5. **Listen to Music**: Music can have a powerful effect on mood. Choose songs that lift your spirits or calm and soothe you, depending on what you feel you need.\n\n6. **Practice Mindfulness or Meditation**: These practices can help you stay present and reduce negative thoughts. Even a few minutes can have a significant impact.\n\n7. **Consider Professional Help**: If sadness persists and starts to affect your daily life, it might be helpful to talk to a mental health professional who can provide guidance and support.\n\nRemember, it's perfectly okay to have days when you feel down. Taking steps to care for yourself during these times is important."
29
+ },
30
+ {
31
+ "role": "user",
32
+ "content": "okay thanks how do i contact mental health persons"
33
+ },
34
+ {
35
+ "role": "assistant",
36
+ "content": "Seeking help from a mental health professional is a positive step towards taking care of your well-being. Here\u2019s how you can get started:\n\n1. **Primary Care Doctor**: You can start by talking to your primary care doctor. They can assess your situation and may provide a referral to a mental health specialist such as a psychologist or psychiatrist.\n\n2. **Insurance Provider**: If you have health insurance, check your provider\u2019s website or call their customer service for a list of covered mental health professionals. They can guide you on how to proceed based on your plan.\n\n3. **Online Directories**: Use reputable online directories or websites that list mental health professionals. Websites like Psychology Today, GoodTherapy, or even Google can help you find therapists in your area.\n\n4. **Local Mental Health Clinics**: Look for community mental health clinics in your area. These often offer services on a sliding scale fee based on income, making them accessible to more people.\n\n5. **Telehealth Services**: Consider online therapy options, which can be more convenient and sometimes more affordable. Platforms like BetterHelp, Talkspace, or Amwell offer various online therapy services.\n\n6. **Employee Assistance Programs (EAP)**: If you are employed, check if your employer offers an Employee Assistance Program, which often includes a few free therapy sessions.\n\n7. **School Counselors**: If you are a student, your school or university likely has counselors available to help students at no extra cost.\n\nRemember, seeking help is a sign of strength, and it\u2019s important to find a professional you feel comfortable with. Don\u2019t hesitate to reach out and take that first step."
37
+ },
38
+ {
39
+ "role": "user",
40
+ "content": "Hello, I am feeling sad today, what should I do?"
41
+ },
42
+ {
43
+ "role": "assistant",
44
+ "content": "I'm sorry to hear that you're feeling sad. It's important to take care of yourself and address your emotions. Here are a few suggestions that might help improve your mood:\n\n1. **Acknowledge Your Feelings**: It's okay to feel sad. Acknowledging your emotions is the first step toward feeling better. Allow yourself some time to just feel what you're feeling without judgment.\n\n2. **Engage in a Favorite Activity**: Doing something you enjoy, such as reading a book, watching a favorite movie, or engaging in a hobby, can provide a comforting distraction and bring some joy.\n\n3. **Connect with Others**: Reach out to friends or family members who can offer support. Sometimes just talking about how you feel can lighten the burden.\n\n4. **Get Some Exercise**: Physical activity can boost endorphins, which are chemicals in the brain that can lighten your mood. Even a short walk can make a difference.\n\n5. **Listen to Music**: Music can have a powerful effect on mood. Choose songs that lift your spirits or calm and soothe you, depending on what you feel you need.\n\n6. **Practice Mindfulness or Meditation**: These practices can help you stay present and reduce negative thoughts. Even a few minutes can have a significant impact.\n\n7. **Consider Professional Help**: If sadness persists and starts to affect your daily life, it might be helpful to talk to a mental health professional who can provide guidance and support.\n\nRemember, it's perfectly okay to have days when you feel down. Taking steps to care for yourself during these times is important."
45
+ },
46
+ {
47
+ "role": "user",
48
+ "content": "I'M SORRY TO HEAR THAT YOUR FEELING SAD ITS IMPORTANT TO TAKE CARE OF YOURSELF AND ADDRESS YOUREMOTIONS ERE ARE A FU SUGGESTIONS THAT MIGHT HELP IMPROVE YOUR MOOD ONE ACKNOWLEDGE YOUR FEELINS ITS OCADE A FEEL SAD ACKNOWLEDGING YOUR EMOTIONS AS THE FIRST STEP TOWARD FEELING BETTER ALLOW YOUR SELF SOME TIME TO JUST FEEL WHAT YOUR FEELING WITHOUT JUDGMAN TWO ENGAGIN A FAVORITE ACTIVITY DOING SOMETHING YOU IN JOY SUCH AS READING A BOOK WATCHING A FVORITE MOVY OR ENGAGING IN A HOBB CAN PROVIDE A COMFORTING DISTRACTION AND BRING SOME JOY THREE CONNECT WITH OTHERS REACH OUT TO FRIENDS OUR FAMLY MEMBERS WHO CAN OFFER SUPPORT SOMETIMES JUST TALTING ABOUT HOW YOU FEEL CAN LIGHTEN THE BURDN FOUR GET SOME EXERCISE FISICAL ACTIVITY CAN BOOSED AN DORPHANS WICH OURE CHEMICULS IN THE BRAIN THAT CAN LIGHTEN YOUR MOOD EVEN A SHORT WALK CAN MAKE A DIFFERENE FIVE LISEN TO MUSIC MUSIC CAN HAVE A POWERFUL AFFECT ON MOOD CHOOSE SONGS THAT LIFT YOUR PIRITS ARE COMMAN SOOTHYOU DEPENDING ON WHAT YOU FEEL YOU NEED SIX PRACTICE MINDFULNESS OR MEDITATION VUSE PRACTICES CAN HELP YOU STAKE PRESON AND REDUCE NEGATIVE THOUGHTS EVEN A FEW MINUTES CAN HAVE US O NIFICANT IMPACT SEVEN CAN SIDER PROFESSIONAL HELP IT SADNESS PERSISTS AND STARTS TO EFFECT YOUR DAILY LIFE IT MIGHT BE HELPFUL TO TALK TO A MENTAL HELTH PROFESSIONAL WHO CAN PROVIDE GUIDANCE AND SUPORT REMEMBER ITS PERFECALY OCADE TO HAVE DAYS WHEN YOU FEEL DOWN TACING STEPS TO CARE FOR YOURSELF DURING THESE TIMES AS IMPORDANT"
49
+ },
50
+ {
51
+ "role": "assistant",
52
+ "content": "Thank you for your concern, but I am an AI language model and do not have feelings. My purpose is to provide assistance and support to those who need it. If you have any other questions or concerns, please feel free to ask."
53
+ },
54
+ {
55
+ "role": "user",
56
+ "content": "what AI model are you GPT or what"
57
+ },
58
+ {
59
+ "role": "assistant",
60
+ "content": "Yes, I am a GPT (Generative Pre-trained Transformer) AI language model developed by OpenAI. I am designed to understand natural language and generate human-like responses to various queries and prompts. My purpose is to assist and provide helpful responses to users like you."
61
+ }
62
+ ]
chatbot/chroma_migrated/chroma.sqlite3 ADDED
Binary file (98.3 kB). View file
 
chatbot/fine_tuned_model/config.json ADDED
@@ -0,0 +1,2026 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "google/vit-base-patch16-224",
3
+ "architectures": [
4
+ "ViTForImageClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "encoder_stride": 16,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.0,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "tench, Tinca tinca",
13
+ "1": "goldfish, Carassius auratus",
14
+ "2": "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias",
15
+ "3": "tiger shark, Galeocerdo cuvieri",
16
+ "4": "hammerhead, hammerhead shark",
17
+ "5": "electric ray, crampfish, numbfish, torpedo",
18
+ "6": "stingray",
19
+ "7": "cock",
20
+ "8": "hen",
21
+ "9": "ostrich, Struthio camelus",
22
+ "10": "brambling, Fringilla montifringilla",
23
+ "11": "goldfinch, Carduelis carduelis",
24
+ "12": "house finch, linnet, Carpodacus mexicanus",
25
+ "13": "junco, snowbird",
26
+ "14": "indigo bunting, indigo finch, indigo bird, Passerina cyanea",
27
+ "15": "robin, American robin, Turdus migratorius",
28
+ "16": "bulbul",
29
+ "17": "jay",
30
+ "18": "magpie",
31
+ "19": "chickadee",
32
+ "20": "water ouzel, dipper",
33
+ "21": "kite",
34
+ "22": "bald eagle, American eagle, Haliaeetus leucocephalus",
35
+ "23": "vulture",
36
+ "24": "great grey owl, great gray owl, Strix nebulosa",
37
+ "25": "European fire salamander, Salamandra salamandra",
38
+ "26": "common newt, Triturus vulgaris",
39
+ "27": "eft",
40
+ "28": "spotted salamander, Ambystoma maculatum",
41
+ "29": "axolotl, mud puppy, Ambystoma mexicanum",
42
+ "30": "bullfrog, Rana catesbeiana",
43
+ "31": "tree frog, tree-frog",
44
+ "32": "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui",
45
+ "33": "loggerhead, loggerhead turtle, Caretta caretta",
46
+ "34": "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea",
47
+ "35": "mud turtle",
48
+ "36": "terrapin",
49
+ "37": "box turtle, box tortoise",
50
+ "38": "banded gecko",
51
+ "39": "common iguana, iguana, Iguana iguana",
52
+ "40": "American chameleon, anole, Anolis carolinensis",
53
+ "41": "whiptail, whiptail lizard",
54
+ "42": "agama",
55
+ "43": "frilled lizard, Chlamydosaurus kingi",
56
+ "44": "alligator lizard",
57
+ "45": "Gila monster, Heloderma suspectum",
58
+ "46": "green lizard, Lacerta viridis",
59
+ "47": "African chameleon, Chamaeleo chamaeleon",
60
+ "48": "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis",
61
+ "49": "African crocodile, Nile crocodile, Crocodylus niloticus",
62
+ "50": "American alligator, Alligator mississipiensis",
63
+ "51": "triceratops",
64
+ "52": "thunder snake, worm snake, Carphophis amoenus",
65
+ "53": "ringneck snake, ring-necked snake, ring snake",
66
+ "54": "hognose snake, puff adder, sand viper",
67
+ "55": "green snake, grass snake",
68
+ "56": "king snake, kingsnake",
69
+ "57": "garter snake, grass snake",
70
+ "58": "water snake",
71
+ "59": "vine snake",
72
+ "60": "night snake, Hypsiglena torquata",
73
+ "61": "boa constrictor, Constrictor constrictor",
74
+ "62": "rock python, rock snake, Python sebae",
75
+ "63": "Indian cobra, Naja naja",
76
+ "64": "green mamba",
77
+ "65": "sea snake",
78
+ "66": "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus",
79
+ "67": "diamondback, diamondback rattlesnake, Crotalus adamanteus",
80
+ "68": "sidewinder, horned rattlesnake, Crotalus cerastes",
81
+ "69": "trilobite",
82
+ "70": "harvestman, daddy longlegs, Phalangium opilio",
83
+ "71": "scorpion",
84
+ "72": "black and gold garden spider, Argiope aurantia",
85
+ "73": "barn spider, Araneus cavaticus",
86
+ "74": "garden spider, Aranea diademata",
87
+ "75": "black widow, Latrodectus mactans",
88
+ "76": "tarantula",
89
+ "77": "wolf spider, hunting spider",
90
+ "78": "tick",
91
+ "79": "centipede",
92
+ "80": "black grouse",
93
+ "81": "ptarmigan",
94
+ "82": "ruffed grouse, partridge, Bonasa umbellus",
95
+ "83": "prairie chicken, prairie grouse, prairie fowl",
96
+ "84": "peacock",
97
+ "85": "quail",
98
+ "86": "partridge",
99
+ "87": "African grey, African gray, Psittacus erithacus",
100
+ "88": "macaw",
101
+ "89": "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita",
102
+ "90": "lorikeet",
103
+ "91": "coucal",
104
+ "92": "bee eater",
105
+ "93": "hornbill",
106
+ "94": "hummingbird",
107
+ "95": "jacamar",
108
+ "96": "toucan",
109
+ "97": "drake",
110
+ "98": "red-breasted merganser, Mergus serrator",
111
+ "99": "goose",
112
+ "100": "black swan, Cygnus atratus",
113
+ "101": "tusker",
114
+ "102": "echidna, spiny anteater, anteater",
115
+ "103": "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus",
116
+ "104": "wallaby, brush kangaroo",
117
+ "105": "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus",
118
+ "106": "wombat",
119
+ "107": "jellyfish",
120
+ "108": "sea anemone, anemone",
121
+ "109": "brain coral",
122
+ "110": "flatworm, platyhelminth",
123
+ "111": "nematode, nematode worm, roundworm",
124
+ "112": "conch",
125
+ "113": "snail",
126
+ "114": "slug",
127
+ "115": "sea slug, nudibranch",
128
+ "116": "chiton, coat-of-mail shell, sea cradle, polyplacophore",
129
+ "117": "chambered nautilus, pearly nautilus, nautilus",
130
+ "118": "Dungeness crab, Cancer magister",
131
+ "119": "rock crab, Cancer irroratus",
132
+ "120": "fiddler crab",
133
+ "121": "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica",
134
+ "122": "American lobster, Northern lobster, Maine lobster, Homarus americanus",
135
+ "123": "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
136
+ "124": "crayfish, crawfish, crawdad, crawdaddy",
137
+ "125": "hermit crab",
138
+ "126": "isopod",
139
+ "127": "white stork, Ciconia ciconia",
140
+ "128": "black stork, Ciconia nigra",
141
+ "129": "spoonbill",
142
+ "130": "flamingo",
143
+ "131": "little blue heron, Egretta caerulea",
144
+ "132": "American egret, great white heron, Egretta albus",
145
+ "133": "bittern",
146
+ "134": "crane",
147
+ "135": "limpkin, Aramus pictus",
148
+ "136": "European gallinule, Porphyrio porphyrio",
149
+ "137": "American coot, marsh hen, mud hen, water hen, Fulica americana",
150
+ "138": "bustard",
151
+ "139": "ruddy turnstone, Arenaria interpres",
152
+ "140": "red-backed sandpiper, dunlin, Erolia alpina",
153
+ "141": "redshank, Tringa totanus",
154
+ "142": "dowitcher",
155
+ "143": "oystercatcher, oyster catcher",
156
+ "144": "pelican",
157
+ "145": "king penguin, Aptenodytes patagonica",
158
+ "146": "albatross, mollymawk",
159
+ "147": "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus",
160
+ "148": "killer whale, killer, orca, grampus, sea wolf, Orcinus orca",
161
+ "149": "dugong, Dugong dugon",
162
+ "150": "sea lion",
163
+ "151": "Chihuahua",
164
+ "152": "Japanese spaniel",
165
+ "153": "Maltese dog, Maltese terrier, Maltese",
166
+ "154": "Pekinese, Pekingese, Peke",
167
+ "155": "Shih-Tzu",
168
+ "156": "Blenheim spaniel",
169
+ "157": "papillon",
170
+ "158": "toy terrier",
171
+ "159": "Rhodesian ridgeback",
172
+ "160": "Afghan hound, Afghan",
173
+ "161": "basset, basset hound",
174
+ "162": "beagle",
175
+ "163": "bloodhound, sleuthhound",
176
+ "164": "bluetick",
177
+ "165": "black-and-tan coonhound",
178
+ "166": "Walker hound, Walker foxhound",
179
+ "167": "English foxhound",
180
+ "168": "redbone",
181
+ "169": "borzoi, Russian wolfhound",
182
+ "170": "Irish wolfhound",
183
+ "171": "Italian greyhound",
184
+ "172": "whippet",
185
+ "173": "Ibizan hound, Ibizan Podenco",
186
+ "174": "Norwegian elkhound, elkhound",
187
+ "175": "otterhound, otter hound",
188
+ "176": "Saluki, gazelle hound",
189
+ "177": "Scottish deerhound, deerhound",
190
+ "178": "Weimaraner",
191
+ "179": "Staffordshire bullterrier, Staffordshire bull terrier",
192
+ "180": "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier",
193
+ "181": "Bedlington terrier",
194
+ "182": "Border terrier",
195
+ "183": "Kerry blue terrier",
196
+ "184": "Irish terrier",
197
+ "185": "Norfolk terrier",
198
+ "186": "Norwich terrier",
199
+ "187": "Yorkshire terrier",
200
+ "188": "wire-haired fox terrier",
201
+ "189": "Lakeland terrier",
202
+ "190": "Sealyham terrier, Sealyham",
203
+ "191": "Airedale, Airedale terrier",
204
+ "192": "cairn, cairn terrier",
205
+ "193": "Australian terrier",
206
+ "194": "Dandie Dinmont, Dandie Dinmont terrier",
207
+ "195": "Boston bull, Boston terrier",
208
+ "196": "miniature schnauzer",
209
+ "197": "giant schnauzer",
210
+ "198": "standard schnauzer",
211
+ "199": "Scotch terrier, Scottish terrier, Scottie",
212
+ "200": "Tibetan terrier, chrysanthemum dog",
213
+ "201": "silky terrier, Sydney silky",
214
+ "202": "soft-coated wheaten terrier",
215
+ "203": "West Highland white terrier",
216
+ "204": "Lhasa, Lhasa apso",
217
+ "205": "flat-coated retriever",
218
+ "206": "curly-coated retriever",
219
+ "207": "golden retriever",
220
+ "208": "Labrador retriever",
221
+ "209": "Chesapeake Bay retriever",
222
+ "210": "German short-haired pointer",
223
+ "211": "vizsla, Hungarian pointer",
224
+ "212": "English setter",
225
+ "213": "Irish setter, red setter",
226
+ "214": "Gordon setter",
227
+ "215": "Brittany spaniel",
228
+ "216": "clumber, clumber spaniel",
229
+ "217": "English springer, English springer spaniel",
230
+ "218": "Welsh springer spaniel",
231
+ "219": "cocker spaniel, English cocker spaniel, cocker",
232
+ "220": "Sussex spaniel",
233
+ "221": "Irish water spaniel",
234
+ "222": "kuvasz",
235
+ "223": "schipperke",
236
+ "224": "groenendael",
237
+ "225": "malinois",
238
+ "226": "briard",
239
+ "227": "kelpie",
240
+ "228": "komondor",
241
+ "229": "Old English sheepdog, bobtail",
242
+ "230": "Shetland sheepdog, Shetland sheep dog, Shetland",
243
+ "231": "collie",
244
+ "232": "Border collie",
245
+ "233": "Bouvier des Flandres, Bouviers des Flandres",
246
+ "234": "Rottweiler",
247
+ "235": "German shepherd, German shepherd dog, German police dog, alsatian",
248
+ "236": "Doberman, Doberman pinscher",
249
+ "237": "miniature pinscher",
250
+ "238": "Greater Swiss Mountain dog",
251
+ "239": "Bernese mountain dog",
252
+ "240": "Appenzeller",
253
+ "241": "EntleBucher",
254
+ "242": "boxer",
255
+ "243": "bull mastiff",
256
+ "244": "Tibetan mastiff",
257
+ "245": "French bulldog",
258
+ "246": "Great Dane",
259
+ "247": "Saint Bernard, St Bernard",
260
+ "248": "Eskimo dog, husky",
261
+ "249": "malamute, malemute, Alaskan malamute",
262
+ "250": "Siberian husky",
263
+ "251": "dalmatian, coach dog, carriage dog",
264
+ "252": "affenpinscher, monkey pinscher, monkey dog",
265
+ "253": "basenji",
266
+ "254": "pug, pug-dog",
267
+ "255": "Leonberg",
268
+ "256": "Newfoundland, Newfoundland dog",
269
+ "257": "Great Pyrenees",
270
+ "258": "Samoyed, Samoyede",
271
+ "259": "Pomeranian",
272
+ "260": "chow, chow chow",
273
+ "261": "keeshond",
274
+ "262": "Brabancon griffon",
275
+ "263": "Pembroke, Pembroke Welsh corgi",
276
+ "264": "Cardigan, Cardigan Welsh corgi",
277
+ "265": "toy poodle",
278
+ "266": "miniature poodle",
279
+ "267": "standard poodle",
280
+ "268": "Mexican hairless",
281
+ "269": "timber wolf, grey wolf, gray wolf, Canis lupus",
282
+ "270": "white wolf, Arctic wolf, Canis lupus tundrarum",
283
+ "271": "red wolf, maned wolf, Canis rufus, Canis niger",
284
+ "272": "coyote, prairie wolf, brush wolf, Canis latrans",
285
+ "273": "dingo, warrigal, warragal, Canis dingo",
286
+ "274": "dhole, Cuon alpinus",
287
+ "275": "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus",
288
+ "276": "hyena, hyaena",
289
+ "277": "red fox, Vulpes vulpes",
290
+ "278": "kit fox, Vulpes macrotis",
291
+ "279": "Arctic fox, white fox, Alopex lagopus",
292
+ "280": "grey fox, gray fox, Urocyon cinereoargenteus",
293
+ "281": "tabby, tabby cat",
294
+ "282": "tiger cat",
295
+ "283": "Persian cat",
296
+ "284": "Siamese cat, Siamese",
297
+ "285": "Egyptian cat",
298
+ "286": "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor",
299
+ "287": "lynx, catamount",
300
+ "288": "leopard, Panthera pardus",
301
+ "289": "snow leopard, ounce, Panthera uncia",
302
+ "290": "jaguar, panther, Panthera onca, Felis onca",
303
+ "291": "lion, king of beasts, Panthera leo",
304
+ "292": "tiger, Panthera tigris",
305
+ "293": "cheetah, chetah, Acinonyx jubatus",
306
+ "294": "brown bear, bruin, Ursus arctos",
307
+ "295": "American black bear, black bear, Ursus americanus, Euarctos americanus",
308
+ "296": "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus",
309
+ "297": "sloth bear, Melursus ursinus, Ursus ursinus",
310
+ "298": "mongoose",
311
+ "299": "meerkat, mierkat",
312
+ "300": "tiger beetle",
313
+ "301": "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
314
+ "302": "ground beetle, carabid beetle",
315
+ "303": "long-horned beetle, longicorn, longicorn beetle",
316
+ "304": "leaf beetle, chrysomelid",
317
+ "305": "dung beetle",
318
+ "306": "rhinoceros beetle",
319
+ "307": "weevil",
320
+ "308": "fly",
321
+ "309": "bee",
322
+ "310": "ant, emmet, pismire",
323
+ "311": "grasshopper, hopper",
324
+ "312": "cricket",
325
+ "313": "walking stick, walkingstick, stick insect",
326
+ "314": "cockroach, roach",
327
+ "315": "mantis, mantid",
328
+ "316": "cicada, cicala",
329
+ "317": "leafhopper",
330
+ "318": "lacewing, lacewing fly",
331
+ "319": "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
332
+ "320": "damselfly",
333
+ "321": "admiral",
334
+ "322": "ringlet, ringlet butterfly",
335
+ "323": "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus",
336
+ "324": "cabbage butterfly",
337
+ "325": "sulphur butterfly, sulfur butterfly",
338
+ "326": "lycaenid, lycaenid butterfly",
339
+ "327": "starfish, sea star",
340
+ "328": "sea urchin",
341
+ "329": "sea cucumber, holothurian",
342
+ "330": "wood rabbit, cottontail, cottontail rabbit",
343
+ "331": "hare",
344
+ "332": "Angora, Angora rabbit",
345
+ "333": "hamster",
346
+ "334": "porcupine, hedgehog",
347
+ "335": "fox squirrel, eastern fox squirrel, Sciurus niger",
348
+ "336": "marmot",
349
+ "337": "beaver",
350
+ "338": "guinea pig, Cavia cobaya",
351
+ "339": "sorrel",
352
+ "340": "zebra",
353
+ "341": "hog, pig, grunter, squealer, Sus scrofa",
354
+ "342": "wild boar, boar, Sus scrofa",
355
+ "343": "warthog",
356
+ "344": "hippopotamus, hippo, river horse, Hippopotamus amphibius",
357
+ "345": "ox",
358
+ "346": "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis",
359
+ "347": "bison",
360
+ "348": "ram, tup",
361
+ "349": "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis",
362
+ "350": "ibex, Capra ibex",
363
+ "351": "hartebeest",
364
+ "352": "impala, Aepyceros melampus",
365
+ "353": "gazelle",
366
+ "354": "Arabian camel, dromedary, Camelus dromedarius",
367
+ "355": "llama",
368
+ "356": "weasel",
369
+ "357": "mink",
370
+ "358": "polecat, fitch, foulmart, foumart, Mustela putorius",
371
+ "359": "black-footed ferret, ferret, Mustela nigripes",
372
+ "360": "otter",
373
+ "361": "skunk, polecat, wood pussy",
374
+ "362": "badger",
375
+ "363": "armadillo",
376
+ "364": "three-toed sloth, ai, Bradypus tridactylus",
377
+ "365": "orangutan, orang, orangutang, Pongo pygmaeus",
378
+ "366": "gorilla, Gorilla gorilla",
379
+ "367": "chimpanzee, chimp, Pan troglodytes",
380
+ "368": "gibbon, Hylobates lar",
381
+ "369": "siamang, Hylobates syndactylus, Symphalangus syndactylus",
382
+ "370": "guenon, guenon monkey",
383
+ "371": "patas, hussar monkey, Erythrocebus patas",
384
+ "372": "baboon",
385
+ "373": "macaque",
386
+ "374": "langur",
387
+ "375": "colobus, colobus monkey",
388
+ "376": "proboscis monkey, Nasalis larvatus",
389
+ "377": "marmoset",
390
+ "378": "capuchin, ringtail, Cebus capucinus",
391
+ "379": "howler monkey, howler",
392
+ "380": "titi, titi monkey",
393
+ "381": "spider monkey, Ateles geoffroyi",
394
+ "382": "squirrel monkey, Saimiri sciureus",
395
+ "383": "Madagascar cat, ring-tailed lemur, Lemur catta",
396
+ "384": "indri, indris, Indri indri, Indri brevicaudatus",
397
+ "385": "Indian elephant, Elephas maximus",
398
+ "386": "African elephant, Loxodonta africana",
399
+ "387": "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens",
400
+ "388": "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca",
401
+ "389": "barracouta, snoek",
402
+ "390": "eel",
403
+ "391": "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch",
404
+ "392": "rock beauty, Holocanthus tricolor",
405
+ "393": "anemone fish",
406
+ "394": "sturgeon",
407
+ "395": "gar, garfish, garpike, billfish, Lepisosteus osseus",
408
+ "396": "lionfish",
409
+ "397": "puffer, pufferfish, blowfish, globefish",
410
+ "398": "abacus",
411
+ "399": "abaya",
412
+ "400": "academic gown, academic robe, judge's robe",
413
+ "401": "accordion, piano accordion, squeeze box",
414
+ "402": "acoustic guitar",
415
+ "403": "aircraft carrier, carrier, flattop, attack aircraft carrier",
416
+ "404": "airliner",
417
+ "405": "airship, dirigible",
418
+ "406": "altar",
419
+ "407": "ambulance",
420
+ "408": "amphibian, amphibious vehicle",
421
+ "409": "analog clock",
422
+ "410": "apiary, bee house",
423
+ "411": "apron",
424
+ "412": "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
425
+ "413": "assault rifle, assault gun",
426
+ "414": "backpack, back pack, knapsack, packsack, rucksack, haversack",
427
+ "415": "bakery, bakeshop, bakehouse",
428
+ "416": "balance beam, beam",
429
+ "417": "balloon",
430
+ "418": "ballpoint, ballpoint pen, ballpen, Biro",
431
+ "419": "Band Aid",
432
+ "420": "banjo",
433
+ "421": "bannister, banister, balustrade, balusters, handrail",
434
+ "422": "barbell",
435
+ "423": "barber chair",
436
+ "424": "barbershop",
437
+ "425": "barn",
438
+ "426": "barometer",
439
+ "427": "barrel, cask",
440
+ "428": "barrow, garden cart, lawn cart, wheelbarrow",
441
+ "429": "baseball",
442
+ "430": "basketball",
443
+ "431": "bassinet",
444
+ "432": "bassoon",
445
+ "433": "bathing cap, swimming cap",
446
+ "434": "bath towel",
447
+ "435": "bathtub, bathing tub, bath, tub",
448
+ "436": "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
449
+ "437": "beacon, lighthouse, beacon light, pharos",
450
+ "438": "beaker",
451
+ "439": "bearskin, busby, shako",
452
+ "440": "beer bottle",
453
+ "441": "beer glass",
454
+ "442": "bell cote, bell cot",
455
+ "443": "bib",
456
+ "444": "bicycle-built-for-two, tandem bicycle, tandem",
457
+ "445": "bikini, two-piece",
458
+ "446": "binder, ring-binder",
459
+ "447": "binoculars, field glasses, opera glasses",
460
+ "448": "birdhouse",
461
+ "449": "boathouse",
462
+ "450": "bobsled, bobsleigh, bob",
463
+ "451": "bolo tie, bolo, bola tie, bola",
464
+ "452": "bonnet, poke bonnet",
465
+ "453": "bookcase",
466
+ "454": "bookshop, bookstore, bookstall",
467
+ "455": "bottlecap",
468
+ "456": "bow",
469
+ "457": "bow tie, bow-tie, bowtie",
470
+ "458": "brass, memorial tablet, plaque",
471
+ "459": "brassiere, bra, bandeau",
472
+ "460": "breakwater, groin, groyne, mole, bulwark, seawall, jetty",
473
+ "461": "breastplate, aegis, egis",
474
+ "462": "broom",
475
+ "463": "bucket, pail",
476
+ "464": "buckle",
477
+ "465": "bulletproof vest",
478
+ "466": "bullet train, bullet",
479
+ "467": "butcher shop, meat market",
480
+ "468": "cab, hack, taxi, taxicab",
481
+ "469": "caldron, cauldron",
482
+ "470": "candle, taper, wax light",
483
+ "471": "cannon",
484
+ "472": "canoe",
485
+ "473": "can opener, tin opener",
486
+ "474": "cardigan",
487
+ "475": "car mirror",
488
+ "476": "carousel, carrousel, merry-go-round, roundabout, whirligig",
489
+ "477": "carpenter's kit, tool kit",
490
+ "478": "carton",
491
+ "479": "car wheel",
492
+ "480": "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM",
493
+ "481": "cassette",
494
+ "482": "cassette player",
495
+ "483": "castle",
496
+ "484": "catamaran",
497
+ "485": "CD player",
498
+ "486": "cello, violoncello",
499
+ "487": "cellular telephone, cellular phone, cellphone, cell, mobile phone",
500
+ "488": "chain",
501
+ "489": "chainlink fence",
502
+ "490": "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
503
+ "491": "chain saw, chainsaw",
504
+ "492": "chest",
505
+ "493": "chiffonier, commode",
506
+ "494": "chime, bell, gong",
507
+ "495": "china cabinet, china closet",
508
+ "496": "Christmas stocking",
509
+ "497": "church, church building",
510
+ "498": "cinema, movie theater, movie theatre, movie house, picture palace",
511
+ "499": "cleaver, meat cleaver, chopper",
512
+ "500": "cliff dwelling",
513
+ "501": "cloak",
514
+ "502": "clog, geta, patten, sabot",
515
+ "503": "cocktail shaker",
516
+ "504": "coffee mug",
517
+ "505": "coffeepot",
518
+ "506": "coil, spiral, volute, whorl, helix",
519
+ "507": "combination lock",
520
+ "508": "computer keyboard, keypad",
521
+ "509": "confectionery, confectionary, candy store",
522
+ "510": "container ship, containership, container vessel",
523
+ "511": "convertible",
524
+ "512": "corkscrew, bottle screw",
525
+ "513": "cornet, horn, trumpet, trump",
526
+ "514": "cowboy boot",
527
+ "515": "cowboy hat, ten-gallon hat",
528
+ "516": "cradle",
529
+ "517": "crane",
530
+ "518": "crash helmet",
531
+ "519": "crate",
532
+ "520": "crib, cot",
533
+ "521": "Crock Pot",
534
+ "522": "croquet ball",
535
+ "523": "crutch",
536
+ "524": "cuirass",
537
+ "525": "dam, dike, dyke",
538
+ "526": "desk",
539
+ "527": "desktop computer",
540
+ "528": "dial telephone, dial phone",
541
+ "529": "diaper, nappy, napkin",
542
+ "530": "digital clock",
543
+ "531": "digital watch",
544
+ "532": "dining table, board",
545
+ "533": "dishrag, dishcloth",
546
+ "534": "dishwasher, dish washer, dishwashing machine",
547
+ "535": "disk brake, disc brake",
548
+ "536": "dock, dockage, docking facility",
549
+ "537": "dogsled, dog sled, dog sleigh",
550
+ "538": "dome",
551
+ "539": "doormat, welcome mat",
552
+ "540": "drilling platform, offshore rig",
553
+ "541": "drum, membranophone, tympan",
554
+ "542": "drumstick",
555
+ "543": "dumbbell",
556
+ "544": "Dutch oven",
557
+ "545": "electric fan, blower",
558
+ "546": "electric guitar",
559
+ "547": "electric locomotive",
560
+ "548": "entertainment center",
561
+ "549": "envelope",
562
+ "550": "espresso maker",
563
+ "551": "face powder",
564
+ "552": "feather boa, boa",
565
+ "553": "file, file cabinet, filing cabinet",
566
+ "554": "fireboat",
567
+ "555": "fire engine, fire truck",
568
+ "556": "fire screen, fireguard",
569
+ "557": "flagpole, flagstaff",
570
+ "558": "flute, transverse flute",
571
+ "559": "folding chair",
572
+ "560": "football helmet",
573
+ "561": "forklift",
574
+ "562": "fountain",
575
+ "563": "fountain pen",
576
+ "564": "four-poster",
577
+ "565": "freight car",
578
+ "566": "French horn, horn",
579
+ "567": "frying pan, frypan, skillet",
580
+ "568": "fur coat",
581
+ "569": "garbage truck, dustcart",
582
+ "570": "gasmask, respirator, gas helmet",
583
+ "571": "gas pump, gasoline pump, petrol pump, island dispenser",
584
+ "572": "goblet",
585
+ "573": "go-kart",
586
+ "574": "golf ball",
587
+ "575": "golfcart, golf cart",
588
+ "576": "gondola",
589
+ "577": "gong, tam-tam",
590
+ "578": "gown",
591
+ "579": "grand piano, grand",
592
+ "580": "greenhouse, nursery, glasshouse",
593
+ "581": "grille, radiator grille",
594
+ "582": "grocery store, grocery, food market, market",
595
+ "583": "guillotine",
596
+ "584": "hair slide",
597
+ "585": "hair spray",
598
+ "586": "half track",
599
+ "587": "hammer",
600
+ "588": "hamper",
601
+ "589": "hand blower, blow dryer, blow drier, hair dryer, hair drier",
602
+ "590": "hand-held computer, hand-held microcomputer",
603
+ "591": "handkerchief, hankie, hanky, hankey",
604
+ "592": "hard disc, hard disk, fixed disk",
605
+ "593": "harmonica, mouth organ, harp, mouth harp",
606
+ "594": "harp",
607
+ "595": "harvester, reaper",
608
+ "596": "hatchet",
609
+ "597": "holster",
610
+ "598": "home theater, home theatre",
611
+ "599": "honeycomb",
612
+ "600": "hook, claw",
613
+ "601": "hoopskirt, crinoline",
614
+ "602": "horizontal bar, high bar",
615
+ "603": "horse cart, horse-cart",
616
+ "604": "hourglass",
617
+ "605": "iPod",
618
+ "606": "iron, smoothing iron",
619
+ "607": "jack-o'-lantern",
620
+ "608": "jean, blue jean, denim",
621
+ "609": "jeep, landrover",
622
+ "610": "jersey, T-shirt, tee shirt",
623
+ "611": "jigsaw puzzle",
624
+ "612": "jinrikisha, ricksha, rickshaw",
625
+ "613": "joystick",
626
+ "614": "kimono",
627
+ "615": "knee pad",
628
+ "616": "knot",
629
+ "617": "lab coat, laboratory coat",
630
+ "618": "ladle",
631
+ "619": "lampshade, lamp shade",
632
+ "620": "laptop, laptop computer",
633
+ "621": "lawn mower, mower",
634
+ "622": "lens cap, lens cover",
635
+ "623": "letter opener, paper knife, paperknife",
636
+ "624": "library",
637
+ "625": "lifeboat",
638
+ "626": "lighter, light, igniter, ignitor",
639
+ "627": "limousine, limo",
640
+ "628": "liner, ocean liner",
641
+ "629": "lipstick, lip rouge",
642
+ "630": "Loafer",
643
+ "631": "lotion",
644
+ "632": "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
645
+ "633": "loupe, jeweler's loupe",
646
+ "634": "lumbermill, sawmill",
647
+ "635": "magnetic compass",
648
+ "636": "mailbag, postbag",
649
+ "637": "mailbox, letter box",
650
+ "638": "maillot",
651
+ "639": "maillot, tank suit",
652
+ "640": "manhole cover",
653
+ "641": "maraca",
654
+ "642": "marimba, xylophone",
655
+ "643": "mask",
656
+ "644": "matchstick",
657
+ "645": "maypole",
658
+ "646": "maze, labyrinth",
659
+ "647": "measuring cup",
660
+ "648": "medicine chest, medicine cabinet",
661
+ "649": "megalith, megalithic structure",
662
+ "650": "microphone, mike",
663
+ "651": "microwave, microwave oven",
664
+ "652": "military uniform",
665
+ "653": "milk can",
666
+ "654": "minibus",
667
+ "655": "miniskirt, mini",
668
+ "656": "minivan",
669
+ "657": "missile",
670
+ "658": "mitten",
671
+ "659": "mixing bowl",
672
+ "660": "mobile home, manufactured home",
673
+ "661": "Model T",
674
+ "662": "modem",
675
+ "663": "monastery",
676
+ "664": "monitor",
677
+ "665": "moped",
678
+ "666": "mortar",
679
+ "667": "mortarboard",
680
+ "668": "mosque",
681
+ "669": "mosquito net",
682
+ "670": "motor scooter, scooter",
683
+ "671": "mountain bike, all-terrain bike, off-roader",
684
+ "672": "mountain tent",
685
+ "673": "mouse, computer mouse",
686
+ "674": "mousetrap",
687
+ "675": "moving van",
688
+ "676": "muzzle",
689
+ "677": "nail",
690
+ "678": "neck brace",
691
+ "679": "necklace",
692
+ "680": "nipple",
693
+ "681": "notebook, notebook computer",
694
+ "682": "obelisk",
695
+ "683": "oboe, hautboy, hautbois",
696
+ "684": "ocarina, sweet potato",
697
+ "685": "odometer, hodometer, mileometer, milometer",
698
+ "686": "oil filter",
699
+ "687": "organ, pipe organ",
700
+ "688": "oscilloscope, scope, cathode-ray oscilloscope, CRO",
701
+ "689": "overskirt",
702
+ "690": "oxcart",
703
+ "691": "oxygen mask",
704
+ "692": "packet",
705
+ "693": "paddle, boat paddle",
706
+ "694": "paddlewheel, paddle wheel",
707
+ "695": "padlock",
708
+ "696": "paintbrush",
709
+ "697": "pajama, pyjama, pj's, jammies",
710
+ "698": "palace",
711
+ "699": "panpipe, pandean pipe, syrinx",
712
+ "700": "paper towel",
713
+ "701": "parachute, chute",
714
+ "702": "parallel bars, bars",
715
+ "703": "park bench",
716
+ "704": "parking meter",
717
+ "705": "passenger car, coach, carriage",
718
+ "706": "patio, terrace",
719
+ "707": "pay-phone, pay-station",
720
+ "708": "pedestal, plinth, footstall",
721
+ "709": "pencil box, pencil case",
722
+ "710": "pencil sharpener",
723
+ "711": "perfume, essence",
724
+ "712": "Petri dish",
725
+ "713": "photocopier",
726
+ "714": "pick, plectrum, plectron",
727
+ "715": "pickelhaube",
728
+ "716": "picket fence, paling",
729
+ "717": "pickup, pickup truck",
730
+ "718": "pier",
731
+ "719": "piggy bank, penny bank",
732
+ "720": "pill bottle",
733
+ "721": "pillow",
734
+ "722": "ping-pong ball",
735
+ "723": "pinwheel",
736
+ "724": "pirate, pirate ship",
737
+ "725": "pitcher, ewer",
738
+ "726": "plane, carpenter's plane, woodworking plane",
739
+ "727": "planetarium",
740
+ "728": "plastic bag",
741
+ "729": "plate rack",
742
+ "730": "plow, plough",
743
+ "731": "plunger, plumber's helper",
744
+ "732": "Polaroid camera, Polaroid Land camera",
745
+ "733": "pole",
746
+ "734": "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria",
747
+ "735": "poncho",
748
+ "736": "pool table, billiard table, snooker table",
749
+ "737": "pop bottle, soda bottle",
750
+ "738": "pot, flowerpot",
751
+ "739": "potter's wheel",
752
+ "740": "power drill",
753
+ "741": "prayer rug, prayer mat",
754
+ "742": "printer",
755
+ "743": "prison, prison house",
756
+ "744": "projectile, missile",
757
+ "745": "projector",
758
+ "746": "puck, hockey puck",
759
+ "747": "punching bag, punch bag, punching ball, punchball",
760
+ "748": "purse",
761
+ "749": "quill, quill pen",
762
+ "750": "quilt, comforter, comfort, puff",
763
+ "751": "racer, race car, racing car",
764
+ "752": "racket, racquet",
765
+ "753": "radiator",
766
+ "754": "radio, wireless",
767
+ "755": "radio telescope, radio reflector",
768
+ "756": "rain barrel",
769
+ "757": "recreational vehicle, RV, R.V.",
770
+ "758": "reel",
771
+ "759": "reflex camera",
772
+ "760": "refrigerator, icebox",
773
+ "761": "remote control, remote",
774
+ "762": "restaurant, eating house, eating place, eatery",
775
+ "763": "revolver, six-gun, six-shooter",
776
+ "764": "rifle",
777
+ "765": "rocking chair, rocker",
778
+ "766": "rotisserie",
779
+ "767": "rubber eraser, rubber, pencil eraser",
780
+ "768": "rugby ball",
781
+ "769": "rule, ruler",
782
+ "770": "running shoe",
783
+ "771": "safe",
784
+ "772": "safety pin",
785
+ "773": "saltshaker, salt shaker",
786
+ "774": "sandal",
787
+ "775": "sarong",
788
+ "776": "sax, saxophone",
789
+ "777": "scabbard",
790
+ "778": "scale, weighing machine",
791
+ "779": "school bus",
792
+ "780": "schooner",
793
+ "781": "scoreboard",
794
+ "782": "screen, CRT screen",
795
+ "783": "screw",
796
+ "784": "screwdriver",
797
+ "785": "seat belt, seatbelt",
798
+ "786": "sewing machine",
799
+ "787": "shield, buckler",
800
+ "788": "shoe shop, shoe-shop, shoe store",
801
+ "789": "shoji",
802
+ "790": "shopping basket",
803
+ "791": "shopping cart",
804
+ "792": "shovel",
805
+ "793": "shower cap",
806
+ "794": "shower curtain",
807
+ "795": "ski",
808
+ "796": "ski mask",
809
+ "797": "sleeping bag",
810
+ "798": "slide rule, slipstick",
811
+ "799": "sliding door",
812
+ "800": "slot, one-armed bandit",
813
+ "801": "snorkel",
814
+ "802": "snowmobile",
815
+ "803": "snowplow, snowplough",
816
+ "804": "soap dispenser",
817
+ "805": "soccer ball",
818
+ "806": "sock",
819
+ "807": "solar dish, solar collector, solar furnace",
820
+ "808": "sombrero",
821
+ "809": "soup bowl",
822
+ "810": "space bar",
823
+ "811": "space heater",
824
+ "812": "space shuttle",
825
+ "813": "spatula",
826
+ "814": "speedboat",
827
+ "815": "spider web, spider's web",
828
+ "816": "spindle",
829
+ "817": "sports car, sport car",
830
+ "818": "spotlight, spot",
831
+ "819": "stage",
832
+ "820": "steam locomotive",
833
+ "821": "steel arch bridge",
834
+ "822": "steel drum",
835
+ "823": "stethoscope",
836
+ "824": "stole",
837
+ "825": "stone wall",
838
+ "826": "stopwatch, stop watch",
839
+ "827": "stove",
840
+ "828": "strainer",
841
+ "829": "streetcar, tram, tramcar, trolley, trolley car",
842
+ "830": "stretcher",
843
+ "831": "studio couch, day bed",
844
+ "832": "stupa, tope",
845
+ "833": "submarine, pigboat, sub, U-boat",
846
+ "834": "suit, suit of clothes",
847
+ "835": "sundial",
848
+ "836": "sunglass",
849
+ "837": "sunglasses, dark glasses, shades",
850
+ "838": "sunscreen, sunblock, sun blocker",
851
+ "839": "suspension bridge",
852
+ "840": "swab, swob, mop",
853
+ "841": "sweatshirt",
854
+ "842": "swimming trunks, bathing trunks",
855
+ "843": "swing",
856
+ "844": "switch, electric switch, electrical switch",
857
+ "845": "syringe",
858
+ "846": "table lamp",
859
+ "847": "tank, army tank, armored combat vehicle, armoured combat vehicle",
860
+ "848": "tape player",
861
+ "849": "teapot",
862
+ "850": "teddy, teddy bear",
863
+ "851": "television, television system",
864
+ "852": "tennis ball",
865
+ "853": "thatch, thatched roof",
866
+ "854": "theater curtain, theatre curtain",
867
+ "855": "thimble",
868
+ "856": "thresher, thrasher, threshing machine",
869
+ "857": "throne",
870
+ "858": "tile roof",
871
+ "859": "toaster",
872
+ "860": "tobacco shop, tobacconist shop, tobacconist",
873
+ "861": "toilet seat",
874
+ "862": "torch",
875
+ "863": "totem pole",
876
+ "864": "tow truck, tow car, wrecker",
877
+ "865": "toyshop",
878
+ "866": "tractor",
879
+ "867": "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
880
+ "868": "tray",
881
+ "869": "trench coat",
882
+ "870": "tricycle, trike, velocipede",
883
+ "871": "trimaran",
884
+ "872": "tripod",
885
+ "873": "triumphal arch",
886
+ "874": "trolleybus, trolley coach, trackless trolley",
887
+ "875": "trombone",
888
+ "876": "tub, vat",
889
+ "877": "turnstile",
890
+ "878": "typewriter keyboard",
891
+ "879": "umbrella",
892
+ "880": "unicycle, monocycle",
893
+ "881": "upright, upright piano",
894
+ "882": "vacuum, vacuum cleaner",
895
+ "883": "vase",
896
+ "884": "vault",
897
+ "885": "velvet",
898
+ "886": "vending machine",
899
+ "887": "vestment",
900
+ "888": "viaduct",
901
+ "889": "violin, fiddle",
902
+ "890": "volleyball",
903
+ "891": "waffle iron",
904
+ "892": "wall clock",
905
+ "893": "wallet, billfold, notecase, pocketbook",
906
+ "894": "wardrobe, closet, press",
907
+ "895": "warplane, military plane",
908
+ "896": "washbasin, handbasin, washbowl, lavabo, wash-hand basin",
909
+ "897": "washer, automatic washer, washing machine",
910
+ "898": "water bottle",
911
+ "899": "water jug",
912
+ "900": "water tower",
913
+ "901": "whiskey jug",
914
+ "902": "whistle",
915
+ "903": "wig",
916
+ "904": "window screen",
917
+ "905": "window shade",
918
+ "906": "Windsor tie",
919
+ "907": "wine bottle",
920
+ "908": "wing",
921
+ "909": "wok",
922
+ "910": "wooden spoon",
923
+ "911": "wool, woolen, woollen",
924
+ "912": "worm fence, snake fence, snake-rail fence, Virginia fence",
925
+ "913": "wreck",
926
+ "914": "yawl",
927
+ "915": "yurt",
928
+ "916": "web site, website, internet site, site",
929
+ "917": "comic book",
930
+ "918": "crossword puzzle, crossword",
931
+ "919": "street sign",
932
+ "920": "traffic light, traffic signal, stoplight",
933
+ "921": "book jacket, dust cover, dust jacket, dust wrapper",
934
+ "922": "menu",
935
+ "923": "plate",
936
+ "924": "guacamole",
937
+ "925": "consomme",
938
+ "926": "hot pot, hotpot",
939
+ "927": "trifle",
940
+ "928": "ice cream, icecream",
941
+ "929": "ice lolly, lolly, lollipop, popsicle",
942
+ "930": "French loaf",
943
+ "931": "bagel, beigel",
944
+ "932": "pretzel",
945
+ "933": "cheeseburger",
946
+ "934": "hotdog, hot dog, red hot",
947
+ "935": "mashed potato",
948
+ "936": "head cabbage",
949
+ "937": "broccoli",
950
+ "938": "cauliflower",
951
+ "939": "zucchini, courgette",
952
+ "940": "spaghetti squash",
953
+ "941": "acorn squash",
954
+ "942": "butternut squash",
955
+ "943": "cucumber, cuke",
956
+ "944": "artichoke, globe artichoke",
957
+ "945": "bell pepper",
958
+ "946": "cardoon",
959
+ "947": "mushroom",
960
+ "948": "Granny Smith",
961
+ "949": "strawberry",
962
+ "950": "orange",
963
+ "951": "lemon",
964
+ "952": "fig",
965
+ "953": "pineapple, ananas",
966
+ "954": "banana",
967
+ "955": "jackfruit, jak, jack",
968
+ "956": "custard apple",
969
+ "957": "pomegranate",
970
+ "958": "hay",
971
+ "959": "carbonara",
972
+ "960": "chocolate sauce, chocolate syrup",
973
+ "961": "dough",
974
+ "962": "meat loaf, meatloaf",
975
+ "963": "pizza, pizza pie",
976
+ "964": "potpie",
977
+ "965": "burrito",
978
+ "966": "red wine",
979
+ "967": "espresso",
980
+ "968": "cup",
981
+ "969": "eggnog",
982
+ "970": "alp",
983
+ "971": "bubble",
984
+ "972": "cliff, drop, drop-off",
985
+ "973": "coral reef",
986
+ "974": "geyser",
987
+ "975": "lakeside, lakeshore",
988
+ "976": "promontory, headland, head, foreland",
989
+ "977": "sandbar, sand bar",
990
+ "978": "seashore, coast, seacoast, sea-coast",
991
+ "979": "valley, vale",
992
+ "980": "volcano",
993
+ "981": "ballplayer, baseball player",
994
+ "982": "groom, bridegroom",
995
+ "983": "scuba diver",
996
+ "984": "rapeseed",
997
+ "985": "daisy",
998
+ "986": "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum",
999
+ "987": "corn",
1000
+ "988": "acorn",
1001
+ "989": "hip, rose hip, rosehip",
1002
+ "990": "buckeye, horse chestnut, conker",
1003
+ "991": "coral fungus",
1004
+ "992": "agaric",
1005
+ "993": "gyromitra",
1006
+ "994": "stinkhorn, carrion fungus",
1007
+ "995": "earthstar",
1008
+ "996": "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa",
1009
+ "997": "bolete",
1010
+ "998": "ear, spike, capitulum",
1011
+ "999": "toilet tissue, toilet paper, bathroom tissue"
1012
+ },
1013
+ "image_size": 224,
1014
+ "initializer_range": 0.02,
1015
+ "intermediate_size": 3072,
1016
+ "label2id": {
1017
+ "Afghan hound, Afghan": 160,
1018
+ "African chameleon, Chamaeleo chamaeleon": 47,
1019
+ "African crocodile, Nile crocodile, Crocodylus niloticus": 49,
1020
+ "African elephant, Loxodonta africana": 386,
1021
+ "African grey, African gray, Psittacus erithacus": 87,
1022
+ "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus": 275,
1023
+ "Airedale, Airedale terrier": 191,
1024
+ "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier": 180,
1025
+ "American alligator, Alligator mississipiensis": 50,
1026
+ "American black bear, black bear, Ursus americanus, Euarctos americanus": 295,
1027
+ "American chameleon, anole, Anolis carolinensis": 40,
1028
+ "American coot, marsh hen, mud hen, water hen, Fulica americana": 137,
1029
+ "American egret, great white heron, Egretta albus": 132,
1030
+ "American lobster, Northern lobster, Maine lobster, Homarus americanus": 122,
1031
+ "Angora, Angora rabbit": 332,
1032
+ "Appenzeller": 240,
1033
+ "Arabian camel, dromedary, Camelus dromedarius": 354,
1034
+ "Arctic fox, white fox, Alopex lagopus": 279,
1035
+ "Australian terrier": 193,
1036
+ "Band Aid": 419,
1037
+ "Bedlington terrier": 181,
1038
+ "Bernese mountain dog": 239,
1039
+ "Blenheim spaniel": 156,
1040
+ "Border collie": 232,
1041
+ "Border terrier": 182,
1042
+ "Boston bull, Boston terrier": 195,
1043
+ "Bouvier des Flandres, Bouviers des Flandres": 233,
1044
+ "Brabancon griffon": 262,
1045
+ "Brittany spaniel": 215,
1046
+ "CD player": 485,
1047
+ "Cardigan, Cardigan Welsh corgi": 264,
1048
+ "Chesapeake Bay retriever": 209,
1049
+ "Chihuahua": 151,
1050
+ "Christmas stocking": 496,
1051
+ "Crock Pot": 521,
1052
+ "Dandie Dinmont, Dandie Dinmont terrier": 194,
1053
+ "Doberman, Doberman pinscher": 236,
1054
+ "Dungeness crab, Cancer magister": 118,
1055
+ "Dutch oven": 544,
1056
+ "Egyptian cat": 285,
1057
+ "English foxhound": 167,
1058
+ "English setter": 212,
1059
+ "English springer, English springer spaniel": 217,
1060
+ "EntleBucher": 241,
1061
+ "Eskimo dog, husky": 248,
1062
+ "European fire salamander, Salamandra salamandra": 25,
1063
+ "European gallinule, Porphyrio porphyrio": 136,
1064
+ "French bulldog": 245,
1065
+ "French horn, horn": 566,
1066
+ "French loaf": 930,
1067
+ "German shepherd, German shepherd dog, German police dog, alsatian": 235,
1068
+ "German short-haired pointer": 210,
1069
+ "Gila monster, Heloderma suspectum": 45,
1070
+ "Gordon setter": 214,
1071
+ "Granny Smith": 948,
1072
+ "Great Dane": 246,
1073
+ "Great Pyrenees": 257,
1074
+ "Greater Swiss Mountain dog": 238,
1075
+ "Ibizan hound, Ibizan Podenco": 173,
1076
+ "Indian cobra, Naja naja": 63,
1077
+ "Indian elephant, Elephas maximus": 385,
1078
+ "Irish setter, red setter": 213,
1079
+ "Irish terrier": 184,
1080
+ "Irish water spaniel": 221,
1081
+ "Irish wolfhound": 170,
1082
+ "Italian greyhound": 171,
1083
+ "Japanese spaniel": 152,
1084
+ "Kerry blue terrier": 183,
1085
+ "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis": 48,
1086
+ "Labrador retriever": 208,
1087
+ "Lakeland terrier": 189,
1088
+ "Leonberg": 255,
1089
+ "Lhasa, Lhasa apso": 204,
1090
+ "Loafer": 630,
1091
+ "Madagascar cat, ring-tailed lemur, Lemur catta": 383,
1092
+ "Maltese dog, Maltese terrier, Maltese": 153,
1093
+ "Mexican hairless": 268,
1094
+ "Model T": 661,
1095
+ "Newfoundland, Newfoundland dog": 256,
1096
+ "Norfolk terrier": 185,
1097
+ "Norwegian elkhound, elkhound": 174,
1098
+ "Norwich terrier": 186,
1099
+ "Old English sheepdog, bobtail": 229,
1100
+ "Pekinese, Pekingese, Peke": 154,
1101
+ "Pembroke, Pembroke Welsh corgi": 263,
1102
+ "Persian cat": 283,
1103
+ "Petri dish": 712,
1104
+ "Polaroid camera, Polaroid Land camera": 732,
1105
+ "Pomeranian": 259,
1106
+ "Rhodesian ridgeback": 159,
1107
+ "Rottweiler": 234,
1108
+ "Saint Bernard, St Bernard": 247,
1109
+ "Saluki, gazelle hound": 176,
1110
+ "Samoyed, Samoyede": 258,
1111
+ "Scotch terrier, Scottish terrier, Scottie": 199,
1112
+ "Scottish deerhound, deerhound": 177,
1113
+ "Sealyham terrier, Sealyham": 190,
1114
+ "Shetland sheepdog, Shetland sheep dog, Shetland": 230,
1115
+ "Shih-Tzu": 155,
1116
+ "Siamese cat, Siamese": 284,
1117
+ "Siberian husky": 250,
1118
+ "Staffordshire bullterrier, Staffordshire bull terrier": 179,
1119
+ "Sussex spaniel": 220,
1120
+ "Tibetan mastiff": 244,
1121
+ "Tibetan terrier, chrysanthemum dog": 200,
1122
+ "Walker hound, Walker foxhound": 166,
1123
+ "Weimaraner": 178,
1124
+ "Welsh springer spaniel": 218,
1125
+ "West Highland white terrier": 203,
1126
+ "Windsor tie": 906,
1127
+ "Yorkshire terrier": 187,
1128
+ "abacus": 398,
1129
+ "abaya": 399,
1130
+ "academic gown, academic robe, judge's robe": 400,
1131
+ "accordion, piano accordion, squeeze box": 401,
1132
+ "acorn": 988,
1133
+ "acorn squash": 941,
1134
+ "acoustic guitar": 402,
1135
+ "admiral": 321,
1136
+ "affenpinscher, monkey pinscher, monkey dog": 252,
1137
+ "agama": 42,
1138
+ "agaric": 992,
1139
+ "aircraft carrier, carrier, flattop, attack aircraft carrier": 403,
1140
+ "airliner": 404,
1141
+ "airship, dirigible": 405,
1142
+ "albatross, mollymawk": 146,
1143
+ "alligator lizard": 44,
1144
+ "alp": 970,
1145
+ "altar": 406,
1146
+ "ambulance": 407,
1147
+ "amphibian, amphibious vehicle": 408,
1148
+ "analog clock": 409,
1149
+ "anemone fish": 393,
1150
+ "ant, emmet, pismire": 310,
1151
+ "apiary, bee house": 410,
1152
+ "apron": 411,
1153
+ "armadillo": 363,
1154
+ "artichoke, globe artichoke": 944,
1155
+ "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin": 412,
1156
+ "assault rifle, assault gun": 413,
1157
+ "axolotl, mud puppy, Ambystoma mexicanum": 29,
1158
+ "baboon": 372,
1159
+ "backpack, back pack, knapsack, packsack, rucksack, haversack": 414,
1160
+ "badger": 362,
1161
+ "bagel, beigel": 931,
1162
+ "bakery, bakeshop, bakehouse": 415,
1163
+ "balance beam, beam": 416,
1164
+ "bald eagle, American eagle, Haliaeetus leucocephalus": 22,
1165
+ "balloon": 417,
1166
+ "ballplayer, baseball player": 981,
1167
+ "ballpoint, ballpoint pen, ballpen, Biro": 418,
1168
+ "banana": 954,
1169
+ "banded gecko": 38,
1170
+ "banjo": 420,
1171
+ "bannister, banister, balustrade, balusters, handrail": 421,
1172
+ "barbell": 422,
1173
+ "barber chair": 423,
1174
+ "barbershop": 424,
1175
+ "barn": 425,
1176
+ "barn spider, Araneus cavaticus": 73,
1177
+ "barometer": 426,
1178
+ "barracouta, snoek": 389,
1179
+ "barrel, cask": 427,
1180
+ "barrow, garden cart, lawn cart, wheelbarrow": 428,
1181
+ "baseball": 429,
1182
+ "basenji": 253,
1183
+ "basketball": 430,
1184
+ "basset, basset hound": 161,
1185
+ "bassinet": 431,
1186
+ "bassoon": 432,
1187
+ "bath towel": 434,
1188
+ "bathing cap, swimming cap": 433,
1189
+ "bathtub, bathing tub, bath, tub": 435,
1190
+ "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon": 436,
1191
+ "beacon, lighthouse, beacon light, pharos": 437,
1192
+ "beagle": 162,
1193
+ "beaker": 438,
1194
+ "bearskin, busby, shako": 439,
1195
+ "beaver": 337,
1196
+ "bee": 309,
1197
+ "bee eater": 92,
1198
+ "beer bottle": 440,
1199
+ "beer glass": 441,
1200
+ "bell cote, bell cot": 442,
1201
+ "bell pepper": 945,
1202
+ "bib": 443,
1203
+ "bicycle-built-for-two, tandem bicycle, tandem": 444,
1204
+ "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis": 349,
1205
+ "bikini, two-piece": 445,
1206
+ "binder, ring-binder": 446,
1207
+ "binoculars, field glasses, opera glasses": 447,
1208
+ "birdhouse": 448,
1209
+ "bison": 347,
1210
+ "bittern": 133,
1211
+ "black and gold garden spider, Argiope aurantia": 72,
1212
+ "black grouse": 80,
1213
+ "black stork, Ciconia nigra": 128,
1214
+ "black swan, Cygnus atratus": 100,
1215
+ "black widow, Latrodectus mactans": 75,
1216
+ "black-and-tan coonhound": 165,
1217
+ "black-footed ferret, ferret, Mustela nigripes": 359,
1218
+ "bloodhound, sleuthhound": 163,
1219
+ "bluetick": 164,
1220
+ "boa constrictor, Constrictor constrictor": 61,
1221
+ "boathouse": 449,
1222
+ "bobsled, bobsleigh, bob": 450,
1223
+ "bolete": 997,
1224
+ "bolo tie, bolo, bola tie, bola": 451,
1225
+ "bonnet, poke bonnet": 452,
1226
+ "book jacket, dust cover, dust jacket, dust wrapper": 921,
1227
+ "bookcase": 453,
1228
+ "bookshop, bookstore, bookstall": 454,
1229
+ "borzoi, Russian wolfhound": 169,
1230
+ "bottlecap": 455,
1231
+ "bow": 456,
1232
+ "bow tie, bow-tie, bowtie": 457,
1233
+ "box turtle, box tortoise": 37,
1234
+ "boxer": 242,
1235
+ "brain coral": 109,
1236
+ "brambling, Fringilla montifringilla": 10,
1237
+ "brass, memorial tablet, plaque": 458,
1238
+ "brassiere, bra, bandeau": 459,
1239
+ "breakwater, groin, groyne, mole, bulwark, seawall, jetty": 460,
1240
+ "breastplate, aegis, egis": 461,
1241
+ "briard": 226,
1242
+ "broccoli": 937,
1243
+ "broom": 462,
1244
+ "brown bear, bruin, Ursus arctos": 294,
1245
+ "bubble": 971,
1246
+ "bucket, pail": 463,
1247
+ "buckeye, horse chestnut, conker": 990,
1248
+ "buckle": 464,
1249
+ "bulbul": 16,
1250
+ "bull mastiff": 243,
1251
+ "bullet train, bullet": 466,
1252
+ "bulletproof vest": 465,
1253
+ "bullfrog, Rana catesbeiana": 30,
1254
+ "burrito": 965,
1255
+ "bustard": 138,
1256
+ "butcher shop, meat market": 467,
1257
+ "butternut squash": 942,
1258
+ "cab, hack, taxi, taxicab": 468,
1259
+ "cabbage butterfly": 324,
1260
+ "cairn, cairn terrier": 192,
1261
+ "caldron, cauldron": 469,
1262
+ "can opener, tin opener": 473,
1263
+ "candle, taper, wax light": 470,
1264
+ "cannon": 471,
1265
+ "canoe": 472,
1266
+ "capuchin, ringtail, Cebus capucinus": 378,
1267
+ "car mirror": 475,
1268
+ "car wheel": 479,
1269
+ "carbonara": 959,
1270
+ "cardigan": 474,
1271
+ "cardoon": 946,
1272
+ "carousel, carrousel, merry-go-round, roundabout, whirligig": 476,
1273
+ "carpenter's kit, tool kit": 477,
1274
+ "carton": 478,
1275
+ "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM": 480,
1276
+ "cassette": 481,
1277
+ "cassette player": 482,
1278
+ "castle": 483,
1279
+ "catamaran": 484,
1280
+ "cauliflower": 938,
1281
+ "cello, violoncello": 486,
1282
+ "cellular telephone, cellular phone, cellphone, cell, mobile phone": 487,
1283
+ "centipede": 79,
1284
+ "chain": 488,
1285
+ "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour": 490,
1286
+ "chain saw, chainsaw": 491,
1287
+ "chainlink fence": 489,
1288
+ "chambered nautilus, pearly nautilus, nautilus": 117,
1289
+ "cheeseburger": 933,
1290
+ "cheetah, chetah, Acinonyx jubatus": 293,
1291
+ "chest": 492,
1292
+ "chickadee": 19,
1293
+ "chiffonier, commode": 493,
1294
+ "chime, bell, gong": 494,
1295
+ "chimpanzee, chimp, Pan troglodytes": 367,
1296
+ "china cabinet, china closet": 495,
1297
+ "chiton, coat-of-mail shell, sea cradle, polyplacophore": 116,
1298
+ "chocolate sauce, chocolate syrup": 960,
1299
+ "chow, chow chow": 260,
1300
+ "church, church building": 497,
1301
+ "cicada, cicala": 316,
1302
+ "cinema, movie theater, movie theatre, movie house, picture palace": 498,
1303
+ "cleaver, meat cleaver, chopper": 499,
1304
+ "cliff dwelling": 500,
1305
+ "cliff, drop, drop-off": 972,
1306
+ "cloak": 501,
1307
+ "clog, geta, patten, sabot": 502,
1308
+ "clumber, clumber spaniel": 216,
1309
+ "cock": 7,
1310
+ "cocker spaniel, English cocker spaniel, cocker": 219,
1311
+ "cockroach, roach": 314,
1312
+ "cocktail shaker": 503,
1313
+ "coffee mug": 504,
1314
+ "coffeepot": 505,
1315
+ "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch": 391,
1316
+ "coil, spiral, volute, whorl, helix": 506,
1317
+ "collie": 231,
1318
+ "colobus, colobus monkey": 375,
1319
+ "combination lock": 507,
1320
+ "comic book": 917,
1321
+ "common iguana, iguana, Iguana iguana": 39,
1322
+ "common newt, Triturus vulgaris": 26,
1323
+ "computer keyboard, keypad": 508,
1324
+ "conch": 112,
1325
+ "confectionery, confectionary, candy store": 509,
1326
+ "consomme": 925,
1327
+ "container ship, containership, container vessel": 510,
1328
+ "convertible": 511,
1329
+ "coral fungus": 991,
1330
+ "coral reef": 973,
1331
+ "corkscrew, bottle screw": 512,
1332
+ "corn": 987,
1333
+ "cornet, horn, trumpet, trump": 513,
1334
+ "coucal": 91,
1335
+ "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor": 286,
1336
+ "cowboy boot": 514,
1337
+ "cowboy hat, ten-gallon hat": 515,
1338
+ "coyote, prairie wolf, brush wolf, Canis latrans": 272,
1339
+ "cradle": 516,
1340
+ "crane": 517,
1341
+ "crash helmet": 518,
1342
+ "crate": 519,
1343
+ "crayfish, crawfish, crawdad, crawdaddy": 124,
1344
+ "crib, cot": 520,
1345
+ "cricket": 312,
1346
+ "croquet ball": 522,
1347
+ "crossword puzzle, crossword": 918,
1348
+ "crutch": 523,
1349
+ "cucumber, cuke": 943,
1350
+ "cuirass": 524,
1351
+ "cup": 968,
1352
+ "curly-coated retriever": 206,
1353
+ "custard apple": 956,
1354
+ "daisy": 985,
1355
+ "dalmatian, coach dog, carriage dog": 251,
1356
+ "dam, dike, dyke": 525,
1357
+ "damselfly": 320,
1358
+ "desk": 526,
1359
+ "desktop computer": 527,
1360
+ "dhole, Cuon alpinus": 274,
1361
+ "dial telephone, dial phone": 528,
1362
+ "diamondback, diamondback rattlesnake, Crotalus adamanteus": 67,
1363
+ "diaper, nappy, napkin": 529,
1364
+ "digital clock": 530,
1365
+ "digital watch": 531,
1366
+ "dingo, warrigal, warragal, Canis dingo": 273,
1367
+ "dining table, board": 532,
1368
+ "dishrag, dishcloth": 533,
1369
+ "dishwasher, dish washer, dishwashing machine": 534,
1370
+ "disk brake, disc brake": 535,
1371
+ "dock, dockage, docking facility": 536,
1372
+ "dogsled, dog sled, dog sleigh": 537,
1373
+ "dome": 538,
1374
+ "doormat, welcome mat": 539,
1375
+ "dough": 961,
1376
+ "dowitcher": 142,
1377
+ "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk": 319,
1378
+ "drake": 97,
1379
+ "drilling platform, offshore rig": 540,
1380
+ "drum, membranophone, tympan": 541,
1381
+ "drumstick": 542,
1382
+ "dugong, Dugong dugon": 149,
1383
+ "dumbbell": 543,
1384
+ "dung beetle": 305,
1385
+ "ear, spike, capitulum": 998,
1386
+ "earthstar": 995,
1387
+ "echidna, spiny anteater, anteater": 102,
1388
+ "eel": 390,
1389
+ "eft": 27,
1390
+ "eggnog": 969,
1391
+ "electric fan, blower": 545,
1392
+ "electric guitar": 546,
1393
+ "electric locomotive": 547,
1394
+ "electric ray, crampfish, numbfish, torpedo": 5,
1395
+ "entertainment center": 548,
1396
+ "envelope": 549,
1397
+ "espresso": 967,
1398
+ "espresso maker": 550,
1399
+ "face powder": 551,
1400
+ "feather boa, boa": 552,
1401
+ "fiddler crab": 120,
1402
+ "fig": 952,
1403
+ "file, file cabinet, filing cabinet": 553,
1404
+ "fire engine, fire truck": 555,
1405
+ "fire screen, fireguard": 556,
1406
+ "fireboat": 554,
1407
+ "flagpole, flagstaff": 557,
1408
+ "flamingo": 130,
1409
+ "flat-coated retriever": 205,
1410
+ "flatworm, platyhelminth": 110,
1411
+ "flute, transverse flute": 558,
1412
+ "fly": 308,
1413
+ "folding chair": 559,
1414
+ "football helmet": 560,
1415
+ "forklift": 561,
1416
+ "fountain": 562,
1417
+ "fountain pen": 563,
1418
+ "four-poster": 564,
1419
+ "fox squirrel, eastern fox squirrel, Sciurus niger": 335,
1420
+ "freight car": 565,
1421
+ "frilled lizard, Chlamydosaurus kingi": 43,
1422
+ "frying pan, frypan, skillet": 567,
1423
+ "fur coat": 568,
1424
+ "gar, garfish, garpike, billfish, Lepisosteus osseus": 395,
1425
+ "garbage truck, dustcart": 569,
1426
+ "garden spider, Aranea diademata": 74,
1427
+ "garter snake, grass snake": 57,
1428
+ "gas pump, gasoline pump, petrol pump, island dispenser": 571,
1429
+ "gasmask, respirator, gas helmet": 570,
1430
+ "gazelle": 353,
1431
+ "geyser": 974,
1432
+ "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca": 388,
1433
+ "giant schnauzer": 197,
1434
+ "gibbon, Hylobates lar": 368,
1435
+ "go-kart": 573,
1436
+ "goblet": 572,
1437
+ "golden retriever": 207,
1438
+ "goldfinch, Carduelis carduelis": 11,
1439
+ "goldfish, Carassius auratus": 1,
1440
+ "golf ball": 574,
1441
+ "golfcart, golf cart": 575,
1442
+ "gondola": 576,
1443
+ "gong, tam-tam": 577,
1444
+ "goose": 99,
1445
+ "gorilla, Gorilla gorilla": 366,
1446
+ "gown": 578,
1447
+ "grand piano, grand": 579,
1448
+ "grasshopper, hopper": 311,
1449
+ "great grey owl, great gray owl, Strix nebulosa": 24,
1450
+ "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias": 2,
1451
+ "green lizard, Lacerta viridis": 46,
1452
+ "green mamba": 64,
1453
+ "green snake, grass snake": 55,
1454
+ "greenhouse, nursery, glasshouse": 580,
1455
+ "grey fox, gray fox, Urocyon cinereoargenteus": 280,
1456
+ "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus": 147,
1457
+ "grille, radiator grille": 581,
1458
+ "grocery store, grocery, food market, market": 582,
1459
+ "groenendael": 224,
1460
+ "groom, bridegroom": 982,
1461
+ "ground beetle, carabid beetle": 302,
1462
+ "guacamole": 924,
1463
+ "guenon, guenon monkey": 370,
1464
+ "guillotine": 583,
1465
+ "guinea pig, Cavia cobaya": 338,
1466
+ "gyromitra": 993,
1467
+ "hair slide": 584,
1468
+ "hair spray": 585,
1469
+ "half track": 586,
1470
+ "hammer": 587,
1471
+ "hammerhead, hammerhead shark": 4,
1472
+ "hamper": 588,
1473
+ "hamster": 333,
1474
+ "hand blower, blow dryer, blow drier, hair dryer, hair drier": 589,
1475
+ "hand-held computer, hand-held microcomputer": 590,
1476
+ "handkerchief, hankie, hanky, hankey": 591,
1477
+ "hard disc, hard disk, fixed disk": 592,
1478
+ "hare": 331,
1479
+ "harmonica, mouth organ, harp, mouth harp": 593,
1480
+ "harp": 594,
1481
+ "hartebeest": 351,
1482
+ "harvester, reaper": 595,
1483
+ "harvestman, daddy longlegs, Phalangium opilio": 70,
1484
+ "hatchet": 596,
1485
+ "hay": 958,
1486
+ "head cabbage": 936,
1487
+ "hen": 8,
1488
+ "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa": 996,
1489
+ "hermit crab": 125,
1490
+ "hip, rose hip, rosehip": 989,
1491
+ "hippopotamus, hippo, river horse, Hippopotamus amphibius": 344,
1492
+ "hog, pig, grunter, squealer, Sus scrofa": 341,
1493
+ "hognose snake, puff adder, sand viper": 54,
1494
+ "holster": 597,
1495
+ "home theater, home theatre": 598,
1496
+ "honeycomb": 599,
1497
+ "hook, claw": 600,
1498
+ "hoopskirt, crinoline": 601,
1499
+ "horizontal bar, high bar": 602,
1500
+ "hornbill": 93,
1501
+ "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus": 66,
1502
+ "horse cart, horse-cart": 603,
1503
+ "hot pot, hotpot": 926,
1504
+ "hotdog, hot dog, red hot": 934,
1505
+ "hourglass": 604,
1506
+ "house finch, linnet, Carpodacus mexicanus": 12,
1507
+ "howler monkey, howler": 379,
1508
+ "hummingbird": 94,
1509
+ "hyena, hyaena": 276,
1510
+ "iPod": 605,
1511
+ "ibex, Capra ibex": 350,
1512
+ "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus": 296,
1513
+ "ice cream, icecream": 928,
1514
+ "ice lolly, lolly, lollipop, popsicle": 929,
1515
+ "impala, Aepyceros melampus": 352,
1516
+ "indigo bunting, indigo finch, indigo bird, Passerina cyanea": 14,
1517
+ "indri, indris, Indri indri, Indri brevicaudatus": 384,
1518
+ "iron, smoothing iron": 606,
1519
+ "isopod": 126,
1520
+ "jacamar": 95,
1521
+ "jack-o'-lantern": 607,
1522
+ "jackfruit, jak, jack": 955,
1523
+ "jaguar, panther, Panthera onca, Felis onca": 290,
1524
+ "jay": 17,
1525
+ "jean, blue jean, denim": 608,
1526
+ "jeep, landrover": 609,
1527
+ "jellyfish": 107,
1528
+ "jersey, T-shirt, tee shirt": 610,
1529
+ "jigsaw puzzle": 611,
1530
+ "jinrikisha, ricksha, rickshaw": 612,
1531
+ "joystick": 613,
1532
+ "junco, snowbird": 13,
1533
+ "keeshond": 261,
1534
+ "kelpie": 227,
1535
+ "killer whale, killer, orca, grampus, sea wolf, Orcinus orca": 148,
1536
+ "kimono": 614,
1537
+ "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica": 121,
1538
+ "king penguin, Aptenodytes patagonica": 145,
1539
+ "king snake, kingsnake": 56,
1540
+ "kit fox, Vulpes macrotis": 278,
1541
+ "kite": 21,
1542
+ "knee pad": 615,
1543
+ "knot": 616,
1544
+ "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus": 105,
1545
+ "komondor": 228,
1546
+ "kuvasz": 222,
1547
+ "lab coat, laboratory coat": 617,
1548
+ "lacewing, lacewing fly": 318,
1549
+ "ladle": 618,
1550
+ "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle": 301,
1551
+ "lakeside, lakeshore": 975,
1552
+ "lampshade, lamp shade": 619,
1553
+ "langur": 374,
1554
+ "laptop, laptop computer": 620,
1555
+ "lawn mower, mower": 621,
1556
+ "leaf beetle, chrysomelid": 304,
1557
+ "leafhopper": 317,
1558
+ "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea": 34,
1559
+ "lemon": 951,
1560
+ "lens cap, lens cover": 622,
1561
+ "leopard, Panthera pardus": 288,
1562
+ "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens": 387,
1563
+ "letter opener, paper knife, paperknife": 623,
1564
+ "library": 624,
1565
+ "lifeboat": 625,
1566
+ "lighter, light, igniter, ignitor": 626,
1567
+ "limousine, limo": 627,
1568
+ "limpkin, Aramus pictus": 135,
1569
+ "liner, ocean liner": 628,
1570
+ "lion, king of beasts, Panthera leo": 291,
1571
+ "lionfish": 396,
1572
+ "lipstick, lip rouge": 629,
1573
+ "little blue heron, Egretta caerulea": 131,
1574
+ "llama": 355,
1575
+ "loggerhead, loggerhead turtle, Caretta caretta": 33,
1576
+ "long-horned beetle, longicorn, longicorn beetle": 303,
1577
+ "lorikeet": 90,
1578
+ "lotion": 631,
1579
+ "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system": 632,
1580
+ "loupe, jeweler's loupe": 633,
1581
+ "lumbermill, sawmill": 634,
1582
+ "lycaenid, lycaenid butterfly": 326,
1583
+ "lynx, catamount": 287,
1584
+ "macaque": 373,
1585
+ "macaw": 88,
1586
+ "magnetic compass": 635,
1587
+ "magpie": 18,
1588
+ "mailbag, postbag": 636,
1589
+ "mailbox, letter box": 637,
1590
+ "maillot": 638,
1591
+ "maillot, tank suit": 639,
1592
+ "malamute, malemute, Alaskan malamute": 249,
1593
+ "malinois": 225,
1594
+ "manhole cover": 640,
1595
+ "mantis, mantid": 315,
1596
+ "maraca": 641,
1597
+ "marimba, xylophone": 642,
1598
+ "marmoset": 377,
1599
+ "marmot": 336,
1600
+ "mashed potato": 935,
1601
+ "mask": 643,
1602
+ "matchstick": 644,
1603
+ "maypole": 645,
1604
+ "maze, labyrinth": 646,
1605
+ "measuring cup": 647,
1606
+ "meat loaf, meatloaf": 962,
1607
+ "medicine chest, medicine cabinet": 648,
1608
+ "meerkat, mierkat": 299,
1609
+ "megalith, megalithic structure": 649,
1610
+ "menu": 922,
1611
+ "microphone, mike": 650,
1612
+ "microwave, microwave oven": 651,
1613
+ "military uniform": 652,
1614
+ "milk can": 653,
1615
+ "miniature pinscher": 237,
1616
+ "miniature poodle": 266,
1617
+ "miniature schnauzer": 196,
1618
+ "minibus": 654,
1619
+ "miniskirt, mini": 655,
1620
+ "minivan": 656,
1621
+ "mink": 357,
1622
+ "missile": 657,
1623
+ "mitten": 658,
1624
+ "mixing bowl": 659,
1625
+ "mobile home, manufactured home": 660,
1626
+ "modem": 662,
1627
+ "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus": 323,
1628
+ "monastery": 663,
1629
+ "mongoose": 298,
1630
+ "monitor": 664,
1631
+ "moped": 665,
1632
+ "mortar": 666,
1633
+ "mortarboard": 667,
1634
+ "mosque": 668,
1635
+ "mosquito net": 669,
1636
+ "motor scooter, scooter": 670,
1637
+ "mountain bike, all-terrain bike, off-roader": 671,
1638
+ "mountain tent": 672,
1639
+ "mouse, computer mouse": 673,
1640
+ "mousetrap": 674,
1641
+ "moving van": 675,
1642
+ "mud turtle": 35,
1643
+ "mushroom": 947,
1644
+ "muzzle": 676,
1645
+ "nail": 677,
1646
+ "neck brace": 678,
1647
+ "necklace": 679,
1648
+ "nematode, nematode worm, roundworm": 111,
1649
+ "night snake, Hypsiglena torquata": 60,
1650
+ "nipple": 680,
1651
+ "notebook, notebook computer": 681,
1652
+ "obelisk": 682,
1653
+ "oboe, hautboy, hautbois": 683,
1654
+ "ocarina, sweet potato": 684,
1655
+ "odometer, hodometer, mileometer, milometer": 685,
1656
+ "oil filter": 686,
1657
+ "orange": 950,
1658
+ "orangutan, orang, orangutang, Pongo pygmaeus": 365,
1659
+ "organ, pipe organ": 687,
1660
+ "oscilloscope, scope, cathode-ray oscilloscope, CRO": 688,
1661
+ "ostrich, Struthio camelus": 9,
1662
+ "otter": 360,
1663
+ "otterhound, otter hound": 175,
1664
+ "overskirt": 689,
1665
+ "ox": 345,
1666
+ "oxcart": 690,
1667
+ "oxygen mask": 691,
1668
+ "oystercatcher, oyster catcher": 143,
1669
+ "packet": 692,
1670
+ "paddle, boat paddle": 693,
1671
+ "paddlewheel, paddle wheel": 694,
1672
+ "padlock": 695,
1673
+ "paintbrush": 696,
1674
+ "pajama, pyjama, pj's, jammies": 697,
1675
+ "palace": 698,
1676
+ "panpipe, pandean pipe, syrinx": 699,
1677
+ "paper towel": 700,
1678
+ "papillon": 157,
1679
+ "parachute, chute": 701,
1680
+ "parallel bars, bars": 702,
1681
+ "park bench": 703,
1682
+ "parking meter": 704,
1683
+ "partridge": 86,
1684
+ "passenger car, coach, carriage": 705,
1685
+ "patas, hussar monkey, Erythrocebus patas": 371,
1686
+ "patio, terrace": 706,
1687
+ "pay-phone, pay-station": 707,
1688
+ "peacock": 84,
1689
+ "pedestal, plinth, footstall": 708,
1690
+ "pelican": 144,
1691
+ "pencil box, pencil case": 709,
1692
+ "pencil sharpener": 710,
1693
+ "perfume, essence": 711,
1694
+ "photocopier": 713,
1695
+ "pick, plectrum, plectron": 714,
1696
+ "pickelhaube": 715,
1697
+ "picket fence, paling": 716,
1698
+ "pickup, pickup truck": 717,
1699
+ "pier": 718,
1700
+ "piggy bank, penny bank": 719,
1701
+ "pill bottle": 720,
1702
+ "pillow": 721,
1703
+ "pineapple, ananas": 953,
1704
+ "ping-pong ball": 722,
1705
+ "pinwheel": 723,
1706
+ "pirate, pirate ship": 724,
1707
+ "pitcher, ewer": 725,
1708
+ "pizza, pizza pie": 963,
1709
+ "plane, carpenter's plane, woodworking plane": 726,
1710
+ "planetarium": 727,
1711
+ "plastic bag": 728,
1712
+ "plate": 923,
1713
+ "plate rack": 729,
1714
+ "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus": 103,
1715
+ "plow, plough": 730,
1716
+ "plunger, plumber's helper": 731,
1717
+ "pole": 733,
1718
+ "polecat, fitch, foulmart, foumart, Mustela putorius": 358,
1719
+ "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria": 734,
1720
+ "pomegranate": 957,
1721
+ "poncho": 735,
1722
+ "pool table, billiard table, snooker table": 736,
1723
+ "pop bottle, soda bottle": 737,
1724
+ "porcupine, hedgehog": 334,
1725
+ "pot, flowerpot": 738,
1726
+ "potpie": 964,
1727
+ "potter's wheel": 739,
1728
+ "power drill": 740,
1729
+ "prairie chicken, prairie grouse, prairie fowl": 83,
1730
+ "prayer rug, prayer mat": 741,
1731
+ "pretzel": 932,
1732
+ "printer": 742,
1733
+ "prison, prison house": 743,
1734
+ "proboscis monkey, Nasalis larvatus": 376,
1735
+ "projectile, missile": 744,
1736
+ "projector": 745,
1737
+ "promontory, headland, head, foreland": 976,
1738
+ "ptarmigan": 81,
1739
+ "puck, hockey puck": 746,
1740
+ "puffer, pufferfish, blowfish, globefish": 397,
1741
+ "pug, pug-dog": 254,
1742
+ "punching bag, punch bag, punching ball, punchball": 747,
1743
+ "purse": 748,
1744
+ "quail": 85,
1745
+ "quill, quill pen": 749,
1746
+ "quilt, comforter, comfort, puff": 750,
1747
+ "racer, race car, racing car": 751,
1748
+ "racket, racquet": 752,
1749
+ "radiator": 753,
1750
+ "radio telescope, radio reflector": 755,
1751
+ "radio, wireless": 754,
1752
+ "rain barrel": 756,
1753
+ "ram, tup": 348,
1754
+ "rapeseed": 984,
1755
+ "recreational vehicle, RV, R.V.": 757,
1756
+ "red fox, Vulpes vulpes": 277,
1757
+ "red wine": 966,
1758
+ "red wolf, maned wolf, Canis rufus, Canis niger": 271,
1759
+ "red-backed sandpiper, dunlin, Erolia alpina": 140,
1760
+ "red-breasted merganser, Mergus serrator": 98,
1761
+ "redbone": 168,
1762
+ "redshank, Tringa totanus": 141,
1763
+ "reel": 758,
1764
+ "reflex camera": 759,
1765
+ "refrigerator, icebox": 760,
1766
+ "remote control, remote": 761,
1767
+ "restaurant, eating house, eating place, eatery": 762,
1768
+ "revolver, six-gun, six-shooter": 763,
1769
+ "rhinoceros beetle": 306,
1770
+ "rifle": 764,
1771
+ "ringlet, ringlet butterfly": 322,
1772
+ "ringneck snake, ring-necked snake, ring snake": 53,
1773
+ "robin, American robin, Turdus migratorius": 15,
1774
+ "rock beauty, Holocanthus tricolor": 392,
1775
+ "rock crab, Cancer irroratus": 119,
1776
+ "rock python, rock snake, Python sebae": 62,
1777
+ "rocking chair, rocker": 765,
1778
+ "rotisserie": 766,
1779
+ "rubber eraser, rubber, pencil eraser": 767,
1780
+ "ruddy turnstone, Arenaria interpres": 139,
1781
+ "ruffed grouse, partridge, Bonasa umbellus": 82,
1782
+ "rugby ball": 768,
1783
+ "rule, ruler": 769,
1784
+ "running shoe": 770,
1785
+ "safe": 771,
1786
+ "safety pin": 772,
1787
+ "saltshaker, salt shaker": 773,
1788
+ "sandal": 774,
1789
+ "sandbar, sand bar": 977,
1790
+ "sarong": 775,
1791
+ "sax, saxophone": 776,
1792
+ "scabbard": 777,
1793
+ "scale, weighing machine": 778,
1794
+ "schipperke": 223,
1795
+ "school bus": 779,
1796
+ "schooner": 780,
1797
+ "scoreboard": 781,
1798
+ "scorpion": 71,
1799
+ "screen, CRT screen": 782,
1800
+ "screw": 783,
1801
+ "screwdriver": 784,
1802
+ "scuba diver": 983,
1803
+ "sea anemone, anemone": 108,
1804
+ "sea cucumber, holothurian": 329,
1805
+ "sea lion": 150,
1806
+ "sea slug, nudibranch": 115,
1807
+ "sea snake": 65,
1808
+ "sea urchin": 328,
1809
+ "seashore, coast, seacoast, sea-coast": 978,
1810
+ "seat belt, seatbelt": 785,
1811
+ "sewing machine": 786,
1812
+ "shield, buckler": 787,
1813
+ "shoe shop, shoe-shop, shoe store": 788,
1814
+ "shoji": 789,
1815
+ "shopping basket": 790,
1816
+ "shopping cart": 791,
1817
+ "shovel": 792,
1818
+ "shower cap": 793,
1819
+ "shower curtain": 794,
1820
+ "siamang, Hylobates syndactylus, Symphalangus syndactylus": 369,
1821
+ "sidewinder, horned rattlesnake, Crotalus cerastes": 68,
1822
+ "silky terrier, Sydney silky": 201,
1823
+ "ski": 795,
1824
+ "ski mask": 796,
1825
+ "skunk, polecat, wood pussy": 361,
1826
+ "sleeping bag": 797,
1827
+ "slide rule, slipstick": 798,
1828
+ "sliding door": 799,
1829
+ "slot, one-armed bandit": 800,
1830
+ "sloth bear, Melursus ursinus, Ursus ursinus": 297,
1831
+ "slug": 114,
1832
+ "snail": 113,
1833
+ "snorkel": 801,
1834
+ "snow leopard, ounce, Panthera uncia": 289,
1835
+ "snowmobile": 802,
1836
+ "snowplow, snowplough": 803,
1837
+ "soap dispenser": 804,
1838
+ "soccer ball": 805,
1839
+ "sock": 806,
1840
+ "soft-coated wheaten terrier": 202,
1841
+ "solar dish, solar collector, solar furnace": 807,
1842
+ "sombrero": 808,
1843
+ "sorrel": 339,
1844
+ "soup bowl": 809,
1845
+ "space bar": 810,
1846
+ "space heater": 811,
1847
+ "space shuttle": 812,
1848
+ "spaghetti squash": 940,
1849
+ "spatula": 813,
1850
+ "speedboat": 814,
1851
+ "spider monkey, Ateles geoffroyi": 381,
1852
+ "spider web, spider's web": 815,
1853
+ "spindle": 816,
1854
+ "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish": 123,
1855
+ "spoonbill": 129,
1856
+ "sports car, sport car": 817,
1857
+ "spotlight, spot": 818,
1858
+ "spotted salamander, Ambystoma maculatum": 28,
1859
+ "squirrel monkey, Saimiri sciureus": 382,
1860
+ "stage": 819,
1861
+ "standard poodle": 267,
1862
+ "standard schnauzer": 198,
1863
+ "starfish, sea star": 327,
1864
+ "steam locomotive": 820,
1865
+ "steel arch bridge": 821,
1866
+ "steel drum": 822,
1867
+ "stethoscope": 823,
1868
+ "stingray": 6,
1869
+ "stinkhorn, carrion fungus": 994,
1870
+ "stole": 824,
1871
+ "stone wall": 825,
1872
+ "stopwatch, stop watch": 826,
1873
+ "stove": 827,
1874
+ "strainer": 828,
1875
+ "strawberry": 949,
1876
+ "street sign": 919,
1877
+ "streetcar, tram, tramcar, trolley, trolley car": 829,
1878
+ "stretcher": 830,
1879
+ "studio couch, day bed": 831,
1880
+ "stupa, tope": 832,
1881
+ "sturgeon": 394,
1882
+ "submarine, pigboat, sub, U-boat": 833,
1883
+ "suit, suit of clothes": 834,
1884
+ "sulphur butterfly, sulfur butterfly": 325,
1885
+ "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita": 89,
1886
+ "sundial": 835,
1887
+ "sunglass": 836,
1888
+ "sunglasses, dark glasses, shades": 837,
1889
+ "sunscreen, sunblock, sun blocker": 838,
1890
+ "suspension bridge": 839,
1891
+ "swab, swob, mop": 840,
1892
+ "sweatshirt": 841,
1893
+ "swimming trunks, bathing trunks": 842,
1894
+ "swing": 843,
1895
+ "switch, electric switch, electrical switch": 844,
1896
+ "syringe": 845,
1897
+ "tabby, tabby cat": 281,
1898
+ "table lamp": 846,
1899
+ "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui": 32,
1900
+ "tank, army tank, armored combat vehicle, armoured combat vehicle": 847,
1901
+ "tape player": 848,
1902
+ "tarantula": 76,
1903
+ "teapot": 849,
1904
+ "teddy, teddy bear": 850,
1905
+ "television, television system": 851,
1906
+ "tench, Tinca tinca": 0,
1907
+ "tennis ball": 852,
1908
+ "terrapin": 36,
1909
+ "thatch, thatched roof": 853,
1910
+ "theater curtain, theatre curtain": 854,
1911
+ "thimble": 855,
1912
+ "three-toed sloth, ai, Bradypus tridactylus": 364,
1913
+ "thresher, thrasher, threshing machine": 856,
1914
+ "throne": 857,
1915
+ "thunder snake, worm snake, Carphophis amoenus": 52,
1916
+ "tick": 78,
1917
+ "tiger beetle": 300,
1918
+ "tiger cat": 282,
1919
+ "tiger shark, Galeocerdo cuvieri": 3,
1920
+ "tiger, Panthera tigris": 292,
1921
+ "tile roof": 858,
1922
+ "timber wolf, grey wolf, gray wolf, Canis lupus": 269,
1923
+ "titi, titi monkey": 380,
1924
+ "toaster": 859,
1925
+ "tobacco shop, tobacconist shop, tobacconist": 860,
1926
+ "toilet seat": 861,
1927
+ "toilet tissue, toilet paper, bathroom tissue": 999,
1928
+ "torch": 862,
1929
+ "totem pole": 863,
1930
+ "toucan": 96,
1931
+ "tow truck, tow car, wrecker": 864,
1932
+ "toy poodle": 265,
1933
+ "toy terrier": 158,
1934
+ "toyshop": 865,
1935
+ "tractor": 866,
1936
+ "traffic light, traffic signal, stoplight": 920,
1937
+ "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi": 867,
1938
+ "tray": 868,
1939
+ "tree frog, tree-frog": 31,
1940
+ "trench coat": 869,
1941
+ "triceratops": 51,
1942
+ "tricycle, trike, velocipede": 870,
1943
+ "trifle": 927,
1944
+ "trilobite": 69,
1945
+ "trimaran": 871,
1946
+ "tripod": 872,
1947
+ "triumphal arch": 873,
1948
+ "trolleybus, trolley coach, trackless trolley": 874,
1949
+ "trombone": 875,
1950
+ "tub, vat": 876,
1951
+ "turnstile": 877,
1952
+ "tusker": 101,
1953
+ "typewriter keyboard": 878,
1954
+ "umbrella": 879,
1955
+ "unicycle, monocycle": 880,
1956
+ "upright, upright piano": 881,
1957
+ "vacuum, vacuum cleaner": 882,
1958
+ "valley, vale": 979,
1959
+ "vase": 883,
1960
+ "vault": 884,
1961
+ "velvet": 885,
1962
+ "vending machine": 886,
1963
+ "vestment": 887,
1964
+ "viaduct": 888,
1965
+ "vine snake": 59,
1966
+ "violin, fiddle": 889,
1967
+ "vizsla, Hungarian pointer": 211,
1968
+ "volcano": 980,
1969
+ "volleyball": 890,
1970
+ "vulture": 23,
1971
+ "waffle iron": 891,
1972
+ "walking stick, walkingstick, stick insect": 313,
1973
+ "wall clock": 892,
1974
+ "wallaby, brush kangaroo": 104,
1975
+ "wallet, billfold, notecase, pocketbook": 893,
1976
+ "wardrobe, closet, press": 894,
1977
+ "warplane, military plane": 895,
1978
+ "warthog": 343,
1979
+ "washbasin, handbasin, washbowl, lavabo, wash-hand basin": 896,
1980
+ "washer, automatic washer, washing machine": 897,
1981
+ "water bottle": 898,
1982
+ "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis": 346,
1983
+ "water jug": 899,
1984
+ "water ouzel, dipper": 20,
1985
+ "water snake": 58,
1986
+ "water tower": 900,
1987
+ "weasel": 356,
1988
+ "web site, website, internet site, site": 916,
1989
+ "weevil": 307,
1990
+ "whippet": 172,
1991
+ "whiptail, whiptail lizard": 41,
1992
+ "whiskey jug": 901,
1993
+ "whistle": 902,
1994
+ "white stork, Ciconia ciconia": 127,
1995
+ "white wolf, Arctic wolf, Canis lupus tundrarum": 270,
1996
+ "wig": 903,
1997
+ "wild boar, boar, Sus scrofa": 342,
1998
+ "window screen": 904,
1999
+ "window shade": 905,
2000
+ "wine bottle": 907,
2001
+ "wing": 908,
2002
+ "wire-haired fox terrier": 188,
2003
+ "wok": 909,
2004
+ "wolf spider, hunting spider": 77,
2005
+ "wombat": 106,
2006
+ "wood rabbit, cottontail, cottontail rabbit": 330,
2007
+ "wooden spoon": 910,
2008
+ "wool, woolen, woollen": 911,
2009
+ "worm fence, snake fence, snake-rail fence, Virginia fence": 912,
2010
+ "wreck": 913,
2011
+ "yawl": 914,
2012
+ "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum": 986,
2013
+ "yurt": 915,
2014
+ "zebra": 340,
2015
+ "zucchini, courgette": 939
2016
+ },
2017
+ "layer_norm_eps": 1e-12,
2018
+ "model_type": "vit",
2019
+ "num_attention_heads": 12,
2020
+ "num_channels": 3,
2021
+ "num_hidden_layers": 12,
2022
+ "patch_size": 16,
2023
+ "qkv_bias": true,
2024
+ "torch_dtype": "float32",
2025
+ "transformers_version": "4.44.2"
2026
+ }
chatbot/fine_tuned_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8b6d2fa2eaa34206ba3f12bf21f91a932ed30645c668e95db4512cb9189ca029
3
+ size 343220892
chatbot/fined-tuned-model.lora.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6fc9f1de53fe3d4eee5c536a0d566dafaf1d11d0167c526506bc9d89c7c3ebe3
3
+ size 5560280
chatbot/heatmap.png ADDED
chatbot/inventory.db ADDED
Binary file (8.19 kB). View file
 
chatbot/inventory_chart.png ADDED
chatbot/inventory_counts.png ADDED
chatbot/inventory_images.db ADDED
Binary file (12.3 kB). View file
 
chatbot/inventory_log.json ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "coffee mug": {
3
+ "category": "coffee mug",
4
+ "count": 2,
5
+ "last_detected": "2024-11-07T12:17:46.882928",
6
+ "history": [
7
+ {
8
+ "timestamp": "2024-11-07T12:15:25.208394",
9
+ "count": 0
10
+ },
11
+ {
12
+ "timestamp": "2024-11-07T12:17:46.882928",
13
+ "count": 1
14
+ }
15
+ ]
16
+ },
17
+ "ladle": {
18
+ "category": "ladle",
19
+ "count": 1,
20
+ "last_detected": "2024-11-07T12:15:25.315552",
21
+ "history": [
22
+ {
23
+ "timestamp": "2024-11-07T12:15:25.315552",
24
+ "count": 0
25
+ }
26
+ ]
27
+ },
28
+ "wool, woolen, woollen": {
29
+ "category": "wool, woolen, woollen",
30
+ "count": 2,
31
+ "last_detected": "2024-11-07T12:17:46.998784",
32
+ "history": [
33
+ {
34
+ "timestamp": "2024-11-07T12:15:25.412992",
35
+ "count": 0
36
+ },
37
+ {
38
+ "timestamp": "2024-11-07T12:17:46.998784",
39
+ "count": 1
40
+ }
41
+ ]
42
+ },
43
+ "Windsor tie": {
44
+ "category": "Windsor tie",
45
+ "count": 1,
46
+ "last_detected": "2024-11-07T12:16:27.597227",
47
+ "history": [
48
+ {
49
+ "timestamp": "2024-11-07T12:16:27.597227",
50
+ "count": 0
51
+ }
52
+ ]
53
+ },
54
+ "ballplayer, baseball player": {
55
+ "category": "ballplayer, baseball player",
56
+ "count": 1,
57
+ "last_detected": "2024-11-07T12:16:27.646567",
58
+ "history": [
59
+ {
60
+ "timestamp": "2024-11-07T12:16:27.646567",
61
+ "count": 0
62
+ }
63
+ ]
64
+ },
65
+ "analog clock": {
66
+ "category": "analog clock",
67
+ "count": 1,
68
+ "last_detected": "2024-11-07T12:17:47.055052",
69
+ "history": [
70
+ {
71
+ "timestamp": "2024-11-07T12:17:47.055052",
72
+ "count": 0
73
+ }
74
+ ]
75
+ },
76
+ "dining table, board": {
77
+ "category": "dining table, board",
78
+ "count": 2,
79
+ "last_detected": "2024-11-07T12:17:47.188026",
80
+ "history": [
81
+ {
82
+ "timestamp": "2024-11-07T12:17:47.120347",
83
+ "count": 0
84
+ },
85
+ {
86
+ "timestamp": "2024-11-07T12:17:47.188026",
87
+ "count": 1
88
+ }
89
+ ]
90
+ }
91
+ }
chatbot/line_chart.png ADDED
chatbot/main.py.save ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as grimport os
2
+ import keras
3
+ import keras_nlp
4
+
5
+ import os
6
+
7
+ os.environ["KERAS_BACKEND"] = "jax"
8
+ # Avoid memory fragmentation on JAX backend.
9
+ os.environ["XLA_PYTHON_CLIENT_MEM_FRACTION"]="1.00"
10
+
11
+ import os
12
+
13
+ # Set Kaggle API credentials
14
+ os.environ["KAGGLE_USERNAME"] = "rogerkorantenng"
15
+ os.environ["KAGGLE_KEY"] = "9a33b6e88bcb6058b1281d777fa6808d"
16
+
17
+ # Load environment variables
18
+ load_dotenv()
19
+
20
+ # Replace this with the path or method to load your local model
21
+ gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset("gemma_2b_en")
22
+
23
+ def generate_response(message, history):
24
+ # Format the conversation history for the local model
25
+ formatted_history = []
26
+ for user, assistant in history:
27
+ formatted_history.append(f"Instruction:\n{user}\n\nResponse:\n{assistant}")
28
+
29
+ # Add the latest user message to the history
30
+ formatted_history.append(f"Instruction:\n{message}\n\nResponse:\n")
31
+
32
+ # Join formatted history into a single string for input
33
+ input_text = "\n".join(formatted_history)
34
+
35
+ # Generate response from the local model
36
+ # Make sure to adjust this part according to your model's API
37
+ response = gemma_lm.generate(input_text, max_length=256)
38
+
39
+ # Extract the response text
40
+ # Adjust the response extraction based on the actual structure of your model's output
41
+ return response[0] # Change this line if necessary
42
+
43
+ # Create the Gradio interface
44
+ gr.ChatInterface(
45
+ generate_response,
46
+ chatbot=gr.Chatbot(height=300),
47
+ textbox=gr.Textbox(placeholder="You can ask me anything", container=False, scale=7),
48
+ title="Local Model Chat Bot",
49
+ retry_btn=None,
50
+ undo_btn="Delete Previous",
51
+ clear_btn="Clear"
52
+ ).launch(share=True)
chatbot/main2.py ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+ import os
3
+ import keras_nlp
4
+ from transformers import AutoModelForCausalLM
5
+
6
+ # Set Kaggle API credentials using values from environment variables
7
+ os.environ["KAGGLE_USERNAME"] = os.environ.get("KAGGLE_USERNAME")
8
+ os.environ["KAGGLE_KEY"] = os.environ.get("KAGGLE_KEY")
9
+
10
+
11
+ # Load LoRA weights if you have them
12
+ LoRA_weights_path = "fined-tuned-model.lora.h5"
13
+ gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset("gemma_2b_en")
14
+ gemma_lm.backbone.enable_lora(rank=4) # Enable LoRA with rank 4
15
+ gemma_lm.preprocessor.sequence_length = 512 # Limit sequence length
16
+ gemma_lm.backbone.load_lora_weights(LoRA_weights_path) # Load LoRA weights
17
+
18
+ # Define the response generation function
19
+ def generate_response(message):
20
+ # Create a prompt template
21
+ template = "Instruction:\n{instruction}\n\nResponse:\n{response}"
22
+
23
+ # Create the prompt with the current message
24
+ prompt = template.format(instruction=message, response="")
25
+ print("Prompt:\n", prompt)
26
+
27
+ # Generate response from the model
28
+ response = gemma_lm.generate(prompt, max_length=256)
29
+ # Only keep the generated response
30
+ response = response.split("Response:")[-1].strip()
31
+
32
+ print("Generated Response:\n", response)
33
+
34
+ # Extract and return the generated response text
35
+ return response # Adjust this if your model's output structure differs
36
+
37
+ # Create the Gradio chat interface
38
+ interface = gr.Interface(
39
+ fn=generate_response, # Function that generates responses
40
+ inputs=gr.Textbox(placeholder="Hello, I am Sage, your mental health advisor", lines=2, scale=7),
41
+ outputs=gr.Textbox(),
42
+ title="Sage, your Mental Health Advisor",
43
+ # description="Chat with Sage, your mental health advisor.",
44
+ # live=True
45
+ )
46
+ proxy_prefix = os.environ.get("PROXY_PREFIX")
47
+ # Launch the Gradio app
48
+ interface.launch(server_name="0.0.0.0", server_port=8080, root_path=proxy_prefix, share=True)
chatbot/model_config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "gpt-4": {
3
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4/chat/completions?api-version=2024-08-01-preview",
4
+ "api_key": "XTtd4wdC0mlEdbQqp0KWOe9aWrlg4z4HurxXlgBf5ASUvrB9iPkwJQQJ99AKACHYHv6XJ3w3AAABACOGaeSg",
5
+ "model_path": null
6
+ },
7
+ "gpt-4o": {
8
+ "endpoint": "https://roger-m38jr9pd-eastus2.openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-preview",
9
+ "api_key": "XTtd4wdC0mlEdbQqp0KWOe9aWrlg4z4HurxXlgBf5ASUvrB9iPkwJQQJ99AKACHYHv6XJ3w3AAABACOGaeSg",
10
+ "model_path": null
11
+ },
12
+ "gpt-35-turbo": {
13
+ "endpoint": "https://rogerkoranteng.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2024-08-01-preview",
14
+ "api_key": "cfe75d17d47c40318db6230d2677d84f",
15
+ "model_path": null
16
+ },
17
+ "gpt-4-32k": {
18
+ "endpoint": "https://roger-m38orjxq-australiaeast.openai.azure.com/openai/deployments/gpt-4-32k/chat/completions?api-version=2024-08-01-preview",
19
+ "api_key": "AebO97c0w0lnrppFNpbuPVueKth7HfGOdMbKeaX5R2Uql6wvJA5qJQQJ99AKACL93NaXJ3w3AAABACOGh8MD",
20
+ "model_path": null
21
+ },
22
+ "microsoft/Phi-3.5-mini-instruct": {
23
+ "endpoint": null,
24
+ "model_path": "./models/microsoft/Phi-3.5-mini-instruct",
25
+ "api_key": null
26
+ }
27
+ }
chatbot/models/500/config.json ADDED
@@ -0,0 +1,2026 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "google/vit-base-patch16-224",
3
+ "architectures": [
4
+ "ViTForImageClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "encoder_stride": 16,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.0,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "tench, Tinca tinca",
13
+ "1": "goldfish, Carassius auratus",
14
+ "2": "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias",
15
+ "3": "tiger shark, Galeocerdo cuvieri",
16
+ "4": "hammerhead, hammerhead shark",
17
+ "5": "electric ray, crampfish, numbfish, torpedo",
18
+ "6": "stingray",
19
+ "7": "cock",
20
+ "8": "hen",
21
+ "9": "ostrich, Struthio camelus",
22
+ "10": "brambling, Fringilla montifringilla",
23
+ "11": "goldfinch, Carduelis carduelis",
24
+ "12": "house finch, linnet, Carpodacus mexicanus",
25
+ "13": "junco, snowbird",
26
+ "14": "indigo bunting, indigo finch, indigo bird, Passerina cyanea",
27
+ "15": "robin, American robin, Turdus migratorius",
28
+ "16": "bulbul",
29
+ "17": "jay",
30
+ "18": "magpie",
31
+ "19": "chickadee",
32
+ "20": "water ouzel, dipper",
33
+ "21": "kite",
34
+ "22": "bald eagle, American eagle, Haliaeetus leucocephalus",
35
+ "23": "vulture",
36
+ "24": "great grey owl, great gray owl, Strix nebulosa",
37
+ "25": "European fire salamander, Salamandra salamandra",
38
+ "26": "common newt, Triturus vulgaris",
39
+ "27": "eft",
40
+ "28": "spotted salamander, Ambystoma maculatum",
41
+ "29": "axolotl, mud puppy, Ambystoma mexicanum",
42
+ "30": "bullfrog, Rana catesbeiana",
43
+ "31": "tree frog, tree-frog",
44
+ "32": "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui",
45
+ "33": "loggerhead, loggerhead turtle, Caretta caretta",
46
+ "34": "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea",
47
+ "35": "mud turtle",
48
+ "36": "terrapin",
49
+ "37": "box turtle, box tortoise",
50
+ "38": "banded gecko",
51
+ "39": "common iguana, iguana, Iguana iguana",
52
+ "40": "American chameleon, anole, Anolis carolinensis",
53
+ "41": "whiptail, whiptail lizard",
54
+ "42": "agama",
55
+ "43": "frilled lizard, Chlamydosaurus kingi",
56
+ "44": "alligator lizard",
57
+ "45": "Gila monster, Heloderma suspectum",
58
+ "46": "green lizard, Lacerta viridis",
59
+ "47": "African chameleon, Chamaeleo chamaeleon",
60
+ "48": "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis",
61
+ "49": "African crocodile, Nile crocodile, Crocodylus niloticus",
62
+ "50": "American alligator, Alligator mississipiensis",
63
+ "51": "triceratops",
64
+ "52": "thunder snake, worm snake, Carphophis amoenus",
65
+ "53": "ringneck snake, ring-necked snake, ring snake",
66
+ "54": "hognose snake, puff adder, sand viper",
67
+ "55": "green snake, grass snake",
68
+ "56": "king snake, kingsnake",
69
+ "57": "garter snake, grass snake",
70
+ "58": "water snake",
71
+ "59": "vine snake",
72
+ "60": "night snake, Hypsiglena torquata",
73
+ "61": "boa constrictor, Constrictor constrictor",
74
+ "62": "rock python, rock snake, Python sebae",
75
+ "63": "Indian cobra, Naja naja",
76
+ "64": "green mamba",
77
+ "65": "sea snake",
78
+ "66": "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus",
79
+ "67": "diamondback, diamondback rattlesnake, Crotalus adamanteus",
80
+ "68": "sidewinder, horned rattlesnake, Crotalus cerastes",
81
+ "69": "trilobite",
82
+ "70": "harvestman, daddy longlegs, Phalangium opilio",
83
+ "71": "scorpion",
84
+ "72": "black and gold garden spider, Argiope aurantia",
85
+ "73": "barn spider, Araneus cavaticus",
86
+ "74": "garden spider, Aranea diademata",
87
+ "75": "black widow, Latrodectus mactans",
88
+ "76": "tarantula",
89
+ "77": "wolf spider, hunting spider",
90
+ "78": "tick",
91
+ "79": "centipede",
92
+ "80": "black grouse",
93
+ "81": "ptarmigan",
94
+ "82": "ruffed grouse, partridge, Bonasa umbellus",
95
+ "83": "prairie chicken, prairie grouse, prairie fowl",
96
+ "84": "peacock",
97
+ "85": "quail",
98
+ "86": "partridge",
99
+ "87": "African grey, African gray, Psittacus erithacus",
100
+ "88": "macaw",
101
+ "89": "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita",
102
+ "90": "lorikeet",
103
+ "91": "coucal",
104
+ "92": "bee eater",
105
+ "93": "hornbill",
106
+ "94": "hummingbird",
107
+ "95": "jacamar",
108
+ "96": "toucan",
109
+ "97": "drake",
110
+ "98": "red-breasted merganser, Mergus serrator",
111
+ "99": "goose",
112
+ "100": "black swan, Cygnus atratus",
113
+ "101": "tusker",
114
+ "102": "echidna, spiny anteater, anteater",
115
+ "103": "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus",
116
+ "104": "wallaby, brush kangaroo",
117
+ "105": "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus",
118
+ "106": "wombat",
119
+ "107": "jellyfish",
120
+ "108": "sea anemone, anemone",
121
+ "109": "brain coral",
122
+ "110": "flatworm, platyhelminth",
123
+ "111": "nematode, nematode worm, roundworm",
124
+ "112": "conch",
125
+ "113": "snail",
126
+ "114": "slug",
127
+ "115": "sea slug, nudibranch",
128
+ "116": "chiton, coat-of-mail shell, sea cradle, polyplacophore",
129
+ "117": "chambered nautilus, pearly nautilus, nautilus",
130
+ "118": "Dungeness crab, Cancer magister",
131
+ "119": "rock crab, Cancer irroratus",
132
+ "120": "fiddler crab",
133
+ "121": "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica",
134
+ "122": "American lobster, Northern lobster, Maine lobster, Homarus americanus",
135
+ "123": "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
136
+ "124": "crayfish, crawfish, crawdad, crawdaddy",
137
+ "125": "hermit crab",
138
+ "126": "isopod",
139
+ "127": "white stork, Ciconia ciconia",
140
+ "128": "black stork, Ciconia nigra",
141
+ "129": "spoonbill",
142
+ "130": "flamingo",
143
+ "131": "little blue heron, Egretta caerulea",
144
+ "132": "American egret, great white heron, Egretta albus",
145
+ "133": "bittern",
146
+ "134": "crane",
147
+ "135": "limpkin, Aramus pictus",
148
+ "136": "European gallinule, Porphyrio porphyrio",
149
+ "137": "American coot, marsh hen, mud hen, water hen, Fulica americana",
150
+ "138": "bustard",
151
+ "139": "ruddy turnstone, Arenaria interpres",
152
+ "140": "red-backed sandpiper, dunlin, Erolia alpina",
153
+ "141": "redshank, Tringa totanus",
154
+ "142": "dowitcher",
155
+ "143": "oystercatcher, oyster catcher",
156
+ "144": "pelican",
157
+ "145": "king penguin, Aptenodytes patagonica",
158
+ "146": "albatross, mollymawk",
159
+ "147": "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus",
160
+ "148": "killer whale, killer, orca, grampus, sea wolf, Orcinus orca",
161
+ "149": "dugong, Dugong dugon",
162
+ "150": "sea lion",
163
+ "151": "Chihuahua",
164
+ "152": "Japanese spaniel",
165
+ "153": "Maltese dog, Maltese terrier, Maltese",
166
+ "154": "Pekinese, Pekingese, Peke",
167
+ "155": "Shih-Tzu",
168
+ "156": "Blenheim spaniel",
169
+ "157": "papillon",
170
+ "158": "toy terrier",
171
+ "159": "Rhodesian ridgeback",
172
+ "160": "Afghan hound, Afghan",
173
+ "161": "basset, basset hound",
174
+ "162": "beagle",
175
+ "163": "bloodhound, sleuthhound",
176
+ "164": "bluetick",
177
+ "165": "black-and-tan coonhound",
178
+ "166": "Walker hound, Walker foxhound",
179
+ "167": "English foxhound",
180
+ "168": "redbone",
181
+ "169": "borzoi, Russian wolfhound",
182
+ "170": "Irish wolfhound",
183
+ "171": "Italian greyhound",
184
+ "172": "whippet",
185
+ "173": "Ibizan hound, Ibizan Podenco",
186
+ "174": "Norwegian elkhound, elkhound",
187
+ "175": "otterhound, otter hound",
188
+ "176": "Saluki, gazelle hound",
189
+ "177": "Scottish deerhound, deerhound",
190
+ "178": "Weimaraner",
191
+ "179": "Staffordshire bullterrier, Staffordshire bull terrier",
192
+ "180": "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier",
193
+ "181": "Bedlington terrier",
194
+ "182": "Border terrier",
195
+ "183": "Kerry blue terrier",
196
+ "184": "Irish terrier",
197
+ "185": "Norfolk terrier",
198
+ "186": "Norwich terrier",
199
+ "187": "Yorkshire terrier",
200
+ "188": "wire-haired fox terrier",
201
+ "189": "Lakeland terrier",
202
+ "190": "Sealyham terrier, Sealyham",
203
+ "191": "Airedale, Airedale terrier",
204
+ "192": "cairn, cairn terrier",
205
+ "193": "Australian terrier",
206
+ "194": "Dandie Dinmont, Dandie Dinmont terrier",
207
+ "195": "Boston bull, Boston terrier",
208
+ "196": "miniature schnauzer",
209
+ "197": "giant schnauzer",
210
+ "198": "standard schnauzer",
211
+ "199": "Scotch terrier, Scottish terrier, Scottie",
212
+ "200": "Tibetan terrier, chrysanthemum dog",
213
+ "201": "silky terrier, Sydney silky",
214
+ "202": "soft-coated wheaten terrier",
215
+ "203": "West Highland white terrier",
216
+ "204": "Lhasa, Lhasa apso",
217
+ "205": "flat-coated retriever",
218
+ "206": "curly-coated retriever",
219
+ "207": "golden retriever",
220
+ "208": "Labrador retriever",
221
+ "209": "Chesapeake Bay retriever",
222
+ "210": "German short-haired pointer",
223
+ "211": "vizsla, Hungarian pointer",
224
+ "212": "English setter",
225
+ "213": "Irish setter, red setter",
226
+ "214": "Gordon setter",
227
+ "215": "Brittany spaniel",
228
+ "216": "clumber, clumber spaniel",
229
+ "217": "English springer, English springer spaniel",
230
+ "218": "Welsh springer spaniel",
231
+ "219": "cocker spaniel, English cocker spaniel, cocker",
232
+ "220": "Sussex spaniel",
233
+ "221": "Irish water spaniel",
234
+ "222": "kuvasz",
235
+ "223": "schipperke",
236
+ "224": "groenendael",
237
+ "225": "malinois",
238
+ "226": "briard",
239
+ "227": "kelpie",
240
+ "228": "komondor",
241
+ "229": "Old English sheepdog, bobtail",
242
+ "230": "Shetland sheepdog, Shetland sheep dog, Shetland",
243
+ "231": "collie",
244
+ "232": "Border collie",
245
+ "233": "Bouvier des Flandres, Bouviers des Flandres",
246
+ "234": "Rottweiler",
247
+ "235": "German shepherd, German shepherd dog, German police dog, alsatian",
248
+ "236": "Doberman, Doberman pinscher",
249
+ "237": "miniature pinscher",
250
+ "238": "Greater Swiss Mountain dog",
251
+ "239": "Bernese mountain dog",
252
+ "240": "Appenzeller",
253
+ "241": "EntleBucher",
254
+ "242": "boxer",
255
+ "243": "bull mastiff",
256
+ "244": "Tibetan mastiff",
257
+ "245": "French bulldog",
258
+ "246": "Great Dane",
259
+ "247": "Saint Bernard, St Bernard",
260
+ "248": "Eskimo dog, husky",
261
+ "249": "malamute, malemute, Alaskan malamute",
262
+ "250": "Siberian husky",
263
+ "251": "dalmatian, coach dog, carriage dog",
264
+ "252": "affenpinscher, monkey pinscher, monkey dog",
265
+ "253": "basenji",
266
+ "254": "pug, pug-dog",
267
+ "255": "Leonberg",
268
+ "256": "Newfoundland, Newfoundland dog",
269
+ "257": "Great Pyrenees",
270
+ "258": "Samoyed, Samoyede",
271
+ "259": "Pomeranian",
272
+ "260": "chow, chow chow",
273
+ "261": "keeshond",
274
+ "262": "Brabancon griffon",
275
+ "263": "Pembroke, Pembroke Welsh corgi",
276
+ "264": "Cardigan, Cardigan Welsh corgi",
277
+ "265": "toy poodle",
278
+ "266": "miniature poodle",
279
+ "267": "standard poodle",
280
+ "268": "Mexican hairless",
281
+ "269": "timber wolf, grey wolf, gray wolf, Canis lupus",
282
+ "270": "white wolf, Arctic wolf, Canis lupus tundrarum",
283
+ "271": "red wolf, maned wolf, Canis rufus, Canis niger",
284
+ "272": "coyote, prairie wolf, brush wolf, Canis latrans",
285
+ "273": "dingo, warrigal, warragal, Canis dingo",
286
+ "274": "dhole, Cuon alpinus",
287
+ "275": "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus",
288
+ "276": "hyena, hyaena",
289
+ "277": "red fox, Vulpes vulpes",
290
+ "278": "kit fox, Vulpes macrotis",
291
+ "279": "Arctic fox, white fox, Alopex lagopus",
292
+ "280": "grey fox, gray fox, Urocyon cinereoargenteus",
293
+ "281": "tabby, tabby cat",
294
+ "282": "tiger cat",
295
+ "283": "Persian cat",
296
+ "284": "Siamese cat, Siamese",
297
+ "285": "Egyptian cat",
298
+ "286": "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor",
299
+ "287": "lynx, catamount",
300
+ "288": "leopard, Panthera pardus",
301
+ "289": "snow leopard, ounce, Panthera uncia",
302
+ "290": "jaguar, panther, Panthera onca, Felis onca",
303
+ "291": "lion, king of beasts, Panthera leo",
304
+ "292": "tiger, Panthera tigris",
305
+ "293": "cheetah, chetah, Acinonyx jubatus",
306
+ "294": "brown bear, bruin, Ursus arctos",
307
+ "295": "American black bear, black bear, Ursus americanus, Euarctos americanus",
308
+ "296": "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus",
309
+ "297": "sloth bear, Melursus ursinus, Ursus ursinus",
310
+ "298": "mongoose",
311
+ "299": "meerkat, mierkat",
312
+ "300": "tiger beetle",
313
+ "301": "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
314
+ "302": "ground beetle, carabid beetle",
315
+ "303": "long-horned beetle, longicorn, longicorn beetle",
316
+ "304": "leaf beetle, chrysomelid",
317
+ "305": "dung beetle",
318
+ "306": "rhinoceros beetle",
319
+ "307": "weevil",
320
+ "308": "fly",
321
+ "309": "bee",
322
+ "310": "ant, emmet, pismire",
323
+ "311": "grasshopper, hopper",
324
+ "312": "cricket",
325
+ "313": "walking stick, walkingstick, stick insect",
326
+ "314": "cockroach, roach",
327
+ "315": "mantis, mantid",
328
+ "316": "cicada, cicala",
329
+ "317": "leafhopper",
330
+ "318": "lacewing, lacewing fly",
331
+ "319": "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
332
+ "320": "damselfly",
333
+ "321": "admiral",
334
+ "322": "ringlet, ringlet butterfly",
335
+ "323": "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus",
336
+ "324": "cabbage butterfly",
337
+ "325": "sulphur butterfly, sulfur butterfly",
338
+ "326": "lycaenid, lycaenid butterfly",
339
+ "327": "starfish, sea star",
340
+ "328": "sea urchin",
341
+ "329": "sea cucumber, holothurian",
342
+ "330": "wood rabbit, cottontail, cottontail rabbit",
343
+ "331": "hare",
344
+ "332": "Angora, Angora rabbit",
345
+ "333": "hamster",
346
+ "334": "porcupine, hedgehog",
347
+ "335": "fox squirrel, eastern fox squirrel, Sciurus niger",
348
+ "336": "marmot",
349
+ "337": "beaver",
350
+ "338": "guinea pig, Cavia cobaya",
351
+ "339": "sorrel",
352
+ "340": "zebra",
353
+ "341": "hog, pig, grunter, squealer, Sus scrofa",
354
+ "342": "wild boar, boar, Sus scrofa",
355
+ "343": "warthog",
356
+ "344": "hippopotamus, hippo, river horse, Hippopotamus amphibius",
357
+ "345": "ox",
358
+ "346": "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis",
359
+ "347": "bison",
360
+ "348": "ram, tup",
361
+ "349": "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis",
362
+ "350": "ibex, Capra ibex",
363
+ "351": "hartebeest",
364
+ "352": "impala, Aepyceros melampus",
365
+ "353": "gazelle",
366
+ "354": "Arabian camel, dromedary, Camelus dromedarius",
367
+ "355": "llama",
368
+ "356": "weasel",
369
+ "357": "mink",
370
+ "358": "polecat, fitch, foulmart, foumart, Mustela putorius",
371
+ "359": "black-footed ferret, ferret, Mustela nigripes",
372
+ "360": "otter",
373
+ "361": "skunk, polecat, wood pussy",
374
+ "362": "badger",
375
+ "363": "armadillo",
376
+ "364": "three-toed sloth, ai, Bradypus tridactylus",
377
+ "365": "orangutan, orang, orangutang, Pongo pygmaeus",
378
+ "366": "gorilla, Gorilla gorilla",
379
+ "367": "chimpanzee, chimp, Pan troglodytes",
380
+ "368": "gibbon, Hylobates lar",
381
+ "369": "siamang, Hylobates syndactylus, Symphalangus syndactylus",
382
+ "370": "guenon, guenon monkey",
383
+ "371": "patas, hussar monkey, Erythrocebus patas",
384
+ "372": "baboon",
385
+ "373": "macaque",
386
+ "374": "langur",
387
+ "375": "colobus, colobus monkey",
388
+ "376": "proboscis monkey, Nasalis larvatus",
389
+ "377": "marmoset",
390
+ "378": "capuchin, ringtail, Cebus capucinus",
391
+ "379": "howler monkey, howler",
392
+ "380": "titi, titi monkey",
393
+ "381": "spider monkey, Ateles geoffroyi",
394
+ "382": "squirrel monkey, Saimiri sciureus",
395
+ "383": "Madagascar cat, ring-tailed lemur, Lemur catta",
396
+ "384": "indri, indris, Indri indri, Indri brevicaudatus",
397
+ "385": "Indian elephant, Elephas maximus",
398
+ "386": "African elephant, Loxodonta africana",
399
+ "387": "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens",
400
+ "388": "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca",
401
+ "389": "barracouta, snoek",
402
+ "390": "eel",
403
+ "391": "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch",
404
+ "392": "rock beauty, Holocanthus tricolor",
405
+ "393": "anemone fish",
406
+ "394": "sturgeon",
407
+ "395": "gar, garfish, garpike, billfish, Lepisosteus osseus",
408
+ "396": "lionfish",
409
+ "397": "puffer, pufferfish, blowfish, globefish",
410
+ "398": "abacus",
411
+ "399": "abaya",
412
+ "400": "academic gown, academic robe, judge's robe",
413
+ "401": "accordion, piano accordion, squeeze box",
414
+ "402": "acoustic guitar",
415
+ "403": "aircraft carrier, carrier, flattop, attack aircraft carrier",
416
+ "404": "airliner",
417
+ "405": "airship, dirigible",
418
+ "406": "altar",
419
+ "407": "ambulance",
420
+ "408": "amphibian, amphibious vehicle",
421
+ "409": "analog clock",
422
+ "410": "apiary, bee house",
423
+ "411": "apron",
424
+ "412": "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
425
+ "413": "assault rifle, assault gun",
426
+ "414": "backpack, back pack, knapsack, packsack, rucksack, haversack",
427
+ "415": "bakery, bakeshop, bakehouse",
428
+ "416": "balance beam, beam",
429
+ "417": "balloon",
430
+ "418": "ballpoint, ballpoint pen, ballpen, Biro",
431
+ "419": "Band Aid",
432
+ "420": "banjo",
433
+ "421": "bannister, banister, balustrade, balusters, handrail",
434
+ "422": "barbell",
435
+ "423": "barber chair",
436
+ "424": "barbershop",
437
+ "425": "barn",
438
+ "426": "barometer",
439
+ "427": "barrel, cask",
440
+ "428": "barrow, garden cart, lawn cart, wheelbarrow",
441
+ "429": "baseball",
442
+ "430": "basketball",
443
+ "431": "bassinet",
444
+ "432": "bassoon",
445
+ "433": "bathing cap, swimming cap",
446
+ "434": "bath towel",
447
+ "435": "bathtub, bathing tub, bath, tub",
448
+ "436": "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
449
+ "437": "beacon, lighthouse, beacon light, pharos",
450
+ "438": "beaker",
451
+ "439": "bearskin, busby, shako",
452
+ "440": "beer bottle",
453
+ "441": "beer glass",
454
+ "442": "bell cote, bell cot",
455
+ "443": "bib",
456
+ "444": "bicycle-built-for-two, tandem bicycle, tandem",
457
+ "445": "bikini, two-piece",
458
+ "446": "binder, ring-binder",
459
+ "447": "binoculars, field glasses, opera glasses",
460
+ "448": "birdhouse",
461
+ "449": "boathouse",
462
+ "450": "bobsled, bobsleigh, bob",
463
+ "451": "bolo tie, bolo, bola tie, bola",
464
+ "452": "bonnet, poke bonnet",
465
+ "453": "bookcase",
466
+ "454": "bookshop, bookstore, bookstall",
467
+ "455": "bottlecap",
468
+ "456": "bow",
469
+ "457": "bow tie, bow-tie, bowtie",
470
+ "458": "brass, memorial tablet, plaque",
471
+ "459": "brassiere, bra, bandeau",
472
+ "460": "breakwater, groin, groyne, mole, bulwark, seawall, jetty",
473
+ "461": "breastplate, aegis, egis",
474
+ "462": "broom",
475
+ "463": "bucket, pail",
476
+ "464": "buckle",
477
+ "465": "bulletproof vest",
478
+ "466": "bullet train, bullet",
479
+ "467": "butcher shop, meat market",
480
+ "468": "cab, hack, taxi, taxicab",
481
+ "469": "caldron, cauldron",
482
+ "470": "candle, taper, wax light",
483
+ "471": "cannon",
484
+ "472": "canoe",
485
+ "473": "can opener, tin opener",
486
+ "474": "cardigan",
487
+ "475": "car mirror",
488
+ "476": "carousel, carrousel, merry-go-round, roundabout, whirligig",
489
+ "477": "carpenter's kit, tool kit",
490
+ "478": "carton",
491
+ "479": "car wheel",
492
+ "480": "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM",
493
+ "481": "cassette",
494
+ "482": "cassette player",
495
+ "483": "castle",
496
+ "484": "catamaran",
497
+ "485": "CD player",
498
+ "486": "cello, violoncello",
499
+ "487": "cellular telephone, cellular phone, cellphone, cell, mobile phone",
500
+ "488": "chain",
501
+ "489": "chainlink fence",
502
+ "490": "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
503
+ "491": "chain saw, chainsaw",
504
+ "492": "chest",
505
+ "493": "chiffonier, commode",
506
+ "494": "chime, bell, gong",
507
+ "495": "china cabinet, china closet",
508
+ "496": "Christmas stocking",
509
+ "497": "church, church building",
510
+ "498": "cinema, movie theater, movie theatre, movie house, picture palace",
511
+ "499": "cleaver, meat cleaver, chopper",
512
+ "500": "cliff dwelling",
513
+ "501": "cloak",
514
+ "502": "clog, geta, patten, sabot",
515
+ "503": "cocktail shaker",
516
+ "504": "coffee mug",
517
+ "505": "coffeepot",
518
+ "506": "coil, spiral, volute, whorl, helix",
519
+ "507": "combination lock",
520
+ "508": "computer keyboard, keypad",
521
+ "509": "confectionery, confectionary, candy store",
522
+ "510": "container ship, containership, container vessel",
523
+ "511": "convertible",
524
+ "512": "corkscrew, bottle screw",
525
+ "513": "cornet, horn, trumpet, trump",
526
+ "514": "cowboy boot",
527
+ "515": "cowboy hat, ten-gallon hat",
528
+ "516": "cradle",
529
+ "517": "crane",
530
+ "518": "crash helmet",
531
+ "519": "crate",
532
+ "520": "crib, cot",
533
+ "521": "Crock Pot",
534
+ "522": "croquet ball",
535
+ "523": "crutch",
536
+ "524": "cuirass",
537
+ "525": "dam, dike, dyke",
538
+ "526": "desk",
539
+ "527": "desktop computer",
540
+ "528": "dial telephone, dial phone",
541
+ "529": "diaper, nappy, napkin",
542
+ "530": "digital clock",
543
+ "531": "digital watch",
544
+ "532": "dining table, board",
545
+ "533": "dishrag, dishcloth",
546
+ "534": "dishwasher, dish washer, dishwashing machine",
547
+ "535": "disk brake, disc brake",
548
+ "536": "dock, dockage, docking facility",
549
+ "537": "dogsled, dog sled, dog sleigh",
550
+ "538": "dome",
551
+ "539": "doormat, welcome mat",
552
+ "540": "drilling platform, offshore rig",
553
+ "541": "drum, membranophone, tympan",
554
+ "542": "drumstick",
555
+ "543": "dumbbell",
556
+ "544": "Dutch oven",
557
+ "545": "electric fan, blower",
558
+ "546": "electric guitar",
559
+ "547": "electric locomotive",
560
+ "548": "entertainment center",
561
+ "549": "envelope",
562
+ "550": "espresso maker",
563
+ "551": "face powder",
564
+ "552": "feather boa, boa",
565
+ "553": "file, file cabinet, filing cabinet",
566
+ "554": "fireboat",
567
+ "555": "fire engine, fire truck",
568
+ "556": "fire screen, fireguard",
569
+ "557": "flagpole, flagstaff",
570
+ "558": "flute, transverse flute",
571
+ "559": "folding chair",
572
+ "560": "football helmet",
573
+ "561": "forklift",
574
+ "562": "fountain",
575
+ "563": "fountain pen",
576
+ "564": "four-poster",
577
+ "565": "freight car",
578
+ "566": "French horn, horn",
579
+ "567": "frying pan, frypan, skillet",
580
+ "568": "fur coat",
581
+ "569": "garbage truck, dustcart",
582
+ "570": "gasmask, respirator, gas helmet",
583
+ "571": "gas pump, gasoline pump, petrol pump, island dispenser",
584
+ "572": "goblet",
585
+ "573": "go-kart",
586
+ "574": "golf ball",
587
+ "575": "golfcart, golf cart",
588
+ "576": "gondola",
589
+ "577": "gong, tam-tam",
590
+ "578": "gown",
591
+ "579": "grand piano, grand",
592
+ "580": "greenhouse, nursery, glasshouse",
593
+ "581": "grille, radiator grille",
594
+ "582": "grocery store, grocery, food market, market",
595
+ "583": "guillotine",
596
+ "584": "hair slide",
597
+ "585": "hair spray",
598
+ "586": "half track",
599
+ "587": "hammer",
600
+ "588": "hamper",
601
+ "589": "hand blower, blow dryer, blow drier, hair dryer, hair drier",
602
+ "590": "hand-held computer, hand-held microcomputer",
603
+ "591": "handkerchief, hankie, hanky, hankey",
604
+ "592": "hard disc, hard disk, fixed disk",
605
+ "593": "harmonica, mouth organ, harp, mouth harp",
606
+ "594": "harp",
607
+ "595": "harvester, reaper",
608
+ "596": "hatchet",
609
+ "597": "holster",
610
+ "598": "home theater, home theatre",
611
+ "599": "honeycomb",
612
+ "600": "hook, claw",
613
+ "601": "hoopskirt, crinoline",
614
+ "602": "horizontal bar, high bar",
615
+ "603": "horse cart, horse-cart",
616
+ "604": "hourglass",
617
+ "605": "iPod",
618
+ "606": "iron, smoothing iron",
619
+ "607": "jack-o'-lantern",
620
+ "608": "jean, blue jean, denim",
621
+ "609": "jeep, landrover",
622
+ "610": "jersey, T-shirt, tee shirt",
623
+ "611": "jigsaw puzzle",
624
+ "612": "jinrikisha, ricksha, rickshaw",
625
+ "613": "joystick",
626
+ "614": "kimono",
627
+ "615": "knee pad",
628
+ "616": "knot",
629
+ "617": "lab coat, laboratory coat",
630
+ "618": "ladle",
631
+ "619": "lampshade, lamp shade",
632
+ "620": "laptop, laptop computer",
633
+ "621": "lawn mower, mower",
634
+ "622": "lens cap, lens cover",
635
+ "623": "letter opener, paper knife, paperknife",
636
+ "624": "library",
637
+ "625": "lifeboat",
638
+ "626": "lighter, light, igniter, ignitor",
639
+ "627": "limousine, limo",
640
+ "628": "liner, ocean liner",
641
+ "629": "lipstick, lip rouge",
642
+ "630": "Loafer",
643
+ "631": "lotion",
644
+ "632": "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
645
+ "633": "loupe, jeweler's loupe",
646
+ "634": "lumbermill, sawmill",
647
+ "635": "magnetic compass",
648
+ "636": "mailbag, postbag",
649
+ "637": "mailbox, letter box",
650
+ "638": "maillot",
651
+ "639": "maillot, tank suit",
652
+ "640": "manhole cover",
653
+ "641": "maraca",
654
+ "642": "marimba, xylophone",
655
+ "643": "mask",
656
+ "644": "matchstick",
657
+ "645": "maypole",
658
+ "646": "maze, labyrinth",
659
+ "647": "measuring cup",
660
+ "648": "medicine chest, medicine cabinet",
661
+ "649": "megalith, megalithic structure",
662
+ "650": "microphone, mike",
663
+ "651": "microwave, microwave oven",
664
+ "652": "military uniform",
665
+ "653": "milk can",
666
+ "654": "minibus",
667
+ "655": "miniskirt, mini",
668
+ "656": "minivan",
669
+ "657": "missile",
670
+ "658": "mitten",
671
+ "659": "mixing bowl",
672
+ "660": "mobile home, manufactured home",
673
+ "661": "Model T",
674
+ "662": "modem",
675
+ "663": "monastery",
676
+ "664": "monitor",
677
+ "665": "moped",
678
+ "666": "mortar",
679
+ "667": "mortarboard",
680
+ "668": "mosque",
681
+ "669": "mosquito net",
682
+ "670": "motor scooter, scooter",
683
+ "671": "mountain bike, all-terrain bike, off-roader",
684
+ "672": "mountain tent",
685
+ "673": "mouse, computer mouse",
686
+ "674": "mousetrap",
687
+ "675": "moving van",
688
+ "676": "muzzle",
689
+ "677": "nail",
690
+ "678": "neck brace",
691
+ "679": "necklace",
692
+ "680": "nipple",
693
+ "681": "notebook, notebook computer",
694
+ "682": "obelisk",
695
+ "683": "oboe, hautboy, hautbois",
696
+ "684": "ocarina, sweet potato",
697
+ "685": "odometer, hodometer, mileometer, milometer",
698
+ "686": "oil filter",
699
+ "687": "organ, pipe organ",
700
+ "688": "oscilloscope, scope, cathode-ray oscilloscope, CRO",
701
+ "689": "overskirt",
702
+ "690": "oxcart",
703
+ "691": "oxygen mask",
704
+ "692": "packet",
705
+ "693": "paddle, boat paddle",
706
+ "694": "paddlewheel, paddle wheel",
707
+ "695": "padlock",
708
+ "696": "paintbrush",
709
+ "697": "pajama, pyjama, pj's, jammies",
710
+ "698": "palace",
711
+ "699": "panpipe, pandean pipe, syrinx",
712
+ "700": "paper towel",
713
+ "701": "parachute, chute",
714
+ "702": "parallel bars, bars",
715
+ "703": "park bench",
716
+ "704": "parking meter",
717
+ "705": "passenger car, coach, carriage",
718
+ "706": "patio, terrace",
719
+ "707": "pay-phone, pay-station",
720
+ "708": "pedestal, plinth, footstall",
721
+ "709": "pencil box, pencil case",
722
+ "710": "pencil sharpener",
723
+ "711": "perfume, essence",
724
+ "712": "Petri dish",
725
+ "713": "photocopier",
726
+ "714": "pick, plectrum, plectron",
727
+ "715": "pickelhaube",
728
+ "716": "picket fence, paling",
729
+ "717": "pickup, pickup truck",
730
+ "718": "pier",
731
+ "719": "piggy bank, penny bank",
732
+ "720": "pill bottle",
733
+ "721": "pillow",
734
+ "722": "ping-pong ball",
735
+ "723": "pinwheel",
736
+ "724": "pirate, pirate ship",
737
+ "725": "pitcher, ewer",
738
+ "726": "plane, carpenter's plane, woodworking plane",
739
+ "727": "planetarium",
740
+ "728": "plastic bag",
741
+ "729": "plate rack",
742
+ "730": "plow, plough",
743
+ "731": "plunger, plumber's helper",
744
+ "732": "Polaroid camera, Polaroid Land camera",
745
+ "733": "pole",
746
+ "734": "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria",
747
+ "735": "poncho",
748
+ "736": "pool table, billiard table, snooker table",
749
+ "737": "pop bottle, soda bottle",
750
+ "738": "pot, flowerpot",
751
+ "739": "potter's wheel",
752
+ "740": "power drill",
753
+ "741": "prayer rug, prayer mat",
754
+ "742": "printer",
755
+ "743": "prison, prison house",
756
+ "744": "projectile, missile",
757
+ "745": "projector",
758
+ "746": "puck, hockey puck",
759
+ "747": "punching bag, punch bag, punching ball, punchball",
760
+ "748": "purse",
761
+ "749": "quill, quill pen",
762
+ "750": "quilt, comforter, comfort, puff",
763
+ "751": "racer, race car, racing car",
764
+ "752": "racket, racquet",
765
+ "753": "radiator",
766
+ "754": "radio, wireless",
767
+ "755": "radio telescope, radio reflector",
768
+ "756": "rain barrel",
769
+ "757": "recreational vehicle, RV, R.V.",
770
+ "758": "reel",
771
+ "759": "reflex camera",
772
+ "760": "refrigerator, icebox",
773
+ "761": "remote control, remote",
774
+ "762": "restaurant, eating house, eating place, eatery",
775
+ "763": "revolver, six-gun, six-shooter",
776
+ "764": "rifle",
777
+ "765": "rocking chair, rocker",
778
+ "766": "rotisserie",
779
+ "767": "rubber eraser, rubber, pencil eraser",
780
+ "768": "rugby ball",
781
+ "769": "rule, ruler",
782
+ "770": "running shoe",
783
+ "771": "safe",
784
+ "772": "safety pin",
785
+ "773": "saltshaker, salt shaker",
786
+ "774": "sandal",
787
+ "775": "sarong",
788
+ "776": "sax, saxophone",
789
+ "777": "scabbard",
790
+ "778": "scale, weighing machine",
791
+ "779": "school bus",
792
+ "780": "schooner",
793
+ "781": "scoreboard",
794
+ "782": "screen, CRT screen",
795
+ "783": "screw",
796
+ "784": "screwdriver",
797
+ "785": "seat belt, seatbelt",
798
+ "786": "sewing machine",
799
+ "787": "shield, buckler",
800
+ "788": "shoe shop, shoe-shop, shoe store",
801
+ "789": "shoji",
802
+ "790": "shopping basket",
803
+ "791": "shopping cart",
804
+ "792": "shovel",
805
+ "793": "shower cap",
806
+ "794": "shower curtain",
807
+ "795": "ski",
808
+ "796": "ski mask",
809
+ "797": "sleeping bag",
810
+ "798": "slide rule, slipstick",
811
+ "799": "sliding door",
812
+ "800": "slot, one-armed bandit",
813
+ "801": "snorkel",
814
+ "802": "snowmobile",
815
+ "803": "snowplow, snowplough",
816
+ "804": "soap dispenser",
817
+ "805": "soccer ball",
818
+ "806": "sock",
819
+ "807": "solar dish, solar collector, solar furnace",
820
+ "808": "sombrero",
821
+ "809": "soup bowl",
822
+ "810": "space bar",
823
+ "811": "space heater",
824
+ "812": "space shuttle",
825
+ "813": "spatula",
826
+ "814": "speedboat",
827
+ "815": "spider web, spider's web",
828
+ "816": "spindle",
829
+ "817": "sports car, sport car",
830
+ "818": "spotlight, spot",
831
+ "819": "stage",
832
+ "820": "steam locomotive",
833
+ "821": "steel arch bridge",
834
+ "822": "steel drum",
835
+ "823": "stethoscope",
836
+ "824": "stole",
837
+ "825": "stone wall",
838
+ "826": "stopwatch, stop watch",
839
+ "827": "stove",
840
+ "828": "strainer",
841
+ "829": "streetcar, tram, tramcar, trolley, trolley car",
842
+ "830": "stretcher",
843
+ "831": "studio couch, day bed",
844
+ "832": "stupa, tope",
845
+ "833": "submarine, pigboat, sub, U-boat",
846
+ "834": "suit, suit of clothes",
847
+ "835": "sundial",
848
+ "836": "sunglass",
849
+ "837": "sunglasses, dark glasses, shades",
850
+ "838": "sunscreen, sunblock, sun blocker",
851
+ "839": "suspension bridge",
852
+ "840": "swab, swob, mop",
853
+ "841": "sweatshirt",
854
+ "842": "swimming trunks, bathing trunks",
855
+ "843": "swing",
856
+ "844": "switch, electric switch, electrical switch",
857
+ "845": "syringe",
858
+ "846": "table lamp",
859
+ "847": "tank, army tank, armored combat vehicle, armoured combat vehicle",
860
+ "848": "tape player",
861
+ "849": "teapot",
862
+ "850": "teddy, teddy bear",
863
+ "851": "television, television system",
864
+ "852": "tennis ball",
865
+ "853": "thatch, thatched roof",
866
+ "854": "theater curtain, theatre curtain",
867
+ "855": "thimble",
868
+ "856": "thresher, thrasher, threshing machine",
869
+ "857": "throne",
870
+ "858": "tile roof",
871
+ "859": "toaster",
872
+ "860": "tobacco shop, tobacconist shop, tobacconist",
873
+ "861": "toilet seat",
874
+ "862": "torch",
875
+ "863": "totem pole",
876
+ "864": "tow truck, tow car, wrecker",
877
+ "865": "toyshop",
878
+ "866": "tractor",
879
+ "867": "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
880
+ "868": "tray",
881
+ "869": "trench coat",
882
+ "870": "tricycle, trike, velocipede",
883
+ "871": "trimaran",
884
+ "872": "tripod",
885
+ "873": "triumphal arch",
886
+ "874": "trolleybus, trolley coach, trackless trolley",
887
+ "875": "trombone",
888
+ "876": "tub, vat",
889
+ "877": "turnstile",
890
+ "878": "typewriter keyboard",
891
+ "879": "umbrella",
892
+ "880": "unicycle, monocycle",
893
+ "881": "upright, upright piano",
894
+ "882": "vacuum, vacuum cleaner",
895
+ "883": "vase",
896
+ "884": "vault",
897
+ "885": "velvet",
898
+ "886": "vending machine",
899
+ "887": "vestment",
900
+ "888": "viaduct",
901
+ "889": "violin, fiddle",
902
+ "890": "volleyball",
903
+ "891": "waffle iron",
904
+ "892": "wall clock",
905
+ "893": "wallet, billfold, notecase, pocketbook",
906
+ "894": "wardrobe, closet, press",
907
+ "895": "warplane, military plane",
908
+ "896": "washbasin, handbasin, washbowl, lavabo, wash-hand basin",
909
+ "897": "washer, automatic washer, washing machine",
910
+ "898": "water bottle",
911
+ "899": "water jug",
912
+ "900": "water tower",
913
+ "901": "whiskey jug",
914
+ "902": "whistle",
915
+ "903": "wig",
916
+ "904": "window screen",
917
+ "905": "window shade",
918
+ "906": "Windsor tie",
919
+ "907": "wine bottle",
920
+ "908": "wing",
921
+ "909": "wok",
922
+ "910": "wooden spoon",
923
+ "911": "wool, woolen, woollen",
924
+ "912": "worm fence, snake fence, snake-rail fence, Virginia fence",
925
+ "913": "wreck",
926
+ "914": "yawl",
927
+ "915": "yurt",
928
+ "916": "web site, website, internet site, site",
929
+ "917": "comic book",
930
+ "918": "crossword puzzle, crossword",
931
+ "919": "street sign",
932
+ "920": "traffic light, traffic signal, stoplight",
933
+ "921": "book jacket, dust cover, dust jacket, dust wrapper",
934
+ "922": "menu",
935
+ "923": "plate",
936
+ "924": "guacamole",
937
+ "925": "consomme",
938
+ "926": "hot pot, hotpot",
939
+ "927": "trifle",
940
+ "928": "ice cream, icecream",
941
+ "929": "ice lolly, lolly, lollipop, popsicle",
942
+ "930": "French loaf",
943
+ "931": "bagel, beigel",
944
+ "932": "pretzel",
945
+ "933": "cheeseburger",
946
+ "934": "hotdog, hot dog, red hot",
947
+ "935": "mashed potato",
948
+ "936": "head cabbage",
949
+ "937": "broccoli",
950
+ "938": "cauliflower",
951
+ "939": "zucchini, courgette",
952
+ "940": "spaghetti squash",
953
+ "941": "acorn squash",
954
+ "942": "butternut squash",
955
+ "943": "cucumber, cuke",
956
+ "944": "artichoke, globe artichoke",
957
+ "945": "bell pepper",
958
+ "946": "cardoon",
959
+ "947": "mushroom",
960
+ "948": "Granny Smith",
961
+ "949": "strawberry",
962
+ "950": "orange",
963
+ "951": "lemon",
964
+ "952": "fig",
965
+ "953": "pineapple, ananas",
966
+ "954": "banana",
967
+ "955": "jackfruit, jak, jack",
968
+ "956": "custard apple",
969
+ "957": "pomegranate",
970
+ "958": "hay",
971
+ "959": "carbonara",
972
+ "960": "chocolate sauce, chocolate syrup",
973
+ "961": "dough",
974
+ "962": "meat loaf, meatloaf",
975
+ "963": "pizza, pizza pie",
976
+ "964": "potpie",
977
+ "965": "burrito",
978
+ "966": "red wine",
979
+ "967": "espresso",
980
+ "968": "cup",
981
+ "969": "eggnog",
982
+ "970": "alp",
983
+ "971": "bubble",
984
+ "972": "cliff, drop, drop-off",
985
+ "973": "coral reef",
986
+ "974": "geyser",
987
+ "975": "lakeside, lakeshore",
988
+ "976": "promontory, headland, head, foreland",
989
+ "977": "sandbar, sand bar",
990
+ "978": "seashore, coast, seacoast, sea-coast",
991
+ "979": "valley, vale",
992
+ "980": "volcano",
993
+ "981": "ballplayer, baseball player",
994
+ "982": "groom, bridegroom",
995
+ "983": "scuba diver",
996
+ "984": "rapeseed",
997
+ "985": "daisy",
998
+ "986": "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum",
999
+ "987": "corn",
1000
+ "988": "acorn",
1001
+ "989": "hip, rose hip, rosehip",
1002
+ "990": "buckeye, horse chestnut, conker",
1003
+ "991": "coral fungus",
1004
+ "992": "agaric",
1005
+ "993": "gyromitra",
1006
+ "994": "stinkhorn, carrion fungus",
1007
+ "995": "earthstar",
1008
+ "996": "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa",
1009
+ "997": "bolete",
1010
+ "998": "ear, spike, capitulum",
1011
+ "999": "toilet tissue, toilet paper, bathroom tissue"
1012
+ },
1013
+ "image_size": 224,
1014
+ "initializer_range": 0.02,
1015
+ "intermediate_size": 3072,
1016
+ "label2id": {
1017
+ "Afghan hound, Afghan": 160,
1018
+ "African chameleon, Chamaeleo chamaeleon": 47,
1019
+ "African crocodile, Nile crocodile, Crocodylus niloticus": 49,
1020
+ "African elephant, Loxodonta africana": 386,
1021
+ "African grey, African gray, Psittacus erithacus": 87,
1022
+ "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus": 275,
1023
+ "Airedale, Airedale terrier": 191,
1024
+ "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier": 180,
1025
+ "American alligator, Alligator mississipiensis": 50,
1026
+ "American black bear, black bear, Ursus americanus, Euarctos americanus": 295,
1027
+ "American chameleon, anole, Anolis carolinensis": 40,
1028
+ "American coot, marsh hen, mud hen, water hen, Fulica americana": 137,
1029
+ "American egret, great white heron, Egretta albus": 132,
1030
+ "American lobster, Northern lobster, Maine lobster, Homarus americanus": 122,
1031
+ "Angora, Angora rabbit": 332,
1032
+ "Appenzeller": 240,
1033
+ "Arabian camel, dromedary, Camelus dromedarius": 354,
1034
+ "Arctic fox, white fox, Alopex lagopus": 279,
1035
+ "Australian terrier": 193,
1036
+ "Band Aid": 419,
1037
+ "Bedlington terrier": 181,
1038
+ "Bernese mountain dog": 239,
1039
+ "Blenheim spaniel": 156,
1040
+ "Border collie": 232,
1041
+ "Border terrier": 182,
1042
+ "Boston bull, Boston terrier": 195,
1043
+ "Bouvier des Flandres, Bouviers des Flandres": 233,
1044
+ "Brabancon griffon": 262,
1045
+ "Brittany spaniel": 215,
1046
+ "CD player": 485,
1047
+ "Cardigan, Cardigan Welsh corgi": 264,
1048
+ "Chesapeake Bay retriever": 209,
1049
+ "Chihuahua": 151,
1050
+ "Christmas stocking": 496,
1051
+ "Crock Pot": 521,
1052
+ "Dandie Dinmont, Dandie Dinmont terrier": 194,
1053
+ "Doberman, Doberman pinscher": 236,
1054
+ "Dungeness crab, Cancer magister": 118,
1055
+ "Dutch oven": 544,
1056
+ "Egyptian cat": 285,
1057
+ "English foxhound": 167,
1058
+ "English setter": 212,
1059
+ "English springer, English springer spaniel": 217,
1060
+ "EntleBucher": 241,
1061
+ "Eskimo dog, husky": 248,
1062
+ "European fire salamander, Salamandra salamandra": 25,
1063
+ "European gallinule, Porphyrio porphyrio": 136,
1064
+ "French bulldog": 245,
1065
+ "French horn, horn": 566,
1066
+ "French loaf": 930,
1067
+ "German shepherd, German shepherd dog, German police dog, alsatian": 235,
1068
+ "German short-haired pointer": 210,
1069
+ "Gila monster, Heloderma suspectum": 45,
1070
+ "Gordon setter": 214,
1071
+ "Granny Smith": 948,
1072
+ "Great Dane": 246,
1073
+ "Great Pyrenees": 257,
1074
+ "Greater Swiss Mountain dog": 238,
1075
+ "Ibizan hound, Ibizan Podenco": 173,
1076
+ "Indian cobra, Naja naja": 63,
1077
+ "Indian elephant, Elephas maximus": 385,
1078
+ "Irish setter, red setter": 213,
1079
+ "Irish terrier": 184,
1080
+ "Irish water spaniel": 221,
1081
+ "Irish wolfhound": 170,
1082
+ "Italian greyhound": 171,
1083
+ "Japanese spaniel": 152,
1084
+ "Kerry blue terrier": 183,
1085
+ "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis": 48,
1086
+ "Labrador retriever": 208,
1087
+ "Lakeland terrier": 189,
1088
+ "Leonberg": 255,
1089
+ "Lhasa, Lhasa apso": 204,
1090
+ "Loafer": 630,
1091
+ "Madagascar cat, ring-tailed lemur, Lemur catta": 383,
1092
+ "Maltese dog, Maltese terrier, Maltese": 153,
1093
+ "Mexican hairless": 268,
1094
+ "Model T": 661,
1095
+ "Newfoundland, Newfoundland dog": 256,
1096
+ "Norfolk terrier": 185,
1097
+ "Norwegian elkhound, elkhound": 174,
1098
+ "Norwich terrier": 186,
1099
+ "Old English sheepdog, bobtail": 229,
1100
+ "Pekinese, Pekingese, Peke": 154,
1101
+ "Pembroke, Pembroke Welsh corgi": 263,
1102
+ "Persian cat": 283,
1103
+ "Petri dish": 712,
1104
+ "Polaroid camera, Polaroid Land camera": 732,
1105
+ "Pomeranian": 259,
1106
+ "Rhodesian ridgeback": 159,
1107
+ "Rottweiler": 234,
1108
+ "Saint Bernard, St Bernard": 247,
1109
+ "Saluki, gazelle hound": 176,
1110
+ "Samoyed, Samoyede": 258,
1111
+ "Scotch terrier, Scottish terrier, Scottie": 199,
1112
+ "Scottish deerhound, deerhound": 177,
1113
+ "Sealyham terrier, Sealyham": 190,
1114
+ "Shetland sheepdog, Shetland sheep dog, Shetland": 230,
1115
+ "Shih-Tzu": 155,
1116
+ "Siamese cat, Siamese": 284,
1117
+ "Siberian husky": 250,
1118
+ "Staffordshire bullterrier, Staffordshire bull terrier": 179,
1119
+ "Sussex spaniel": 220,
1120
+ "Tibetan mastiff": 244,
1121
+ "Tibetan terrier, chrysanthemum dog": 200,
1122
+ "Walker hound, Walker foxhound": 166,
1123
+ "Weimaraner": 178,
1124
+ "Welsh springer spaniel": 218,
1125
+ "West Highland white terrier": 203,
1126
+ "Windsor tie": 906,
1127
+ "Yorkshire terrier": 187,
1128
+ "abacus": 398,
1129
+ "abaya": 399,
1130
+ "academic gown, academic robe, judge's robe": 400,
1131
+ "accordion, piano accordion, squeeze box": 401,
1132
+ "acorn": 988,
1133
+ "acorn squash": 941,
1134
+ "acoustic guitar": 402,
1135
+ "admiral": 321,
1136
+ "affenpinscher, monkey pinscher, monkey dog": 252,
1137
+ "agama": 42,
1138
+ "agaric": 992,
1139
+ "aircraft carrier, carrier, flattop, attack aircraft carrier": 403,
1140
+ "airliner": 404,
1141
+ "airship, dirigible": 405,
1142
+ "albatross, mollymawk": 146,
1143
+ "alligator lizard": 44,
1144
+ "alp": 970,
1145
+ "altar": 406,
1146
+ "ambulance": 407,
1147
+ "amphibian, amphibious vehicle": 408,
1148
+ "analog clock": 409,
1149
+ "anemone fish": 393,
1150
+ "ant, emmet, pismire": 310,
1151
+ "apiary, bee house": 410,
1152
+ "apron": 411,
1153
+ "armadillo": 363,
1154
+ "artichoke, globe artichoke": 944,
1155
+ "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin": 412,
1156
+ "assault rifle, assault gun": 413,
1157
+ "axolotl, mud puppy, Ambystoma mexicanum": 29,
1158
+ "baboon": 372,
1159
+ "backpack, back pack, knapsack, packsack, rucksack, haversack": 414,
1160
+ "badger": 362,
1161
+ "bagel, beigel": 931,
1162
+ "bakery, bakeshop, bakehouse": 415,
1163
+ "balance beam, beam": 416,
1164
+ "bald eagle, American eagle, Haliaeetus leucocephalus": 22,
1165
+ "balloon": 417,
1166
+ "ballplayer, baseball player": 981,
1167
+ "ballpoint, ballpoint pen, ballpen, Biro": 418,
1168
+ "banana": 954,
1169
+ "banded gecko": 38,
1170
+ "banjo": 420,
1171
+ "bannister, banister, balustrade, balusters, handrail": 421,
1172
+ "barbell": 422,
1173
+ "barber chair": 423,
1174
+ "barbershop": 424,
1175
+ "barn": 425,
1176
+ "barn spider, Araneus cavaticus": 73,
1177
+ "barometer": 426,
1178
+ "barracouta, snoek": 389,
1179
+ "barrel, cask": 427,
1180
+ "barrow, garden cart, lawn cart, wheelbarrow": 428,
1181
+ "baseball": 429,
1182
+ "basenji": 253,
1183
+ "basketball": 430,
1184
+ "basset, basset hound": 161,
1185
+ "bassinet": 431,
1186
+ "bassoon": 432,
1187
+ "bath towel": 434,
1188
+ "bathing cap, swimming cap": 433,
1189
+ "bathtub, bathing tub, bath, tub": 435,
1190
+ "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon": 436,
1191
+ "beacon, lighthouse, beacon light, pharos": 437,
1192
+ "beagle": 162,
1193
+ "beaker": 438,
1194
+ "bearskin, busby, shako": 439,
1195
+ "beaver": 337,
1196
+ "bee": 309,
1197
+ "bee eater": 92,
1198
+ "beer bottle": 440,
1199
+ "beer glass": 441,
1200
+ "bell cote, bell cot": 442,
1201
+ "bell pepper": 945,
1202
+ "bib": 443,
1203
+ "bicycle-built-for-two, tandem bicycle, tandem": 444,
1204
+ "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis": 349,
1205
+ "bikini, two-piece": 445,
1206
+ "binder, ring-binder": 446,
1207
+ "binoculars, field glasses, opera glasses": 447,
1208
+ "birdhouse": 448,
1209
+ "bison": 347,
1210
+ "bittern": 133,
1211
+ "black and gold garden spider, Argiope aurantia": 72,
1212
+ "black grouse": 80,
1213
+ "black stork, Ciconia nigra": 128,
1214
+ "black swan, Cygnus atratus": 100,
1215
+ "black widow, Latrodectus mactans": 75,
1216
+ "black-and-tan coonhound": 165,
1217
+ "black-footed ferret, ferret, Mustela nigripes": 359,
1218
+ "bloodhound, sleuthhound": 163,
1219
+ "bluetick": 164,
1220
+ "boa constrictor, Constrictor constrictor": 61,
1221
+ "boathouse": 449,
1222
+ "bobsled, bobsleigh, bob": 450,
1223
+ "bolete": 997,
1224
+ "bolo tie, bolo, bola tie, bola": 451,
1225
+ "bonnet, poke bonnet": 452,
1226
+ "book jacket, dust cover, dust jacket, dust wrapper": 921,
1227
+ "bookcase": 453,
1228
+ "bookshop, bookstore, bookstall": 454,
1229
+ "borzoi, Russian wolfhound": 169,
1230
+ "bottlecap": 455,
1231
+ "bow": 456,
1232
+ "bow tie, bow-tie, bowtie": 457,
1233
+ "box turtle, box tortoise": 37,
1234
+ "boxer": 242,
1235
+ "brain coral": 109,
1236
+ "brambling, Fringilla montifringilla": 10,
1237
+ "brass, memorial tablet, plaque": 458,
1238
+ "brassiere, bra, bandeau": 459,
1239
+ "breakwater, groin, groyne, mole, bulwark, seawall, jetty": 460,
1240
+ "breastplate, aegis, egis": 461,
1241
+ "briard": 226,
1242
+ "broccoli": 937,
1243
+ "broom": 462,
1244
+ "brown bear, bruin, Ursus arctos": 294,
1245
+ "bubble": 971,
1246
+ "bucket, pail": 463,
1247
+ "buckeye, horse chestnut, conker": 990,
1248
+ "buckle": 464,
1249
+ "bulbul": 16,
1250
+ "bull mastiff": 243,
1251
+ "bullet train, bullet": 466,
1252
+ "bulletproof vest": 465,
1253
+ "bullfrog, Rana catesbeiana": 30,
1254
+ "burrito": 965,
1255
+ "bustard": 138,
1256
+ "butcher shop, meat market": 467,
1257
+ "butternut squash": 942,
1258
+ "cab, hack, taxi, taxicab": 468,
1259
+ "cabbage butterfly": 324,
1260
+ "cairn, cairn terrier": 192,
1261
+ "caldron, cauldron": 469,
1262
+ "can opener, tin opener": 473,
1263
+ "candle, taper, wax light": 470,
1264
+ "cannon": 471,
1265
+ "canoe": 472,
1266
+ "capuchin, ringtail, Cebus capucinus": 378,
1267
+ "car mirror": 475,
1268
+ "car wheel": 479,
1269
+ "carbonara": 959,
1270
+ "cardigan": 474,
1271
+ "cardoon": 946,
1272
+ "carousel, carrousel, merry-go-round, roundabout, whirligig": 476,
1273
+ "carpenter's kit, tool kit": 477,
1274
+ "carton": 478,
1275
+ "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM": 480,
1276
+ "cassette": 481,
1277
+ "cassette player": 482,
1278
+ "castle": 483,
1279
+ "catamaran": 484,
1280
+ "cauliflower": 938,
1281
+ "cello, violoncello": 486,
1282
+ "cellular telephone, cellular phone, cellphone, cell, mobile phone": 487,
1283
+ "centipede": 79,
1284
+ "chain": 488,
1285
+ "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour": 490,
1286
+ "chain saw, chainsaw": 491,
1287
+ "chainlink fence": 489,
1288
+ "chambered nautilus, pearly nautilus, nautilus": 117,
1289
+ "cheeseburger": 933,
1290
+ "cheetah, chetah, Acinonyx jubatus": 293,
1291
+ "chest": 492,
1292
+ "chickadee": 19,
1293
+ "chiffonier, commode": 493,
1294
+ "chime, bell, gong": 494,
1295
+ "chimpanzee, chimp, Pan troglodytes": 367,
1296
+ "china cabinet, china closet": 495,
1297
+ "chiton, coat-of-mail shell, sea cradle, polyplacophore": 116,
1298
+ "chocolate sauce, chocolate syrup": 960,
1299
+ "chow, chow chow": 260,
1300
+ "church, church building": 497,
1301
+ "cicada, cicala": 316,
1302
+ "cinema, movie theater, movie theatre, movie house, picture palace": 498,
1303
+ "cleaver, meat cleaver, chopper": 499,
1304
+ "cliff dwelling": 500,
1305
+ "cliff, drop, drop-off": 972,
1306
+ "cloak": 501,
1307
+ "clog, geta, patten, sabot": 502,
1308
+ "clumber, clumber spaniel": 216,
1309
+ "cock": 7,
1310
+ "cocker spaniel, English cocker spaniel, cocker": 219,
1311
+ "cockroach, roach": 314,
1312
+ "cocktail shaker": 503,
1313
+ "coffee mug": 504,
1314
+ "coffeepot": 505,
1315
+ "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch": 391,
1316
+ "coil, spiral, volute, whorl, helix": 506,
1317
+ "collie": 231,
1318
+ "colobus, colobus monkey": 375,
1319
+ "combination lock": 507,
1320
+ "comic book": 917,
1321
+ "common iguana, iguana, Iguana iguana": 39,
1322
+ "common newt, Triturus vulgaris": 26,
1323
+ "computer keyboard, keypad": 508,
1324
+ "conch": 112,
1325
+ "confectionery, confectionary, candy store": 509,
1326
+ "consomme": 925,
1327
+ "container ship, containership, container vessel": 510,
1328
+ "convertible": 511,
1329
+ "coral fungus": 991,
1330
+ "coral reef": 973,
1331
+ "corkscrew, bottle screw": 512,
1332
+ "corn": 987,
1333
+ "cornet, horn, trumpet, trump": 513,
1334
+ "coucal": 91,
1335
+ "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor": 286,
1336
+ "cowboy boot": 514,
1337
+ "cowboy hat, ten-gallon hat": 515,
1338
+ "coyote, prairie wolf, brush wolf, Canis latrans": 272,
1339
+ "cradle": 516,
1340
+ "crane": 517,
1341
+ "crash helmet": 518,
1342
+ "crate": 519,
1343
+ "crayfish, crawfish, crawdad, crawdaddy": 124,
1344
+ "crib, cot": 520,
1345
+ "cricket": 312,
1346
+ "croquet ball": 522,
1347
+ "crossword puzzle, crossword": 918,
1348
+ "crutch": 523,
1349
+ "cucumber, cuke": 943,
1350
+ "cuirass": 524,
1351
+ "cup": 968,
1352
+ "curly-coated retriever": 206,
1353
+ "custard apple": 956,
1354
+ "daisy": 985,
1355
+ "dalmatian, coach dog, carriage dog": 251,
1356
+ "dam, dike, dyke": 525,
1357
+ "damselfly": 320,
1358
+ "desk": 526,
1359
+ "desktop computer": 527,
1360
+ "dhole, Cuon alpinus": 274,
1361
+ "dial telephone, dial phone": 528,
1362
+ "diamondback, diamondback rattlesnake, Crotalus adamanteus": 67,
1363
+ "diaper, nappy, napkin": 529,
1364
+ "digital clock": 530,
1365
+ "digital watch": 531,
1366
+ "dingo, warrigal, warragal, Canis dingo": 273,
1367
+ "dining table, board": 532,
1368
+ "dishrag, dishcloth": 533,
1369
+ "dishwasher, dish washer, dishwashing machine": 534,
1370
+ "disk brake, disc brake": 535,
1371
+ "dock, dockage, docking facility": 536,
1372
+ "dogsled, dog sled, dog sleigh": 537,
1373
+ "dome": 538,
1374
+ "doormat, welcome mat": 539,
1375
+ "dough": 961,
1376
+ "dowitcher": 142,
1377
+ "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk": 319,
1378
+ "drake": 97,
1379
+ "drilling platform, offshore rig": 540,
1380
+ "drum, membranophone, tympan": 541,
1381
+ "drumstick": 542,
1382
+ "dugong, Dugong dugon": 149,
1383
+ "dumbbell": 543,
1384
+ "dung beetle": 305,
1385
+ "ear, spike, capitulum": 998,
1386
+ "earthstar": 995,
1387
+ "echidna, spiny anteater, anteater": 102,
1388
+ "eel": 390,
1389
+ "eft": 27,
1390
+ "eggnog": 969,
1391
+ "electric fan, blower": 545,
1392
+ "electric guitar": 546,
1393
+ "electric locomotive": 547,
1394
+ "electric ray, crampfish, numbfish, torpedo": 5,
1395
+ "entertainment center": 548,
1396
+ "envelope": 549,
1397
+ "espresso": 967,
1398
+ "espresso maker": 550,
1399
+ "face powder": 551,
1400
+ "feather boa, boa": 552,
1401
+ "fiddler crab": 120,
1402
+ "fig": 952,
1403
+ "file, file cabinet, filing cabinet": 553,
1404
+ "fire engine, fire truck": 555,
1405
+ "fire screen, fireguard": 556,
1406
+ "fireboat": 554,
1407
+ "flagpole, flagstaff": 557,
1408
+ "flamingo": 130,
1409
+ "flat-coated retriever": 205,
1410
+ "flatworm, platyhelminth": 110,
1411
+ "flute, transverse flute": 558,
1412
+ "fly": 308,
1413
+ "folding chair": 559,
1414
+ "football helmet": 560,
1415
+ "forklift": 561,
1416
+ "fountain": 562,
1417
+ "fountain pen": 563,
1418
+ "four-poster": 564,
1419
+ "fox squirrel, eastern fox squirrel, Sciurus niger": 335,
1420
+ "freight car": 565,
1421
+ "frilled lizard, Chlamydosaurus kingi": 43,
1422
+ "frying pan, frypan, skillet": 567,
1423
+ "fur coat": 568,
1424
+ "gar, garfish, garpike, billfish, Lepisosteus osseus": 395,
1425
+ "garbage truck, dustcart": 569,
1426
+ "garden spider, Aranea diademata": 74,
1427
+ "garter snake, grass snake": 57,
1428
+ "gas pump, gasoline pump, petrol pump, island dispenser": 571,
1429
+ "gasmask, respirator, gas helmet": 570,
1430
+ "gazelle": 353,
1431
+ "geyser": 974,
1432
+ "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca": 388,
1433
+ "giant schnauzer": 197,
1434
+ "gibbon, Hylobates lar": 368,
1435
+ "go-kart": 573,
1436
+ "goblet": 572,
1437
+ "golden retriever": 207,
1438
+ "goldfinch, Carduelis carduelis": 11,
1439
+ "goldfish, Carassius auratus": 1,
1440
+ "golf ball": 574,
1441
+ "golfcart, golf cart": 575,
1442
+ "gondola": 576,
1443
+ "gong, tam-tam": 577,
1444
+ "goose": 99,
1445
+ "gorilla, Gorilla gorilla": 366,
1446
+ "gown": 578,
1447
+ "grand piano, grand": 579,
1448
+ "grasshopper, hopper": 311,
1449
+ "great grey owl, great gray owl, Strix nebulosa": 24,
1450
+ "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias": 2,
1451
+ "green lizard, Lacerta viridis": 46,
1452
+ "green mamba": 64,
1453
+ "green snake, grass snake": 55,
1454
+ "greenhouse, nursery, glasshouse": 580,
1455
+ "grey fox, gray fox, Urocyon cinereoargenteus": 280,
1456
+ "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus": 147,
1457
+ "grille, radiator grille": 581,
1458
+ "grocery store, grocery, food market, market": 582,
1459
+ "groenendael": 224,
1460
+ "groom, bridegroom": 982,
1461
+ "ground beetle, carabid beetle": 302,
1462
+ "guacamole": 924,
1463
+ "guenon, guenon monkey": 370,
1464
+ "guillotine": 583,
1465
+ "guinea pig, Cavia cobaya": 338,
1466
+ "gyromitra": 993,
1467
+ "hair slide": 584,
1468
+ "hair spray": 585,
1469
+ "half track": 586,
1470
+ "hammer": 587,
1471
+ "hammerhead, hammerhead shark": 4,
1472
+ "hamper": 588,
1473
+ "hamster": 333,
1474
+ "hand blower, blow dryer, blow drier, hair dryer, hair drier": 589,
1475
+ "hand-held computer, hand-held microcomputer": 590,
1476
+ "handkerchief, hankie, hanky, hankey": 591,
1477
+ "hard disc, hard disk, fixed disk": 592,
1478
+ "hare": 331,
1479
+ "harmonica, mouth organ, harp, mouth harp": 593,
1480
+ "harp": 594,
1481
+ "hartebeest": 351,
1482
+ "harvester, reaper": 595,
1483
+ "harvestman, daddy longlegs, Phalangium opilio": 70,
1484
+ "hatchet": 596,
1485
+ "hay": 958,
1486
+ "head cabbage": 936,
1487
+ "hen": 8,
1488
+ "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa": 996,
1489
+ "hermit crab": 125,
1490
+ "hip, rose hip, rosehip": 989,
1491
+ "hippopotamus, hippo, river horse, Hippopotamus amphibius": 344,
1492
+ "hog, pig, grunter, squealer, Sus scrofa": 341,
1493
+ "hognose snake, puff adder, sand viper": 54,
1494
+ "holster": 597,
1495
+ "home theater, home theatre": 598,
1496
+ "honeycomb": 599,
1497
+ "hook, claw": 600,
1498
+ "hoopskirt, crinoline": 601,
1499
+ "horizontal bar, high bar": 602,
1500
+ "hornbill": 93,
1501
+ "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus": 66,
1502
+ "horse cart, horse-cart": 603,
1503
+ "hot pot, hotpot": 926,
1504
+ "hotdog, hot dog, red hot": 934,
1505
+ "hourglass": 604,
1506
+ "house finch, linnet, Carpodacus mexicanus": 12,
1507
+ "howler monkey, howler": 379,
1508
+ "hummingbird": 94,
1509
+ "hyena, hyaena": 276,
1510
+ "iPod": 605,
1511
+ "ibex, Capra ibex": 350,
1512
+ "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus": 296,
1513
+ "ice cream, icecream": 928,
1514
+ "ice lolly, lolly, lollipop, popsicle": 929,
1515
+ "impala, Aepyceros melampus": 352,
1516
+ "indigo bunting, indigo finch, indigo bird, Passerina cyanea": 14,
1517
+ "indri, indris, Indri indri, Indri brevicaudatus": 384,
1518
+ "iron, smoothing iron": 606,
1519
+ "isopod": 126,
1520
+ "jacamar": 95,
1521
+ "jack-o'-lantern": 607,
1522
+ "jackfruit, jak, jack": 955,
1523
+ "jaguar, panther, Panthera onca, Felis onca": 290,
1524
+ "jay": 17,
1525
+ "jean, blue jean, denim": 608,
1526
+ "jeep, landrover": 609,
1527
+ "jellyfish": 107,
1528
+ "jersey, T-shirt, tee shirt": 610,
1529
+ "jigsaw puzzle": 611,
1530
+ "jinrikisha, ricksha, rickshaw": 612,
1531
+ "joystick": 613,
1532
+ "junco, snowbird": 13,
1533
+ "keeshond": 261,
1534
+ "kelpie": 227,
1535
+ "killer whale, killer, orca, grampus, sea wolf, Orcinus orca": 148,
1536
+ "kimono": 614,
1537
+ "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica": 121,
1538
+ "king penguin, Aptenodytes patagonica": 145,
1539
+ "king snake, kingsnake": 56,
1540
+ "kit fox, Vulpes macrotis": 278,
1541
+ "kite": 21,
1542
+ "knee pad": 615,
1543
+ "knot": 616,
1544
+ "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus": 105,
1545
+ "komondor": 228,
1546
+ "kuvasz": 222,
1547
+ "lab coat, laboratory coat": 617,
1548
+ "lacewing, lacewing fly": 318,
1549
+ "ladle": 618,
1550
+ "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle": 301,
1551
+ "lakeside, lakeshore": 975,
1552
+ "lampshade, lamp shade": 619,
1553
+ "langur": 374,
1554
+ "laptop, laptop computer": 620,
1555
+ "lawn mower, mower": 621,
1556
+ "leaf beetle, chrysomelid": 304,
1557
+ "leafhopper": 317,
1558
+ "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea": 34,
1559
+ "lemon": 951,
1560
+ "lens cap, lens cover": 622,
1561
+ "leopard, Panthera pardus": 288,
1562
+ "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens": 387,
1563
+ "letter opener, paper knife, paperknife": 623,
1564
+ "library": 624,
1565
+ "lifeboat": 625,
1566
+ "lighter, light, igniter, ignitor": 626,
1567
+ "limousine, limo": 627,
1568
+ "limpkin, Aramus pictus": 135,
1569
+ "liner, ocean liner": 628,
1570
+ "lion, king of beasts, Panthera leo": 291,
1571
+ "lionfish": 396,
1572
+ "lipstick, lip rouge": 629,
1573
+ "little blue heron, Egretta caerulea": 131,
1574
+ "llama": 355,
1575
+ "loggerhead, loggerhead turtle, Caretta caretta": 33,
1576
+ "long-horned beetle, longicorn, longicorn beetle": 303,
1577
+ "lorikeet": 90,
1578
+ "lotion": 631,
1579
+ "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system": 632,
1580
+ "loupe, jeweler's loupe": 633,
1581
+ "lumbermill, sawmill": 634,
1582
+ "lycaenid, lycaenid butterfly": 326,
1583
+ "lynx, catamount": 287,
1584
+ "macaque": 373,
1585
+ "macaw": 88,
1586
+ "magnetic compass": 635,
1587
+ "magpie": 18,
1588
+ "mailbag, postbag": 636,
1589
+ "mailbox, letter box": 637,
1590
+ "maillot": 638,
1591
+ "maillot, tank suit": 639,
1592
+ "malamute, malemute, Alaskan malamute": 249,
1593
+ "malinois": 225,
1594
+ "manhole cover": 640,
1595
+ "mantis, mantid": 315,
1596
+ "maraca": 641,
1597
+ "marimba, xylophone": 642,
1598
+ "marmoset": 377,
1599
+ "marmot": 336,
1600
+ "mashed potato": 935,
1601
+ "mask": 643,
1602
+ "matchstick": 644,
1603
+ "maypole": 645,
1604
+ "maze, labyrinth": 646,
1605
+ "measuring cup": 647,
1606
+ "meat loaf, meatloaf": 962,
1607
+ "medicine chest, medicine cabinet": 648,
1608
+ "meerkat, mierkat": 299,
1609
+ "megalith, megalithic structure": 649,
1610
+ "menu": 922,
1611
+ "microphone, mike": 650,
1612
+ "microwave, microwave oven": 651,
1613
+ "military uniform": 652,
1614
+ "milk can": 653,
1615
+ "miniature pinscher": 237,
1616
+ "miniature poodle": 266,
1617
+ "miniature schnauzer": 196,
1618
+ "minibus": 654,
1619
+ "miniskirt, mini": 655,
1620
+ "minivan": 656,
1621
+ "mink": 357,
1622
+ "missile": 657,
1623
+ "mitten": 658,
1624
+ "mixing bowl": 659,
1625
+ "mobile home, manufactured home": 660,
1626
+ "modem": 662,
1627
+ "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus": 323,
1628
+ "monastery": 663,
1629
+ "mongoose": 298,
1630
+ "monitor": 664,
1631
+ "moped": 665,
1632
+ "mortar": 666,
1633
+ "mortarboard": 667,
1634
+ "mosque": 668,
1635
+ "mosquito net": 669,
1636
+ "motor scooter, scooter": 670,
1637
+ "mountain bike, all-terrain bike, off-roader": 671,
1638
+ "mountain tent": 672,
1639
+ "mouse, computer mouse": 673,
1640
+ "mousetrap": 674,
1641
+ "moving van": 675,
1642
+ "mud turtle": 35,
1643
+ "mushroom": 947,
1644
+ "muzzle": 676,
1645
+ "nail": 677,
1646
+ "neck brace": 678,
1647
+ "necklace": 679,
1648
+ "nematode, nematode worm, roundworm": 111,
1649
+ "night snake, Hypsiglena torquata": 60,
1650
+ "nipple": 680,
1651
+ "notebook, notebook computer": 681,
1652
+ "obelisk": 682,
1653
+ "oboe, hautboy, hautbois": 683,
1654
+ "ocarina, sweet potato": 684,
1655
+ "odometer, hodometer, mileometer, milometer": 685,
1656
+ "oil filter": 686,
1657
+ "orange": 950,
1658
+ "orangutan, orang, orangutang, Pongo pygmaeus": 365,
1659
+ "organ, pipe organ": 687,
1660
+ "oscilloscope, scope, cathode-ray oscilloscope, CRO": 688,
1661
+ "ostrich, Struthio camelus": 9,
1662
+ "otter": 360,
1663
+ "otterhound, otter hound": 175,
1664
+ "overskirt": 689,
1665
+ "ox": 345,
1666
+ "oxcart": 690,
1667
+ "oxygen mask": 691,
1668
+ "oystercatcher, oyster catcher": 143,
1669
+ "packet": 692,
1670
+ "paddle, boat paddle": 693,
1671
+ "paddlewheel, paddle wheel": 694,
1672
+ "padlock": 695,
1673
+ "paintbrush": 696,
1674
+ "pajama, pyjama, pj's, jammies": 697,
1675
+ "palace": 698,
1676
+ "panpipe, pandean pipe, syrinx": 699,
1677
+ "paper towel": 700,
1678
+ "papillon": 157,
1679
+ "parachute, chute": 701,
1680
+ "parallel bars, bars": 702,
1681
+ "park bench": 703,
1682
+ "parking meter": 704,
1683
+ "partridge": 86,
1684
+ "passenger car, coach, carriage": 705,
1685
+ "patas, hussar monkey, Erythrocebus patas": 371,
1686
+ "patio, terrace": 706,
1687
+ "pay-phone, pay-station": 707,
1688
+ "peacock": 84,
1689
+ "pedestal, plinth, footstall": 708,
1690
+ "pelican": 144,
1691
+ "pencil box, pencil case": 709,
1692
+ "pencil sharpener": 710,
1693
+ "perfume, essence": 711,
1694
+ "photocopier": 713,
1695
+ "pick, plectrum, plectron": 714,
1696
+ "pickelhaube": 715,
1697
+ "picket fence, paling": 716,
1698
+ "pickup, pickup truck": 717,
1699
+ "pier": 718,
1700
+ "piggy bank, penny bank": 719,
1701
+ "pill bottle": 720,
1702
+ "pillow": 721,
1703
+ "pineapple, ananas": 953,
1704
+ "ping-pong ball": 722,
1705
+ "pinwheel": 723,
1706
+ "pirate, pirate ship": 724,
1707
+ "pitcher, ewer": 725,
1708
+ "pizza, pizza pie": 963,
1709
+ "plane, carpenter's plane, woodworking plane": 726,
1710
+ "planetarium": 727,
1711
+ "plastic bag": 728,
1712
+ "plate": 923,
1713
+ "plate rack": 729,
1714
+ "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus": 103,
1715
+ "plow, plough": 730,
1716
+ "plunger, plumber's helper": 731,
1717
+ "pole": 733,
1718
+ "polecat, fitch, foulmart, foumart, Mustela putorius": 358,
1719
+ "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria": 734,
1720
+ "pomegranate": 957,
1721
+ "poncho": 735,
1722
+ "pool table, billiard table, snooker table": 736,
1723
+ "pop bottle, soda bottle": 737,
1724
+ "porcupine, hedgehog": 334,
1725
+ "pot, flowerpot": 738,
1726
+ "potpie": 964,
1727
+ "potter's wheel": 739,
1728
+ "power drill": 740,
1729
+ "prairie chicken, prairie grouse, prairie fowl": 83,
1730
+ "prayer rug, prayer mat": 741,
1731
+ "pretzel": 932,
1732
+ "printer": 742,
1733
+ "prison, prison house": 743,
1734
+ "proboscis monkey, Nasalis larvatus": 376,
1735
+ "projectile, missile": 744,
1736
+ "projector": 745,
1737
+ "promontory, headland, head, foreland": 976,
1738
+ "ptarmigan": 81,
1739
+ "puck, hockey puck": 746,
1740
+ "puffer, pufferfish, blowfish, globefish": 397,
1741
+ "pug, pug-dog": 254,
1742
+ "punching bag, punch bag, punching ball, punchball": 747,
1743
+ "purse": 748,
1744
+ "quail": 85,
1745
+ "quill, quill pen": 749,
1746
+ "quilt, comforter, comfort, puff": 750,
1747
+ "racer, race car, racing car": 751,
1748
+ "racket, racquet": 752,
1749
+ "radiator": 753,
1750
+ "radio telescope, radio reflector": 755,
1751
+ "radio, wireless": 754,
1752
+ "rain barrel": 756,
1753
+ "ram, tup": 348,
1754
+ "rapeseed": 984,
1755
+ "recreational vehicle, RV, R.V.": 757,
1756
+ "red fox, Vulpes vulpes": 277,
1757
+ "red wine": 966,
1758
+ "red wolf, maned wolf, Canis rufus, Canis niger": 271,
1759
+ "red-backed sandpiper, dunlin, Erolia alpina": 140,
1760
+ "red-breasted merganser, Mergus serrator": 98,
1761
+ "redbone": 168,
1762
+ "redshank, Tringa totanus": 141,
1763
+ "reel": 758,
1764
+ "reflex camera": 759,
1765
+ "refrigerator, icebox": 760,
1766
+ "remote control, remote": 761,
1767
+ "restaurant, eating house, eating place, eatery": 762,
1768
+ "revolver, six-gun, six-shooter": 763,
1769
+ "rhinoceros beetle": 306,
1770
+ "rifle": 764,
1771
+ "ringlet, ringlet butterfly": 322,
1772
+ "ringneck snake, ring-necked snake, ring snake": 53,
1773
+ "robin, American robin, Turdus migratorius": 15,
1774
+ "rock beauty, Holocanthus tricolor": 392,
1775
+ "rock crab, Cancer irroratus": 119,
1776
+ "rock python, rock snake, Python sebae": 62,
1777
+ "rocking chair, rocker": 765,
1778
+ "rotisserie": 766,
1779
+ "rubber eraser, rubber, pencil eraser": 767,
1780
+ "ruddy turnstone, Arenaria interpres": 139,
1781
+ "ruffed grouse, partridge, Bonasa umbellus": 82,
1782
+ "rugby ball": 768,
1783
+ "rule, ruler": 769,
1784
+ "running shoe": 770,
1785
+ "safe": 771,
1786
+ "safety pin": 772,
1787
+ "saltshaker, salt shaker": 773,
1788
+ "sandal": 774,
1789
+ "sandbar, sand bar": 977,
1790
+ "sarong": 775,
1791
+ "sax, saxophone": 776,
1792
+ "scabbard": 777,
1793
+ "scale, weighing machine": 778,
1794
+ "schipperke": 223,
1795
+ "school bus": 779,
1796
+ "schooner": 780,
1797
+ "scoreboard": 781,
1798
+ "scorpion": 71,
1799
+ "screen, CRT screen": 782,
1800
+ "screw": 783,
1801
+ "screwdriver": 784,
1802
+ "scuba diver": 983,
1803
+ "sea anemone, anemone": 108,
1804
+ "sea cucumber, holothurian": 329,
1805
+ "sea lion": 150,
1806
+ "sea slug, nudibranch": 115,
1807
+ "sea snake": 65,
1808
+ "sea urchin": 328,
1809
+ "seashore, coast, seacoast, sea-coast": 978,
1810
+ "seat belt, seatbelt": 785,
1811
+ "sewing machine": 786,
1812
+ "shield, buckler": 787,
1813
+ "shoe shop, shoe-shop, shoe store": 788,
1814
+ "shoji": 789,
1815
+ "shopping basket": 790,
1816
+ "shopping cart": 791,
1817
+ "shovel": 792,
1818
+ "shower cap": 793,
1819
+ "shower curtain": 794,
1820
+ "siamang, Hylobates syndactylus, Symphalangus syndactylus": 369,
1821
+ "sidewinder, horned rattlesnake, Crotalus cerastes": 68,
1822
+ "silky terrier, Sydney silky": 201,
1823
+ "ski": 795,
1824
+ "ski mask": 796,
1825
+ "skunk, polecat, wood pussy": 361,
1826
+ "sleeping bag": 797,
1827
+ "slide rule, slipstick": 798,
1828
+ "sliding door": 799,
1829
+ "slot, one-armed bandit": 800,
1830
+ "sloth bear, Melursus ursinus, Ursus ursinus": 297,
1831
+ "slug": 114,
1832
+ "snail": 113,
1833
+ "snorkel": 801,
1834
+ "snow leopard, ounce, Panthera uncia": 289,
1835
+ "snowmobile": 802,
1836
+ "snowplow, snowplough": 803,
1837
+ "soap dispenser": 804,
1838
+ "soccer ball": 805,
1839
+ "sock": 806,
1840
+ "soft-coated wheaten terrier": 202,
1841
+ "solar dish, solar collector, solar furnace": 807,
1842
+ "sombrero": 808,
1843
+ "sorrel": 339,
1844
+ "soup bowl": 809,
1845
+ "space bar": 810,
1846
+ "space heater": 811,
1847
+ "space shuttle": 812,
1848
+ "spaghetti squash": 940,
1849
+ "spatula": 813,
1850
+ "speedboat": 814,
1851
+ "spider monkey, Ateles geoffroyi": 381,
1852
+ "spider web, spider's web": 815,
1853
+ "spindle": 816,
1854
+ "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish": 123,
1855
+ "spoonbill": 129,
1856
+ "sports car, sport car": 817,
1857
+ "spotlight, spot": 818,
1858
+ "spotted salamander, Ambystoma maculatum": 28,
1859
+ "squirrel monkey, Saimiri sciureus": 382,
1860
+ "stage": 819,
1861
+ "standard poodle": 267,
1862
+ "standard schnauzer": 198,
1863
+ "starfish, sea star": 327,
1864
+ "steam locomotive": 820,
1865
+ "steel arch bridge": 821,
1866
+ "steel drum": 822,
1867
+ "stethoscope": 823,
1868
+ "stingray": 6,
1869
+ "stinkhorn, carrion fungus": 994,
1870
+ "stole": 824,
1871
+ "stone wall": 825,
1872
+ "stopwatch, stop watch": 826,
1873
+ "stove": 827,
1874
+ "strainer": 828,
1875
+ "strawberry": 949,
1876
+ "street sign": 919,
1877
+ "streetcar, tram, tramcar, trolley, trolley car": 829,
1878
+ "stretcher": 830,
1879
+ "studio couch, day bed": 831,
1880
+ "stupa, tope": 832,
1881
+ "sturgeon": 394,
1882
+ "submarine, pigboat, sub, U-boat": 833,
1883
+ "suit, suit of clothes": 834,
1884
+ "sulphur butterfly, sulfur butterfly": 325,
1885
+ "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita": 89,
1886
+ "sundial": 835,
1887
+ "sunglass": 836,
1888
+ "sunglasses, dark glasses, shades": 837,
1889
+ "sunscreen, sunblock, sun blocker": 838,
1890
+ "suspension bridge": 839,
1891
+ "swab, swob, mop": 840,
1892
+ "sweatshirt": 841,
1893
+ "swimming trunks, bathing trunks": 842,
1894
+ "swing": 843,
1895
+ "switch, electric switch, electrical switch": 844,
1896
+ "syringe": 845,
1897
+ "tabby, tabby cat": 281,
1898
+ "table lamp": 846,
1899
+ "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui": 32,
1900
+ "tank, army tank, armored combat vehicle, armoured combat vehicle": 847,
1901
+ "tape player": 848,
1902
+ "tarantula": 76,
1903
+ "teapot": 849,
1904
+ "teddy, teddy bear": 850,
1905
+ "television, television system": 851,
1906
+ "tench, Tinca tinca": 0,
1907
+ "tennis ball": 852,
1908
+ "terrapin": 36,
1909
+ "thatch, thatched roof": 853,
1910
+ "theater curtain, theatre curtain": 854,
1911
+ "thimble": 855,
1912
+ "three-toed sloth, ai, Bradypus tridactylus": 364,
1913
+ "thresher, thrasher, threshing machine": 856,
1914
+ "throne": 857,
1915
+ "thunder snake, worm snake, Carphophis amoenus": 52,
1916
+ "tick": 78,
1917
+ "tiger beetle": 300,
1918
+ "tiger cat": 282,
1919
+ "tiger shark, Galeocerdo cuvieri": 3,
1920
+ "tiger, Panthera tigris": 292,
1921
+ "tile roof": 858,
1922
+ "timber wolf, grey wolf, gray wolf, Canis lupus": 269,
1923
+ "titi, titi monkey": 380,
1924
+ "toaster": 859,
1925
+ "tobacco shop, tobacconist shop, tobacconist": 860,
1926
+ "toilet seat": 861,
1927
+ "toilet tissue, toilet paper, bathroom tissue": 999,
1928
+ "torch": 862,
1929
+ "totem pole": 863,
1930
+ "toucan": 96,
1931
+ "tow truck, tow car, wrecker": 864,
1932
+ "toy poodle": 265,
1933
+ "toy terrier": 158,
1934
+ "toyshop": 865,
1935
+ "tractor": 866,
1936
+ "traffic light, traffic signal, stoplight": 920,
1937
+ "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi": 867,
1938
+ "tray": 868,
1939
+ "tree frog, tree-frog": 31,
1940
+ "trench coat": 869,
1941
+ "triceratops": 51,
1942
+ "tricycle, trike, velocipede": 870,
1943
+ "trifle": 927,
1944
+ "trilobite": 69,
1945
+ "trimaran": 871,
1946
+ "tripod": 872,
1947
+ "triumphal arch": 873,
1948
+ "trolleybus, trolley coach, trackless trolley": 874,
1949
+ "trombone": 875,
1950
+ "tub, vat": 876,
1951
+ "turnstile": 877,
1952
+ "tusker": 101,
1953
+ "typewriter keyboard": 878,
1954
+ "umbrella": 879,
1955
+ "unicycle, monocycle": 880,
1956
+ "upright, upright piano": 881,
1957
+ "vacuum, vacuum cleaner": 882,
1958
+ "valley, vale": 979,
1959
+ "vase": 883,
1960
+ "vault": 884,
1961
+ "velvet": 885,
1962
+ "vending machine": 886,
1963
+ "vestment": 887,
1964
+ "viaduct": 888,
1965
+ "vine snake": 59,
1966
+ "violin, fiddle": 889,
1967
+ "vizsla, Hungarian pointer": 211,
1968
+ "volcano": 980,
1969
+ "volleyball": 890,
1970
+ "vulture": 23,
1971
+ "waffle iron": 891,
1972
+ "walking stick, walkingstick, stick insect": 313,
1973
+ "wall clock": 892,
1974
+ "wallaby, brush kangaroo": 104,
1975
+ "wallet, billfold, notecase, pocketbook": 893,
1976
+ "wardrobe, closet, press": 894,
1977
+ "warplane, military plane": 895,
1978
+ "warthog": 343,
1979
+ "washbasin, handbasin, washbowl, lavabo, wash-hand basin": 896,
1980
+ "washer, automatic washer, washing machine": 897,
1981
+ "water bottle": 898,
1982
+ "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis": 346,
1983
+ "water jug": 899,
1984
+ "water ouzel, dipper": 20,
1985
+ "water snake": 58,
1986
+ "water tower": 900,
1987
+ "weasel": 356,
1988
+ "web site, website, internet site, site": 916,
1989
+ "weevil": 307,
1990
+ "whippet": 172,
1991
+ "whiptail, whiptail lizard": 41,
1992
+ "whiskey jug": 901,
1993
+ "whistle": 902,
1994
+ "white stork, Ciconia ciconia": 127,
1995
+ "white wolf, Arctic wolf, Canis lupus tundrarum": 270,
1996
+ "wig": 903,
1997
+ "wild boar, boar, Sus scrofa": 342,
1998
+ "window screen": 904,
1999
+ "window shade": 905,
2000
+ "wine bottle": 907,
2001
+ "wing": 908,
2002
+ "wire-haired fox terrier": 188,
2003
+ "wok": 909,
2004
+ "wolf spider, hunting spider": 77,
2005
+ "wombat": 106,
2006
+ "wood rabbit, cottontail, cottontail rabbit": 330,
2007
+ "wooden spoon": 910,
2008
+ "wool, woolen, woollen": 911,
2009
+ "worm fence, snake fence, snake-rail fence, Virginia fence": 912,
2010
+ "wreck": 913,
2011
+ "yawl": 914,
2012
+ "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum": 986,
2013
+ "yurt": 915,
2014
+ "zebra": 340,
2015
+ "zucchini, courgette": 939
2016
+ },
2017
+ "layer_norm_eps": 1e-12,
2018
+ "model_type": "vit",
2019
+ "num_attention_heads": 12,
2020
+ "num_channels": 3,
2021
+ "num_hidden_layers": 12,
2022
+ "patch_size": 16,
2023
+ "qkv_bias": true,
2024
+ "torch_dtype": "float32",
2025
+ "transformers_version": "4.44.2"
2026
+ }
chatbot/models/500/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d46ec8a4f83d8d75025bb7d299730df4642b3ab7082cb6594d94bcc7c69491e1
3
+ size 343220892
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/config.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "HuggingFaceTB/SmolLM2-1.7B-Instruct",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 1,
9
+ "eos_token_id": 2,
10
+ "head_dim": 64,
11
+ "hidden_act": "silu",
12
+ "hidden_size": 2048,
13
+ "initializer_range": 0.02,
14
+ "intermediate_size": 8192,
15
+ "max_position_embeddings": 8192,
16
+ "mlp_bias": false,
17
+ "model_type": "llama",
18
+ "num_attention_heads": 32,
19
+ "num_hidden_layers": 24,
20
+ "num_key_value_heads": 32,
21
+ "pad_token_id": 2,
22
+ "pretraining_tp": 1,
23
+ "rms_norm_eps": 1e-05,
24
+ "rope_scaling": null,
25
+ "rope_theta": 130000,
26
+ "tie_word_embeddings": true,
27
+ "torch_dtype": "float32",
28
+ "transformers.js_config": {
29
+ "kv_cache_dtype": {
30
+ "fp16": "float16",
31
+ "q4f16": "float16"
32
+ }
33
+ },
34
+ "transformers_version": "4.46.2",
35
+ "use_cache": true,
36
+ "vocab_size": 49152
37
+ }
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": 2,
5
+ "pad_token_id": 2,
6
+ "transformers_version": "4.46.2"
7
+ }
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d21ed78acd64116143c09180c275689fd8eab22a8920455c1c64f63ceab9089a
3
+ size 4999906800
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a1eecb36a045a2ba3deff58cfa2bbb7dddfb76f260e853b442128d1799055b61
3
+ size 1845623728
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/model.safetensors.index.json ADDED
@@ -0,0 +1,225 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 6845505536
4
+ },
5
+ "weight_map": {
6
+ "model.embed_tokens.weight": "model-00001-of-00002.safetensors",
7
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
8
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
9
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
10
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
11
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
12
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
13
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
14
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
15
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
16
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
17
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
18
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
19
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
20
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
21
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
22
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
23
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
24
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
25
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
26
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
27
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
28
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
29
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
30
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
31
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
32
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
33
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
34
+ "model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
35
+ "model.layers.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
36
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
37
+ "model.layers.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
38
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
39
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
40
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
41
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
42
+ "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
43
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
44
+ "model.layers.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
45
+ "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
46
+ "model.layers.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
47
+ "model.layers.12.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
48
+ "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
49
+ "model.layers.12.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
50
+ "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
51
+ "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
52
+ "model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors",
53
+ "model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
54
+ "model.layers.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
55
+ "model.layers.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
56
+ "model.layers.13.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
57
+ "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
58
+ "model.layers.13.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
59
+ "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
60
+ "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
61
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
62
+ "model.layers.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
63
+ "model.layers.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
64
+ "model.layers.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
65
+ "model.layers.14.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
66
+ "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
67
+ "model.layers.14.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
68
+ "model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
69
+ "model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
70
+ "model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
71
+ "model.layers.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
72
+ "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
73
+ "model.layers.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
74
+ "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
75
+ "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
76
+ "model.layers.15.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
77
+ "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
78
+ "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
79
+ "model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
80
+ "model.layers.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
81
+ "model.layers.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
82
+ "model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
83
+ "model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
84
+ "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
85
+ "model.layers.16.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
86
+ "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
87
+ "model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
88
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00002.safetensors",
89
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
90
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
91
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
92
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
93
+ "model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
94
+ "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
95
+ "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
96
+ "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
97
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00002.safetensors",
98
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
99
+ "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
100
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
101
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
102
+ "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
103
+ "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
104
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
105
+ "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
106
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00002.safetensors",
107
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
108
+ "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
109
+ "model.layers.19.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
110
+ "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
111
+ "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
112
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
113
+ "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
114
+ "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
115
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
116
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
117
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
118
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
119
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
120
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
121
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
122
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
123
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
124
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00002.safetensors",
125
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
126
+ "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
127
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
128
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
129
+ "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
130
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
131
+ "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
132
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
133
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00002.safetensors",
134
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
135
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
136
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
137
+ "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
138
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
139
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
140
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
141
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
142
+ "model.layers.22.input_layernorm.weight": "model-00002-of-00002.safetensors",
143
+ "model.layers.22.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
144
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
145
+ "model.layers.22.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
146
+ "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
147
+ "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
148
+ "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
149
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
150
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
151
+ "model.layers.23.input_layernorm.weight": "model-00002-of-00002.safetensors",
152
+ "model.layers.23.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
153
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
154
+ "model.layers.23.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
155
+ "model.layers.23.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
156
+ "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
157
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
158
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
159
+ "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
160
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
161
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
162
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
163
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
164
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
165
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
166
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
167
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
168
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
169
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors",
170
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
171
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
172
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
173
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
174
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
175
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
176
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
177
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
178
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
179
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
180
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
181
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
182
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
183
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
184
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
185
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
186
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
187
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
188
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
189
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
190
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
191
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
192
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
193
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
194
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
195
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
196
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
197
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
198
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
199
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
200
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
201
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
202
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
203
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
204
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
205
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors",
206
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
207
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
208
+ "model.layers.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
209
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
210
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
211
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
212
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
213
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
214
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
215
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
216
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
217
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
218
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
219
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
220
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
221
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
222
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
223
+ "model.norm.weight": "model-00002-of-00002.safetensors"
224
+ }
225
+ }
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/special_tokens_map.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|im_start|>",
4
+ "<|im_end|>"
5
+ ],
6
+ "bos_token": {
7
+ "content": "<|im_start|>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false
12
+ },
13
+ "eos_token": {
14
+ "content": "<|im_end|>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false
19
+ },
20
+ "pad_token": {
21
+ "content": "<|im_end|>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false
26
+ },
27
+ "unk_token": {
28
+ "content": "<|endoftext|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false
33
+ }
34
+ }
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/tokenizer_config.json ADDED
@@ -0,0 +1,154 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<|im_start|>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "<|im_end|>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<repo_name>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "4": {
37
+ "content": "<reponame>",
38
+ "lstrip": false,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "5": {
45
+ "content": "<file_sep>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "6": {
53
+ "content": "<filename>",
54
+ "lstrip": false,
55
+ "normalized": false,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ },
60
+ "7": {
61
+ "content": "<gh_stars>",
62
+ "lstrip": false,
63
+ "normalized": false,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": true
67
+ },
68
+ "8": {
69
+ "content": "<issue_start>",
70
+ "lstrip": false,
71
+ "normalized": false,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": true
75
+ },
76
+ "9": {
77
+ "content": "<issue_comment>",
78
+ "lstrip": false,
79
+ "normalized": false,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": true
83
+ },
84
+ "10": {
85
+ "content": "<issue_closed>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": true
91
+ },
92
+ "11": {
93
+ "content": "<jupyter_start>",
94
+ "lstrip": false,
95
+ "normalized": false,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": true
99
+ },
100
+ "12": {
101
+ "content": "<jupyter_text>",
102
+ "lstrip": false,
103
+ "normalized": false,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": true
107
+ },
108
+ "13": {
109
+ "content": "<jupyter_code>",
110
+ "lstrip": false,
111
+ "normalized": false,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": true
115
+ },
116
+ "14": {
117
+ "content": "<jupyter_output>",
118
+ "lstrip": false,
119
+ "normalized": false,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": true
123
+ },
124
+ "15": {
125
+ "content": "<jupyter_script>",
126
+ "lstrip": false,
127
+ "normalized": false,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": true
131
+ },
132
+ "16": {
133
+ "content": "<empty_output>",
134
+ "lstrip": false,
135
+ "normalized": false,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": true
139
+ }
140
+ },
141
+ "additional_special_tokens": [
142
+ "<|im_start|>",
143
+ "<|im_end|>"
144
+ ],
145
+ "bos_token": "<|im_start|>",
146
+ "chat_template": "{% for message in messages %}{% if loop.first and messages[0]['role'] != 'system' %}{{ '<|im_start|>system\nYou are a helpful AI assistant named SmolLM, trained by Hugging Face<|im_end|>\n' }}{% endif %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
147
+ "clean_up_tokenization_spaces": false,
148
+ "eos_token": "<|im_end|>",
149
+ "model_max_length": 2048,
150
+ "pad_token": "<|im_end|>",
151
+ "tokenizer_class": "GPT2Tokenizer",
152
+ "unk_token": "<|endoftext|>",
153
+ "vocab_size": 49152
154
+ }
chatbot/models/HuggingFaceTB/SmolLM2-1.7B-Instruct/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
chatbot/models/custom_model/config.json ADDED
@@ -0,0 +1,2026 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "google/vit-base-patch16-224",
3
+ "architectures": [
4
+ "ViTForImageClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "encoder_stride": 16,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.0,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "tench, Tinca tinca",
13
+ "1": "goldfish, Carassius auratus",
14
+ "2": "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias",
15
+ "3": "tiger shark, Galeocerdo cuvieri",
16
+ "4": "hammerhead, hammerhead shark",
17
+ "5": "electric ray, crampfish, numbfish, torpedo",
18
+ "6": "stingray",
19
+ "7": "cock",
20
+ "8": "hen",
21
+ "9": "ostrich, Struthio camelus",
22
+ "10": "brambling, Fringilla montifringilla",
23
+ "11": "goldfinch, Carduelis carduelis",
24
+ "12": "house finch, linnet, Carpodacus mexicanus",
25
+ "13": "junco, snowbird",
26
+ "14": "indigo bunting, indigo finch, indigo bird, Passerina cyanea",
27
+ "15": "robin, American robin, Turdus migratorius",
28
+ "16": "bulbul",
29
+ "17": "jay",
30
+ "18": "magpie",
31
+ "19": "chickadee",
32
+ "20": "water ouzel, dipper",
33
+ "21": "kite",
34
+ "22": "bald eagle, American eagle, Haliaeetus leucocephalus",
35
+ "23": "vulture",
36
+ "24": "great grey owl, great gray owl, Strix nebulosa",
37
+ "25": "European fire salamander, Salamandra salamandra",
38
+ "26": "common newt, Triturus vulgaris",
39
+ "27": "eft",
40
+ "28": "spotted salamander, Ambystoma maculatum",
41
+ "29": "axolotl, mud puppy, Ambystoma mexicanum",
42
+ "30": "bullfrog, Rana catesbeiana",
43
+ "31": "tree frog, tree-frog",
44
+ "32": "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui",
45
+ "33": "loggerhead, loggerhead turtle, Caretta caretta",
46
+ "34": "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea",
47
+ "35": "mud turtle",
48
+ "36": "terrapin",
49
+ "37": "box turtle, box tortoise",
50
+ "38": "banded gecko",
51
+ "39": "common iguana, iguana, Iguana iguana",
52
+ "40": "American chameleon, anole, Anolis carolinensis",
53
+ "41": "whiptail, whiptail lizard",
54
+ "42": "agama",
55
+ "43": "frilled lizard, Chlamydosaurus kingi",
56
+ "44": "alligator lizard",
57
+ "45": "Gila monster, Heloderma suspectum",
58
+ "46": "green lizard, Lacerta viridis",
59
+ "47": "African chameleon, Chamaeleo chamaeleon",
60
+ "48": "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis",
61
+ "49": "African crocodile, Nile crocodile, Crocodylus niloticus",
62
+ "50": "American alligator, Alligator mississipiensis",
63
+ "51": "triceratops",
64
+ "52": "thunder snake, worm snake, Carphophis amoenus",
65
+ "53": "ringneck snake, ring-necked snake, ring snake",
66
+ "54": "hognose snake, puff adder, sand viper",
67
+ "55": "green snake, grass snake",
68
+ "56": "king snake, kingsnake",
69
+ "57": "garter snake, grass snake",
70
+ "58": "water snake",
71
+ "59": "vine snake",
72
+ "60": "night snake, Hypsiglena torquata",
73
+ "61": "boa constrictor, Constrictor constrictor",
74
+ "62": "rock python, rock snake, Python sebae",
75
+ "63": "Indian cobra, Naja naja",
76
+ "64": "green mamba",
77
+ "65": "sea snake",
78
+ "66": "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus",
79
+ "67": "diamondback, diamondback rattlesnake, Crotalus adamanteus",
80
+ "68": "sidewinder, horned rattlesnake, Crotalus cerastes",
81
+ "69": "trilobite",
82
+ "70": "harvestman, daddy longlegs, Phalangium opilio",
83
+ "71": "scorpion",
84
+ "72": "black and gold garden spider, Argiope aurantia",
85
+ "73": "barn spider, Araneus cavaticus",
86
+ "74": "garden spider, Aranea diademata",
87
+ "75": "black widow, Latrodectus mactans",
88
+ "76": "tarantula",
89
+ "77": "wolf spider, hunting spider",
90
+ "78": "tick",
91
+ "79": "centipede",
92
+ "80": "black grouse",
93
+ "81": "ptarmigan",
94
+ "82": "ruffed grouse, partridge, Bonasa umbellus",
95
+ "83": "prairie chicken, prairie grouse, prairie fowl",
96
+ "84": "peacock",
97
+ "85": "quail",
98
+ "86": "partridge",
99
+ "87": "African grey, African gray, Psittacus erithacus",
100
+ "88": "macaw",
101
+ "89": "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita",
102
+ "90": "lorikeet",
103
+ "91": "coucal",
104
+ "92": "bee eater",
105
+ "93": "hornbill",
106
+ "94": "hummingbird",
107
+ "95": "jacamar",
108
+ "96": "toucan",
109
+ "97": "drake",
110
+ "98": "red-breasted merganser, Mergus serrator",
111
+ "99": "goose",
112
+ "100": "black swan, Cygnus atratus",
113
+ "101": "tusker",
114
+ "102": "echidna, spiny anteater, anteater",
115
+ "103": "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus",
116
+ "104": "wallaby, brush kangaroo",
117
+ "105": "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus",
118
+ "106": "wombat",
119
+ "107": "jellyfish",
120
+ "108": "sea anemone, anemone",
121
+ "109": "brain coral",
122
+ "110": "flatworm, platyhelminth",
123
+ "111": "nematode, nematode worm, roundworm",
124
+ "112": "conch",
125
+ "113": "snail",
126
+ "114": "slug",
127
+ "115": "sea slug, nudibranch",
128
+ "116": "chiton, coat-of-mail shell, sea cradle, polyplacophore",
129
+ "117": "chambered nautilus, pearly nautilus, nautilus",
130
+ "118": "Dungeness crab, Cancer magister",
131
+ "119": "rock crab, Cancer irroratus",
132
+ "120": "fiddler crab",
133
+ "121": "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica",
134
+ "122": "American lobster, Northern lobster, Maine lobster, Homarus americanus",
135
+ "123": "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
136
+ "124": "crayfish, crawfish, crawdad, crawdaddy",
137
+ "125": "hermit crab",
138
+ "126": "isopod",
139
+ "127": "white stork, Ciconia ciconia",
140
+ "128": "black stork, Ciconia nigra",
141
+ "129": "spoonbill",
142
+ "130": "flamingo",
143
+ "131": "little blue heron, Egretta caerulea",
144
+ "132": "American egret, great white heron, Egretta albus",
145
+ "133": "bittern",
146
+ "134": "crane",
147
+ "135": "limpkin, Aramus pictus",
148
+ "136": "European gallinule, Porphyrio porphyrio",
149
+ "137": "American coot, marsh hen, mud hen, water hen, Fulica americana",
150
+ "138": "bustard",
151
+ "139": "ruddy turnstone, Arenaria interpres",
152
+ "140": "red-backed sandpiper, dunlin, Erolia alpina",
153
+ "141": "redshank, Tringa totanus",
154
+ "142": "dowitcher",
155
+ "143": "oystercatcher, oyster catcher",
156
+ "144": "pelican",
157
+ "145": "king penguin, Aptenodytes patagonica",
158
+ "146": "albatross, mollymawk",
159
+ "147": "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus",
160
+ "148": "killer whale, killer, orca, grampus, sea wolf, Orcinus orca",
161
+ "149": "dugong, Dugong dugon",
162
+ "150": "sea lion",
163
+ "151": "Chihuahua",
164
+ "152": "Japanese spaniel",
165
+ "153": "Maltese dog, Maltese terrier, Maltese",
166
+ "154": "Pekinese, Pekingese, Peke",
167
+ "155": "Shih-Tzu",
168
+ "156": "Blenheim spaniel",
169
+ "157": "papillon",
170
+ "158": "toy terrier",
171
+ "159": "Rhodesian ridgeback",
172
+ "160": "Afghan hound, Afghan",
173
+ "161": "basset, basset hound",
174
+ "162": "beagle",
175
+ "163": "bloodhound, sleuthhound",
176
+ "164": "bluetick",
177
+ "165": "black-and-tan coonhound",
178
+ "166": "Walker hound, Walker foxhound",
179
+ "167": "English foxhound",
180
+ "168": "redbone",
181
+ "169": "borzoi, Russian wolfhound",
182
+ "170": "Irish wolfhound",
183
+ "171": "Italian greyhound",
184
+ "172": "whippet",
185
+ "173": "Ibizan hound, Ibizan Podenco",
186
+ "174": "Norwegian elkhound, elkhound",
187
+ "175": "otterhound, otter hound",
188
+ "176": "Saluki, gazelle hound",
189
+ "177": "Scottish deerhound, deerhound",
190
+ "178": "Weimaraner",
191
+ "179": "Staffordshire bullterrier, Staffordshire bull terrier",
192
+ "180": "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier",
193
+ "181": "Bedlington terrier",
194
+ "182": "Border terrier",
195
+ "183": "Kerry blue terrier",
196
+ "184": "Irish terrier",
197
+ "185": "Norfolk terrier",
198
+ "186": "Norwich terrier",
199
+ "187": "Yorkshire terrier",
200
+ "188": "wire-haired fox terrier",
201
+ "189": "Lakeland terrier",
202
+ "190": "Sealyham terrier, Sealyham",
203
+ "191": "Airedale, Airedale terrier",
204
+ "192": "cairn, cairn terrier",
205
+ "193": "Australian terrier",
206
+ "194": "Dandie Dinmont, Dandie Dinmont terrier",
207
+ "195": "Boston bull, Boston terrier",
208
+ "196": "miniature schnauzer",
209
+ "197": "giant schnauzer",
210
+ "198": "standard schnauzer",
211
+ "199": "Scotch terrier, Scottish terrier, Scottie",
212
+ "200": "Tibetan terrier, chrysanthemum dog",
213
+ "201": "silky terrier, Sydney silky",
214
+ "202": "soft-coated wheaten terrier",
215
+ "203": "West Highland white terrier",
216
+ "204": "Lhasa, Lhasa apso",
217
+ "205": "flat-coated retriever",
218
+ "206": "curly-coated retriever",
219
+ "207": "golden retriever",
220
+ "208": "Labrador retriever",
221
+ "209": "Chesapeake Bay retriever",
222
+ "210": "German short-haired pointer",
223
+ "211": "vizsla, Hungarian pointer",
224
+ "212": "English setter",
225
+ "213": "Irish setter, red setter",
226
+ "214": "Gordon setter",
227
+ "215": "Brittany spaniel",
228
+ "216": "clumber, clumber spaniel",
229
+ "217": "English springer, English springer spaniel",
230
+ "218": "Welsh springer spaniel",
231
+ "219": "cocker spaniel, English cocker spaniel, cocker",
232
+ "220": "Sussex spaniel",
233
+ "221": "Irish water spaniel",
234
+ "222": "kuvasz",
235
+ "223": "schipperke",
236
+ "224": "groenendael",
237
+ "225": "malinois",
238
+ "226": "briard",
239
+ "227": "kelpie",
240
+ "228": "komondor",
241
+ "229": "Old English sheepdog, bobtail",
242
+ "230": "Shetland sheepdog, Shetland sheep dog, Shetland",
243
+ "231": "collie",
244
+ "232": "Border collie",
245
+ "233": "Bouvier des Flandres, Bouviers des Flandres",
246
+ "234": "Rottweiler",
247
+ "235": "German shepherd, German shepherd dog, German police dog, alsatian",
248
+ "236": "Doberman, Doberman pinscher",
249
+ "237": "miniature pinscher",
250
+ "238": "Greater Swiss Mountain dog",
251
+ "239": "Bernese mountain dog",
252
+ "240": "Appenzeller",
253
+ "241": "EntleBucher",
254
+ "242": "boxer",
255
+ "243": "bull mastiff",
256
+ "244": "Tibetan mastiff",
257
+ "245": "French bulldog",
258
+ "246": "Great Dane",
259
+ "247": "Saint Bernard, St Bernard",
260
+ "248": "Eskimo dog, husky",
261
+ "249": "malamute, malemute, Alaskan malamute",
262
+ "250": "Siberian husky",
263
+ "251": "dalmatian, coach dog, carriage dog",
264
+ "252": "affenpinscher, monkey pinscher, monkey dog",
265
+ "253": "basenji",
266
+ "254": "pug, pug-dog",
267
+ "255": "Leonberg",
268
+ "256": "Newfoundland, Newfoundland dog",
269
+ "257": "Great Pyrenees",
270
+ "258": "Samoyed, Samoyede",
271
+ "259": "Pomeranian",
272
+ "260": "chow, chow chow",
273
+ "261": "keeshond",
274
+ "262": "Brabancon griffon",
275
+ "263": "Pembroke, Pembroke Welsh corgi",
276
+ "264": "Cardigan, Cardigan Welsh corgi",
277
+ "265": "toy poodle",
278
+ "266": "miniature poodle",
279
+ "267": "standard poodle",
280
+ "268": "Mexican hairless",
281
+ "269": "timber wolf, grey wolf, gray wolf, Canis lupus",
282
+ "270": "white wolf, Arctic wolf, Canis lupus tundrarum",
283
+ "271": "red wolf, maned wolf, Canis rufus, Canis niger",
284
+ "272": "coyote, prairie wolf, brush wolf, Canis latrans",
285
+ "273": "dingo, warrigal, warragal, Canis dingo",
286
+ "274": "dhole, Cuon alpinus",
287
+ "275": "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus",
288
+ "276": "hyena, hyaena",
289
+ "277": "red fox, Vulpes vulpes",
290
+ "278": "kit fox, Vulpes macrotis",
291
+ "279": "Arctic fox, white fox, Alopex lagopus",
292
+ "280": "grey fox, gray fox, Urocyon cinereoargenteus",
293
+ "281": "tabby, tabby cat",
294
+ "282": "tiger cat",
295
+ "283": "Persian cat",
296
+ "284": "Siamese cat, Siamese",
297
+ "285": "Egyptian cat",
298
+ "286": "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor",
299
+ "287": "lynx, catamount",
300
+ "288": "leopard, Panthera pardus",
301
+ "289": "snow leopard, ounce, Panthera uncia",
302
+ "290": "jaguar, panther, Panthera onca, Felis onca",
303
+ "291": "lion, king of beasts, Panthera leo",
304
+ "292": "tiger, Panthera tigris",
305
+ "293": "cheetah, chetah, Acinonyx jubatus",
306
+ "294": "brown bear, bruin, Ursus arctos",
307
+ "295": "American black bear, black bear, Ursus americanus, Euarctos americanus",
308
+ "296": "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus",
309
+ "297": "sloth bear, Melursus ursinus, Ursus ursinus",
310
+ "298": "mongoose",
311
+ "299": "meerkat, mierkat",
312
+ "300": "tiger beetle",
313
+ "301": "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
314
+ "302": "ground beetle, carabid beetle",
315
+ "303": "long-horned beetle, longicorn, longicorn beetle",
316
+ "304": "leaf beetle, chrysomelid",
317
+ "305": "dung beetle",
318
+ "306": "rhinoceros beetle",
319
+ "307": "weevil",
320
+ "308": "fly",
321
+ "309": "bee",
322
+ "310": "ant, emmet, pismire",
323
+ "311": "grasshopper, hopper",
324
+ "312": "cricket",
325
+ "313": "walking stick, walkingstick, stick insect",
326
+ "314": "cockroach, roach",
327
+ "315": "mantis, mantid",
328
+ "316": "cicada, cicala",
329
+ "317": "leafhopper",
330
+ "318": "lacewing, lacewing fly",
331
+ "319": "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
332
+ "320": "damselfly",
333
+ "321": "admiral",
334
+ "322": "ringlet, ringlet butterfly",
335
+ "323": "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus",
336
+ "324": "cabbage butterfly",
337
+ "325": "sulphur butterfly, sulfur butterfly",
338
+ "326": "lycaenid, lycaenid butterfly",
339
+ "327": "starfish, sea star",
340
+ "328": "sea urchin",
341
+ "329": "sea cucumber, holothurian",
342
+ "330": "wood rabbit, cottontail, cottontail rabbit",
343
+ "331": "hare",
344
+ "332": "Angora, Angora rabbit",
345
+ "333": "hamster",
346
+ "334": "porcupine, hedgehog",
347
+ "335": "fox squirrel, eastern fox squirrel, Sciurus niger",
348
+ "336": "marmot",
349
+ "337": "beaver",
350
+ "338": "guinea pig, Cavia cobaya",
351
+ "339": "sorrel",
352
+ "340": "zebra",
353
+ "341": "hog, pig, grunter, squealer, Sus scrofa",
354
+ "342": "wild boar, boar, Sus scrofa",
355
+ "343": "warthog",
356
+ "344": "hippopotamus, hippo, river horse, Hippopotamus amphibius",
357
+ "345": "ox",
358
+ "346": "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis",
359
+ "347": "bison",
360
+ "348": "ram, tup",
361
+ "349": "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis",
362
+ "350": "ibex, Capra ibex",
363
+ "351": "hartebeest",
364
+ "352": "impala, Aepyceros melampus",
365
+ "353": "gazelle",
366
+ "354": "Arabian camel, dromedary, Camelus dromedarius",
367
+ "355": "llama",
368
+ "356": "weasel",
369
+ "357": "mink",
370
+ "358": "polecat, fitch, foulmart, foumart, Mustela putorius",
371
+ "359": "black-footed ferret, ferret, Mustela nigripes",
372
+ "360": "otter",
373
+ "361": "skunk, polecat, wood pussy",
374
+ "362": "badger",
375
+ "363": "armadillo",
376
+ "364": "three-toed sloth, ai, Bradypus tridactylus",
377
+ "365": "orangutan, orang, orangutang, Pongo pygmaeus",
378
+ "366": "gorilla, Gorilla gorilla",
379
+ "367": "chimpanzee, chimp, Pan troglodytes",
380
+ "368": "gibbon, Hylobates lar",
381
+ "369": "siamang, Hylobates syndactylus, Symphalangus syndactylus",
382
+ "370": "guenon, guenon monkey",
383
+ "371": "patas, hussar monkey, Erythrocebus patas",
384
+ "372": "baboon",
385
+ "373": "macaque",
386
+ "374": "langur",
387
+ "375": "colobus, colobus monkey",
388
+ "376": "proboscis monkey, Nasalis larvatus",
389
+ "377": "marmoset",
390
+ "378": "capuchin, ringtail, Cebus capucinus",
391
+ "379": "howler monkey, howler",
392
+ "380": "titi, titi monkey",
393
+ "381": "spider monkey, Ateles geoffroyi",
394
+ "382": "squirrel monkey, Saimiri sciureus",
395
+ "383": "Madagascar cat, ring-tailed lemur, Lemur catta",
396
+ "384": "indri, indris, Indri indri, Indri brevicaudatus",
397
+ "385": "Indian elephant, Elephas maximus",
398
+ "386": "African elephant, Loxodonta africana",
399
+ "387": "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens",
400
+ "388": "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca",
401
+ "389": "barracouta, snoek",
402
+ "390": "eel",
403
+ "391": "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch",
404
+ "392": "rock beauty, Holocanthus tricolor",
405
+ "393": "anemone fish",
406
+ "394": "sturgeon",
407
+ "395": "gar, garfish, garpike, billfish, Lepisosteus osseus",
408
+ "396": "lionfish",
409
+ "397": "puffer, pufferfish, blowfish, globefish",
410
+ "398": "abacus",
411
+ "399": "abaya",
412
+ "400": "academic gown, academic robe, judge's robe",
413
+ "401": "accordion, piano accordion, squeeze box",
414
+ "402": "acoustic guitar",
415
+ "403": "aircraft carrier, carrier, flattop, attack aircraft carrier",
416
+ "404": "airliner",
417
+ "405": "airship, dirigible",
418
+ "406": "altar",
419
+ "407": "ambulance",
420
+ "408": "amphibian, amphibious vehicle",
421
+ "409": "analog clock",
422
+ "410": "apiary, bee house",
423
+ "411": "apron",
424
+ "412": "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
425
+ "413": "assault rifle, assault gun",
426
+ "414": "backpack, back pack, knapsack, packsack, rucksack, haversack",
427
+ "415": "bakery, bakeshop, bakehouse",
428
+ "416": "balance beam, beam",
429
+ "417": "balloon",
430
+ "418": "ballpoint, ballpoint pen, ballpen, Biro",
431
+ "419": "Band Aid",
432
+ "420": "banjo",
433
+ "421": "bannister, banister, balustrade, balusters, handrail",
434
+ "422": "barbell",
435
+ "423": "barber chair",
436
+ "424": "barbershop",
437
+ "425": "barn",
438
+ "426": "barometer",
439
+ "427": "barrel, cask",
440
+ "428": "barrow, garden cart, lawn cart, wheelbarrow",
441
+ "429": "baseball",
442
+ "430": "basketball",
443
+ "431": "bassinet",
444
+ "432": "bassoon",
445
+ "433": "bathing cap, swimming cap",
446
+ "434": "bath towel",
447
+ "435": "bathtub, bathing tub, bath, tub",
448
+ "436": "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
449
+ "437": "beacon, lighthouse, beacon light, pharos",
450
+ "438": "beaker",
451
+ "439": "bearskin, busby, shako",
452
+ "440": "beer bottle",
453
+ "441": "beer glass",
454
+ "442": "bell cote, bell cot",
455
+ "443": "bib",
456
+ "444": "bicycle-built-for-two, tandem bicycle, tandem",
457
+ "445": "bikini, two-piece",
458
+ "446": "binder, ring-binder",
459
+ "447": "binoculars, field glasses, opera glasses",
460
+ "448": "birdhouse",
461
+ "449": "boathouse",
462
+ "450": "bobsled, bobsleigh, bob",
463
+ "451": "bolo tie, bolo, bola tie, bola",
464
+ "452": "bonnet, poke bonnet",
465
+ "453": "bookcase",
466
+ "454": "bookshop, bookstore, bookstall",
467
+ "455": "bottlecap",
468
+ "456": "bow",
469
+ "457": "bow tie, bow-tie, bowtie",
470
+ "458": "brass, memorial tablet, plaque",
471
+ "459": "brassiere, bra, bandeau",
472
+ "460": "breakwater, groin, groyne, mole, bulwark, seawall, jetty",
473
+ "461": "breastplate, aegis, egis",
474
+ "462": "broom",
475
+ "463": "bucket, pail",
476
+ "464": "buckle",
477
+ "465": "bulletproof vest",
478
+ "466": "bullet train, bullet",
479
+ "467": "butcher shop, meat market",
480
+ "468": "cab, hack, taxi, taxicab",
481
+ "469": "caldron, cauldron",
482
+ "470": "candle, taper, wax light",
483
+ "471": "cannon",
484
+ "472": "canoe",
485
+ "473": "can opener, tin opener",
486
+ "474": "cardigan",
487
+ "475": "car mirror",
488
+ "476": "carousel, carrousel, merry-go-round, roundabout, whirligig",
489
+ "477": "carpenter's kit, tool kit",
490
+ "478": "carton",
491
+ "479": "car wheel",
492
+ "480": "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM",
493
+ "481": "cassette",
494
+ "482": "cassette player",
495
+ "483": "castle",
496
+ "484": "catamaran",
497
+ "485": "CD player",
498
+ "486": "cello, violoncello",
499
+ "487": "cellular telephone, cellular phone, cellphone, cell, mobile phone",
500
+ "488": "chain",
501
+ "489": "chainlink fence",
502
+ "490": "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
503
+ "491": "chain saw, chainsaw",
504
+ "492": "chest",
505
+ "493": "chiffonier, commode",
506
+ "494": "chime, bell, gong",
507
+ "495": "china cabinet, china closet",
508
+ "496": "Christmas stocking",
509
+ "497": "church, church building",
510
+ "498": "cinema, movie theater, movie theatre, movie house, picture palace",
511
+ "499": "cleaver, meat cleaver, chopper",
512
+ "500": "cliff dwelling",
513
+ "501": "cloak",
514
+ "502": "clog, geta, patten, sabot",
515
+ "503": "cocktail shaker",
516
+ "504": "coffee mug",
517
+ "505": "coffeepot",
518
+ "506": "coil, spiral, volute, whorl, helix",
519
+ "507": "combination lock",
520
+ "508": "computer keyboard, keypad",
521
+ "509": "confectionery, confectionary, candy store",
522
+ "510": "container ship, containership, container vessel",
523
+ "511": "convertible",
524
+ "512": "corkscrew, bottle screw",
525
+ "513": "cornet, horn, trumpet, trump",
526
+ "514": "cowboy boot",
527
+ "515": "cowboy hat, ten-gallon hat",
528
+ "516": "cradle",
529
+ "517": "crane",
530
+ "518": "crash helmet",
531
+ "519": "crate",
532
+ "520": "crib, cot",
533
+ "521": "Crock Pot",
534
+ "522": "croquet ball",
535
+ "523": "crutch",
536
+ "524": "cuirass",
537
+ "525": "dam, dike, dyke",
538
+ "526": "desk",
539
+ "527": "desktop computer",
540
+ "528": "dial telephone, dial phone",
541
+ "529": "diaper, nappy, napkin",
542
+ "530": "digital clock",
543
+ "531": "digital watch",
544
+ "532": "dining table, board",
545
+ "533": "dishrag, dishcloth",
546
+ "534": "dishwasher, dish washer, dishwashing machine",
547
+ "535": "disk brake, disc brake",
548
+ "536": "dock, dockage, docking facility",
549
+ "537": "dogsled, dog sled, dog sleigh",
550
+ "538": "dome",
551
+ "539": "doormat, welcome mat",
552
+ "540": "drilling platform, offshore rig",
553
+ "541": "drum, membranophone, tympan",
554
+ "542": "drumstick",
555
+ "543": "dumbbell",
556
+ "544": "Dutch oven",
557
+ "545": "electric fan, blower",
558
+ "546": "electric guitar",
559
+ "547": "electric locomotive",
560
+ "548": "entertainment center",
561
+ "549": "envelope",
562
+ "550": "espresso maker",
563
+ "551": "face powder",
564
+ "552": "feather boa, boa",
565
+ "553": "file, file cabinet, filing cabinet",
566
+ "554": "fireboat",
567
+ "555": "fire engine, fire truck",
568
+ "556": "fire screen, fireguard",
569
+ "557": "flagpole, flagstaff",
570
+ "558": "flute, transverse flute",
571
+ "559": "folding chair",
572
+ "560": "football helmet",
573
+ "561": "forklift",
574
+ "562": "fountain",
575
+ "563": "fountain pen",
576
+ "564": "four-poster",
577
+ "565": "freight car",
578
+ "566": "French horn, horn",
579
+ "567": "frying pan, frypan, skillet",
580
+ "568": "fur coat",
581
+ "569": "garbage truck, dustcart",
582
+ "570": "gasmask, respirator, gas helmet",
583
+ "571": "gas pump, gasoline pump, petrol pump, island dispenser",
584
+ "572": "goblet",
585
+ "573": "go-kart",
586
+ "574": "golf ball",
587
+ "575": "golfcart, golf cart",
588
+ "576": "gondola",
589
+ "577": "gong, tam-tam",
590
+ "578": "gown",
591
+ "579": "grand piano, grand",
592
+ "580": "greenhouse, nursery, glasshouse",
593
+ "581": "grille, radiator grille",
594
+ "582": "grocery store, grocery, food market, market",
595
+ "583": "guillotine",
596
+ "584": "hair slide",
597
+ "585": "hair spray",
598
+ "586": "half track",
599
+ "587": "hammer",
600
+ "588": "hamper",
601
+ "589": "hand blower, blow dryer, blow drier, hair dryer, hair drier",
602
+ "590": "hand-held computer, hand-held microcomputer",
603
+ "591": "handkerchief, hankie, hanky, hankey",
604
+ "592": "hard disc, hard disk, fixed disk",
605
+ "593": "harmonica, mouth organ, harp, mouth harp",
606
+ "594": "harp",
607
+ "595": "harvester, reaper",
608
+ "596": "hatchet",
609
+ "597": "holster",
610
+ "598": "home theater, home theatre",
611
+ "599": "honeycomb",
612
+ "600": "hook, claw",
613
+ "601": "hoopskirt, crinoline",
614
+ "602": "horizontal bar, high bar",
615
+ "603": "horse cart, horse-cart",
616
+ "604": "hourglass",
617
+ "605": "iPod",
618
+ "606": "iron, smoothing iron",
619
+ "607": "jack-o'-lantern",
620
+ "608": "jean, blue jean, denim",
621
+ "609": "jeep, landrover",
622
+ "610": "jersey, T-shirt, tee shirt",
623
+ "611": "jigsaw puzzle",
624
+ "612": "jinrikisha, ricksha, rickshaw",
625
+ "613": "joystick",
626
+ "614": "kimono",
627
+ "615": "knee pad",
628
+ "616": "knot",
629
+ "617": "lab coat, laboratory coat",
630
+ "618": "ladle",
631
+ "619": "lampshade, lamp shade",
632
+ "620": "laptop, laptop computer",
633
+ "621": "lawn mower, mower",
634
+ "622": "lens cap, lens cover",
635
+ "623": "letter opener, paper knife, paperknife",
636
+ "624": "library",
637
+ "625": "lifeboat",
638
+ "626": "lighter, light, igniter, ignitor",
639
+ "627": "limousine, limo",
640
+ "628": "liner, ocean liner",
641
+ "629": "lipstick, lip rouge",
642
+ "630": "Loafer",
643
+ "631": "lotion",
644
+ "632": "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
645
+ "633": "loupe, jeweler's loupe",
646
+ "634": "lumbermill, sawmill",
647
+ "635": "magnetic compass",
648
+ "636": "mailbag, postbag",
649
+ "637": "mailbox, letter box",
650
+ "638": "maillot",
651
+ "639": "maillot, tank suit",
652
+ "640": "manhole cover",
653
+ "641": "maraca",
654
+ "642": "marimba, xylophone",
655
+ "643": "mask",
656
+ "644": "matchstick",
657
+ "645": "maypole",
658
+ "646": "maze, labyrinth",
659
+ "647": "measuring cup",
660
+ "648": "medicine chest, medicine cabinet",
661
+ "649": "megalith, megalithic structure",
662
+ "650": "microphone, mike",
663
+ "651": "microwave, microwave oven",
664
+ "652": "military uniform",
665
+ "653": "milk can",
666
+ "654": "minibus",
667
+ "655": "miniskirt, mini",
668
+ "656": "minivan",
669
+ "657": "missile",
670
+ "658": "mitten",
671
+ "659": "mixing bowl",
672
+ "660": "mobile home, manufactured home",
673
+ "661": "Model T",
674
+ "662": "modem",
675
+ "663": "monastery",
676
+ "664": "monitor",
677
+ "665": "moped",
678
+ "666": "mortar",
679
+ "667": "mortarboard",
680
+ "668": "mosque",
681
+ "669": "mosquito net",
682
+ "670": "motor scooter, scooter",
683
+ "671": "mountain bike, all-terrain bike, off-roader",
684
+ "672": "mountain tent",
685
+ "673": "mouse, computer mouse",
686
+ "674": "mousetrap",
687
+ "675": "moving van",
688
+ "676": "muzzle",
689
+ "677": "nail",
690
+ "678": "neck brace",
691
+ "679": "necklace",
692
+ "680": "nipple",
693
+ "681": "notebook, notebook computer",
694
+ "682": "obelisk",
695
+ "683": "oboe, hautboy, hautbois",
696
+ "684": "ocarina, sweet potato",
697
+ "685": "odometer, hodometer, mileometer, milometer",
698
+ "686": "oil filter",
699
+ "687": "organ, pipe organ",
700
+ "688": "oscilloscope, scope, cathode-ray oscilloscope, CRO",
701
+ "689": "overskirt",
702
+ "690": "oxcart",
703
+ "691": "oxygen mask",
704
+ "692": "packet",
705
+ "693": "paddle, boat paddle",
706
+ "694": "paddlewheel, paddle wheel",
707
+ "695": "padlock",
708
+ "696": "paintbrush",
709
+ "697": "pajama, pyjama, pj's, jammies",
710
+ "698": "palace",
711
+ "699": "panpipe, pandean pipe, syrinx",
712
+ "700": "paper towel",
713
+ "701": "parachute, chute",
714
+ "702": "parallel bars, bars",
715
+ "703": "park bench",
716
+ "704": "parking meter",
717
+ "705": "passenger car, coach, carriage",
718
+ "706": "patio, terrace",
719
+ "707": "pay-phone, pay-station",
720
+ "708": "pedestal, plinth, footstall",
721
+ "709": "pencil box, pencil case",
722
+ "710": "pencil sharpener",
723
+ "711": "perfume, essence",
724
+ "712": "Petri dish",
725
+ "713": "photocopier",
726
+ "714": "pick, plectrum, plectron",
727
+ "715": "pickelhaube",
728
+ "716": "picket fence, paling",
729
+ "717": "pickup, pickup truck",
730
+ "718": "pier",
731
+ "719": "piggy bank, penny bank",
732
+ "720": "pill bottle",
733
+ "721": "pillow",
734
+ "722": "ping-pong ball",
735
+ "723": "pinwheel",
736
+ "724": "pirate, pirate ship",
737
+ "725": "pitcher, ewer",
738
+ "726": "plane, carpenter's plane, woodworking plane",
739
+ "727": "planetarium",
740
+ "728": "plastic bag",
741
+ "729": "plate rack",
742
+ "730": "plow, plough",
743
+ "731": "plunger, plumber's helper",
744
+ "732": "Polaroid camera, Polaroid Land camera",
745
+ "733": "pole",
746
+ "734": "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria",
747
+ "735": "poncho",
748
+ "736": "pool table, billiard table, snooker table",
749
+ "737": "pop bottle, soda bottle",
750
+ "738": "pot, flowerpot",
751
+ "739": "potter's wheel",
752
+ "740": "power drill",
753
+ "741": "prayer rug, prayer mat",
754
+ "742": "printer",
755
+ "743": "prison, prison house",
756
+ "744": "projectile, missile",
757
+ "745": "projector",
758
+ "746": "puck, hockey puck",
759
+ "747": "punching bag, punch bag, punching ball, punchball",
760
+ "748": "purse",
761
+ "749": "quill, quill pen",
762
+ "750": "quilt, comforter, comfort, puff",
763
+ "751": "racer, race car, racing car",
764
+ "752": "racket, racquet",
765
+ "753": "radiator",
766
+ "754": "radio, wireless",
767
+ "755": "radio telescope, radio reflector",
768
+ "756": "rain barrel",
769
+ "757": "recreational vehicle, RV, R.V.",
770
+ "758": "reel",
771
+ "759": "reflex camera",
772
+ "760": "refrigerator, icebox",
773
+ "761": "remote control, remote",
774
+ "762": "restaurant, eating house, eating place, eatery",
775
+ "763": "revolver, six-gun, six-shooter",
776
+ "764": "rifle",
777
+ "765": "rocking chair, rocker",
778
+ "766": "rotisserie",
779
+ "767": "rubber eraser, rubber, pencil eraser",
780
+ "768": "rugby ball",
781
+ "769": "rule, ruler",
782
+ "770": "running shoe",
783
+ "771": "safe",
784
+ "772": "safety pin",
785
+ "773": "saltshaker, salt shaker",
786
+ "774": "sandal",
787
+ "775": "sarong",
788
+ "776": "sax, saxophone",
789
+ "777": "scabbard",
790
+ "778": "scale, weighing machine",
791
+ "779": "school bus",
792
+ "780": "schooner",
793
+ "781": "scoreboard",
794
+ "782": "screen, CRT screen",
795
+ "783": "screw",
796
+ "784": "screwdriver",
797
+ "785": "seat belt, seatbelt",
798
+ "786": "sewing machine",
799
+ "787": "shield, buckler",
800
+ "788": "shoe shop, shoe-shop, shoe store",
801
+ "789": "shoji",
802
+ "790": "shopping basket",
803
+ "791": "shopping cart",
804
+ "792": "shovel",
805
+ "793": "shower cap",
806
+ "794": "shower curtain",
807
+ "795": "ski",
808
+ "796": "ski mask",
809
+ "797": "sleeping bag",
810
+ "798": "slide rule, slipstick",
811
+ "799": "sliding door",
812
+ "800": "slot, one-armed bandit",
813
+ "801": "snorkel",
814
+ "802": "snowmobile",
815
+ "803": "snowplow, snowplough",
816
+ "804": "soap dispenser",
817
+ "805": "soccer ball",
818
+ "806": "sock",
819
+ "807": "solar dish, solar collector, solar furnace",
820
+ "808": "sombrero",
821
+ "809": "soup bowl",
822
+ "810": "space bar",
823
+ "811": "space heater",
824
+ "812": "space shuttle",
825
+ "813": "spatula",
826
+ "814": "speedboat",
827
+ "815": "spider web, spider's web",
828
+ "816": "spindle",
829
+ "817": "sports car, sport car",
830
+ "818": "spotlight, spot",
831
+ "819": "stage",
832
+ "820": "steam locomotive",
833
+ "821": "steel arch bridge",
834
+ "822": "steel drum",
835
+ "823": "stethoscope",
836
+ "824": "stole",
837
+ "825": "stone wall",
838
+ "826": "stopwatch, stop watch",
839
+ "827": "stove",
840
+ "828": "strainer",
841
+ "829": "streetcar, tram, tramcar, trolley, trolley car",
842
+ "830": "stretcher",
843
+ "831": "studio couch, day bed",
844
+ "832": "stupa, tope",
845
+ "833": "submarine, pigboat, sub, U-boat",
846
+ "834": "suit, suit of clothes",
847
+ "835": "sundial",
848
+ "836": "sunglass",
849
+ "837": "sunglasses, dark glasses, shades",
850
+ "838": "sunscreen, sunblock, sun blocker",
851
+ "839": "suspension bridge",
852
+ "840": "swab, swob, mop",
853
+ "841": "sweatshirt",
854
+ "842": "swimming trunks, bathing trunks",
855
+ "843": "swing",
856
+ "844": "switch, electric switch, electrical switch",
857
+ "845": "syringe",
858
+ "846": "table lamp",
859
+ "847": "tank, army tank, armored combat vehicle, armoured combat vehicle",
860
+ "848": "tape player",
861
+ "849": "teapot",
862
+ "850": "teddy, teddy bear",
863
+ "851": "television, television system",
864
+ "852": "tennis ball",
865
+ "853": "thatch, thatched roof",
866
+ "854": "theater curtain, theatre curtain",
867
+ "855": "thimble",
868
+ "856": "thresher, thrasher, threshing machine",
869
+ "857": "throne",
870
+ "858": "tile roof",
871
+ "859": "toaster",
872
+ "860": "tobacco shop, tobacconist shop, tobacconist",
873
+ "861": "toilet seat",
874
+ "862": "torch",
875
+ "863": "totem pole",
876
+ "864": "tow truck, tow car, wrecker",
877
+ "865": "toyshop",
878
+ "866": "tractor",
879
+ "867": "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
880
+ "868": "tray",
881
+ "869": "trench coat",
882
+ "870": "tricycle, trike, velocipede",
883
+ "871": "trimaran",
884
+ "872": "tripod",
885
+ "873": "triumphal arch",
886
+ "874": "trolleybus, trolley coach, trackless trolley",
887
+ "875": "trombone",
888
+ "876": "tub, vat",
889
+ "877": "turnstile",
890
+ "878": "typewriter keyboard",
891
+ "879": "umbrella",
892
+ "880": "unicycle, monocycle",
893
+ "881": "upright, upright piano",
894
+ "882": "vacuum, vacuum cleaner",
895
+ "883": "vase",
896
+ "884": "vault",
897
+ "885": "velvet",
898
+ "886": "vending machine",
899
+ "887": "vestment",
900
+ "888": "viaduct",
901
+ "889": "violin, fiddle",
902
+ "890": "volleyball",
903
+ "891": "waffle iron",
904
+ "892": "wall clock",
905
+ "893": "wallet, billfold, notecase, pocketbook",
906
+ "894": "wardrobe, closet, press",
907
+ "895": "warplane, military plane",
908
+ "896": "washbasin, handbasin, washbowl, lavabo, wash-hand basin",
909
+ "897": "washer, automatic washer, washing machine",
910
+ "898": "water bottle",
911
+ "899": "water jug",
912
+ "900": "water tower",
913
+ "901": "whiskey jug",
914
+ "902": "whistle",
915
+ "903": "wig",
916
+ "904": "window screen",
917
+ "905": "window shade",
918
+ "906": "Windsor tie",
919
+ "907": "wine bottle",
920
+ "908": "wing",
921
+ "909": "wok",
922
+ "910": "wooden spoon",
923
+ "911": "wool, woolen, woollen",
924
+ "912": "worm fence, snake fence, snake-rail fence, Virginia fence",
925
+ "913": "wreck",
926
+ "914": "yawl",
927
+ "915": "yurt",
928
+ "916": "web site, website, internet site, site",
929
+ "917": "comic book",
930
+ "918": "crossword puzzle, crossword",
931
+ "919": "street sign",
932
+ "920": "traffic light, traffic signal, stoplight",
933
+ "921": "book jacket, dust cover, dust jacket, dust wrapper",
934
+ "922": "menu",
935
+ "923": "plate",
936
+ "924": "guacamole",
937
+ "925": "consomme",
938
+ "926": "hot pot, hotpot",
939
+ "927": "trifle",
940
+ "928": "ice cream, icecream",
941
+ "929": "ice lolly, lolly, lollipop, popsicle",
942
+ "930": "French loaf",
943
+ "931": "bagel, beigel",
944
+ "932": "pretzel",
945
+ "933": "cheeseburger",
946
+ "934": "hotdog, hot dog, red hot",
947
+ "935": "mashed potato",
948
+ "936": "head cabbage",
949
+ "937": "broccoli",
950
+ "938": "cauliflower",
951
+ "939": "zucchini, courgette",
952
+ "940": "spaghetti squash",
953
+ "941": "acorn squash",
954
+ "942": "butternut squash",
955
+ "943": "cucumber, cuke",
956
+ "944": "artichoke, globe artichoke",
957
+ "945": "bell pepper",
958
+ "946": "cardoon",
959
+ "947": "mushroom",
960
+ "948": "Granny Smith",
961
+ "949": "strawberry",
962
+ "950": "orange",
963
+ "951": "lemon",
964
+ "952": "fig",
965
+ "953": "pineapple, ananas",
966
+ "954": "banana",
967
+ "955": "jackfruit, jak, jack",
968
+ "956": "custard apple",
969
+ "957": "pomegranate",
970
+ "958": "hay",
971
+ "959": "carbonara",
972
+ "960": "chocolate sauce, chocolate syrup",
973
+ "961": "dough",
974
+ "962": "meat loaf, meatloaf",
975
+ "963": "pizza, pizza pie",
976
+ "964": "potpie",
977
+ "965": "burrito",
978
+ "966": "red wine",
979
+ "967": "espresso",
980
+ "968": "cup",
981
+ "969": "eggnog",
982
+ "970": "alp",
983
+ "971": "bubble",
984
+ "972": "cliff, drop, drop-off",
985
+ "973": "coral reef",
986
+ "974": "geyser",
987
+ "975": "lakeside, lakeshore",
988
+ "976": "promontory, headland, head, foreland",
989
+ "977": "sandbar, sand bar",
990
+ "978": "seashore, coast, seacoast, sea-coast",
991
+ "979": "valley, vale",
992
+ "980": "volcano",
993
+ "981": "ballplayer, baseball player",
994
+ "982": "groom, bridegroom",
995
+ "983": "scuba diver",
996
+ "984": "rapeseed",
997
+ "985": "daisy",
998
+ "986": "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum",
999
+ "987": "corn",
1000
+ "988": "acorn",
1001
+ "989": "hip, rose hip, rosehip",
1002
+ "990": "buckeye, horse chestnut, conker",
1003
+ "991": "coral fungus",
1004
+ "992": "agaric",
1005
+ "993": "gyromitra",
1006
+ "994": "stinkhorn, carrion fungus",
1007
+ "995": "earthstar",
1008
+ "996": "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa",
1009
+ "997": "bolete",
1010
+ "998": "ear, spike, capitulum",
1011
+ "999": "toilet tissue, toilet paper, bathroom tissue"
1012
+ },
1013
+ "image_size": 224,
1014
+ "initializer_range": 0.02,
1015
+ "intermediate_size": 3072,
1016
+ "label2id": {
1017
+ "Afghan hound, Afghan": 160,
1018
+ "African chameleon, Chamaeleo chamaeleon": 47,
1019
+ "African crocodile, Nile crocodile, Crocodylus niloticus": 49,
1020
+ "African elephant, Loxodonta africana": 386,
1021
+ "African grey, African gray, Psittacus erithacus": 87,
1022
+ "African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus": 275,
1023
+ "Airedale, Airedale terrier": 191,
1024
+ "American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier": 180,
1025
+ "American alligator, Alligator mississipiensis": 50,
1026
+ "American black bear, black bear, Ursus americanus, Euarctos americanus": 295,
1027
+ "American chameleon, anole, Anolis carolinensis": 40,
1028
+ "American coot, marsh hen, mud hen, water hen, Fulica americana": 137,
1029
+ "American egret, great white heron, Egretta albus": 132,
1030
+ "American lobster, Northern lobster, Maine lobster, Homarus americanus": 122,
1031
+ "Angora, Angora rabbit": 332,
1032
+ "Appenzeller": 240,
1033
+ "Arabian camel, dromedary, Camelus dromedarius": 354,
1034
+ "Arctic fox, white fox, Alopex lagopus": 279,
1035
+ "Australian terrier": 193,
1036
+ "Band Aid": 419,
1037
+ "Bedlington terrier": 181,
1038
+ "Bernese mountain dog": 239,
1039
+ "Blenheim spaniel": 156,
1040
+ "Border collie": 232,
1041
+ "Border terrier": 182,
1042
+ "Boston bull, Boston terrier": 195,
1043
+ "Bouvier des Flandres, Bouviers des Flandres": 233,
1044
+ "Brabancon griffon": 262,
1045
+ "Brittany spaniel": 215,
1046
+ "CD player": 485,
1047
+ "Cardigan, Cardigan Welsh corgi": 264,
1048
+ "Chesapeake Bay retriever": 209,
1049
+ "Chihuahua": 151,
1050
+ "Christmas stocking": 496,
1051
+ "Crock Pot": 521,
1052
+ "Dandie Dinmont, Dandie Dinmont terrier": 194,
1053
+ "Doberman, Doberman pinscher": 236,
1054
+ "Dungeness crab, Cancer magister": 118,
1055
+ "Dutch oven": 544,
1056
+ "Egyptian cat": 285,
1057
+ "English foxhound": 167,
1058
+ "English setter": 212,
1059
+ "English springer, English springer spaniel": 217,
1060
+ "EntleBucher": 241,
1061
+ "Eskimo dog, husky": 248,
1062
+ "European fire salamander, Salamandra salamandra": 25,
1063
+ "European gallinule, Porphyrio porphyrio": 136,
1064
+ "French bulldog": 245,
1065
+ "French horn, horn": 566,
1066
+ "French loaf": 930,
1067
+ "German shepherd, German shepherd dog, German police dog, alsatian": 235,
1068
+ "German short-haired pointer": 210,
1069
+ "Gila monster, Heloderma suspectum": 45,
1070
+ "Gordon setter": 214,
1071
+ "Granny Smith": 948,
1072
+ "Great Dane": 246,
1073
+ "Great Pyrenees": 257,
1074
+ "Greater Swiss Mountain dog": 238,
1075
+ "Ibizan hound, Ibizan Podenco": 173,
1076
+ "Indian cobra, Naja naja": 63,
1077
+ "Indian elephant, Elephas maximus": 385,
1078
+ "Irish setter, red setter": 213,
1079
+ "Irish terrier": 184,
1080
+ "Irish water spaniel": 221,
1081
+ "Irish wolfhound": 170,
1082
+ "Italian greyhound": 171,
1083
+ "Japanese spaniel": 152,
1084
+ "Kerry blue terrier": 183,
1085
+ "Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis": 48,
1086
+ "Labrador retriever": 208,
1087
+ "Lakeland terrier": 189,
1088
+ "Leonberg": 255,
1089
+ "Lhasa, Lhasa apso": 204,
1090
+ "Loafer": 630,
1091
+ "Madagascar cat, ring-tailed lemur, Lemur catta": 383,
1092
+ "Maltese dog, Maltese terrier, Maltese": 153,
1093
+ "Mexican hairless": 268,
1094
+ "Model T": 661,
1095
+ "Newfoundland, Newfoundland dog": 256,
1096
+ "Norfolk terrier": 185,
1097
+ "Norwegian elkhound, elkhound": 174,
1098
+ "Norwich terrier": 186,
1099
+ "Old English sheepdog, bobtail": 229,
1100
+ "Pekinese, Pekingese, Peke": 154,
1101
+ "Pembroke, Pembroke Welsh corgi": 263,
1102
+ "Persian cat": 283,
1103
+ "Petri dish": 712,
1104
+ "Polaroid camera, Polaroid Land camera": 732,
1105
+ "Pomeranian": 259,
1106
+ "Rhodesian ridgeback": 159,
1107
+ "Rottweiler": 234,
1108
+ "Saint Bernard, St Bernard": 247,
1109
+ "Saluki, gazelle hound": 176,
1110
+ "Samoyed, Samoyede": 258,
1111
+ "Scotch terrier, Scottish terrier, Scottie": 199,
1112
+ "Scottish deerhound, deerhound": 177,
1113
+ "Sealyham terrier, Sealyham": 190,
1114
+ "Shetland sheepdog, Shetland sheep dog, Shetland": 230,
1115
+ "Shih-Tzu": 155,
1116
+ "Siamese cat, Siamese": 284,
1117
+ "Siberian husky": 250,
1118
+ "Staffordshire bullterrier, Staffordshire bull terrier": 179,
1119
+ "Sussex spaniel": 220,
1120
+ "Tibetan mastiff": 244,
1121
+ "Tibetan terrier, chrysanthemum dog": 200,
1122
+ "Walker hound, Walker foxhound": 166,
1123
+ "Weimaraner": 178,
1124
+ "Welsh springer spaniel": 218,
1125
+ "West Highland white terrier": 203,
1126
+ "Windsor tie": 906,
1127
+ "Yorkshire terrier": 187,
1128
+ "abacus": 398,
1129
+ "abaya": 399,
1130
+ "academic gown, academic robe, judge's robe": 400,
1131
+ "accordion, piano accordion, squeeze box": 401,
1132
+ "acorn": 988,
1133
+ "acorn squash": 941,
1134
+ "acoustic guitar": 402,
1135
+ "admiral": 321,
1136
+ "affenpinscher, monkey pinscher, monkey dog": 252,
1137
+ "agama": 42,
1138
+ "agaric": 992,
1139
+ "aircraft carrier, carrier, flattop, attack aircraft carrier": 403,
1140
+ "airliner": 404,
1141
+ "airship, dirigible": 405,
1142
+ "albatross, mollymawk": 146,
1143
+ "alligator lizard": 44,
1144
+ "alp": 970,
1145
+ "altar": 406,
1146
+ "ambulance": 407,
1147
+ "amphibian, amphibious vehicle": 408,
1148
+ "analog clock": 409,
1149
+ "anemone fish": 393,
1150
+ "ant, emmet, pismire": 310,
1151
+ "apiary, bee house": 410,
1152
+ "apron": 411,
1153
+ "armadillo": 363,
1154
+ "artichoke, globe artichoke": 944,
1155
+ "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin": 412,
1156
+ "assault rifle, assault gun": 413,
1157
+ "axolotl, mud puppy, Ambystoma mexicanum": 29,
1158
+ "baboon": 372,
1159
+ "backpack, back pack, knapsack, packsack, rucksack, haversack": 414,
1160
+ "badger": 362,
1161
+ "bagel, beigel": 931,
1162
+ "bakery, bakeshop, bakehouse": 415,
1163
+ "balance beam, beam": 416,
1164
+ "bald eagle, American eagle, Haliaeetus leucocephalus": 22,
1165
+ "balloon": 417,
1166
+ "ballplayer, baseball player": 981,
1167
+ "ballpoint, ballpoint pen, ballpen, Biro": 418,
1168
+ "banana": 954,
1169
+ "banded gecko": 38,
1170
+ "banjo": 420,
1171
+ "bannister, banister, balustrade, balusters, handrail": 421,
1172
+ "barbell": 422,
1173
+ "barber chair": 423,
1174
+ "barbershop": 424,
1175
+ "barn": 425,
1176
+ "barn spider, Araneus cavaticus": 73,
1177
+ "barometer": 426,
1178
+ "barracouta, snoek": 389,
1179
+ "barrel, cask": 427,
1180
+ "barrow, garden cart, lawn cart, wheelbarrow": 428,
1181
+ "baseball": 429,
1182
+ "basenji": 253,
1183
+ "basketball": 430,
1184
+ "basset, basset hound": 161,
1185
+ "bassinet": 431,
1186
+ "bassoon": 432,
1187
+ "bath towel": 434,
1188
+ "bathing cap, swimming cap": 433,
1189
+ "bathtub, bathing tub, bath, tub": 435,
1190
+ "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon": 436,
1191
+ "beacon, lighthouse, beacon light, pharos": 437,
1192
+ "beagle": 162,
1193
+ "beaker": 438,
1194
+ "bearskin, busby, shako": 439,
1195
+ "beaver": 337,
1196
+ "bee": 309,
1197
+ "bee eater": 92,
1198
+ "beer bottle": 440,
1199
+ "beer glass": 441,
1200
+ "bell cote, bell cot": 442,
1201
+ "bell pepper": 945,
1202
+ "bib": 443,
1203
+ "bicycle-built-for-two, tandem bicycle, tandem": 444,
1204
+ "bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis": 349,
1205
+ "bikini, two-piece": 445,
1206
+ "binder, ring-binder": 446,
1207
+ "binoculars, field glasses, opera glasses": 447,
1208
+ "birdhouse": 448,
1209
+ "bison": 347,
1210
+ "bittern": 133,
1211
+ "black and gold garden spider, Argiope aurantia": 72,
1212
+ "black grouse": 80,
1213
+ "black stork, Ciconia nigra": 128,
1214
+ "black swan, Cygnus atratus": 100,
1215
+ "black widow, Latrodectus mactans": 75,
1216
+ "black-and-tan coonhound": 165,
1217
+ "black-footed ferret, ferret, Mustela nigripes": 359,
1218
+ "bloodhound, sleuthhound": 163,
1219
+ "bluetick": 164,
1220
+ "boa constrictor, Constrictor constrictor": 61,
1221
+ "boathouse": 449,
1222
+ "bobsled, bobsleigh, bob": 450,
1223
+ "bolete": 997,
1224
+ "bolo tie, bolo, bola tie, bola": 451,
1225
+ "bonnet, poke bonnet": 452,
1226
+ "book jacket, dust cover, dust jacket, dust wrapper": 921,
1227
+ "bookcase": 453,
1228
+ "bookshop, bookstore, bookstall": 454,
1229
+ "borzoi, Russian wolfhound": 169,
1230
+ "bottlecap": 455,
1231
+ "bow": 456,
1232
+ "bow tie, bow-tie, bowtie": 457,
1233
+ "box turtle, box tortoise": 37,
1234
+ "boxer": 242,
1235
+ "brain coral": 109,
1236
+ "brambling, Fringilla montifringilla": 10,
1237
+ "brass, memorial tablet, plaque": 458,
1238
+ "brassiere, bra, bandeau": 459,
1239
+ "breakwater, groin, groyne, mole, bulwark, seawall, jetty": 460,
1240
+ "breastplate, aegis, egis": 461,
1241
+ "briard": 226,
1242
+ "broccoli": 937,
1243
+ "broom": 462,
1244
+ "brown bear, bruin, Ursus arctos": 294,
1245
+ "bubble": 971,
1246
+ "bucket, pail": 463,
1247
+ "buckeye, horse chestnut, conker": 990,
1248
+ "buckle": 464,
1249
+ "bulbul": 16,
1250
+ "bull mastiff": 243,
1251
+ "bullet train, bullet": 466,
1252
+ "bulletproof vest": 465,
1253
+ "bullfrog, Rana catesbeiana": 30,
1254
+ "burrito": 965,
1255
+ "bustard": 138,
1256
+ "butcher shop, meat market": 467,
1257
+ "butternut squash": 942,
1258
+ "cab, hack, taxi, taxicab": 468,
1259
+ "cabbage butterfly": 324,
1260
+ "cairn, cairn terrier": 192,
1261
+ "caldron, cauldron": 469,
1262
+ "can opener, tin opener": 473,
1263
+ "candle, taper, wax light": 470,
1264
+ "cannon": 471,
1265
+ "canoe": 472,
1266
+ "capuchin, ringtail, Cebus capucinus": 378,
1267
+ "car mirror": 475,
1268
+ "car wheel": 479,
1269
+ "carbonara": 959,
1270
+ "cardigan": 474,
1271
+ "cardoon": 946,
1272
+ "carousel, carrousel, merry-go-round, roundabout, whirligig": 476,
1273
+ "carpenter's kit, tool kit": 477,
1274
+ "carton": 478,
1275
+ "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM": 480,
1276
+ "cassette": 481,
1277
+ "cassette player": 482,
1278
+ "castle": 483,
1279
+ "catamaran": 484,
1280
+ "cauliflower": 938,
1281
+ "cello, violoncello": 486,
1282
+ "cellular telephone, cellular phone, cellphone, cell, mobile phone": 487,
1283
+ "centipede": 79,
1284
+ "chain": 488,
1285
+ "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour": 490,
1286
+ "chain saw, chainsaw": 491,
1287
+ "chainlink fence": 489,
1288
+ "chambered nautilus, pearly nautilus, nautilus": 117,
1289
+ "cheeseburger": 933,
1290
+ "cheetah, chetah, Acinonyx jubatus": 293,
1291
+ "chest": 492,
1292
+ "chickadee": 19,
1293
+ "chiffonier, commode": 493,
1294
+ "chime, bell, gong": 494,
1295
+ "chimpanzee, chimp, Pan troglodytes": 367,
1296
+ "china cabinet, china closet": 495,
1297
+ "chiton, coat-of-mail shell, sea cradle, polyplacophore": 116,
1298
+ "chocolate sauce, chocolate syrup": 960,
1299
+ "chow, chow chow": 260,
1300
+ "church, church building": 497,
1301
+ "cicada, cicala": 316,
1302
+ "cinema, movie theater, movie theatre, movie house, picture palace": 498,
1303
+ "cleaver, meat cleaver, chopper": 499,
1304
+ "cliff dwelling": 500,
1305
+ "cliff, drop, drop-off": 972,
1306
+ "cloak": 501,
1307
+ "clog, geta, patten, sabot": 502,
1308
+ "clumber, clumber spaniel": 216,
1309
+ "cock": 7,
1310
+ "cocker spaniel, English cocker spaniel, cocker": 219,
1311
+ "cockroach, roach": 314,
1312
+ "cocktail shaker": 503,
1313
+ "coffee mug": 504,
1314
+ "coffeepot": 505,
1315
+ "coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch": 391,
1316
+ "coil, spiral, volute, whorl, helix": 506,
1317
+ "collie": 231,
1318
+ "colobus, colobus monkey": 375,
1319
+ "combination lock": 507,
1320
+ "comic book": 917,
1321
+ "common iguana, iguana, Iguana iguana": 39,
1322
+ "common newt, Triturus vulgaris": 26,
1323
+ "computer keyboard, keypad": 508,
1324
+ "conch": 112,
1325
+ "confectionery, confectionary, candy store": 509,
1326
+ "consomme": 925,
1327
+ "container ship, containership, container vessel": 510,
1328
+ "convertible": 511,
1329
+ "coral fungus": 991,
1330
+ "coral reef": 973,
1331
+ "corkscrew, bottle screw": 512,
1332
+ "corn": 987,
1333
+ "cornet, horn, trumpet, trump": 513,
1334
+ "coucal": 91,
1335
+ "cougar, puma, catamount, mountain lion, painter, panther, Felis concolor": 286,
1336
+ "cowboy boot": 514,
1337
+ "cowboy hat, ten-gallon hat": 515,
1338
+ "coyote, prairie wolf, brush wolf, Canis latrans": 272,
1339
+ "cradle": 516,
1340
+ "crane": 517,
1341
+ "crash helmet": 518,
1342
+ "crate": 519,
1343
+ "crayfish, crawfish, crawdad, crawdaddy": 124,
1344
+ "crib, cot": 520,
1345
+ "cricket": 312,
1346
+ "croquet ball": 522,
1347
+ "crossword puzzle, crossword": 918,
1348
+ "crutch": 523,
1349
+ "cucumber, cuke": 943,
1350
+ "cuirass": 524,
1351
+ "cup": 968,
1352
+ "curly-coated retriever": 206,
1353
+ "custard apple": 956,
1354
+ "daisy": 985,
1355
+ "dalmatian, coach dog, carriage dog": 251,
1356
+ "dam, dike, dyke": 525,
1357
+ "damselfly": 320,
1358
+ "desk": 526,
1359
+ "desktop computer": 527,
1360
+ "dhole, Cuon alpinus": 274,
1361
+ "dial telephone, dial phone": 528,
1362
+ "diamondback, diamondback rattlesnake, Crotalus adamanteus": 67,
1363
+ "diaper, nappy, napkin": 529,
1364
+ "digital clock": 530,
1365
+ "digital watch": 531,
1366
+ "dingo, warrigal, warragal, Canis dingo": 273,
1367
+ "dining table, board": 532,
1368
+ "dishrag, dishcloth": 533,
1369
+ "dishwasher, dish washer, dishwashing machine": 534,
1370
+ "disk brake, disc brake": 535,
1371
+ "dock, dockage, docking facility": 536,
1372
+ "dogsled, dog sled, dog sleigh": 537,
1373
+ "dome": 538,
1374
+ "doormat, welcome mat": 539,
1375
+ "dough": 961,
1376
+ "dowitcher": 142,
1377
+ "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk": 319,
1378
+ "drake": 97,
1379
+ "drilling platform, offshore rig": 540,
1380
+ "drum, membranophone, tympan": 541,
1381
+ "drumstick": 542,
1382
+ "dugong, Dugong dugon": 149,
1383
+ "dumbbell": 543,
1384
+ "dung beetle": 305,
1385
+ "ear, spike, capitulum": 998,
1386
+ "earthstar": 995,
1387
+ "echidna, spiny anteater, anteater": 102,
1388
+ "eel": 390,
1389
+ "eft": 27,
1390
+ "eggnog": 969,
1391
+ "electric fan, blower": 545,
1392
+ "electric guitar": 546,
1393
+ "electric locomotive": 547,
1394
+ "electric ray, crampfish, numbfish, torpedo": 5,
1395
+ "entertainment center": 548,
1396
+ "envelope": 549,
1397
+ "espresso": 967,
1398
+ "espresso maker": 550,
1399
+ "face powder": 551,
1400
+ "feather boa, boa": 552,
1401
+ "fiddler crab": 120,
1402
+ "fig": 952,
1403
+ "file, file cabinet, filing cabinet": 553,
1404
+ "fire engine, fire truck": 555,
1405
+ "fire screen, fireguard": 556,
1406
+ "fireboat": 554,
1407
+ "flagpole, flagstaff": 557,
1408
+ "flamingo": 130,
1409
+ "flat-coated retriever": 205,
1410
+ "flatworm, platyhelminth": 110,
1411
+ "flute, transverse flute": 558,
1412
+ "fly": 308,
1413
+ "folding chair": 559,
1414
+ "football helmet": 560,
1415
+ "forklift": 561,
1416
+ "fountain": 562,
1417
+ "fountain pen": 563,
1418
+ "four-poster": 564,
1419
+ "fox squirrel, eastern fox squirrel, Sciurus niger": 335,
1420
+ "freight car": 565,
1421
+ "frilled lizard, Chlamydosaurus kingi": 43,
1422
+ "frying pan, frypan, skillet": 567,
1423
+ "fur coat": 568,
1424
+ "gar, garfish, garpike, billfish, Lepisosteus osseus": 395,
1425
+ "garbage truck, dustcart": 569,
1426
+ "garden spider, Aranea diademata": 74,
1427
+ "garter snake, grass snake": 57,
1428
+ "gas pump, gasoline pump, petrol pump, island dispenser": 571,
1429
+ "gasmask, respirator, gas helmet": 570,
1430
+ "gazelle": 353,
1431
+ "geyser": 974,
1432
+ "giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca": 388,
1433
+ "giant schnauzer": 197,
1434
+ "gibbon, Hylobates lar": 368,
1435
+ "go-kart": 573,
1436
+ "goblet": 572,
1437
+ "golden retriever": 207,
1438
+ "goldfinch, Carduelis carduelis": 11,
1439
+ "goldfish, Carassius auratus": 1,
1440
+ "golf ball": 574,
1441
+ "golfcart, golf cart": 575,
1442
+ "gondola": 576,
1443
+ "gong, tam-tam": 577,
1444
+ "goose": 99,
1445
+ "gorilla, Gorilla gorilla": 366,
1446
+ "gown": 578,
1447
+ "grand piano, grand": 579,
1448
+ "grasshopper, hopper": 311,
1449
+ "great grey owl, great gray owl, Strix nebulosa": 24,
1450
+ "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias": 2,
1451
+ "green lizard, Lacerta viridis": 46,
1452
+ "green mamba": 64,
1453
+ "green snake, grass snake": 55,
1454
+ "greenhouse, nursery, glasshouse": 580,
1455
+ "grey fox, gray fox, Urocyon cinereoargenteus": 280,
1456
+ "grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus": 147,
1457
+ "grille, radiator grille": 581,
1458
+ "grocery store, grocery, food market, market": 582,
1459
+ "groenendael": 224,
1460
+ "groom, bridegroom": 982,
1461
+ "ground beetle, carabid beetle": 302,
1462
+ "guacamole": 924,
1463
+ "guenon, guenon monkey": 370,
1464
+ "guillotine": 583,
1465
+ "guinea pig, Cavia cobaya": 338,
1466
+ "gyromitra": 993,
1467
+ "hair slide": 584,
1468
+ "hair spray": 585,
1469
+ "half track": 586,
1470
+ "hammer": 587,
1471
+ "hammerhead, hammerhead shark": 4,
1472
+ "hamper": 588,
1473
+ "hamster": 333,
1474
+ "hand blower, blow dryer, blow drier, hair dryer, hair drier": 589,
1475
+ "hand-held computer, hand-held microcomputer": 590,
1476
+ "handkerchief, hankie, hanky, hankey": 591,
1477
+ "hard disc, hard disk, fixed disk": 592,
1478
+ "hare": 331,
1479
+ "harmonica, mouth organ, harp, mouth harp": 593,
1480
+ "harp": 594,
1481
+ "hartebeest": 351,
1482
+ "harvester, reaper": 595,
1483
+ "harvestman, daddy longlegs, Phalangium opilio": 70,
1484
+ "hatchet": 596,
1485
+ "hay": 958,
1486
+ "head cabbage": 936,
1487
+ "hen": 8,
1488
+ "hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa": 996,
1489
+ "hermit crab": 125,
1490
+ "hip, rose hip, rosehip": 989,
1491
+ "hippopotamus, hippo, river horse, Hippopotamus amphibius": 344,
1492
+ "hog, pig, grunter, squealer, Sus scrofa": 341,
1493
+ "hognose snake, puff adder, sand viper": 54,
1494
+ "holster": 597,
1495
+ "home theater, home theatre": 598,
1496
+ "honeycomb": 599,
1497
+ "hook, claw": 600,
1498
+ "hoopskirt, crinoline": 601,
1499
+ "horizontal bar, high bar": 602,
1500
+ "hornbill": 93,
1501
+ "horned viper, cerastes, sand viper, horned asp, Cerastes cornutus": 66,
1502
+ "horse cart, horse-cart": 603,
1503
+ "hot pot, hotpot": 926,
1504
+ "hotdog, hot dog, red hot": 934,
1505
+ "hourglass": 604,
1506
+ "house finch, linnet, Carpodacus mexicanus": 12,
1507
+ "howler monkey, howler": 379,
1508
+ "hummingbird": 94,
1509
+ "hyena, hyaena": 276,
1510
+ "iPod": 605,
1511
+ "ibex, Capra ibex": 350,
1512
+ "ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus": 296,
1513
+ "ice cream, icecream": 928,
1514
+ "ice lolly, lolly, lollipop, popsicle": 929,
1515
+ "impala, Aepyceros melampus": 352,
1516
+ "indigo bunting, indigo finch, indigo bird, Passerina cyanea": 14,
1517
+ "indri, indris, Indri indri, Indri brevicaudatus": 384,
1518
+ "iron, smoothing iron": 606,
1519
+ "isopod": 126,
1520
+ "jacamar": 95,
1521
+ "jack-o'-lantern": 607,
1522
+ "jackfruit, jak, jack": 955,
1523
+ "jaguar, panther, Panthera onca, Felis onca": 290,
1524
+ "jay": 17,
1525
+ "jean, blue jean, denim": 608,
1526
+ "jeep, landrover": 609,
1527
+ "jellyfish": 107,
1528
+ "jersey, T-shirt, tee shirt": 610,
1529
+ "jigsaw puzzle": 611,
1530
+ "jinrikisha, ricksha, rickshaw": 612,
1531
+ "joystick": 613,
1532
+ "junco, snowbird": 13,
1533
+ "keeshond": 261,
1534
+ "kelpie": 227,
1535
+ "killer whale, killer, orca, grampus, sea wolf, Orcinus orca": 148,
1536
+ "kimono": 614,
1537
+ "king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica": 121,
1538
+ "king penguin, Aptenodytes patagonica": 145,
1539
+ "king snake, kingsnake": 56,
1540
+ "kit fox, Vulpes macrotis": 278,
1541
+ "kite": 21,
1542
+ "knee pad": 615,
1543
+ "knot": 616,
1544
+ "koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus": 105,
1545
+ "komondor": 228,
1546
+ "kuvasz": 222,
1547
+ "lab coat, laboratory coat": 617,
1548
+ "lacewing, lacewing fly": 318,
1549
+ "ladle": 618,
1550
+ "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle": 301,
1551
+ "lakeside, lakeshore": 975,
1552
+ "lampshade, lamp shade": 619,
1553
+ "langur": 374,
1554
+ "laptop, laptop computer": 620,
1555
+ "lawn mower, mower": 621,
1556
+ "leaf beetle, chrysomelid": 304,
1557
+ "leafhopper": 317,
1558
+ "leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea": 34,
1559
+ "lemon": 951,
1560
+ "lens cap, lens cover": 622,
1561
+ "leopard, Panthera pardus": 288,
1562
+ "lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens": 387,
1563
+ "letter opener, paper knife, paperknife": 623,
1564
+ "library": 624,
1565
+ "lifeboat": 625,
1566
+ "lighter, light, igniter, ignitor": 626,
1567
+ "limousine, limo": 627,
1568
+ "limpkin, Aramus pictus": 135,
1569
+ "liner, ocean liner": 628,
1570
+ "lion, king of beasts, Panthera leo": 291,
1571
+ "lionfish": 396,
1572
+ "lipstick, lip rouge": 629,
1573
+ "little blue heron, Egretta caerulea": 131,
1574
+ "llama": 355,
1575
+ "loggerhead, loggerhead turtle, Caretta caretta": 33,
1576
+ "long-horned beetle, longicorn, longicorn beetle": 303,
1577
+ "lorikeet": 90,
1578
+ "lotion": 631,
1579
+ "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system": 632,
1580
+ "loupe, jeweler's loupe": 633,
1581
+ "lumbermill, sawmill": 634,
1582
+ "lycaenid, lycaenid butterfly": 326,
1583
+ "lynx, catamount": 287,
1584
+ "macaque": 373,
1585
+ "macaw": 88,
1586
+ "magnetic compass": 635,
1587
+ "magpie": 18,
1588
+ "mailbag, postbag": 636,
1589
+ "mailbox, letter box": 637,
1590
+ "maillot": 638,
1591
+ "maillot, tank suit": 639,
1592
+ "malamute, malemute, Alaskan malamute": 249,
1593
+ "malinois": 225,
1594
+ "manhole cover": 640,
1595
+ "mantis, mantid": 315,
1596
+ "maraca": 641,
1597
+ "marimba, xylophone": 642,
1598
+ "marmoset": 377,
1599
+ "marmot": 336,
1600
+ "mashed potato": 935,
1601
+ "mask": 643,
1602
+ "matchstick": 644,
1603
+ "maypole": 645,
1604
+ "maze, labyrinth": 646,
1605
+ "measuring cup": 647,
1606
+ "meat loaf, meatloaf": 962,
1607
+ "medicine chest, medicine cabinet": 648,
1608
+ "meerkat, mierkat": 299,
1609
+ "megalith, megalithic structure": 649,
1610
+ "menu": 922,
1611
+ "microphone, mike": 650,
1612
+ "microwave, microwave oven": 651,
1613
+ "military uniform": 652,
1614
+ "milk can": 653,
1615
+ "miniature pinscher": 237,
1616
+ "miniature poodle": 266,
1617
+ "miniature schnauzer": 196,
1618
+ "minibus": 654,
1619
+ "miniskirt, mini": 655,
1620
+ "minivan": 656,
1621
+ "mink": 357,
1622
+ "missile": 657,
1623
+ "mitten": 658,
1624
+ "mixing bowl": 659,
1625
+ "mobile home, manufactured home": 660,
1626
+ "modem": 662,
1627
+ "monarch, monarch butterfly, milkweed butterfly, Danaus plexippus": 323,
1628
+ "monastery": 663,
1629
+ "mongoose": 298,
1630
+ "monitor": 664,
1631
+ "moped": 665,
1632
+ "mortar": 666,
1633
+ "mortarboard": 667,
1634
+ "mosque": 668,
1635
+ "mosquito net": 669,
1636
+ "motor scooter, scooter": 670,
1637
+ "mountain bike, all-terrain bike, off-roader": 671,
1638
+ "mountain tent": 672,
1639
+ "mouse, computer mouse": 673,
1640
+ "mousetrap": 674,
1641
+ "moving van": 675,
1642
+ "mud turtle": 35,
1643
+ "mushroom": 947,
1644
+ "muzzle": 676,
1645
+ "nail": 677,
1646
+ "neck brace": 678,
1647
+ "necklace": 679,
1648
+ "nematode, nematode worm, roundworm": 111,
1649
+ "night snake, Hypsiglena torquata": 60,
1650
+ "nipple": 680,
1651
+ "notebook, notebook computer": 681,
1652
+ "obelisk": 682,
1653
+ "oboe, hautboy, hautbois": 683,
1654
+ "ocarina, sweet potato": 684,
1655
+ "odometer, hodometer, mileometer, milometer": 685,
1656
+ "oil filter": 686,
1657
+ "orange": 950,
1658
+ "orangutan, orang, orangutang, Pongo pygmaeus": 365,
1659
+ "organ, pipe organ": 687,
1660
+ "oscilloscope, scope, cathode-ray oscilloscope, CRO": 688,
1661
+ "ostrich, Struthio camelus": 9,
1662
+ "otter": 360,
1663
+ "otterhound, otter hound": 175,
1664
+ "overskirt": 689,
1665
+ "ox": 345,
1666
+ "oxcart": 690,
1667
+ "oxygen mask": 691,
1668
+ "oystercatcher, oyster catcher": 143,
1669
+ "packet": 692,
1670
+ "paddle, boat paddle": 693,
1671
+ "paddlewheel, paddle wheel": 694,
1672
+ "padlock": 695,
1673
+ "paintbrush": 696,
1674
+ "pajama, pyjama, pj's, jammies": 697,
1675
+ "palace": 698,
1676
+ "panpipe, pandean pipe, syrinx": 699,
1677
+ "paper towel": 700,
1678
+ "papillon": 157,
1679
+ "parachute, chute": 701,
1680
+ "parallel bars, bars": 702,
1681
+ "park bench": 703,
1682
+ "parking meter": 704,
1683
+ "partridge": 86,
1684
+ "passenger car, coach, carriage": 705,
1685
+ "patas, hussar monkey, Erythrocebus patas": 371,
1686
+ "patio, terrace": 706,
1687
+ "pay-phone, pay-station": 707,
1688
+ "peacock": 84,
1689
+ "pedestal, plinth, footstall": 708,
1690
+ "pelican": 144,
1691
+ "pencil box, pencil case": 709,
1692
+ "pencil sharpener": 710,
1693
+ "perfume, essence": 711,
1694
+ "photocopier": 713,
1695
+ "pick, plectrum, plectron": 714,
1696
+ "pickelhaube": 715,
1697
+ "picket fence, paling": 716,
1698
+ "pickup, pickup truck": 717,
1699
+ "pier": 718,
1700
+ "piggy bank, penny bank": 719,
1701
+ "pill bottle": 720,
1702
+ "pillow": 721,
1703
+ "pineapple, ananas": 953,
1704
+ "ping-pong ball": 722,
1705
+ "pinwheel": 723,
1706
+ "pirate, pirate ship": 724,
1707
+ "pitcher, ewer": 725,
1708
+ "pizza, pizza pie": 963,
1709
+ "plane, carpenter's plane, woodworking plane": 726,
1710
+ "planetarium": 727,
1711
+ "plastic bag": 728,
1712
+ "plate": 923,
1713
+ "plate rack": 729,
1714
+ "platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus": 103,
1715
+ "plow, plough": 730,
1716
+ "plunger, plumber's helper": 731,
1717
+ "pole": 733,
1718
+ "polecat, fitch, foulmart, foumart, Mustela putorius": 358,
1719
+ "police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria": 734,
1720
+ "pomegranate": 957,
1721
+ "poncho": 735,
1722
+ "pool table, billiard table, snooker table": 736,
1723
+ "pop bottle, soda bottle": 737,
1724
+ "porcupine, hedgehog": 334,
1725
+ "pot, flowerpot": 738,
1726
+ "potpie": 964,
1727
+ "potter's wheel": 739,
1728
+ "power drill": 740,
1729
+ "prairie chicken, prairie grouse, prairie fowl": 83,
1730
+ "prayer rug, prayer mat": 741,
1731
+ "pretzel": 932,
1732
+ "printer": 742,
1733
+ "prison, prison house": 743,
1734
+ "proboscis monkey, Nasalis larvatus": 376,
1735
+ "projectile, missile": 744,
1736
+ "projector": 745,
1737
+ "promontory, headland, head, foreland": 976,
1738
+ "ptarmigan": 81,
1739
+ "puck, hockey puck": 746,
1740
+ "puffer, pufferfish, blowfish, globefish": 397,
1741
+ "pug, pug-dog": 254,
1742
+ "punching bag, punch bag, punching ball, punchball": 747,
1743
+ "purse": 748,
1744
+ "quail": 85,
1745
+ "quill, quill pen": 749,
1746
+ "quilt, comforter, comfort, puff": 750,
1747
+ "racer, race car, racing car": 751,
1748
+ "racket, racquet": 752,
1749
+ "radiator": 753,
1750
+ "radio telescope, radio reflector": 755,
1751
+ "radio, wireless": 754,
1752
+ "rain barrel": 756,
1753
+ "ram, tup": 348,
1754
+ "rapeseed": 984,
1755
+ "recreational vehicle, RV, R.V.": 757,
1756
+ "red fox, Vulpes vulpes": 277,
1757
+ "red wine": 966,
1758
+ "red wolf, maned wolf, Canis rufus, Canis niger": 271,
1759
+ "red-backed sandpiper, dunlin, Erolia alpina": 140,
1760
+ "red-breasted merganser, Mergus serrator": 98,
1761
+ "redbone": 168,
1762
+ "redshank, Tringa totanus": 141,
1763
+ "reel": 758,
1764
+ "reflex camera": 759,
1765
+ "refrigerator, icebox": 760,
1766
+ "remote control, remote": 761,
1767
+ "restaurant, eating house, eating place, eatery": 762,
1768
+ "revolver, six-gun, six-shooter": 763,
1769
+ "rhinoceros beetle": 306,
1770
+ "rifle": 764,
1771
+ "ringlet, ringlet butterfly": 322,
1772
+ "ringneck snake, ring-necked snake, ring snake": 53,
1773
+ "robin, American robin, Turdus migratorius": 15,
1774
+ "rock beauty, Holocanthus tricolor": 392,
1775
+ "rock crab, Cancer irroratus": 119,
1776
+ "rock python, rock snake, Python sebae": 62,
1777
+ "rocking chair, rocker": 765,
1778
+ "rotisserie": 766,
1779
+ "rubber eraser, rubber, pencil eraser": 767,
1780
+ "ruddy turnstone, Arenaria interpres": 139,
1781
+ "ruffed grouse, partridge, Bonasa umbellus": 82,
1782
+ "rugby ball": 768,
1783
+ "rule, ruler": 769,
1784
+ "running shoe": 770,
1785
+ "safe": 771,
1786
+ "safety pin": 772,
1787
+ "saltshaker, salt shaker": 773,
1788
+ "sandal": 774,
1789
+ "sandbar, sand bar": 977,
1790
+ "sarong": 775,
1791
+ "sax, saxophone": 776,
1792
+ "scabbard": 777,
1793
+ "scale, weighing machine": 778,
1794
+ "schipperke": 223,
1795
+ "school bus": 779,
1796
+ "schooner": 780,
1797
+ "scoreboard": 781,
1798
+ "scorpion": 71,
1799
+ "screen, CRT screen": 782,
1800
+ "screw": 783,
1801
+ "screwdriver": 784,
1802
+ "scuba diver": 983,
1803
+ "sea anemone, anemone": 108,
1804
+ "sea cucumber, holothurian": 329,
1805
+ "sea lion": 150,
1806
+ "sea slug, nudibranch": 115,
1807
+ "sea snake": 65,
1808
+ "sea urchin": 328,
1809
+ "seashore, coast, seacoast, sea-coast": 978,
1810
+ "seat belt, seatbelt": 785,
1811
+ "sewing machine": 786,
1812
+ "shield, buckler": 787,
1813
+ "shoe shop, shoe-shop, shoe store": 788,
1814
+ "shoji": 789,
1815
+ "shopping basket": 790,
1816
+ "shopping cart": 791,
1817
+ "shovel": 792,
1818
+ "shower cap": 793,
1819
+ "shower curtain": 794,
1820
+ "siamang, Hylobates syndactylus, Symphalangus syndactylus": 369,
1821
+ "sidewinder, horned rattlesnake, Crotalus cerastes": 68,
1822
+ "silky terrier, Sydney silky": 201,
1823
+ "ski": 795,
1824
+ "ski mask": 796,
1825
+ "skunk, polecat, wood pussy": 361,
1826
+ "sleeping bag": 797,
1827
+ "slide rule, slipstick": 798,
1828
+ "sliding door": 799,
1829
+ "slot, one-armed bandit": 800,
1830
+ "sloth bear, Melursus ursinus, Ursus ursinus": 297,
1831
+ "slug": 114,
1832
+ "snail": 113,
1833
+ "snorkel": 801,
1834
+ "snow leopard, ounce, Panthera uncia": 289,
1835
+ "snowmobile": 802,
1836
+ "snowplow, snowplough": 803,
1837
+ "soap dispenser": 804,
1838
+ "soccer ball": 805,
1839
+ "sock": 806,
1840
+ "soft-coated wheaten terrier": 202,
1841
+ "solar dish, solar collector, solar furnace": 807,
1842
+ "sombrero": 808,
1843
+ "sorrel": 339,
1844
+ "soup bowl": 809,
1845
+ "space bar": 810,
1846
+ "space heater": 811,
1847
+ "space shuttle": 812,
1848
+ "spaghetti squash": 940,
1849
+ "spatula": 813,
1850
+ "speedboat": 814,
1851
+ "spider monkey, Ateles geoffroyi": 381,
1852
+ "spider web, spider's web": 815,
1853
+ "spindle": 816,
1854
+ "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish": 123,
1855
+ "spoonbill": 129,
1856
+ "sports car, sport car": 817,
1857
+ "spotlight, spot": 818,
1858
+ "spotted salamander, Ambystoma maculatum": 28,
1859
+ "squirrel monkey, Saimiri sciureus": 382,
1860
+ "stage": 819,
1861
+ "standard poodle": 267,
1862
+ "standard schnauzer": 198,
1863
+ "starfish, sea star": 327,
1864
+ "steam locomotive": 820,
1865
+ "steel arch bridge": 821,
1866
+ "steel drum": 822,
1867
+ "stethoscope": 823,
1868
+ "stingray": 6,
1869
+ "stinkhorn, carrion fungus": 994,
1870
+ "stole": 824,
1871
+ "stone wall": 825,
1872
+ "stopwatch, stop watch": 826,
1873
+ "stove": 827,
1874
+ "strainer": 828,
1875
+ "strawberry": 949,
1876
+ "street sign": 919,
1877
+ "streetcar, tram, tramcar, trolley, trolley car": 829,
1878
+ "stretcher": 830,
1879
+ "studio couch, day bed": 831,
1880
+ "stupa, tope": 832,
1881
+ "sturgeon": 394,
1882
+ "submarine, pigboat, sub, U-boat": 833,
1883
+ "suit, suit of clothes": 834,
1884
+ "sulphur butterfly, sulfur butterfly": 325,
1885
+ "sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita": 89,
1886
+ "sundial": 835,
1887
+ "sunglass": 836,
1888
+ "sunglasses, dark glasses, shades": 837,
1889
+ "sunscreen, sunblock, sun blocker": 838,
1890
+ "suspension bridge": 839,
1891
+ "swab, swob, mop": 840,
1892
+ "sweatshirt": 841,
1893
+ "swimming trunks, bathing trunks": 842,
1894
+ "swing": 843,
1895
+ "switch, electric switch, electrical switch": 844,
1896
+ "syringe": 845,
1897
+ "tabby, tabby cat": 281,
1898
+ "table lamp": 846,
1899
+ "tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui": 32,
1900
+ "tank, army tank, armored combat vehicle, armoured combat vehicle": 847,
1901
+ "tape player": 848,
1902
+ "tarantula": 76,
1903
+ "teapot": 849,
1904
+ "teddy, teddy bear": 850,
1905
+ "television, television system": 851,
1906
+ "tench, Tinca tinca": 0,
1907
+ "tennis ball": 852,
1908
+ "terrapin": 36,
1909
+ "thatch, thatched roof": 853,
1910
+ "theater curtain, theatre curtain": 854,
1911
+ "thimble": 855,
1912
+ "three-toed sloth, ai, Bradypus tridactylus": 364,
1913
+ "thresher, thrasher, threshing machine": 856,
1914
+ "throne": 857,
1915
+ "thunder snake, worm snake, Carphophis amoenus": 52,
1916
+ "tick": 78,
1917
+ "tiger beetle": 300,
1918
+ "tiger cat": 282,
1919
+ "tiger shark, Galeocerdo cuvieri": 3,
1920
+ "tiger, Panthera tigris": 292,
1921
+ "tile roof": 858,
1922
+ "timber wolf, grey wolf, gray wolf, Canis lupus": 269,
1923
+ "titi, titi monkey": 380,
1924
+ "toaster": 859,
1925
+ "tobacco shop, tobacconist shop, tobacconist": 860,
1926
+ "toilet seat": 861,
1927
+ "toilet tissue, toilet paper, bathroom tissue": 999,
1928
+ "torch": 862,
1929
+ "totem pole": 863,
1930
+ "toucan": 96,
1931
+ "tow truck, tow car, wrecker": 864,
1932
+ "toy poodle": 265,
1933
+ "toy terrier": 158,
1934
+ "toyshop": 865,
1935
+ "tractor": 866,
1936
+ "traffic light, traffic signal, stoplight": 920,
1937
+ "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi": 867,
1938
+ "tray": 868,
1939
+ "tree frog, tree-frog": 31,
1940
+ "trench coat": 869,
1941
+ "triceratops": 51,
1942
+ "tricycle, trike, velocipede": 870,
1943
+ "trifle": 927,
1944
+ "trilobite": 69,
1945
+ "trimaran": 871,
1946
+ "tripod": 872,
1947
+ "triumphal arch": 873,
1948
+ "trolleybus, trolley coach, trackless trolley": 874,
1949
+ "trombone": 875,
1950
+ "tub, vat": 876,
1951
+ "turnstile": 877,
1952
+ "tusker": 101,
1953
+ "typewriter keyboard": 878,
1954
+ "umbrella": 879,
1955
+ "unicycle, monocycle": 880,
1956
+ "upright, upright piano": 881,
1957
+ "vacuum, vacuum cleaner": 882,
1958
+ "valley, vale": 979,
1959
+ "vase": 883,
1960
+ "vault": 884,
1961
+ "velvet": 885,
1962
+ "vending machine": 886,
1963
+ "vestment": 887,
1964
+ "viaduct": 888,
1965
+ "vine snake": 59,
1966
+ "violin, fiddle": 889,
1967
+ "vizsla, Hungarian pointer": 211,
1968
+ "volcano": 980,
1969
+ "volleyball": 890,
1970
+ "vulture": 23,
1971
+ "waffle iron": 891,
1972
+ "walking stick, walkingstick, stick insect": 313,
1973
+ "wall clock": 892,
1974
+ "wallaby, brush kangaroo": 104,
1975
+ "wallet, billfold, notecase, pocketbook": 893,
1976
+ "wardrobe, closet, press": 894,
1977
+ "warplane, military plane": 895,
1978
+ "warthog": 343,
1979
+ "washbasin, handbasin, washbowl, lavabo, wash-hand basin": 896,
1980
+ "washer, automatic washer, washing machine": 897,
1981
+ "water bottle": 898,
1982
+ "water buffalo, water ox, Asiatic buffalo, Bubalus bubalis": 346,
1983
+ "water jug": 899,
1984
+ "water ouzel, dipper": 20,
1985
+ "water snake": 58,
1986
+ "water tower": 900,
1987
+ "weasel": 356,
1988
+ "web site, website, internet site, site": 916,
1989
+ "weevil": 307,
1990
+ "whippet": 172,
1991
+ "whiptail, whiptail lizard": 41,
1992
+ "whiskey jug": 901,
1993
+ "whistle": 902,
1994
+ "white stork, Ciconia ciconia": 127,
1995
+ "white wolf, Arctic wolf, Canis lupus tundrarum": 270,
1996
+ "wig": 903,
1997
+ "wild boar, boar, Sus scrofa": 342,
1998
+ "window screen": 904,
1999
+ "window shade": 905,
2000
+ "wine bottle": 907,
2001
+ "wing": 908,
2002
+ "wire-haired fox terrier": 188,
2003
+ "wok": 909,
2004
+ "wolf spider, hunting spider": 77,
2005
+ "wombat": 106,
2006
+ "wood rabbit, cottontail, cottontail rabbit": 330,
2007
+ "wooden spoon": 910,
2008
+ "wool, woolen, woollen": 911,
2009
+ "worm fence, snake fence, snake-rail fence, Virginia fence": 912,
2010
+ "wreck": 913,
2011
+ "yawl": 914,
2012
+ "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum": 986,
2013
+ "yurt": 915,
2014
+ "zebra": 340,
2015
+ "zucchini, courgette": 939
2016
+ },
2017
+ "layer_norm_eps": 1e-12,
2018
+ "model_type": "vit",
2019
+ "num_attention_heads": 12,
2020
+ "num_channels": 3,
2021
+ "num_hidden_layers": 12,
2022
+ "patch_size": 16,
2023
+ "qkv_bias": true,
2024
+ "torch_dtype": "float32",
2025
+ "transformers_version": "4.44.2"
2026
+ }
chatbot/models/custom_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8a70954c2250bc785d2259892c0ead906d25f5f16bbea62f55f668d286a6cfa
3
+ size 343220892
chatbot/models/microsoft/Phi-3.5-mini-instruct/added_tokens.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|assistant|>": 32001,
3
+ "<|endoftext|>": 32000,
4
+ "<|end|>": 32007,
5
+ "<|placeholder1|>": 32002,
6
+ "<|placeholder2|>": 32003,
7
+ "<|placeholder3|>": 32004,
8
+ "<|placeholder4|>": 32005,
9
+ "<|placeholder5|>": 32008,
10
+ "<|placeholder6|>": 32009,
11
+ "<|system|>": 32006,
12
+ "<|user|>": 32010
13
+ }
chatbot/models/microsoft/Phi-3.5-mini-instruct/config.json ADDED
@@ -0,0 +1,138 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "microsoft/Phi-3.5-mini-instruct",
3
+ "architectures": [
4
+ "Phi3ForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "auto_map": {
9
+ "AutoConfig": "microsoft/Phi-3.5-mini-instruct--configuration_phi3.Phi3Config",
10
+ "AutoModelForCausalLM": "microsoft/Phi-3.5-mini-instruct--modeling_phi3.Phi3ForCausalLM"
11
+ },
12
+ "bos_token_id": 1,
13
+ "embd_pdrop": 0.0,
14
+ "eos_token_id": 32000,
15
+ "hidden_act": "silu",
16
+ "hidden_size": 3072,
17
+ "initializer_range": 0.02,
18
+ "intermediate_size": 8192,
19
+ "max_position_embeddings": 131072,
20
+ "model_type": "phi3",
21
+ "num_attention_heads": 32,
22
+ "num_hidden_layers": 32,
23
+ "num_key_value_heads": 32,
24
+ "original_max_position_embeddings": 4096,
25
+ "pad_token_id": 32000,
26
+ "resid_pdrop": 0.0,
27
+ "rms_norm_eps": 1e-05,
28
+ "rope_scaling": {
29
+ "long_factor": [
30
+ 1.0800000429153442,
31
+ 1.1100000143051147,
32
+ 1.1399999856948853,
33
+ 1.340000033378601,
34
+ 1.5899999141693115,
35
+ 1.600000023841858,
36
+ 1.6200000047683716,
37
+ 2.620000123977661,
38
+ 3.2300000190734863,
39
+ 3.2300000190734863,
40
+ 4.789999961853027,
41
+ 7.400000095367432,
42
+ 7.700000286102295,
43
+ 9.09000015258789,
44
+ 12.199999809265137,
45
+ 17.670000076293945,
46
+ 24.46000099182129,
47
+ 28.57000160217285,
48
+ 30.420001983642578,
49
+ 30.840002059936523,
50
+ 32.590003967285156,
51
+ 32.93000411987305,
52
+ 42.320003509521484,
53
+ 44.96000289916992,
54
+ 50.340003967285156,
55
+ 50.45000457763672,
56
+ 57.55000305175781,
57
+ 57.93000411987305,
58
+ 58.21000289916992,
59
+ 60.1400032043457,
60
+ 62.61000442504883,
61
+ 62.62000274658203,
62
+ 62.71000289916992,
63
+ 63.1400032043457,
64
+ 63.1400032043457,
65
+ 63.77000427246094,
66
+ 63.93000411987305,
67
+ 63.96000289916992,
68
+ 63.970001220703125,
69
+ 64.02999877929688,
70
+ 64.06999969482422,
71
+ 64.08000183105469,
72
+ 64.12000274658203,
73
+ 64.41000366210938,
74
+ 64.4800033569336,
75
+ 64.51000213623047,
76
+ 64.52999877929688,
77
+ 64.83999633789062
78
+ ],
79
+ "short_factor": [
80
+ 1.0,
81
+ 1.0199999809265137,
82
+ 1.0299999713897705,
83
+ 1.0299999713897705,
84
+ 1.0499999523162842,
85
+ 1.0499999523162842,
86
+ 1.0499999523162842,
87
+ 1.0499999523162842,
88
+ 1.0499999523162842,
89
+ 1.0699999332427979,
90
+ 1.0999999046325684,
91
+ 1.1099998950958252,
92
+ 1.1599998474121094,
93
+ 1.1599998474121094,
94
+ 1.1699998378753662,
95
+ 1.2899998426437378,
96
+ 1.339999794960022,
97
+ 1.679999828338623,
98
+ 1.7899998426437378,
99
+ 1.8199998140335083,
100
+ 1.8499997854232788,
101
+ 1.8799997568130493,
102
+ 1.9099997282028198,
103
+ 1.9399996995925903,
104
+ 1.9899996519088745,
105
+ 2.0199997425079346,
106
+ 2.0199997425079346,
107
+ 2.0199997425079346,
108
+ 2.0199997425079346,
109
+ 2.0199997425079346,
110
+ 2.0199997425079346,
111
+ 2.0299997329711914,
112
+ 2.0299997329711914,
113
+ 2.0299997329711914,
114
+ 2.0299997329711914,
115
+ 2.0299997329711914,
116
+ 2.0299997329711914,
117
+ 2.0299997329711914,
118
+ 2.0299997329711914,
119
+ 2.0299997329711914,
120
+ 2.0799996852874756,
121
+ 2.0899996757507324,
122
+ 2.189999580383301,
123
+ 2.2199995517730713,
124
+ 2.5899994373321533,
125
+ 2.729999542236328,
126
+ 2.749999523162842,
127
+ 2.8399994373321533
128
+ ],
129
+ "type": "longrope"
130
+ },
131
+ "rope_theta": 10000.0,
132
+ "sliding_window": 262144,
133
+ "tie_word_embeddings": false,
134
+ "torch_dtype": "float32",
135
+ "transformers_version": "4.46.2",
136
+ "use_cache": true,
137
+ "vocab_size": 32064
138
+ }
chatbot/models/microsoft/Phi-3.5-mini-instruct/generation_config.json ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": [
5
+ 32007,
6
+ 32001,
7
+ 32000
8
+ ],
9
+ "pad_token_id": 32000,
10
+ "transformers_version": "4.46.2"
11
+ }
chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00001-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:69f5298198adf824addbd073539922a669c5405154b9b8e5cf1b9f6f9829cee4
3
+ size 4961852416
chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00002-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8f182e86b3535d1baf86e190963197c8860911c68b6d8613776aca890731f6c0
3
+ size 4983111176
chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00003-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d672464098f4fa6927ff602d8de7c69e9c26dced17a5530a335e51ca95caaf43
3
+ size 4945374704
chatbot/models/microsoft/Phi-3.5-mini-instruct/model-00004-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2445155b9094892fed9c94bac15a5f750a041a814824a4770e4acf3d454380a3
3
+ size 394002560
chatbot/models/microsoft/Phi-3.5-mini-instruct/model.safetensors.index.json ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 15284318208
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00004-of-00004.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00004.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
10
+ "model.layers.0.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
11
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
12
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
13
+ "model.layers.0.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
14
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors",
15
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
16
+ "model.layers.1.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
17
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
18
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
19
+ "model.layers.1.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
20
+ "model.layers.10.input_layernorm.weight": "model-00002-of-00004.safetensors",
21
+ "model.layers.10.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
22
+ "model.layers.10.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
23
+ "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
24
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
25
+ "model.layers.10.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
26
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00004.safetensors",
27
+ "model.layers.11.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
28
+ "model.layers.11.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
29
+ "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
30
+ "model.layers.11.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
31
+ "model.layers.11.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
32
+ "model.layers.12.input_layernorm.weight": "model-00002-of-00004.safetensors",
33
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
34
+ "model.layers.12.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
35
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
36
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
37
+ "model.layers.12.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
38
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00004.safetensors",
39
+ "model.layers.13.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
40
+ "model.layers.13.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
41
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
42
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
43
+ "model.layers.13.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
44
+ "model.layers.14.input_layernorm.weight": "model-00002-of-00004.safetensors",
45
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
46
+ "model.layers.14.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
47
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
48
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
49
+ "model.layers.14.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
50
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00004.safetensors",
51
+ "model.layers.15.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
52
+ "model.layers.15.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
53
+ "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
54
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
55
+ "model.layers.15.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
56
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00004.safetensors",
57
+ "model.layers.16.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
58
+ "model.layers.16.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
59
+ "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
60
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
61
+ "model.layers.16.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
62
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00004.safetensors",
63
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
64
+ "model.layers.17.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
65
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
66
+ "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
67
+ "model.layers.17.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
68
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00004.safetensors",
69
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
70
+ "model.layers.18.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
71
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
72
+ "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
73
+ "model.layers.18.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
74
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00004.safetensors",
75
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
76
+ "model.layers.19.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
77
+ "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
78
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
79
+ "model.layers.19.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
80
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors",
81
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
82
+ "model.layers.2.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
83
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
84
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
85
+ "model.layers.2.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
86
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00004.safetensors",
87
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00004.safetensors",
88
+ "model.layers.20.mlp.gate_up_proj.weight": "model-00002-of-00004.safetensors",
89
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00004.safetensors",
90
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
91
+ "model.layers.20.self_attn.qkv_proj.weight": "model-00002-of-00004.safetensors",
92
+ "model.layers.21.input_layernorm.weight": "model-00003-of-00004.safetensors",
93
+ "model.layers.21.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
94
+ "model.layers.21.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
95
+ "model.layers.21.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
96
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00004.safetensors",
97
+ "model.layers.21.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
98
+ "model.layers.22.input_layernorm.weight": "model-00003-of-00004.safetensors",
99
+ "model.layers.22.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
100
+ "model.layers.22.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
101
+ "model.layers.22.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
102
+ "model.layers.22.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
103
+ "model.layers.22.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
104
+ "model.layers.23.input_layernorm.weight": "model-00003-of-00004.safetensors",
105
+ "model.layers.23.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
106
+ "model.layers.23.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
107
+ "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
108
+ "model.layers.23.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
109
+ "model.layers.23.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
110
+ "model.layers.24.input_layernorm.weight": "model-00003-of-00004.safetensors",
111
+ "model.layers.24.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
112
+ "model.layers.24.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
113
+ "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
114
+ "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
115
+ "model.layers.24.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
116
+ "model.layers.25.input_layernorm.weight": "model-00003-of-00004.safetensors",
117
+ "model.layers.25.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
118
+ "model.layers.25.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
119
+ "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
120
+ "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
121
+ "model.layers.25.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
122
+ "model.layers.26.input_layernorm.weight": "model-00003-of-00004.safetensors",
123
+ "model.layers.26.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
124
+ "model.layers.26.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
125
+ "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
126
+ "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
127
+ "model.layers.26.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
128
+ "model.layers.27.input_layernorm.weight": "model-00003-of-00004.safetensors",
129
+ "model.layers.27.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
130
+ "model.layers.27.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
131
+ "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
132
+ "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
133
+ "model.layers.27.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
134
+ "model.layers.28.input_layernorm.weight": "model-00003-of-00004.safetensors",
135
+ "model.layers.28.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
136
+ "model.layers.28.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
137
+ "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
138
+ "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
139
+ "model.layers.28.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
140
+ "model.layers.29.input_layernorm.weight": "model-00003-of-00004.safetensors",
141
+ "model.layers.29.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
142
+ "model.layers.29.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
143
+ "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
144
+ "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
145
+ "model.layers.29.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
146
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors",
147
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
148
+ "model.layers.3.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
149
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
150
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
151
+ "model.layers.3.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
152
+ "model.layers.30.input_layernorm.weight": "model-00003-of-00004.safetensors",
153
+ "model.layers.30.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
154
+ "model.layers.30.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
155
+ "model.layers.30.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
156
+ "model.layers.30.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
157
+ "model.layers.30.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
158
+ "model.layers.31.input_layernorm.weight": "model-00003-of-00004.safetensors",
159
+ "model.layers.31.mlp.down_proj.weight": "model-00003-of-00004.safetensors",
160
+ "model.layers.31.mlp.gate_up_proj.weight": "model-00003-of-00004.safetensors",
161
+ "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00004.safetensors",
162
+ "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00004.safetensors",
163
+ "model.layers.31.self_attn.qkv_proj.weight": "model-00003-of-00004.safetensors",
164
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors",
165
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
166
+ "model.layers.4.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
167
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
168
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
169
+ "model.layers.4.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
170
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors",
171
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
172
+ "model.layers.5.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
173
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
174
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
175
+ "model.layers.5.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
176
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors",
177
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
178
+ "model.layers.6.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
179
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
180
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
181
+ "model.layers.6.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
182
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors",
183
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
184
+ "model.layers.7.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
185
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
186
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
187
+ "model.layers.7.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
188
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors",
189
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
190
+ "model.layers.8.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
191
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
192
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
193
+ "model.layers.8.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
194
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00004.safetensors",
195
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00004.safetensors",
196
+ "model.layers.9.mlp.gate_up_proj.weight": "model-00001-of-00004.safetensors",
197
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00004.safetensors",
198
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
199
+ "model.layers.9.self_attn.qkv_proj.weight": "model-00001-of-00004.safetensors",
200
+ "model.norm.weight": "model-00003-of-00004.safetensors"
201
+ }
202
+ }