czl commited on
Commit
7635425
1 Parent(s): 4caf524

added notebook

Browse files
Files changed (1) hide show
  1. unit3.ipynb +1335 -0
unit3.ipynb ADDED
@@ -0,0 +1,1335 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "markdown",
5
+ "metadata": {
6
+ "id": "k7xBVPzoXxOg"
7
+ },
8
+ "source": [
9
+ "# Unit 3: Deep Q-Learning with Atari Games 👾 using RL Baselines3 Zoo\n",
10
+ "\n",
11
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/unit4/thumbnail.jpg\" alt=\"Unit 3 Thumbnail\">\n",
12
+ "\n",
13
+ "In this notebook, **you'll train a Deep Q-Learning agent** playing Space Invaders using [RL Baselines3 Zoo](https://github.com/DLR-RM/rl-baselines3-zoo), a training framework based on [Stable-Baselines3](https://stable-baselines3.readthedocs.io/en/master/) that provides scripts for training, evaluating agents, tuning hyperparameters, plotting results and recording videos.\n",
14
+ "\n",
15
+ "We're using the [RL-Baselines-3 Zoo integration, a vanilla version of Deep Q-Learning](https://stable-baselines3.readthedocs.io/en/master/modules/dqn.html) with no extensions such as Double-DQN, Dueling-DQN, and Prioritized Experience Replay.\n",
16
+ "\n",
17
+ "⬇️ Here is an example of what **you will achieve** ⬇️"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "code",
22
+ "execution_count": null,
23
+ "metadata": {
24
+ "id": "J9S713biXntc"
25
+ },
26
+ "outputs": [],
27
+ "source": [
28
+ "%%html\n",
29
+ "<video controls autoplay><source src=\"https://huggingface.co/ThomasSimonini/ppo-SpaceInvadersNoFrameskip-v4/resolve/main/replay.mp4\" type=\"video/mp4\"></video>"
30
+ ]
31
+ },
32
+ {
33
+ "cell_type": "markdown",
34
+ "metadata": {
35
+ "id": "ykJiGevCMVc5"
36
+ },
37
+ "source": [
38
+ "### 🎮 Environments:\n",
39
+ "\n",
40
+ "- [SpacesInvadersNoFrameskip-v4](https://gymnasium.farama.org/environments/atari/space_invaders/)\n",
41
+ "\n",
42
+ "You can see the difference between Space Invaders versions here 👉 https://gymnasium.farama.org/environments/atari/space_invaders/#variants\n",
43
+ "\n",
44
+ "### 📚 RL-Library:\n",
45
+ "\n",
46
+ "- [RL-Baselines3-Zoo](https://github.com/DLR-RM/rl-baselines3-zoo)"
47
+ ]
48
+ },
49
+ {
50
+ "cell_type": "markdown",
51
+ "metadata": {
52
+ "id": "wciHGjrFYz9m"
53
+ },
54
+ "source": [
55
+ "## Objectives of this notebook 🏆\n",
56
+ "At the end of the notebook, you will:\n",
57
+ "- Be able to understand deeper **how RL Baselines3 Zoo works**.\n",
58
+ "- Be able to **push your trained agent and the code to the Hub** with a nice video replay and an evaluation score 🔥.\n",
59
+ "\n",
60
+ "\n"
61
+ ]
62
+ },
63
+ {
64
+ "cell_type": "markdown",
65
+ "metadata": {
66
+ "id": "TsnP0rjxMn1e"
67
+ },
68
+ "source": [
69
+ "## This notebook is from Deep Reinforcement Learning Course\n",
70
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/notebooks/deep-rl-course-illustration.jpg\" alt=\"Deep RL Course illustration\"/>"
71
+ ]
72
+ },
73
+ {
74
+ "cell_type": "markdown",
75
+ "metadata": {
76
+ "id": "nw6fJHIAZd-J"
77
+ },
78
+ "source": [
79
+ "In this free course, you will:\n",
80
+ "\n",
81
+ "- 📖 Study Deep Reinforcement Learning in **theory and practice**.\n",
82
+ "- 🧑‍💻 Learn to **use famous Deep RL libraries** such as Stable Baselines3, RL Baselines3 Zoo, CleanRL and Sample Factory 2.0.\n",
83
+ "- 🤖 Train **agents in unique environments**\n",
84
+ "\n",
85
+ "And more check 📚 the syllabus 👉 https://simoninithomas.github.io/deep-rl-course\n",
86
+ "\n",
87
+ "Don’t forget to **<a href=\"http://eepurl.com/ic5ZUD\">sign up to the course</a>** (we are collecting your email to be able to **send you the links when each Unit is published and give you information about the challenges and updates).**\n",
88
+ "\n",
89
+ "\n",
90
+ "The best way to keep in touch is to join our discord server to exchange with the community and with us 👉🏻 https://discord.gg/ydHrjt3WP5"
91
+ ]
92
+ },
93
+ {
94
+ "cell_type": "markdown",
95
+ "metadata": {
96
+ "id": "0vgANIBBZg1p"
97
+ },
98
+ "source": [
99
+ "## Prerequisites 🏗️\n",
100
+ "Before diving into the notebook, you need to:\n",
101
+ "\n",
102
+ "🔲 📚 **[Study Deep Q-Learning by reading Unit 3](https://huggingface.co/deep-rl-course/unit3/introduction)** 🤗"
103
+ ]
104
+ },
105
+ {
106
+ "cell_type": "markdown",
107
+ "metadata": {
108
+ "id": "7kszpGFaRVhq"
109
+ },
110
+ "source": [
111
+ "We're constantly trying to improve our tutorials, so **if you find some issues in this notebook**, please [open an issue on the Github Repo](https://github.com/huggingface/deep-rl-class/issues)."
112
+ ]
113
+ },
114
+ {
115
+ "cell_type": "markdown",
116
+ "metadata": {
117
+ "id": "QR0jZtYreSI5"
118
+ },
119
+ "source": [
120
+ "# Let's train a Deep Q-Learning agent playing Atari' Space Invaders 👾 and upload it to the Hub.\n",
121
+ "\n",
122
+ "We strongly recommend students **to use Google Colab for the hands-on exercises instead of running them on their personal computers**.\n",
123
+ "\n",
124
+ "By using Google Colab, **you can focus on learning and experimenting without worrying about the technical aspects of setting up your environments**.\n",
125
+ "\n",
126
+ "To validate this hands-on for the certification process, you need to push your trained model to the Hub and **get a result of >= 200**.\n",
127
+ "\n",
128
+ "To find your result, go to the leaderboard and find your model, **the result = mean_reward - std of reward**\n",
129
+ "\n",
130
+ "For more information about the certification process, check this section 👉 https://huggingface.co/deep-rl-course/en/unit0/introduction#certification-process"
131
+ ]
132
+ },
133
+ {
134
+ "cell_type": "markdown",
135
+ "metadata": {
136
+ "id": "Nc8BnyVEc3Ys"
137
+ },
138
+ "source": [
139
+ "## An advice 💡\n",
140
+ "It's better to run this colab in a copy on your Google Drive, so that **if it timeouts** you still have the saved notebook on your Google Drive and do not need to fill everything from scratch.\n",
141
+ "\n",
142
+ "To do that you can either do `Ctrl + S` or `File > Save a copy in Google Drive.`\n",
143
+ "\n",
144
+ "Also, we're going to **train it for 90 minutes with 1M timesteps**. By typing `!nvidia-smi` will tell you what GPU you're using.\n",
145
+ "\n",
146
+ "And if you want to train more such 10 million steps, this will take about 9 hours, potentially resulting in Colab timing out. In that case, I recommend running this on your local computer (or somewhere else). Just click on: `File>Download`."
147
+ ]
148
+ },
149
+ {
150
+ "cell_type": "markdown",
151
+ "metadata": {
152
+ "id": "PU4FVzaoM6fC"
153
+ },
154
+ "source": [
155
+ "## Set the GPU 💪\n",
156
+ "- To **accelerate the agent's training, we'll use a GPU**. To do that, go to `Runtime > Change Runtime type`\n",
157
+ "\n",
158
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/notebooks/gpu-step1.jpg\" alt=\"GPU Step 1\">"
159
+ ]
160
+ },
161
+ {
162
+ "cell_type": "markdown",
163
+ "metadata": {
164
+ "id": "KV0NyFdQM9ZG"
165
+ },
166
+ "source": [
167
+ "- `Hardware Accelerator > GPU`\n",
168
+ "\n",
169
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/notebooks/gpu-step2.jpg\" alt=\"GPU Step 2\">"
170
+ ]
171
+ },
172
+ {
173
+ "cell_type": "markdown",
174
+ "metadata": {
175
+ "id": "wS_cVefO-aYg"
176
+ },
177
+ "source": [
178
+ "# Install RL-Baselines3 Zoo and its dependencies 📚\n",
179
+ "\n",
180
+ "If you see `ERROR: pip's dependency resolver does not currently take into account all the packages that are installed.` **this is normal and it's not a critical error** there's a conflict of version. But the packages we need are installed."
181
+ ]
182
+ },
183
+ {
184
+ "cell_type": "code",
185
+ "execution_count": 1,
186
+ "metadata": {
187
+ "id": "hLTwHqIWdnPb"
188
+ },
189
+ "outputs": [
190
+ {
191
+ "name": "stdout",
192
+ "output_type": "stream",
193
+ "text": [
194
+ "Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com\n",
195
+ "Collecting git+https://github.com/DLR-RM/rl-baselines3-zoo@update/hf\n",
196
+ " Cloning https://github.com/DLR-RM/rl-baselines3-zoo (to revision update/hf) to /tmp/pip-req-build-pa0fi3k1\n",
197
+ " Running command git clone --filter=blob:none --quiet https://github.com/DLR-RM/rl-baselines3-zoo /tmp/pip-req-build-pa0fi3k1\n",
198
+ " Running command git checkout -b update/hf --track origin/update/hf\n",
199
+ " Switched to a new branch 'update/hf'\n",
200
+ " branch 'update/hf' set up to track 'origin/update/hf'.\n",
201
+ " Resolved https://github.com/DLR-RM/rl-baselines3-zoo to commit 7dcbff7e74e7a12c052452181ff353a4dbed313a\n",
202
+ " Running command git submodule update --init --recursive -q\n",
203
+ " Installing build dependencies ... \u001b[?25ldone\n",
204
+ "\u001b[?25h Getting requirements to build wheel ... \u001b[?25ldone\n",
205
+ "\u001b[?25h Preparing metadata (pyproject.toml) ... \u001b[?25ldone\n",
206
+ "\u001b[?25hCollecting sb3-contrib>=2.0.0a9 (from rl-zoo3==2.0.0a9)\n",
207
+ " Obtaining dependency information for sb3-contrib>=2.0.0a9 from https://files.pythonhosted.org/packages/01/d1/5dcefd81d358d74798ba2cde7718f2f9955f7ff80854ed4392ab7f569067/sb3_contrib-2.1.0-py3-none-any.whl.metadata\n",
208
+ " Downloading sb3_contrib-2.1.0-py3-none-any.whl.metadata (3.6 kB)\n",
209
+ "Collecting gym==0.26.2 (from rl-zoo3==2.0.0a9)\n",
210
+ " Downloading gym-0.26.2.tar.gz (721 kB)\n",
211
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m721.7/721.7 kB\u001b[0m \u001b[31m11.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
212
+ "\u001b[?25h Installing build dependencies ... \u001b[?25ldone\n",
213
+ "\u001b[?25h Getting requirements to build wheel ... \u001b[?25ldone\n",
214
+ "\u001b[?25h Preparing metadata (pyproject.toml) ... \u001b[?25ldone\n",
215
+ "\u001b[?25hCollecting huggingface-sb3>=2.2.1 (from rl-zoo3==2.0.0a9)\n",
216
+ " Obtaining dependency information for huggingface-sb3>=2.2.1 from https://files.pythonhosted.org/packages/ca/71/ed75cf1113a80a1a79628c7a27aa185e64e8010295bc7cc399b5d2305801/huggingface_sb3-2.3-py3-none-any.whl.metadata\n",
217
+ " Downloading huggingface_sb3-2.3-py3-none-any.whl.metadata (6.2 kB)\n",
218
+ "Collecting tqdm (from rl-zoo3==2.0.0a9)\n",
219
+ " Obtaining dependency information for tqdm from https://files.pythonhosted.org/packages/00/e5/f12a80907d0884e6dff9c16d0c0114d81b8cd07dc3ae54c5e962cc83037e/tqdm-4.66.1-py3-none-any.whl.metadata\n",
220
+ " Downloading tqdm-4.66.1-py3-none-any.whl.metadata (57 kB)\n",
221
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m57.6/57.6 kB\u001b[0m \u001b[31m336.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
222
+ "\u001b[?25hCollecting rich (from rl-zoo3==2.0.0a9)\n",
223
+ " Obtaining dependency information for rich from https://files.pythonhosted.org/packages/8d/5f/21a93b2ec205f4b79853ff6e838e3c99064d5dbe85ec6b05967506f14af0/rich-13.5.2-py3-none-any.whl.metadata\n",
224
+ " Downloading rich-13.5.2-py3-none-any.whl.metadata (18 kB)\n",
225
+ "Collecting optuna (from rl-zoo3==2.0.0a9)\n",
226
+ " Obtaining dependency information for optuna from https://files.pythonhosted.org/packages/69/60/87a06ef66b34cbe2f2eb0ab66f003664404a7f40c21403a69fad7e28a82b/optuna-3.3.0-py3-none-any.whl.metadata\n",
227
+ " Downloading optuna-3.3.0-py3-none-any.whl.metadata (17 kB)\n",
228
+ "Collecting pyyaml>=5.1 (from rl-zoo3==2.0.0a9)\n",
229
+ " Obtaining dependency information for pyyaml>=5.1 from https://files.pythonhosted.org/packages/7d/39/472f2554a0f1e825bd7c5afc11c817cd7a2f3657460f7159f691fbb37c51/PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
230
+ " Downloading PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)\n",
231
+ "Collecting pytablewriter~=0.64 (from rl-zoo3==2.0.0a9)\n",
232
+ " Downloading pytablewriter-0.64.2-py3-none-any.whl (106 kB)\n",
233
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m106.6/106.6 kB\u001b[0m \u001b[31m385.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
234
+ "\u001b[?25hCollecting numpy>=1.18.0 (from gym==0.26.2->rl-zoo3==2.0.0a9)\n",
235
+ " Obtaining dependency information for numpy>=1.18.0 from https://files.pythonhosted.org/packages/69/1f/c95b1108a9972a52d7b1b63ed8ca70466b59b8c1811bd121f1e667cc45d8/numpy-1.25.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
236
+ " Downloading numpy-1.25.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)\n",
237
+ "Collecting cloudpickle>=1.2.0 (from gym==0.26.2->rl-zoo3==2.0.0a9)\n",
238
+ " Downloading cloudpickle-2.2.1-py3-none-any.whl (25 kB)\n",
239
+ "Collecting gym-notices>=0.0.4 (from gym==0.26.2->rl-zoo3==2.0.0a9)\n",
240
+ " Downloading gym_notices-0.0.8-py3-none-any.whl (3.0 kB)\n",
241
+ "Collecting importlib-metadata>=4.8.0 (from gym==0.26.2->rl-zoo3==2.0.0a9)\n",
242
+ " Obtaining dependency information for importlib-metadata>=4.8.0 from https://files.pythonhosted.org/packages/cc/37/db7ba97e676af155f5fcb1a35466f446eadc9104e25b83366e8088c9c926/importlib_metadata-6.8.0-py3-none-any.whl.metadata\n",
243
+ " Downloading importlib_metadata-6.8.0-py3-none-any.whl.metadata (5.1 kB)\n",
244
+ "Collecting huggingface-hub~=0.8 (from huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
245
+ " Obtaining dependency information for huggingface-hub~=0.8 from https://files.pythonhosted.org/packages/7f/c4/adcbe9a696c135578cabcbdd7331332daad4d49b7c43688bc2d36b3a47d2/huggingface_hub-0.16.4-py3-none-any.whl.metadata\n",
246
+ " Downloading huggingface_hub-0.16.4-py3-none-any.whl.metadata (12 kB)\n",
247
+ "Collecting wasabi (from huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
248
+ " Obtaining dependency information for wasabi from https://files.pythonhosted.org/packages/8f/69/26cbf0bad11703241cb84d5324d868097f7a8faf2f1888354dac8883f3fc/wasabi-1.1.2-py3-none-any.whl.metadata\n",
249
+ " Downloading wasabi-1.1.2-py3-none-any.whl.metadata (28 kB)\n",
250
+ "Requirement already satisfied: setuptools>=38.3.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from pytablewriter~=0.64->rl-zoo3==2.0.0a9) (68.0.0)\n",
251
+ "Collecting DataProperty<2,>=0.55.0 (from pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
252
+ " Obtaining dependency information for DataProperty<2,>=0.55.0 from https://files.pythonhosted.org/packages/b1/3b/90ebd66ad57c588d6087e86e327436343e9cc60776a9445b79c6e80a022d/DataProperty-1.0.1-py3-none-any.whl.metadata\n",
253
+ " Downloading DataProperty-1.0.1-py3-none-any.whl.metadata (11 kB)\n",
254
+ "Collecting mbstrdecoder<2,>=1.0.0 (from pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
255
+ " Obtaining dependency information for mbstrdecoder<2,>=1.0.0 from https://files.pythonhosted.org/packages/c2/0f/726229136022b154895138bb10ba35e8435c4143f614cb5ad4d4e3fc21ec/mbstrdecoder-1.1.3-py3-none-any.whl.metadata\n",
256
+ " Downloading mbstrdecoder-1.1.3-py3-none-any.whl.metadata (4.0 kB)\n",
257
+ "Collecting pathvalidate<3,>=2.3.0 (from pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
258
+ " Downloading pathvalidate-2.5.2-py3-none-any.whl (20 kB)\n",
259
+ "Collecting tabledata<2,>=1.3.0 (from pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
260
+ " Downloading tabledata-1.3.1-py3-none-any.whl (11 kB)\n",
261
+ "Collecting tcolorpy<1,>=0.0.5 (from pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
262
+ " Downloading tcolorpy-0.1.3-py3-none-any.whl (7.9 kB)\n",
263
+ "Collecting typepy[datetime]<2,>=1.2.0 (from pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
264
+ " Obtaining dependency information for typepy[datetime]<2,>=1.2.0 from https://files.pythonhosted.org/packages/7f/31/0c7a66aa315cc8b2d1915fdd163283ba704307d7c0cf15b31e08c51aedba/typepy-1.3.1-py3-none-any.whl.metadata\n",
265
+ " Downloading typepy-1.3.1-py3-none-any.whl.metadata (9.3 kB)\n",
266
+ "Collecting stable-baselines3>=2.1.0 (from sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
267
+ " Obtaining dependency information for stable-baselines3>=2.1.0 from https://files.pythonhosted.org/packages/5e/81/7a0fbfc45240ec36cc3fcfe8f135996ef03277e2305d941a6d9186eb14e8/stable_baselines3-2.1.0-py3-none-any.whl.metadata\n",
268
+ " Downloading stable_baselines3-2.1.0-py3-none-any.whl.metadata (5.2 kB)\n",
269
+ "Collecting alembic>=1.5.0 (from optuna->rl-zoo3==2.0.0a9)\n",
270
+ " Obtaining dependency information for alembic>=1.5.0 from https://files.pythonhosted.org/packages/ab/7d/b572fc6a51bc430b1fa0ef59591db32b14105093324d472eed8ea296d2df/alembic-1.11.3-py3-none-any.whl.metadata\n",
271
+ " Downloading alembic-1.11.3-py3-none-any.whl.metadata (7.2 kB)\n",
272
+ "Collecting cmaes>=0.10.0 (from optuna->rl-zoo3==2.0.0a9)\n",
273
+ " Obtaining dependency information for cmaes>=0.10.0 from https://files.pythonhosted.org/packages/f7/46/7d9544d453346f6c0c405916c95fdb653491ea2e9976cabb810ba2fe8cd4/cmaes-0.10.0-py3-none-any.whl.metadata\n",
274
+ " Downloading cmaes-0.10.0-py3-none-any.whl.metadata (19 kB)\n",
275
+ "Collecting colorlog (from optuna->rl-zoo3==2.0.0a9)\n",
276
+ " Downloading colorlog-6.7.0-py2.py3-none-any.whl (11 kB)\n",
277
+ "Requirement already satisfied: packaging>=20.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from optuna->rl-zoo3==2.0.0a9) (23.1)\n",
278
+ "Collecting sqlalchemy>=1.3.0 (from optuna->rl-zoo3==2.0.0a9)\n",
279
+ " Obtaining dependency information for sqlalchemy>=1.3.0 from https://files.pythonhosted.org/packages/91/2b/92aadcea86b9ebd681de0b6b2cbfa75193227e607893cfb5feea0cefc461/SQLAlchemy-2.0.20-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
280
+ " Downloading SQLAlchemy-2.0.20-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (9.4 kB)\n",
281
+ "Collecting markdown-it-py>=2.2.0 (from rich->rl-zoo3==2.0.0a9)\n",
282
+ " Obtaining dependency information for markdown-it-py>=2.2.0 from https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl.metadata\n",
283
+ " Downloading markdown_it_py-3.0.0-py3-none-any.whl.metadata (6.9 kB)\n",
284
+ "Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from rich->rl-zoo3==2.0.0a9) (2.16.1)\n",
285
+ "Collecting Mako (from alembic>=1.5.0->optuna->rl-zoo3==2.0.0a9)\n",
286
+ " Downloading Mako-1.2.4-py3-none-any.whl (78 kB)\n",
287
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m78.7/78.7 kB\u001b[0m \u001b[31m360.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
288
+ "\u001b[?25hRequirement already satisfied: typing-extensions>=4 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from alembic>=1.5.0->optuna->rl-zoo3==2.0.0a9) (4.7.1)\n",
289
+ "Collecting filelock (from huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
290
+ " Obtaining dependency information for filelock from https://files.pythonhosted.org/packages/52/90/45223db4e1df30ff14e8aebf9a1bf0222da2e7b49e53692c968f36817812/filelock-3.12.3-py3-none-any.whl.metadata\n",
291
+ " Downloading filelock-3.12.3-py3-none-any.whl.metadata (2.7 kB)\n",
292
+ "Collecting fsspec (from huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
293
+ " Obtaining dependency information for fsspec from https://files.pythonhosted.org/packages/e3/bd/4c0a4619494188a9db5d77e2100ab7d544a42e76b2447869d8e124e981d8/fsspec-2023.6.0-py3-none-any.whl.metadata\n",
294
+ " Downloading fsspec-2023.6.0-py3-none-any.whl.metadata (6.7 kB)\n",
295
+ "Collecting requests (from huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
296
+ " Obtaining dependency information for requests from https://files.pythonhosted.org/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl.metadata\n",
297
+ " Downloading requests-2.31.0-py3-none-any.whl.metadata (4.6 kB)\n",
298
+ "Collecting zipp>=0.5 (from importlib-metadata>=4.8.0->gym==0.26.2->rl-zoo3==2.0.0a9)\n",
299
+ " Obtaining dependency information for zipp>=0.5 from https://files.pythonhosted.org/packages/8c/08/d3006317aefe25ea79d3b76c9650afabaf6d63d1c8443b236e7405447503/zipp-3.16.2-py3-none-any.whl.metadata\n",
300
+ " Downloading zipp-3.16.2-py3-none-any.whl.metadata (3.7 kB)\n",
301
+ "Collecting mdurl~=0.1 (from markdown-it-py>=2.2.0->rich->rl-zoo3==2.0.0a9)\n",
302
+ " Downloading mdurl-0.1.2-py3-none-any.whl (10.0 kB)\n",
303
+ "Collecting chardet<6,>=3.0.4 (from mbstrdecoder<2,>=1.0.0->pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
304
+ " Obtaining dependency information for chardet<6,>=3.0.4 from https://files.pythonhosted.org/packages/38/6f/f5fbc992a329ee4e0f288c1fe0e2ad9485ed064cac731ed2fe47dcc38cbf/chardet-5.2.0-py3-none-any.whl.metadata\n",
305
+ " Downloading chardet-5.2.0-py3-none-any.whl.metadata (3.4 kB)\n",
306
+ "Collecting greenlet!=0.4.17 (from sqlalchemy>=1.3.0->optuna->rl-zoo3==2.0.0a9)\n",
307
+ " Downloading greenlet-2.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (610 kB)\n",
308
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m610.9/610.9 kB\u001b[0m \u001b[31m83.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
309
+ "\u001b[?25hCollecting gymnasium<0.30,>=0.28.1 (from stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
310
+ " Obtaining dependency information for gymnasium<0.30,>=0.28.1 from https://files.pythonhosted.org/packages/a8/4d/3cbfd81ed84db450dbe73a89afcd8bc405273918415649ac6683356afe92/gymnasium-0.29.1-py3-none-any.whl.metadata\n",
311
+ " Downloading gymnasium-0.29.1-py3-none-any.whl.metadata (10 kB)\n",
312
+ "Collecting torch>=1.13 (from stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
313
+ " Downloading torch-2.0.1-cp39-cp39-manylinux1_x86_64.whl (619.9 MB)\n",
314
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m619.9/619.9 MB\u001b[0m \u001b[31m47.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
315
+ "\u001b[?25hCollecting pandas (from stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
316
+ " Obtaining dependency information for pandas from https://files.pythonhosted.org/packages/83/f0/2765daac3c58165460b127df5c0ef7b3a039f3bfe7ea7a51f3d20b01371b/pandas-2.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
317
+ " Downloading pandas-2.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (18 kB)\n",
318
+ "Collecting matplotlib (from stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
319
+ " Obtaining dependency information for matplotlib from https://files.pythonhosted.org/packages/47/b9/6c0daa9b953a80b4e6933bf6a11a2d0633f257e84ee5995c5fd35de564c9/matplotlib-3.7.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
320
+ " Downloading matplotlib-3.7.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.6 kB)\n",
321
+ "Requirement already satisfied: python-dateutil<3.0.0,>=2.8.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from typepy[datetime]<2,>=1.2.0->pytablewriter~=0.64->rl-zoo3==2.0.0a9) (2.8.2)\n",
322
+ "Collecting pytz>=2018.9 (from typepy[datetime]<2,>=1.2.0->pytablewriter~=0.64->rl-zoo3==2.0.0a9)\n",
323
+ " Downloading pytz-2023.3-py2.py3-none-any.whl (502 kB)\n",
324
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m502.3/502.3 kB\u001b[0m \u001b[31m72.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
325
+ "\u001b[?25hCollecting farama-notifications>=0.0.1 (from gymnasium<0.30,>=0.28.1->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
326
+ " Downloading Farama_Notifications-0.0.4-py3-none-any.whl (2.5 kB)\n",
327
+ "Requirement already satisfied: six>=1.5 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from python-dateutil<3.0.0,>=2.8.0->typepy[datetime]<2,>=1.2.0->pytablewriter~=0.64->rl-zoo3==2.0.0a9) (1.16.0)\n",
328
+ "Collecting sympy (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
329
+ " Downloading sympy-1.12-py3-none-any.whl (5.7 MB)\n",
330
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m5.7/5.7 MB\u001b[0m \u001b[31m51.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
331
+ "\u001b[?25hCollecting networkx (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
332
+ " Downloading networkx-3.1-py3-none-any.whl (2.1 MB)\n",
333
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.1/2.1 MB\u001b[0m \u001b[31m57.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
334
+ "\u001b[?25hCollecting jinja2 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
335
+ " Downloading Jinja2-3.1.2-py3-none-any.whl (133 kB)\n",
336
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m133.1/133.1 kB\u001b[0m \u001b[31m377.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
337
+ "\u001b[?25hCollecting nvidia-cuda-nvrtc-cu11==11.7.99 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
338
+ " Downloading nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl (21.0 MB)\n",
339
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m21.0/21.0 MB\u001b[0m \u001b[31m50.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
340
+ "\u001b[?25hCollecting nvidia-cuda-runtime-cu11==11.7.99 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
341
+ " Downloading nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl (849 kB)\n",
342
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m849.3/849.3 kB\u001b[0m \u001b[31m61.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
343
+ "\u001b[?25hCollecting nvidia-cuda-cupti-cu11==11.7.101 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
344
+ " Downloading nvidia_cuda_cupti_cu11-11.7.101-py3-none-manylinux1_x86_64.whl (11.8 MB)\n",
345
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m11.8/11.8 MB\u001b[0m \u001b[31m46.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
346
+ "\u001b[?25hCollecting nvidia-cudnn-cu11==8.5.0.96 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
347
+ " Downloading nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl (557.1 MB)\n",
348
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m557.1/557.1 MB\u001b[0m \u001b[31m21.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
349
+ "\u001b[?25hCollecting nvidia-cublas-cu11==11.10.3.66 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
350
+ " Downloading nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl (317.1 MB)\n",
351
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m317.1/317.1 MB\u001b[0m \u001b[31m43.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
352
+ "\u001b[?25hCollecting nvidia-cufft-cu11==10.9.0.58 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
353
+ " Downloading nvidia_cufft_cu11-10.9.0.58-py3-none-manylinux1_x86_64.whl (168.4 MB)\n",
354
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m168.4/168.4 MB\u001b[0m \u001b[31m52.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
355
+ "\u001b[?25hCollecting nvidia-curand-cu11==10.2.10.91 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
356
+ " Downloading nvidia_curand_cu11-10.2.10.91-py3-none-manylinux1_x86_64.whl (54.6 MB)\n",
357
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m54.6/54.6 MB\u001b[0m \u001b[31m57.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
358
+ "\u001b[?25hCollecting nvidia-cusolver-cu11==11.4.0.1 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
359
+ " Downloading nvidia_cusolver_cu11-11.4.0.1-2-py3-none-manylinux1_x86_64.whl (102.6 MB)\n",
360
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m102.6/102.6 MB\u001b[0m \u001b[31m66.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
361
+ "\u001b[?25hCollecting nvidia-cusparse-cu11==11.7.4.91 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
362
+ " Downloading nvidia_cusparse_cu11-11.7.4.91-py3-none-manylinux1_x86_64.whl (173.2 MB)\n",
363
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m173.2/173.2 MB\u001b[0m \u001b[31m31.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
364
+ "\u001b[?25hCollecting nvidia-nccl-cu11==2.14.3 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
365
+ " Downloading nvidia_nccl_cu11-2.14.3-py3-none-manylinux1_x86_64.whl (177.1 MB)\n",
366
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m177.1/177.1 MB\u001b[0m \u001b[31m36.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
367
+ "\u001b[?25hCollecting nvidia-nvtx-cu11==11.7.91 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
368
+ " Downloading nvidia_nvtx_cu11-11.7.91-py3-none-manylinux1_x86_64.whl (98 kB)\n",
369
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m98.6/98.6 kB\u001b[0m \u001b[31m393.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
370
+ "\u001b[?25hCollecting triton==2.0.0 (from torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
371
+ " Downloading triton-2.0.0-1-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (63.3 MB)\n",
372
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m63.3/63.3 MB\u001b[0m \u001b[31m38.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
373
+ "\u001b[?25hRequirement already satisfied: wheel in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9) (0.38.4)\n",
374
+ "Collecting cmake (from triton==2.0.0->torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
375
+ " Obtaining dependency information for cmake from https://files.pythonhosted.org/packages/2e/51/3a4672a819b4532a378bfefad8f886cfe71057556e0d4eefb64523fd370a/cmake-3.27.2-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata\n",
376
+ " Downloading cmake-3.27.2-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata (6.7 kB)\n",
377
+ "Collecting lit (from triton==2.0.0->torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
378
+ " Downloading lit-16.0.6.tar.gz (153 kB)\n",
379
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m153.7/153.7 kB\u001b[0m \u001b[31m348.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
380
+ "\u001b[?25h Installing build dependencies ... \u001b[?25ldone\n",
381
+ "\u001b[?25h Getting requirements to build wheel ... \u001b[?25ldone\n",
382
+ "\u001b[?25h Installing backend dependencies ... \u001b[?25ldone\n",
383
+ "\u001b[?25h Preparing metadata (pyproject.toml) ... \u001b[?25ldone\n",
384
+ "\u001b[?25hCollecting MarkupSafe>=0.9.2 (from Mako->alembic>=1.5.0->optuna->rl-zoo3==2.0.0a9)\n",
385
+ " Obtaining dependency information for MarkupSafe>=0.9.2 from https://files.pythonhosted.org/packages/de/63/cb7e71984e9159ec5f45b5e81e896c8bdd0e45fe3fc6ce02ab497f0d790e/MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
386
+ " Downloading MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB)\n",
387
+ "Collecting contourpy>=1.0.1 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
388
+ " Obtaining dependency information for contourpy>=1.0.1 from https://files.pythonhosted.org/packages/38/6f/5382bdff9dda60cb17cef6dfa2bad3e6edacffd5c2243e282e851c63f721/contourpy-1.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
389
+ " Downloading contourpy-1.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.7 kB)\n",
390
+ "Collecting cycler>=0.10 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
391
+ " Downloading cycler-0.11.0-py3-none-any.whl (6.4 kB)\n",
392
+ "Collecting fonttools>=4.22.0 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
393
+ " Obtaining dependency information for fonttools>=4.22.0 from https://files.pythonhosted.org/packages/49/50/2e31753c088d364756daa5bed0dab6a5928ebfd6e6d26f975c8b6d6f754a/fonttools-4.42.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
394
+ " Downloading fonttools-4.42.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (150 kB)\n",
395
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m151.0/151.0 kB\u001b[0m \u001b[31m431.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
396
+ "\u001b[?25hCollecting kiwisolver>=1.0.1 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
397
+ " Obtaining dependency information for kiwisolver>=1.0.1 from https://files.pythonhosted.org/packages/c0/a8/841594f11d0b88d8aeb26991bc4dac38baa909dc58d0c4262a4f7893bcbf/kiwisolver-1.4.5-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata\n",
398
+ " Downloading kiwisolver-1.4.5-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (6.4 kB)\n",
399
+ "Collecting pillow>=6.2.0 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
400
+ " Obtaining dependency information for pillow>=6.2.0 from https://files.pythonhosted.org/packages/50/e5/0d484d1ac71b934638f91b7156203ba5bf3eb12f596b616a68a85c123808/Pillow-10.0.0-cp39-cp39-manylinux_2_28_x86_64.whl.metadata\n",
401
+ " Downloading Pillow-10.0.0-cp39-cp39-manylinux_2_28_x86_64.whl.metadata (9.5 kB)\n",
402
+ "Collecting pyparsing<3.1,>=2.3.1 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
403
+ " Downloading pyparsing-3.0.9-py3-none-any.whl (98 kB)\n",
404
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m98.3/98.3 kB\u001b[0m \u001b[31m387.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
405
+ "\u001b[?25hCollecting importlib-resources>=3.2.0 (from matplotlib->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
406
+ " Obtaining dependency information for importlib-resources>=3.2.0 from https://files.pythonhosted.org/packages/25/d4/592f53ce2f8dde8be5720851bd0ab71cc2e76c55978e4163ef1ab7e389bb/importlib_resources-6.0.1-py3-none-any.whl.metadata\n",
407
+ " Downloading importlib_resources-6.0.1-py3-none-any.whl.metadata (4.0 kB)\n",
408
+ "Collecting tzdata>=2022.1 (from pandas->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
409
+ " Downloading tzdata-2023.3-py2.py3-none-any.whl (341 kB)\n",
410
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m341.8/341.8 kB\u001b[0m \u001b[31m91.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
411
+ "\u001b[?25hCollecting charset-normalizer<4,>=2 (from requests->huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
412
+ " Obtaining dependency information for charset-normalizer<4,>=2 from https://files.pythonhosted.org/packages/f9/0d/514be8597d7a96243e5467a37d337b9399cec117a513fcf9328405d911c0/charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata\n",
413
+ " Downloading charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (31 kB)\n",
414
+ "Collecting idna<4,>=2.5 (from requests->huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
415
+ " Downloading idna-3.4-py3-none-any.whl (61 kB)\n",
416
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m61.5/61.5 kB\u001b[0m \u001b[31m344.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
417
+ "\u001b[?25hCollecting urllib3<3,>=1.21.1 (from requests->huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
418
+ " Obtaining dependency information for urllib3<3,>=1.21.1 from https://files.pythonhosted.org/packages/9b/81/62fd61001fa4b9d0df6e31d47ff49cfa9de4af03adecf339c7bc30656b37/urllib3-2.0.4-py3-none-any.whl.metadata\n",
419
+ " Downloading urllib3-2.0.4-py3-none-any.whl.metadata (6.6 kB)\n",
420
+ "Collecting certifi>=2017.4.17 (from requests->huggingface-hub~=0.8->huggingface-sb3>=2.2.1->rl-zoo3==2.0.0a9)\n",
421
+ " Obtaining dependency information for certifi>=2017.4.17 from https://files.pythonhosted.org/packages/4c/dd/2234eab22353ffc7d94e8d13177aaa050113286e93e7b40eae01fbf7c3d9/certifi-2023.7.22-py3-none-any.whl.metadata\n",
422
+ " Downloading certifi-2023.7.22-py3-none-any.whl.metadata (2.2 kB)\n",
423
+ "Collecting mpmath>=0.19 (from sympy->torch>=1.13->stable-baselines3>=2.1.0->sb3-contrib>=2.0.0a9->rl-zoo3==2.0.0a9)\n",
424
+ " Downloading mpmath-1.3.0-py3-none-any.whl (536 kB)\n",
425
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m536.2/536.2 kB\u001b[0m \u001b[31m62.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
426
+ "\u001b[?25hDownloading huggingface_sb3-2.3-py3-none-any.whl (9.6 kB)\n",
427
+ "Downloading PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (738 kB)\n",
428
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m738.9/738.9 kB\u001b[0m \u001b[31m42.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
429
+ "\u001b[?25hDownloading sb3_contrib-2.1.0-py3-none-any.whl (80 kB)\n",
430
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m80.3/80.3 kB\u001b[0m \u001b[31m269.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
431
+ "\u001b[?25hDownloading optuna-3.3.0-py3-none-any.whl (404 kB)\n",
432
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m404.2/404.2 kB\u001b[0m \u001b[31m110.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
433
+ "\u001b[?25hDownloading rich-13.5.2-py3-none-any.whl (239 kB)\n",
434
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m239.7/239.7 kB\u001b[0m \u001b[31m170.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
435
+ "\u001b[?25hDownloading tqdm-4.66.1-py3-none-any.whl (78 kB)\n",
436
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m78.3/78.3 kB\u001b[0m \u001b[31m407.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
437
+ "\u001b[?25hDownloading alembic-1.11.3-py3-none-any.whl (225 kB)\n",
438
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m225.4/225.4 kB\u001b[0m \u001b[31m106.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
439
+ "\u001b[?25hDownloading cmaes-0.10.0-py3-none-any.whl (29 kB)\n",
440
+ "Downloading DataProperty-1.0.1-py3-none-any.whl (27 kB)\n",
441
+ "Downloading huggingface_hub-0.16.4-py3-none-any.whl (268 kB)\n",
442
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m268.8/268.8 kB\u001b[0m \u001b[31m83.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
443
+ "\u001b[?25hDownloading importlib_metadata-6.8.0-py3-none-any.whl (22 kB)\n",
444
+ "Downloading markdown_it_py-3.0.0-py3-none-any.whl (87 kB)\n",
445
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m87.5/87.5 kB\u001b[0m \u001b[31m335.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
446
+ "\u001b[?25hDownloading mbstrdecoder-1.1.3-py3-none-any.whl (7.8 kB)\n",
447
+ "Downloading numpy-1.25.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.3 MB)\n",
448
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m18.3/18.3 MB\u001b[0m \u001b[31m39.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
449
+ "\u001b[?25hDownloading SQLAlchemy-2.0.20-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)\n",
450
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m3.0/3.0 MB\u001b[0m \u001b[31m42.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
451
+ "\u001b[?25hDownloading stable_baselines3-2.1.0-py3-none-any.whl (178 kB)\n",
452
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m178.7/178.7 kB\u001b[0m \u001b[31m289.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
453
+ "\u001b[?25hDownloading wasabi-1.1.2-py3-none-any.whl (27 kB)\n",
454
+ "Downloading chardet-5.2.0-py3-none-any.whl (199 kB)\n",
455
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m199.4/199.4 kB\u001b[0m \u001b[31m179.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
456
+ "\u001b[?25hDownloading gymnasium-0.29.1-py3-none-any.whl (953 kB)\n",
457
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m953.9/953.9 kB\u001b[0m \u001b[31m40.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
458
+ "\u001b[?25hDownloading typepy-1.3.1-py3-none-any.whl (31 kB)\n",
459
+ "Downloading zipp-3.16.2-py3-none-any.whl (7.2 kB)\n",
460
+ "Downloading filelock-3.12.3-py3-none-any.whl (11 kB)\n",
461
+ "Downloading fsspec-2023.6.0-py3-none-any.whl (163 kB)\n",
462
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m163.8/163.8 kB\u001b[0m \u001b[31m78.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
463
+ "\u001b[?25hDownloading matplotlib-3.7.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.6 MB)\n",
464
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m11.6/11.6 MB\u001b[0m \u001b[31m30.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
465
+ "\u001b[?25hDownloading pandas-2.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.7 MB)\n",
466
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m12.7/12.7 MB\u001b[0m \u001b[31m24.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
467
+ "\u001b[?25hDownloading requests-2.31.0-py3-none-any.whl (62 kB)\n",
468
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m62.6/62.6 kB\u001b[0m \u001b[31m333.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
469
+ "\u001b[?25hDownloading certifi-2023.7.22-py3-none-any.whl (158 kB)\n",
470
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m158.3/158.3 kB\u001b[0m \u001b[31m56.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
471
+ "\u001b[?25hDownloading charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (202 kB)\n",
472
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m202.1/202.1 kB\u001b[0m \u001b[31m56.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
473
+ "\u001b[?25hDownloading contourpy-1.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (300 kB)\n",
474
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m300.4/300.4 kB\u001b[0m \u001b[31m49.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
475
+ "\u001b[?25hDownloading fonttools-4.42.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB)\n",
476
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m4.5/4.5 MB\u001b[0m \u001b[31m23.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
477
+ "\u001b[?25hDownloading importlib_resources-6.0.1-py3-none-any.whl (34 kB)\n",
478
+ "Downloading kiwisolver-1.4.5-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.6 MB)\n",
479
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.6/1.6 MB\u001b[0m \u001b[31m24.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
480
+ "\u001b[?25hDownloading MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)\n",
481
+ "Downloading Pillow-10.0.0-cp39-cp39-manylinux_2_28_x86_64.whl (3.4 MB)\n",
482
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m3.4/3.4 MB\u001b[0m \u001b[31m24.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
483
+ "\u001b[?25hDownloading urllib3-2.0.4-py3-none-any.whl (123 kB)\n",
484
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m123.9/123.9 kB\u001b[0m \u001b[31m77.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
485
+ "\u001b[?25hDownloading cmake-3.27.2-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (26.1 MB)\n",
486
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m26.1/26.1 MB\u001b[0m \u001b[31m26.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
487
+ "\u001b[?25hBuilding wheels for collected packages: rl-zoo3, gym, lit\n",
488
+ " Building wheel for rl-zoo3 (pyproject.toml) ... \u001b[?25ldone\n",
489
+ "\u001b[?25h Created wheel for rl-zoo3: filename=rl_zoo3-2.0.0a9-py3-none-any.whl size=76401 sha256=353bea9860f77205fe25632e85dae5e55adf610df4c5bfda0669069404ebc74c\n",
490
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-l6mgejm0/wheels/2e/72/ca/842315ce52754f44dbc51302f9a394003c573ce992b71bce0e\n",
491
+ " Building wheel for gym (pyproject.toml) ... \u001b[?25ldone\n",
492
+ "\u001b[?25h Created wheel for gym: filename=gym-0.26.2-py3-none-any.whl size=827620 sha256=a2aa9bf3431831fab18c0fa3b10d66ea971df439509be25a2e55c49d07fed39d\n",
493
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-l6mgejm0/wheels/af/2b/30/5e78b8b9599f2a2286a582b8da80594f654bf0e18d825a4405\n",
494
+ " Building wheel for lit (pyproject.toml) ... \u001b[?25ldone\n",
495
+ "\u001b[?25h Created wheel for lit: filename=lit-16.0.6-py3-none-any.whl size=93584 sha256=2f714142002d212723435f09c3ed2f28fadaf9c94f2df889dadf3771af353229\n",
496
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-l6mgejm0/wheels/a5/36/d6/cac2e6fb891889b33a548f2fddb8b4b7726399aaa2ed32b188\n",
497
+ "Successfully built rl-zoo3 gym lit\n",
498
+ "Installing collected packages: pytz, mpmath, lit, gym-notices, farama-notifications, cmake, zipp, wasabi, urllib3, tzdata, tqdm, tcolorpy, sympy, pyyaml, pyparsing, pillow, pathvalidate, nvidia-nvtx-cu11, nvidia-nccl-cu11, nvidia-cusparse-cu11, nvidia-curand-cu11, nvidia-cufft-cu11, nvidia-cuda-runtime-cu11, nvidia-cuda-nvrtc-cu11, nvidia-cuda-cupti-cu11, nvidia-cublas-cu11, numpy, networkx, mdurl, MarkupSafe, kiwisolver, idna, greenlet, fsspec, fonttools, filelock, cycler, colorlog, cloudpickle, charset-normalizer, chardet, certifi, sqlalchemy, requests, pandas, nvidia-cusolver-cu11, nvidia-cudnn-cu11, mbstrdecoder, markdown-it-py, Mako, jinja2, importlib-resources, importlib-metadata, contourpy, cmaes, typepy, rich, matplotlib, huggingface-hub, gymnasium, gym, alembic, optuna, huggingface-sb3, DataProperty, tabledata, pytablewriter, triton, torch, stable-baselines3, sb3-contrib, rl-zoo3\n",
499
+ "Successfully installed DataProperty-1.0.1 Mako-1.2.4 MarkupSafe-2.1.3 alembic-1.11.3 certifi-2023.7.22 chardet-5.2.0 charset-normalizer-3.2.0 cloudpickle-2.2.1 cmaes-0.10.0 cmake-3.27.2 colorlog-6.7.0 contourpy-1.1.0 cycler-0.11.0 farama-notifications-0.0.4 filelock-3.12.3 fonttools-4.42.1 fsspec-2023.6.0 greenlet-2.0.2 gym-0.26.2 gym-notices-0.0.8 gymnasium-0.29.1 huggingface-hub-0.16.4 huggingface-sb3-2.3 idna-3.4 importlib-metadata-6.8.0 importlib-resources-6.0.1 jinja2-3.1.2 kiwisolver-1.4.5 lit-16.0.6 markdown-it-py-3.0.0 matplotlib-3.7.2 mbstrdecoder-1.1.3 mdurl-0.1.2 mpmath-1.3.0 networkx-3.1 numpy-1.25.2 nvidia-cublas-cu11-11.10.3.66 nvidia-cuda-cupti-cu11-11.7.101 nvidia-cuda-nvrtc-cu11-11.7.99 nvidia-cuda-runtime-cu11-11.7.99 nvidia-cudnn-cu11-8.5.0.96 nvidia-cufft-cu11-10.9.0.58 nvidia-curand-cu11-10.2.10.91 nvidia-cusolver-cu11-11.4.0.1 nvidia-cusparse-cu11-11.7.4.91 nvidia-nccl-cu11-2.14.3 nvidia-nvtx-cu11-11.7.91 optuna-3.3.0 pandas-2.1.0 pathvalidate-2.5.2 pillow-10.0.0 pyparsing-3.0.9 pytablewriter-0.64.2 pytz-2023.3 pyyaml-6.0.1 requests-2.31.0 rich-13.5.2 rl-zoo3-2.0.0a9 sb3-contrib-2.1.0 sqlalchemy-2.0.20 stable-baselines3-2.1.0 sympy-1.12 tabledata-1.3.1 tcolorpy-0.1.3 torch-2.0.1 tqdm-4.66.1 triton-2.0.0 typepy-1.3.1 tzdata-2023.3 urllib3-2.0.4 wasabi-1.1.2 zipp-3.16.2\n"
500
+ ]
501
+ }
502
+ ],
503
+ "source": [
504
+ "# For now we install this update of RL-Baselines3 Zoo\n",
505
+ "!pip install git+https://github.com/DLR-RM/rl-baselines3-zoo@update/hf"
506
+ ]
507
+ },
508
+ {
509
+ "cell_type": "markdown",
510
+ "metadata": {
511
+ "id": "p0xe2sJHdtHy"
512
+ },
513
+ "source": [
514
+ "IF AND ONLY IF THE VERSION ABOVE DOES NOT EXIST ANYMORE. UNCOMMENT AND INSTALL THE ONE BELOW"
515
+ ]
516
+ },
517
+ {
518
+ "cell_type": "code",
519
+ "execution_count": null,
520
+ "metadata": {
521
+ "id": "N0d6wy-F-f39"
522
+ },
523
+ "outputs": [],
524
+ "source": [
525
+ "#!pip install rl_zoo3==2.0.0a9"
526
+ ]
527
+ },
528
+ {
529
+ "cell_type": "code",
530
+ "execution_count": 3,
531
+ "metadata": {
532
+ "id": "8_MllY6Om1eI"
533
+ },
534
+ "outputs": [
535
+ {
536
+ "name": "stdout",
537
+ "output_type": "stream",
538
+ "text": [
539
+ "[sudo] password for zhu: \n"
540
+ ]
541
+ }
542
+ ],
543
+ "source": [
544
+ "!apt-get install swig cmake ffmpeg"
545
+ ]
546
+ },
547
+ {
548
+ "cell_type": "markdown",
549
+ "metadata": {
550
+ "id": "4S9mJiKg6SqC"
551
+ },
552
+ "source": [
553
+ "To be able to use Atari games in Gymnasium we need to install atari package. And accept-rom-license to download the rom files (games files)."
554
+ ]
555
+ },
556
+ {
557
+ "cell_type": "code",
558
+ "execution_count": 4,
559
+ "metadata": {
560
+ "id": "NsRP-lX1_2fC"
561
+ },
562
+ "outputs": [
563
+ {
564
+ "name": "stdout",
565
+ "output_type": "stream",
566
+ "text": [
567
+ "Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com\n",
568
+ "Requirement already satisfied: gymnasium[atari] in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (0.29.1)\n",
569
+ "Requirement already satisfied: numpy>=1.21.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[atari]) (1.25.2)\n",
570
+ "Requirement already satisfied: cloudpickle>=1.2.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[atari]) (2.2.1)\n",
571
+ "Requirement already satisfied: typing-extensions>=4.3.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[atari]) (4.7.1)\n",
572
+ "Requirement already satisfied: farama-notifications>=0.0.1 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[atari]) (0.0.4)\n",
573
+ "Requirement already satisfied: importlib-metadata>=4.8.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[atari]) (6.8.0)\n",
574
+ "Collecting shimmy[atari]<1.0,>=0.1.0 (from gymnasium[atari])\n",
575
+ " Downloading Shimmy-0.2.1-py3-none-any.whl (25 kB)\n",
576
+ "Requirement already satisfied: zipp>=0.5 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from importlib-metadata>=4.8.0->gymnasium[atari]) (3.16.2)\n",
577
+ "Collecting ale-py~=0.8.1 (from shimmy[atari]<1.0,>=0.1.0->gymnasium[atari])\n",
578
+ " Downloading ale_py-0.8.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.7 MB)\n",
579
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.7/1.7 MB\u001b[0m \u001b[31m24.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m \u001b[36m0:00:01\u001b[0m\n",
580
+ "\u001b[?25hRequirement already satisfied: importlib-resources in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from ale-py~=0.8.1->shimmy[atari]<1.0,>=0.1.0->gymnasium[atari]) (6.0.1)\n",
581
+ "Installing collected packages: ale-py, shimmy\n",
582
+ "Successfully installed ale-py-0.8.1 shimmy-0.2.1\n",
583
+ "Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com\n",
584
+ "Requirement already satisfied: gymnasium[accept-rom-license] in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (0.29.1)\n",
585
+ "Requirement already satisfied: numpy>=1.21.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[accept-rom-license]) (1.25.2)\n",
586
+ "Requirement already satisfied: cloudpickle>=1.2.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[accept-rom-license]) (2.2.1)\n",
587
+ "Requirement already satisfied: typing-extensions>=4.3.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[accept-rom-license]) (4.7.1)\n",
588
+ "Requirement already satisfied: farama-notifications>=0.0.1 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[accept-rom-license]) (0.0.4)\n",
589
+ "Requirement already satisfied: importlib-metadata>=4.8.0 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from gymnasium[accept-rom-license]) (6.8.0)\n",
590
+ "Collecting autorom[accept-rom-license]~=0.4.2 (from gymnasium[accept-rom-license])\n",
591
+ " Downloading AutoROM-0.4.2-py3-none-any.whl (16 kB)\n",
592
+ "Collecting click (from autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license])\n",
593
+ " Obtaining dependency information for click from https://files.pythonhosted.org/packages/00/2e/d53fa4befbf2cfa713304affc7ca780ce4fc1fd8710527771b58311a3229/click-8.1.7-py3-none-any.whl.metadata\n",
594
+ " Downloading click-8.1.7-py3-none-any.whl.metadata (3.0 kB)\n",
595
+ "Requirement already satisfied: requests in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license]) (2.31.0)\n",
596
+ "Requirement already satisfied: tqdm in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license]) (4.66.1)\n",
597
+ "Collecting AutoROM.accept-rom-license (from autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license])\n",
598
+ " Downloading AutoROM.accept-rom-license-0.6.1.tar.gz (434 kB)\n",
599
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m434.7/434.7 kB\u001b[0m \u001b[31m7.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0ma \u001b[36m0:00:01\u001b[0m\n",
600
+ "\u001b[?25h Installing build dependencies ... \u001b[?25ldone\n",
601
+ "\u001b[?25h Getting requirements to build wheel ... \u001b[?25ldone\n",
602
+ "\u001b[?25h Preparing metadata (pyproject.toml) ... \u001b[?25ldone\n",
603
+ "\u001b[?25hRequirement already satisfied: zipp>=0.5 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from importlib-metadata>=4.8.0->gymnasium[accept-rom-license]) (3.16.2)\n",
604
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from requests->autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license]) (3.2.0)\n",
605
+ "Requirement already satisfied: idna<4,>=2.5 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from requests->autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license]) (3.4)\n",
606
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from requests->autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license]) (2.0.4)\n",
607
+ "Requirement already satisfied: certifi>=2017.4.17 in /home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages (from requests->autorom[accept-rom-license]~=0.4.2->gymnasium[accept-rom-license]) (2023.7.22)\n",
608
+ "Downloading click-8.1.7-py3-none-any.whl (97 kB)\n",
609
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m97.9/97.9 kB\u001b[0m \u001b[31m305.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
610
+ "\u001b[?25hBuilding wheels for collected packages: AutoROM.accept-rom-license\n",
611
+ " Building wheel for AutoROM.accept-rom-license (pyproject.toml) ... \u001b[?25ldone\n",
612
+ "\u001b[?25h Created wheel for AutoROM.accept-rom-license: filename=AutoROM.accept_rom_license-0.6.1-py3-none-any.whl size=446660 sha256=a3d153bcb7e9a8b04468055c46336a6ca39b5ad80b31d57f3dd49f540c3bd889\n",
613
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-dqje2tuh/wheels/b1/1f/f7/2da07cf4f81ea264bdaf043028749d88fe0c2227134a22cf80\n",
614
+ "Successfully built AutoROM.accept-rom-license\n",
615
+ "Installing collected packages: click, AutoROM.accept-rom-license, autorom\n",
616
+ "Successfully installed AutoROM.accept-rom-license-0.6.1 autorom-0.4.2 click-8.1.7\n"
617
+ ]
618
+ }
619
+ ],
620
+ "source": [
621
+ "!pip install gymnasium[atari]\n",
622
+ "!pip install gymnasium[accept-rom-license]"
623
+ ]
624
+ },
625
+ {
626
+ "cell_type": "markdown",
627
+ "metadata": {
628
+ "id": "bTpYcVZVMzUI"
629
+ },
630
+ "source": [
631
+ "## Create a virtual display 🔽\n",
632
+ "\n",
633
+ "During the notebook, we'll need to generate a replay video. To do so, with colab, **we need to have a virtual screen to be able to render the environment** (and thus record the frames).\n",
634
+ "\n",
635
+ "Hence the following cell will install the librairies and create and run a virtual screen 🖥"
636
+ ]
637
+ },
638
+ {
639
+ "cell_type": "code",
640
+ "execution_count": null,
641
+ "metadata": {
642
+ "id": "jV6wjQ7Be7p5"
643
+ },
644
+ "outputs": [],
645
+ "source": [
646
+ "%%capture\n",
647
+ "!apt install python-opengl\n",
648
+ "!apt install ffmpeg\n",
649
+ "!apt install xvfb"
650
+ ]
651
+ },
652
+ {
653
+ "cell_type": "code",
654
+ "execution_count": 5,
655
+ "metadata": {},
656
+ "outputs": [
657
+ {
658
+ "name": "stdout",
659
+ "output_type": "stream",
660
+ "text": [
661
+ "Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com\n",
662
+ "Collecting pyvirtualdisplay\n",
663
+ " Downloading PyVirtualDisplay-3.0-py3-none-any.whl (15 kB)\n",
664
+ "Installing collected packages: pyvirtualdisplay\n",
665
+ "Successfully installed pyvirtualdisplay-3.0\n"
666
+ ]
667
+ }
668
+ ],
669
+ "source": [
670
+ "!pip3 install pyvirtualdisplay"
671
+ ]
672
+ },
673
+ {
674
+ "cell_type": "code",
675
+ "execution_count": 3,
676
+ "metadata": {
677
+ "id": "BE5JWP5rQIKf"
678
+ },
679
+ "outputs": [],
680
+ "source": [
681
+ "# Virtual display\n",
682
+ "from pyvirtualdisplay import Display\n",
683
+ "\n",
684
+ "virtual_display = Display(visible=0, size=(1400, 900))\n",
685
+ "virtual_display.start()"
686
+ ]
687
+ },
688
+ {
689
+ "cell_type": "markdown",
690
+ "metadata": {
691
+ "id": "XHGrMu07oOW0"
692
+ },
693
+ "source": [
694
+ "## Train our Deep Q-Learning Agent to Play Space Invaders 👾\n",
695
+ "\n",
696
+ "To train an agent with RL-Baselines3-Zoo, we just need to do two things:\n",
697
+ "\n",
698
+ "1. Create a hyperparameter config file that will contain our training hyperparameters called `dqn.yml`.\n",
699
+ "\n",
700
+ "This is a template example:\n",
701
+ "\n",
702
+ "```\n",
703
+ "SpaceInvadersNoFrameskip-v4:\n",
704
+ " env_wrapper:\n",
705
+ " - stable_baselines3.common.atari_wrappers.AtariWrapper\n",
706
+ " frame_stack: 4\n",
707
+ " policy: 'CnnPolicy'\n",
708
+ " n_timesteps: !!float 1e7\n",
709
+ " buffer_size: 100000\n",
710
+ " learning_rate: !!float 1e-4\n",
711
+ " batch_size: 32\n",
712
+ " learning_starts: 100000\n",
713
+ " target_update_interval: 1000\n",
714
+ " train_freq: 4\n",
715
+ " gradient_steps: 1\n",
716
+ " exploration_fraction: 0.1\n",
717
+ " exploration_final_eps: 0.01\n",
718
+ " # If True, you need to deactivate handle_timeout_termination\n",
719
+ " # in the replay_buffer_kwargs\n",
720
+ " optimize_memory_usage: False\n",
721
+ "```"
722
+ ]
723
+ },
724
+ {
725
+ "cell_type": "markdown",
726
+ "metadata": {
727
+ "id": "_VjblFSVDQOj"
728
+ },
729
+ "source": [
730
+ "Here we see that:\n",
731
+ "- We use the `Atari Wrapper` that preprocess the input (Frame reduction ,grayscale, stack 4 frames)\n",
732
+ "- We use `CnnPolicy`, since we use Convolutional layers to process the frames\n",
733
+ "- We train it for 10 million `n_timesteps`\n",
734
+ "- Memory (Experience Replay) size is 100000, aka the amount of experience steps you saved to train again your agent with.\n",
735
+ "\n",
736
+ "💡 My advice is to **reduce the training timesteps to 1M,** which will take about 90 minutes on a P100. `!nvidia-smi` will tell you what GPU you're using. At 10 million steps, this will take about 9 hours, which could likely result in Colab timing out. I recommend running this on your local computer (or somewhere else). Just click on: `File>Download`."
737
+ ]
738
+ },
739
+ {
740
+ "cell_type": "markdown",
741
+ "metadata": {
742
+ "id": "5qTkbWrkECOJ"
743
+ },
744
+ "source": [
745
+ "In terms of hyperparameters optimization, my advice is to focus on these 3 hyperparameters:\n",
746
+ "- `learning_rate`\n",
747
+ "- `buffer_size (Experience Memory size)`\n",
748
+ "- `batch_size`\n",
749
+ "\n",
750
+ "As a good practice, you need to **check the documentation to understand what each hyperparameters does**: https://stable-baselines3.readthedocs.io/en/master/modules/dqn.html#parameters\n",
751
+ "\n"
752
+ ]
753
+ },
754
+ {
755
+ "cell_type": "markdown",
756
+ "metadata": {
757
+ "id": "Hn8bRTHvERRL"
758
+ },
759
+ "source": [
760
+ "2. We start the training and save the models on `logs` folder 📁\n",
761
+ "\n",
762
+ "- Define the algorithm after `--algo`, where we save the model after `-f` and where the hyperparameter config is after `-c`."
763
+ ]
764
+ },
765
+ {
766
+ "cell_type": "code",
767
+ "execution_count": 2,
768
+ "metadata": {
769
+ "id": "Xr1TVW4xfbz3"
770
+ },
771
+ "outputs": [
772
+ {
773
+ "name": "stdout",
774
+ "output_type": "stream",
775
+ "text": [
776
+ "========== SpaceInvadersNoFrameskip-v4 ==========\n",
777
+ "Seed: 364634905\n",
778
+ "Loading hyperparameters from: dqn.yml\n",
779
+ "Default hyperparameters for environment (ones being tuned will be overridden):\n",
780
+ "OrderedDict([('batch_size', 32),\n",
781
+ " ('buffer_size', 100000),\n",
782
+ " ('env_wrapper',\n",
783
+ " ['stable_baselines3.common.atari_wrappers.AtariWrapper']),\n",
784
+ " ('exploration_final_eps', 0.01),\n",
785
+ " ('exploration_fraction', 0.1),\n",
786
+ " ('frame_stack', 4),\n",
787
+ " ('gradient_steps', 1),\n",
788
+ " ('learning_rate', 0.0001),\n",
789
+ " ('learning_starts', 100000),\n",
790
+ " ('n_timesteps', 12000000.0),\n",
791
+ " ('optimize_memory_usage', False),\n",
792
+ " ('policy', 'CnnPolicy'),\n",
793
+ " ('target_update_interval', 1000),\n",
794
+ " ('train_freq', 4)])\n",
795
+ "Using 1 environments\n",
796
+ "Creating test environment\n",
797
+ "A.L.E: Arcade Learning Environment (version 0.8.1+53f58b7)\n",
798
+ "[Powered by Stella]\n",
799
+ "Stacking 4 frames\n",
800
+ "Wrapping the env in a VecTransposeImage.\n",
801
+ "Stacking 4 frames\n",
802
+ "Wrapping the env in a VecTransposeImage.\n",
803
+ "Using cuda device\n",
804
+ "Log path: logs//dqn/SpaceInvadersNoFrameskip-v4_1\n",
805
+ "Traceback (most recent call last):\n",
806
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/runpy.py\", line 197, in _run_module_as_main\n",
807
+ " return _run_code(code, main_globals, None,\n",
808
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/runpy.py\", line 87, in _run_code\n",
809
+ " exec(code, run_globals)\n",
810
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/rl_zoo3/train.py\", line 274, in <module>\n",
811
+ " train()\n",
812
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/rl_zoo3/train.py\", line 267, in train\n",
813
+ " exp_manager.learn(model)\n",
814
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/rl_zoo3/exp_manager.py\", line 236, in learn\n",
815
+ " model.learn(self.n_timesteps, **kwargs)\n",
816
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/dqn/dqn.py\", line 267, in learn\n",
817
+ " return super().learn(\n",
818
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/off_policy_algorithm.py\", line 301, in learn\n",
819
+ " total_timesteps, callback = self._setup_learn(\n",
820
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/off_policy_algorithm.py\", line 284, in _setup_learn\n",
821
+ " return super()._setup_learn(\n",
822
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/base_class.py\", line 424, in _setup_learn\n",
823
+ " self._last_obs = self.env.reset() # type: ignore[assignment]\n",
824
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/vec_env/vec_transpose.py\", line 110, in reset\n",
825
+ " return self.transpose_observations(self.venv.reset())\n",
826
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/vec_env/vec_frame_stack.py\", line 41, in reset\n",
827
+ " observation = self.venv.reset() # pytype:disable=annotation-type-mismatch\n",
828
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/vec_env/dummy_vec_env.py\", line 76, in reset\n",
829
+ " obs, self.reset_infos[env_idx] = self.envs[env_idx].reset(seed=self._seeds[env_idx])\n",
830
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/gymnasium/core.py\", line 467, in reset\n",
831
+ " return self.env.reset(seed=seed, options=options)\n",
832
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/gymnasium/core.py\", line 467, in reset\n",
833
+ " return self.env.reset(seed=seed, options=options)\n",
834
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/gymnasium/core.py\", line 516, in reset\n",
835
+ " return self.observation(obs), info\n",
836
+ " File \"/home/zhu/miniconda3/envs/hf39/lib/python3.9/site-packages/stable_baselines3/common/atari_wrappers.py\", line 244, in observation\n",
837
+ " assert cv2 is not None, \"OpenCV is not installed, you can do `pip install opencv-python`\"\n",
838
+ "AssertionError: OpenCV is not installed, you can do `pip install opencv-python`\n"
839
+ ]
840
+ }
841
+ ],
842
+ "source": [
843
+ "!python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -c dqn.yml"
844
+ ]
845
+ },
846
+ {
847
+ "cell_type": "markdown",
848
+ "metadata": {
849
+ "id": "SeChoX-3SZfP"
850
+ },
851
+ "source": [
852
+ "#### Solution"
853
+ ]
854
+ },
855
+ {
856
+ "cell_type": "code",
857
+ "execution_count": null,
858
+ "metadata": {
859
+ "id": "PuocgdokSab9"
860
+ },
861
+ "outputs": [],
862
+ "source": [
863
+ "!python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -c dqn.yml"
864
+ ]
865
+ },
866
+ {
867
+ "cell_type": "markdown",
868
+ "metadata": {
869
+ "id": "_dLomIiMKQaf"
870
+ },
871
+ "source": [
872
+ "## Let's evaluate our agent 👀\n",
873
+ "- RL-Baselines3-Zoo provides `enjoy.py`, a python script to evaluate our agent. In most RL libraries, we call the evaluation script `enjoy.py`.\n",
874
+ "- Let's evaluate it for 5000 timesteps 🔥"
875
+ ]
876
+ },
877
+ {
878
+ "cell_type": "code",
879
+ "execution_count": null,
880
+ "metadata": {
881
+ "id": "co5um_KeKbBJ"
882
+ },
883
+ "outputs": [],
884
+ "source": [
885
+ "!python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 --no-render --n-timesteps 5000 --folder logs/"
886
+ ]
887
+ },
888
+ {
889
+ "cell_type": "markdown",
890
+ "metadata": {
891
+ "id": "Q24K1tyWSj7t"
892
+ },
893
+ "source": [
894
+ "#### Solution"
895
+ ]
896
+ },
897
+ {
898
+ "cell_type": "code",
899
+ "execution_count": null,
900
+ "metadata": {
901
+ "id": "P_uSmwGRSk0z"
902
+ },
903
+ "outputs": [],
904
+ "source": [
905
+ "!python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 --no-render --n-timesteps 5000 --folder logs/"
906
+ ]
907
+ },
908
+ {
909
+ "cell_type": "markdown",
910
+ "metadata": {
911
+ "id": "liBeTltiHJtr"
912
+ },
913
+ "source": [
914
+ "## Publish our trained model on the Hub 🚀\n",
915
+ "Now that we saw we got good results after the training, we can publish our trained model on the hub 🤗 with one line of code.\n",
916
+ "\n",
917
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/notebooks/unit3/space-invaders-model.gif\" alt=\"Space Invaders model\">"
918
+ ]
919
+ },
920
+ {
921
+ "cell_type": "markdown",
922
+ "metadata": {
923
+ "id": "ezbHS1q3HYVV"
924
+ },
925
+ "source": [
926
+ "By using `rl_zoo3.push_to_hub` **you evaluate, record a replay, generate a model card of your agent and push it to the hub**.\n",
927
+ "\n",
928
+ "This way:\n",
929
+ "- You can **showcase our work** 🔥\n",
930
+ "- You can **visualize your agent playing** 👀\n",
931
+ "- You can **share with the community an agent that others can use** 💾\n",
932
+ "- You can **access a leaderboard 🏆 to see how well your agent is performing compared to your classmates** 👉 https://huggingface.co/spaces/huggingface-projects/Deep-Reinforcement-Learning-Leaderboard"
933
+ ]
934
+ },
935
+ {
936
+ "cell_type": "markdown",
937
+ "metadata": {
938
+ "id": "XMSeZRBiHk6X"
939
+ },
940
+ "source": [
941
+ "To be able to share your model with the community there are three more steps to follow:\n",
942
+ "\n",
943
+ "1️⃣ (If it's not already done) create an account to HF ➡ https://huggingface.co/join\n",
944
+ "\n",
945
+ "2️⃣ Sign in and then, you need to store your authentication token from the Hugging Face website.\n",
946
+ "- Create a new token (https://huggingface.co/settings/tokens) **with write role**\n",
947
+ "\n",
948
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/notebooks/create-token.jpg\" alt=\"Create HF Token\">"
949
+ ]
950
+ },
951
+ {
952
+ "cell_type": "markdown",
953
+ "metadata": {
954
+ "id": "9O6FI0F8HnzE"
955
+ },
956
+ "source": [
957
+ "- Copy the token\n",
958
+ "- Run the cell below and past the token"
959
+ ]
960
+ },
961
+ {
962
+ "cell_type": "code",
963
+ "execution_count": 1,
964
+ "metadata": {
965
+ "id": "Ppu9yePwHrZX"
966
+ },
967
+ "outputs": [
968
+ {
969
+ "data": {
970
+ "application/vnd.jupyter.widget-view+json": {
971
+ "model_id": "e31577a26311464dab1bca2583b252a6",
972
+ "version_major": 2,
973
+ "version_minor": 0
974
+ },
975
+ "text/plain": [
976
+ "VBox(children=(HTML(value='<center> <img\\nsrc=https://huggingface.co/front/assets/huggingface_logo-noborder.sv…"
977
+ ]
978
+ },
979
+ "metadata": {},
980
+ "output_type": "display_data"
981
+ },
982
+ {
983
+ "name": "stdout",
984
+ "output_type": "stream",
985
+ "text": [
986
+ "/bin/bash: /home/zhu/miniconda3/envs/hf39/lib/libtinfo.so.6: no version information available (required by /bin/bash)\n"
987
+ ]
988
+ }
989
+ ],
990
+ "source": [
991
+ "from huggingface_hub import notebook_login # To log to our Hugging Face account to be able to upload models to the Hub.\n",
992
+ "notebook_login()\n",
993
+ "!git config --global credential.helper store"
994
+ ]
995
+ },
996
+ {
997
+ "cell_type": "markdown",
998
+ "metadata": {
999
+ "id": "2RVEdunPHs8B"
1000
+ },
1001
+ "source": [
1002
+ "If you don't want to use a Google Colab or a Jupyter Notebook, you need to use this command instead: `huggingface-cli login`"
1003
+ ]
1004
+ },
1005
+ {
1006
+ "cell_type": "markdown",
1007
+ "metadata": {
1008
+ "id": "dSLwdmvhHvjw"
1009
+ },
1010
+ "source": [
1011
+ "3️⃣ We're now ready to push our trained agent to the 🤗 Hub 🔥"
1012
+ ]
1013
+ },
1014
+ {
1015
+ "cell_type": "markdown",
1016
+ "metadata": {
1017
+ "id": "PW436XnhHw1H"
1018
+ },
1019
+ "source": [
1020
+ "Let's run push_to_hub.py file to upload our trained agent to the Hub.\n",
1021
+ "\n",
1022
+ "`--repo-name `: The name of the repo\n",
1023
+ "\n",
1024
+ "`-orga`: Your Hugging Face username\n",
1025
+ "\n",
1026
+ "`-f`: Where the trained model folder is (in our case `logs`)\n",
1027
+ "\n",
1028
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/notebooks/unit3/select-id.png\" alt=\"Select Id\">"
1029
+ ]
1030
+ },
1031
+ {
1032
+ "cell_type": "code",
1033
+ "execution_count": 3,
1034
+ "metadata": {
1035
+ "id": "Ygk2sEktTDEw"
1036
+ },
1037
+ "outputs": [
1038
+ {
1039
+ "name": "stdout",
1040
+ "output_type": "stream",
1041
+ "text": [
1042
+ "/bin/bash: /home/zhu/miniconda3/envs/hf39/lib/libtinfo.so.6: no version information available (required by /bin/bash)\n",
1043
+ "Loading latest experiment, id=2\n",
1044
+ "Loading logs/dqn/SpaceInvadersNoFrameskip-v4_2/SpaceInvadersNoFrameskip-v4.zip\n",
1045
+ "A.L.E: Arcade Learning Environment (version 0.8.1+53f58b7)\n",
1046
+ "[Powered by Stella]\n",
1047
+ "Stacking 4 frames\n",
1048
+ "Wrapping the env in a VecTransposeImage.\n",
1049
+ "Uploading to czl/SpaceInvadersNoFrameskip-v4, make sure to have the rights\n",
1050
+ "\u001b[38;5;4mℹ This function will save, evaluate, generate a video of your agent,\n",
1051
+ "create a model card and push everything to the hub. It might take up to some\n",
1052
+ "minutes if video generation is activated. This is a work in progress: if you\n",
1053
+ "encounter a bug, please open an issue.\u001b[0m\n",
1054
+ "Cloning https://huggingface.co/czl/SpaceInvadersNoFrameskip-v4 into local empty directory.\n",
1055
+ "WARNING:huggingface_hub.repository:Cloning https://huggingface.co/czl/SpaceInvadersNoFrameskip-v4 into local empty directory.\n",
1056
+ "Saving model to: hub/SpaceInvadersNoFrameskip-v4/dqn-SpaceInvadersNoFrameskip-v4\n",
1057
+ "Could not load library libcudnn_cnn_infer.so.8. Error: libcuda.so: cannot open shared object file: No such file or directory\n"
1058
+ ]
1059
+ }
1060
+ ],
1061
+ "source": [
1062
+ "!python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 --repo-name SpaceInvadersNoFrameskip-v4 -orga czl -f logs/"
1063
+ ]
1064
+ },
1065
+ {
1066
+ "cell_type": "markdown",
1067
+ "metadata": {
1068
+ "id": "otgpa0rhS9wR"
1069
+ },
1070
+ "source": [
1071
+ "#### Solution"
1072
+ ]
1073
+ },
1074
+ {
1075
+ "cell_type": "code",
1076
+ "execution_count": null,
1077
+ "metadata": {
1078
+ "id": "_HQNlAXuEhci"
1079
+ },
1080
+ "outputs": [],
1081
+ "source": [
1082
+ "!python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 --repo-name dqn-SpaceInvadersNoFrameskip-v4 -orga ThomasSimonini -f logs/"
1083
+ ]
1084
+ },
1085
+ {
1086
+ "cell_type": "markdown",
1087
+ "metadata": {
1088
+ "id": "0D4F5zsTTJ-L"
1089
+ },
1090
+ "source": [
1091
+ "###."
1092
+ ]
1093
+ },
1094
+ {
1095
+ "cell_type": "markdown",
1096
+ "metadata": {
1097
+ "id": "ff89kd2HL1_s"
1098
+ },
1099
+ "source": [
1100
+ "Congrats 🥳 you've just trained and uploaded your first Deep Q-Learning agent using RL-Baselines-3 Zoo. The script above should have displayed a link to a model repository such as https://huggingface.co/ThomasSimonini/dqn-SpaceInvadersNoFrameskip-v4. When you go to this link, you can:\n",
1101
+ "\n",
1102
+ "- See a **video preview of your agent** at the right.\n",
1103
+ "- Click \"Files and versions\" to see all the files in the repository.\n",
1104
+ "- Click \"Use in stable-baselines3\" to get a code snippet that shows how to load the model.\n",
1105
+ "- A model card (`README.md` file) which gives a description of the model and the hyperparameters you used.\n",
1106
+ "\n",
1107
+ "Under the hood, the Hub uses git-based repositories (don't worry if you don't know what git is), which means you can update the model with new versions as you experiment and improve your agent.\n",
1108
+ "\n",
1109
+ "**Compare the results of your agents with your classmates** using the [leaderboard](https://huggingface.co/spaces/huggingface-projects/Deep-Reinforcement-Learning-Leaderboard) 🏆"
1110
+ ]
1111
+ },
1112
+ {
1113
+ "cell_type": "markdown",
1114
+ "metadata": {
1115
+ "id": "fyRKcCYY-dIo"
1116
+ },
1117
+ "source": [
1118
+ "## Load a powerful trained model 🔥\n",
1119
+ "- The Stable-Baselines3 team uploaded **more than 150 trained Deep Reinforcement Learning agents on the Hub**.\n",
1120
+ "\n",
1121
+ "You can find them here: 👉 https://huggingface.co/sb3\n",
1122
+ "\n",
1123
+ "Some examples:\n",
1124
+ "- Asteroids: https://huggingface.co/sb3/dqn-AsteroidsNoFrameskip-v4\n",
1125
+ "- Beam Rider: https://huggingface.co/sb3/dqn-BeamRiderNoFrameskip-v4\n",
1126
+ "- Breakout: https://huggingface.co/sb3/dqn-BreakoutNoFrameskip-v4\n",
1127
+ "- Road Runner: https://huggingface.co/sb3/dqn-RoadRunnerNoFrameskip-v4\n",
1128
+ "\n",
1129
+ "Let's load an agent playing Beam Rider: https://huggingface.co/sb3/dqn-BeamRiderNoFrameskip-v4"
1130
+ ]
1131
+ },
1132
+ {
1133
+ "cell_type": "code",
1134
+ "execution_count": null,
1135
+ "metadata": {
1136
+ "id": "B-9QVFIROI5Y"
1137
+ },
1138
+ "outputs": [],
1139
+ "source": [
1140
+ "%%html\n",
1141
+ "<video controls autoplay><source src=\"https://huggingface.co/sb3/dqn-BeamRiderNoFrameskip-v4/resolve/main/replay.mp4\" type=\"video/mp4\"></video>"
1142
+ ]
1143
+ },
1144
+ {
1145
+ "cell_type": "markdown",
1146
+ "metadata": {
1147
+ "id": "7ZQNY_r6NJtC"
1148
+ },
1149
+ "source": [
1150
+ "1. We download the model using `rl_zoo3.load_from_hub`, and place it in a new folder that we can call `rl_trained`"
1151
+ ]
1152
+ },
1153
+ {
1154
+ "cell_type": "code",
1155
+ "execution_count": null,
1156
+ "metadata": {
1157
+ "id": "OdBNZHy0NGTR"
1158
+ },
1159
+ "outputs": [],
1160
+ "source": [
1161
+ "# Download model and save it into the logs/ folder\n",
1162
+ "!python -m rl_zoo3.load_from_hub --algo dqn --env BeamRiderNoFrameskip-v4 -orga sb3 -f rl_trained/"
1163
+ ]
1164
+ },
1165
+ {
1166
+ "cell_type": "markdown",
1167
+ "metadata": {
1168
+ "id": "LFt6hmWsNdBo"
1169
+ },
1170
+ "source": [
1171
+ "2. Let's evaluate if for 5000 timesteps"
1172
+ ]
1173
+ },
1174
+ {
1175
+ "cell_type": "code",
1176
+ "execution_count": null,
1177
+ "metadata": {
1178
+ "id": "aOxs0rNuN0uS"
1179
+ },
1180
+ "outputs": [],
1181
+ "source": [
1182
+ "!python -m rl_zoo3.enjoy --algo dqn --env BeamRiderNoFrameskip-v4 -n 5000 -f rl_trained/ --no-render"
1183
+ ]
1184
+ },
1185
+ {
1186
+ "cell_type": "markdown",
1187
+ "metadata": {
1188
+ "id": "kxMDuDfPON57"
1189
+ },
1190
+ "source": [
1191
+ "Why not trying to train your own **Deep Q-Learning Agent playing BeamRiderNoFrameskip-v4? 🏆.**\n",
1192
+ "\n",
1193
+ "If you want to try, check https://huggingface.co/sb3/dqn-BeamRiderNoFrameskip-v4#hyperparameters **in the model card, you have the hyperparameters of the trained agent.**"
1194
+ ]
1195
+ },
1196
+ {
1197
+ "cell_type": "markdown",
1198
+ "metadata": {
1199
+ "id": "xL_ZtUgpOuY6"
1200
+ },
1201
+ "source": [
1202
+ "But finding hyperparameters can be a daunting task. Fortunately, we'll see in the next Unit, how we can **use Optuna for optimizing the Hyperparameters 🔥.**\n"
1203
+ ]
1204
+ },
1205
+ {
1206
+ "cell_type": "markdown",
1207
+ "metadata": {
1208
+ "id": "-pqaco8W-huW"
1209
+ },
1210
+ "source": [
1211
+ "## Some additional challenges 🏆\n",
1212
+ "The best way to learn **is to try things by your own**!\n",
1213
+ "\n",
1214
+ "In the [Leaderboard](https://huggingface.co/spaces/huggingface-projects/Deep-Reinforcement-Learning-Leaderboard) you will find your agents. Can you get to the top?\n",
1215
+ "\n",
1216
+ "Here's a list of environments you can try to train your agent with:\n",
1217
+ "- BeamRiderNoFrameskip-v4\n",
1218
+ "- BreakoutNoFrameskip-v4\n",
1219
+ "- EnduroNoFrameskip-v4\n",
1220
+ "- PongNoFrameskip-v4\n",
1221
+ "\n",
1222
+ "Also, **if you want to learn to implement Deep Q-Learning by yourself**, you definitely should look at CleanRL implementation: https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/dqn_atari.py\n",
1223
+ "\n",
1224
+ "<img src=\"https://huggingface.co/datasets/huggingface-deep-rl-course/course-images/resolve/main/en/unit4/atari-envs.gif\" alt=\"Environments\"/>"
1225
+ ]
1226
+ },
1227
+ {
1228
+ "cell_type": "markdown",
1229
+ "metadata": {
1230
+ "id": "paS-XKo4-kmu"
1231
+ },
1232
+ "source": [
1233
+ "________________________________________________________________________\n",
1234
+ "Congrats on finishing this chapter!\n",
1235
+ "\n",
1236
+ "If you’re still feel confused with all these elements...it's totally normal! **This was the same for me and for all people who studied RL.**\n",
1237
+ "\n",
1238
+ "Take time to really **grasp the material before continuing and try the additional challenges**. It’s important to master these elements and having a solid foundations.\n",
1239
+ "\n",
1240
+ "In the next unit, **we’re going to learn about [Optuna](https://optuna.org/)**. One of the most critical task in Deep Reinforcement Learning is to find a good set of training hyperparameters. And Optuna is a library that helps you to automate the search.\n",
1241
+ "\n",
1242
+ "\n"
1243
+ ]
1244
+ },
1245
+ {
1246
+ "cell_type": "markdown",
1247
+ "metadata": {
1248
+ "id": "5WRx7tO7-mvC"
1249
+ },
1250
+ "source": [
1251
+ "\n",
1252
+ "\n",
1253
+ "### This is a course built with you 👷🏿‍♀️\n",
1254
+ "\n",
1255
+ "Finally, we want to improve and update the course iteratively with your feedback. If you have some, please fill this form 👉 https://forms.gle/3HgA7bEHwAmmLfwh9\n",
1256
+ "\n",
1257
+ "We're constantly trying to improve our tutorials, so **if you find some issues in this notebook**, please [open an issue on the Github Repo](https://github.com/huggingface/deep-rl-class/issues)."
1258
+ ]
1259
+ },
1260
+ {
1261
+ "cell_type": "markdown",
1262
+ "metadata": {
1263
+ "id": "Kc3udPT-RcXc"
1264
+ },
1265
+ "source": [
1266
+ "See you on Bonus unit 2! 🔥"
1267
+ ]
1268
+ },
1269
+ {
1270
+ "cell_type": "markdown",
1271
+ "metadata": {
1272
+ "id": "fS3Xerx0fIMV"
1273
+ },
1274
+ "source": [
1275
+ "### Keep Learning, Stay Awesome 🤗"
1276
+ ]
1277
+ }
1278
+ ],
1279
+ "metadata": {
1280
+ "accelerator": "GPU",
1281
+ "colab": {
1282
+ "private_outputs": true,
1283
+ "provenance": []
1284
+ },
1285
+ "gpuClass": "standard",
1286
+ "kernelspec": {
1287
+ "display_name": "Python 3 (ipykernel)",
1288
+ "language": "python",
1289
+ "name": "python3"
1290
+ },
1291
+ "language_info": {
1292
+ "codemirror_mode": {
1293
+ "name": "ipython",
1294
+ "version": 3
1295
+ },
1296
+ "file_extension": ".py",
1297
+ "mimetype": "text/x-python",
1298
+ "name": "python",
1299
+ "nbconvert_exporter": "python",
1300
+ "pygments_lexer": "ipython3",
1301
+ "version": "3.9.17"
1302
+ },
1303
+ "varInspector": {
1304
+ "cols": {
1305
+ "lenName": 16,
1306
+ "lenType": 16,
1307
+ "lenVar": 40
1308
+ },
1309
+ "kernels_config": {
1310
+ "python": {
1311
+ "delete_cmd_postfix": "",
1312
+ "delete_cmd_prefix": "del ",
1313
+ "library": "var_list.py",
1314
+ "varRefreshCmd": "print(var_dic_list())"
1315
+ },
1316
+ "r": {
1317
+ "delete_cmd_postfix": ") ",
1318
+ "delete_cmd_prefix": "rm(",
1319
+ "library": "var_list.r",
1320
+ "varRefreshCmd": "cat(var_dic_list()) "
1321
+ }
1322
+ },
1323
+ "types_to_exclude": [
1324
+ "module",
1325
+ "function",
1326
+ "builtin_function_or_method",
1327
+ "instance",
1328
+ "_Feature"
1329
+ ],
1330
+ "window_display": false
1331
+ }
1332
+ },
1333
+ "nbformat": 4,
1334
+ "nbformat_minor": 0
1335
+ }