Dean commited on
Commit
715606b
1 Parent(s): ee797b2

Update readme to include google colab setup + remove problematic packages from requirements.txt

Browse files
Files changed (3) hide show
  1. Notebooks/SavtaDepth_Colab.ipynb +367 -0
  2. README.md +49 -6
  3. requirements.txt +2 -3
Notebooks/SavtaDepth_Colab.ipynb ADDED
@@ -0,0 +1,367 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "nbformat": 4,
3
+ "nbformat_minor": 0,
4
+ "metadata": {
5
+ "colab": {
6
+ "name": "SavtaDepth Colab.ipynb",
7
+ "provenance": [],
8
+ "collapsed_sections": []
9
+ },
10
+ "kernelspec": {
11
+ "name": "python3",
12
+ "display_name": "Python 3"
13
+ },
14
+ "accelerator": "GPU"
15
+ },
16
+ "cells": [
17
+ {
18
+ "cell_type": "markdown",
19
+ "metadata": {
20
+ "id": "QKUz-9qCbHRH"
21
+ },
22
+ "source": [
23
+ "<center> <img alt=\\\"DAGsHub\\\" width=500px src=https://raw.githubusercontent.com/DAGsHub/client/master/dagshub_github.png> </center>\n",
24
+ "\n",
25
+ "# SavtaDepth Colab Environment\n",
26
+ "\n",
27
+ "### ***This notebook is meant to be run on Google Colab***\n",
28
+ "\n",
29
+ "This notebook is a tool to setup and orchestrate (read: Run) experiments for [SavtaDepth](https://dagshub.com/OperationSavta/SavtaDepth) on Google Colab, while maintaining a relatively clean environment and using version control to promote reproducibility. It is a WIP, but following (and modifying) the cells below should let you git clone a project into colab, download the data, run it, and push it to https://DAGsHub.com which is a free platform for open source data science. \n",
30
+ "\n",
31
+ "SavtaDepth is an Open Source Data Science project. We'd love to get help from the community, so if you'd like to contribute head over to the [project page](https://dagshub.com/OperationSavta/SavtaDepth) to get started."
32
+ ]
33
+ },
34
+ {
35
+ "cell_type": "markdown",
36
+ "metadata": {
37
+ "id": "wJNCkwSIa7To"
38
+ },
39
+ "source": [
40
+ "# General Setup\n",
41
+ "Run this before anything else"
42
+ ]
43
+ },
44
+ {
45
+ "cell_type": "code",
46
+ "metadata": {
47
+ "id": "2Tq2Dl33XDqw"
48
+ },
49
+ "source": [
50
+ "%reload_ext autoreload\n",
51
+ "%autoreload 2"
52
+ ],
53
+ "execution_count": 1,
54
+ "outputs": []
55
+ },
56
+ {
57
+ "cell_type": "code",
58
+ "metadata": {
59
+ "id": "UiFeCt5xXEK1"
60
+ },
61
+ "source": [
62
+ "from google.colab import auth\n",
63
+ "auth.authenticate_user()"
64
+ ],
65
+ "execution_count": 2,
66
+ "outputs": []
67
+ },
68
+ {
69
+ "cell_type": "markdown",
70
+ "metadata": {
71
+ "id": "xn8URgDmWBKm"
72
+ },
73
+ "source": [
74
+ "Clone Git Repo from DAGsHub"
75
+ ]
76
+ },
77
+ {
78
+ "cell_type": "code",
79
+ "metadata": {
80
+ "id": "jxOO9c2hU_zM"
81
+ },
82
+ "source": [
83
+ "!git clone https://dagshub.com/OperationSavta/SavtaDepth.git\n",
84
+ "%cd SavtaDepth/"
85
+ ],
86
+ "execution_count": null,
87
+ "outputs": []
88
+ },
89
+ {
90
+ "cell_type": "markdown",
91
+ "metadata": {
92
+ "id": "4bndN7kVagXj"
93
+ },
94
+ "source": [
95
+ "*Note: Currently you can't see the intermediate output so it's completely opaque and long process + you can't set up credentials for google within the shell instance*\n",
96
+ "\n",
97
+ "This installs conda, creates a virtual environment and installs all relevant requirements."
98
+ ]
99
+ },
100
+ {
101
+ "cell_type": "code",
102
+ "metadata": {
103
+ "id": "JAsG5scFYjqd"
104
+ },
105
+ "source": [
106
+ "%%bash\n",
107
+ "MINICONDA_INSTALLER_SCRIPT=Miniconda3-py37_4.8.3-Linux-x86_64.sh\n",
108
+ "MINICONDA_PREFIX=/usr/local\n",
109
+ "wget https://repo.continuum.io/miniconda/$MINICONDA_INSTALLER_SCRIPT\n",
110
+ "chmod +x $MINICONDA_INSTALLER_SCRIPT\n",
111
+ "./$MINICONDA_INSTALLER_SCRIPT -b -f -p $MINICONDA_PREFIX\n",
112
+ "rm $MINICONDA_INSTALLER_SCRIPT\n",
113
+ "make env"
114
+ ],
115
+ "execution_count": null,
116
+ "outputs": []
117
+ },
118
+ {
119
+ "cell_type": "markdown",
120
+ "metadata": {
121
+ "id": "TR0asdFLaeAV"
122
+ },
123
+ "source": [
124
+ "Since locally we might not be working with a GPU, here we install the pytorch version that should use the GPU provided by Google Colab. After that we install the rest of the requirements from the requirements.txt file."
125
+ ]
126
+ },
127
+ {
128
+ "cell_type": "code",
129
+ "metadata": {
130
+ "id": "n3U6qqVFWYRm"
131
+ },
132
+ "source": [
133
+ "!bash -c \"source activate savta_depth && conda install -y pytorch torchvision cudatoolkit=10.1 -c pytorch && make load_requirements\""
134
+ ],
135
+ "execution_count": null,
136
+ "outputs": []
137
+ },
138
+ {
139
+ "cell_type": "markdown",
140
+ "metadata": {
141
+ "id": "WOzZu9GQaaeI"
142
+ },
143
+ "source": [
144
+ "### Pull DVC files from our remote"
145
+ ]
146
+ },
147
+ {
148
+ "cell_type": "code",
149
+ "metadata": {
150
+ "id": "sNmiqub0WbqT"
151
+ },
152
+ "source": [
153
+ "!bash -c \"source activate savta_depth && dvc pull -r dvc-remote\""
154
+ ],
155
+ "execution_count": null,
156
+ "outputs": []
157
+ },
158
+ {
159
+ "cell_type": "markdown",
160
+ "metadata": {
161
+ "id": "qNNIfodJc_0g"
162
+ },
163
+ "source": [
164
+ "\n",
165
+ "\n",
166
+ "---\n",
167
+ "\n",
168
+ "\n",
169
+ "# Setup is done!\n",
170
+ "If you've made it here, everything is set up. You have the code and data in the file viewer to the left. You can edit the files located in SavtaDepth/src/code/ as you like. You can see the YAML defining the project pipeline in dvc.yaml. If you change dependencies or outputs (for example, add an addition code file for the training stage), make sure you edit the pipeline to reflect this.\n",
171
+ "\n",
172
+ "### Once you are done with you changes run the cell below to run the pipeline end-to-end\n",
173
+ "* You can run this multiple times if you've made a change and want to test it\n",
174
+ "* If you want to run only a specific stage you can change the `dvc repro` command to any other command you like.\n",
175
+ "\n",
176
+ "▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽▽"
177
+ ]
178
+ },
179
+ {
180
+ "cell_type": "code",
181
+ "metadata": {
182
+ "id": "fwOfOwnrZWMe"
183
+ },
184
+ "source": [
185
+ "!bash -c \"source activate savta_depth && dvc repro\""
186
+ ],
187
+ "execution_count": null,
188
+ "outputs": []
189
+ },
190
+ {
191
+ "cell_type": "markdown",
192
+ "metadata": {
193
+ "id": "PAxz-29WhN12"
194
+ },
195
+ "source": [
196
+ "---\n",
197
+ "# Commiting Your Work and Pushing Back to DAGsHub\n",
198
+ "\n"
199
+ ]
200
+ },
201
+ {
202
+ "cell_type": "code",
203
+ "metadata": {
204
+ "id": "kyse-hAuZY9X"
205
+ },
206
+ "source": [
207
+ "!git status"
208
+ ],
209
+ "execution_count": null,
210
+ "outputs": []
211
+ },
212
+ {
213
+ "cell_type": "code",
214
+ "metadata": {
215
+ "id": "Ib12i6aOhbgI"
216
+ },
217
+ "source": [
218
+ "# Add the files you want to commit\n",
219
+ "!git add {your files here}"
220
+ ],
221
+ "execution_count": null,
222
+ "outputs": []
223
+ },
224
+ {
225
+ "cell_type": "markdown",
226
+ "metadata": {
227
+ "id": "duM9An0Khr_F"
228
+ },
229
+ "source": [
230
+ "Run the following 2 cells without modifications. They will prompt you for a commit message, and for credentials to push back to DAGsHub"
231
+ ]
232
+ },
233
+ {
234
+ "cell_type": "markdown",
235
+ "metadata": {
236
+ "id": "V9SYYA9Zh3f9"
237
+ },
238
+ "source": [
239
+ "**Commiting**"
240
+ ]
241
+ },
242
+ {
243
+ "cell_type": "code",
244
+ "metadata": {
245
+ "id": "_fjym_38hhgl"
246
+ },
247
+ "source": [
248
+ "os.environ['COMMIT_MESSAGE'] = input('Enter the commit message for you commit: ')\n",
249
+ "!git commit -m \"${COMMIT_MESSAGE}\"\n",
250
+ "os.environ['COMMIT_MESSAGE'] = \"\""
251
+ ],
252
+ "execution_count": null,
253
+ "outputs": []
254
+ },
255
+ {
256
+ "cell_type": "markdown",
257
+ "metadata": {
258
+ "id": "mWtgXU0ph6Dv"
259
+ },
260
+ "source": [
261
+ "**Pushing to DAGsHub**"
262
+ ]
263
+ },
264
+ {
265
+ "cell_type": "code",
266
+ "metadata": {
267
+ "id": "z7LoVk9Zh9GC"
268
+ },
269
+ "source": [
270
+ "# If this stage fails, make sure to remove outputs as it will show your password to whoever it is shared with.\n",
271
+ "from getpass import getpass\n",
272
+ "import os\n",
273
+ "\n",
274
+ "os.environ['USER'] = input('Enter the username of your DAGsHub account: ')\n",
275
+ "os.environ['PASSWORD'] = getpass('Enter the password of your DAGsHub account: ')\n",
276
+ "os.environ['REPO_URL'] = input('Enter the url of your DAGsHub project: ').split('https://')[-1]\n",
277
+ "os.environ['DAGSHUB_AUTH'] = os.environ['USER'] + ':' + os.environ['PASSWORD']\n",
278
+ "\n",
279
+ "!git push https://$DAGSHUB_AUTH@$REPO_URL.git"
280
+ ],
281
+ "execution_count": null,
282
+ "outputs": []
283
+ },
284
+ {
285
+ "cell_type": "markdown",
286
+ "metadata": {
287
+ "id": "MXlzJbuLiFWb"
288
+ },
289
+ "source": [
290
+ "***NOTE: ALWAYS RUN THIS CELL AFTER THE PREVIOUS***, it will delete your DAGsHub password in case you share this notebook with someone"
291
+ ]
292
+ },
293
+ {
294
+ "cell_type": "code",
295
+ "metadata": {
296
+ "id": "6-yZzyeAiGPI"
297
+ },
298
+ "source": [
299
+ "os.environ['PASSWORD'] = os.environ['DAGSHUB_AUTH'] = \"\""
300
+ ],
301
+ "execution_count": null,
302
+ "outputs": []
303
+ },
304
+ {
305
+ "cell_type": "markdown",
306
+ "metadata": {
307
+ "id": "FImsqcvYie77"
308
+ },
309
+ "source": [
310
+ "## Push data back to your DVC Remote\n",
311
+ "For this step you must first create a DVC remote on some cloud provider. We recommend Google Cloud Storage. If you're not sure how to set up a DVC remote, [follow these instructions](https://dagshub.com/docs/getting-started/set-up-remote-storage-for-data-and-models/#create-a-storage-bucket) (you only need to go through creating a storage bucket and adding permissions)."
312
+ ]
313
+ },
314
+ {
315
+ "cell_type": "markdown",
316
+ "metadata": {
317
+ "id": "L4XWSvPQjfLb"
318
+ },
319
+ "source": [
320
+ "Add your storage bucket to DVC (replace {bucket-name} with the bucket name you chose)"
321
+ ]
322
+ },
323
+ {
324
+ "cell_type": "code",
325
+ "metadata": {
326
+ "id": "kGx03PpijiAW"
327
+ },
328
+ "source": [
329
+ "!bash -c \"source activate savta_depth && dvc remote add my-dvc-remote gs://{bucket-name}\""
330
+ ],
331
+ "execution_count": null,
332
+ "outputs": []
333
+ },
334
+ {
335
+ "cell_type": "code",
336
+ "metadata": {
337
+ "id": "i8uYEa64imDl"
338
+ },
339
+ "source": [
340
+ "from google.colab import auth\n",
341
+ "auth.authenticate_user()"
342
+ ],
343
+ "execution_count": null,
344
+ "outputs": []
345
+ },
346
+ {
347
+ "cell_type": "code",
348
+ "metadata": {
349
+ "id": "ZsM5epDbiotM"
350
+ },
351
+ "source": [
352
+ "!bash -c \"source activate savta_depth && dvc push -r my-dvc-remote\""
353
+ ],
354
+ "execution_count": null,
355
+ "outputs": []
356
+ },
357
+ {
358
+ "cell_type": "markdown",
359
+ "metadata": {
360
+ "id": "ne7pU7bMj18m"
361
+ },
362
+ "source": [
363
+ "# That's it, you can now create a PR on DAGsHub."
364
+ ]
365
+ }
366
+ ]
367
+ }
README.md CHANGED
@@ -11,12 +11,53 @@ If you'd like to take part, please follow the guide.
11
 
12
  ### Setting up your environment to contribute
13
  * To get started, fork the repository on DAGsHub
14
- * Now, You have 3 way to set up your environment: Google Colab, local or docker. If you're not sure which one to go with, we recommend using Colab.
15
  #### Google Colab
16
- We can treat Colab, as your web connected, GPU powered IDE. Here is a link to a well-documented Colab notebook, that will load the code and data from this repository, enabling you to modify the code and re-run training.
 
 
 
 
 
 
 
 
17
 
18
- **_NOTE: The downside of this method (if you are not familiar with Colab) is that Google Colab will limit the amount of time an instance can be live, so you might be limited in your ability to train models for longer periods of time.
19
  #### Local
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  #### Docker
21
  * Next, clone the repository you just forked by typing the following command in your terminal:
22
  ```bash
@@ -53,7 +94,6 @@ We can treat Colab, as your web connected, GPU powered IDE. Here is a link to a
53
  ```
54
 
55
 
56
-
57
  * Pull the dvc files to your workspace by typing:
58
 
59
  ```bash
@@ -70,8 +110,11 @@ We can treat Colab, as your web connected, GPU powered IDE. Here is a link to a
70
  ```
71
 
72
  * Push your code to DAGsHub, and your dvc managed files to your dvc remote. In order to setup a dvc remote please refer to [this guide](https://dagshub.com/docs/getting-started/set-up-remote-storage-for-data-and-models/).
73
- * Create a Pull Request on DAGsHub!
74
- * 🐶
 
 
 
75
 
76
  ### TODO:
77
  [ ] Web UI
 
11
 
12
  ### Setting up your environment to contribute
13
  * To get started, fork the repository on DAGsHub
14
+ * Now, you have 3 way to set up your environment: Google Colab, local or docker. If you're not sure which one to go with, we recommend using Colab.
15
  #### Google Colab
16
+ Google Colab can be looked at as your web connected, GPU powered IDE. Below is a link to a well-documented Colab notebook, that will load the code and data from this repository, enabling you to modify the code and re-run training. Notice that you still need to modify the code within the `src/code/` folder, adding cells should be used only for testing things out.
17
+
18
+ In order to edit code files, you must save the notebook to your drive. You can do this by typing `ctrl+s` or `cmd+s` on mac.
19
+
20
+ \>\> **[SavtaDepth Colab Environment](https://colab.research.google.com/drive/19027P09jiiN1C99-YGk4FPj0Ol9iXUIU?usp=sharing)** \<\<
21
+
22
+ **_NOTE: The downside of this method (if you are not familiar with Colab) is that Google Colab will limit the amount of time an instance can be live, so you might be limited in your ability to train models for longer periods of time._**
23
+
24
+ This notebook is also a part of this project, in case it needs modification, in the `Notebooks` folder. You should not commit your version unless your contribution is an improvement to the environment.
25
 
 
26
  #### Local
27
+ * Create a virtual environment or Conda environment and activate it
28
+ ```bash
29
+ # Create the virtual environment
30
+ $ make env
31
+
32
+ # Activate the virtual environment
33
+ # VENV
34
+ $ source env/bin/activate .
35
+
36
+ # or Conda
37
+ $ source activate savta_depth
38
+ ```
39
+ * Install the required libraries
40
+ ```bash
41
+ $ make load_requirements
42
+ ```
43
+ **_NOTE: Here I assume a setup without GPU. Otherwise, you might need to modify requirements, which is outside the scope of this readme (feel free to contribute to this)._**
44
+ * Pull the dvc files to your workspace by typing:
45
+
46
+ ```bash
47
+ $ dvc pull -r dvc-remote
48
+ $ dvc checkout #use this to get the data, models etc
49
+ ```
50
+
51
+ **Note**: You might need to install and setup the tools to pull from a remote. Read more in [this guide](https://dagshub.com/docs/getting-started/set-up-remote-storage-for-data-and-models/) on how to setup Google Storage or AWS S3 access.
52
+ * After you are finished your modification, make sure to do the following:
53
+ * If you modified packages, make sure to freeze your virtualenv by typing in the terminal:
54
+
55
+ ```bash
56
+ $ make save_requirements
57
+ ```
58
+
59
+ * Push your code to DAGsHub, and your dvc managed files to your dvc remote. To setup a dvc remote please refer to [this guide](https://dagshub.com/docs/getting-started/set-up-remote-storage-for-data-and-models/).
60
+
61
  #### Docker
62
  * Next, clone the repository you just forked by typing the following command in your terminal:
63
  ```bash
 
94
  ```
95
 
96
 
 
97
  * Pull the dvc files to your workspace by typing:
98
 
99
  ```bash
 
110
  ```
111
 
112
  * Push your code to DAGsHub, and your dvc managed files to your dvc remote. In order to setup a dvc remote please refer to [this guide](https://dagshub.com/docs/getting-started/set-up-remote-storage-for-data-and-models/).
113
+
114
+ ---
115
+ ### After pushing code and data to DAGsHub
116
+ * Create a Pull Request on DAGsHub!
117
+ * 🐶
118
 
119
  ### TODO:
120
  [ ] Web UI
requirements.txt CHANGED
@@ -17,7 +17,7 @@ decorator==4.4.2
17
  dictdiffer==0.8.1
18
  distro==1.5.0
19
  dpath==2.0.1
20
- dvc==1.6.0
21
  fastai==2.0.0
22
  fastcore==1.0.0
23
  fastprogress==1.0.0
@@ -38,6 +38,7 @@ grandalf==0.6
38
  h5py==2.10.0
39
  idna==2.10
40
  importlib-metadata==1.7.0
 
41
  ipython==7.17.0
42
  ipython-genutils==0.2.0
43
  jedi==0.17.2
@@ -45,8 +46,6 @@ joblib==0.16.0
45
  jsonpath-ng==1.5.1
46
  kiwisolver==1.2.0
47
  matplotlib==3.3.1
48
- mkl-random==1.1.1
49
- mkl-service==2.3.0
50
  murmurhash==1.0.2
51
  nanotime==0.5.2
52
  networkx==2.4
 
17
  dictdiffer==0.8.1
18
  distro==1.5.0
19
  dpath==2.0.1
20
+ dvc==1.9.1
21
  fastai==2.0.0
22
  fastcore==1.0.0
23
  fastprogress==1.0.0
 
38
  h5py==2.10.0
39
  idna==2.10
40
  importlib-metadata==1.7.0
41
+ ipykernel==5.3.4
42
  ipython==7.17.0
43
  ipython-genutils==0.2.0
44
  jedi==0.17.2
 
46
  jsonpath-ng==1.5.1
47
  kiwisolver==1.2.0
48
  matplotlib==3.3.1
 
 
49
  murmurhash==1.0.2
50
  nanotime==0.5.2
51
  networkx==2.4