araffin commited on
Commit
a164664
1 Parent(s): 92862e4

Initial commit

Browse files
.gitattributes CHANGED
@@ -25,3 +25,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
25
  *.zip filter=lfs diff=lfs merge=lfs -text
26
  *.zstandard filter=lfs diff=lfs merge=lfs -text
27
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
25
  *.zip filter=lfs diff=lfs merge=lfs -text
26
  *.zstandard filter=lfs diff=lfs merge=lfs -text
27
  *tfevents* filter=lfs diff=lfs merge=lfs -text
28
+ *.mp4 filter=lfs diff=lfs merge=lfs -text
29
+ vec_normalize.pkl filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,57 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: stable-baselines3
3
+ tags:
4
+ - Walker2d-v3
5
+ - deep-reinforcement-learning
6
+ - reinforcement-learning
7
+ - stable-baselines3
8
+ model-index:
9
+ - name: A2C
10
+ results:
11
+ - metrics:
12
+ - type: mean_reward
13
+ value: 577.46 +/- 43.78
14
+ name: mean_reward
15
+ task:
16
+ type: reinforcement-learning
17
+ name: reinforcement-learning
18
+ dataset:
19
+ name: Walker2d-v3
20
+ type: Walker2d-v3
21
+ ---
22
+
23
+ # **A2C** Agent playing **Walker2d-v3**
24
+ This is a trained model of a **A2C** agent playing **Walker2d-v3**
25
+ using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
26
+ and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
27
+
28
+ The RL Zoo is a training framework for Stable Baselines3
29
+ reinforcement learning agents,
30
+ with hyperparameter optimization and pre-trained agents included.
31
+
32
+ ## Usage (with SB3 RL Zoo)
33
+
34
+ RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
35
+ SB3: https://github.com/DLR-RM/stable-baselines3<br/>
36
+ SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
37
+
38
+ ```
39
+ # Download model and save it into the logs/ folder
40
+ python -m utils.load_from_hub --algo a2c --env Walker2d-v3 -orga sb3 -f logs/
41
+ python enjoy.py --algo a2c --env Walker2d-v3 -f logs/
42
+ ```
43
+
44
+ ## Training (with the RL Zoo)
45
+ ```
46
+ python train.py --algo a2c --env Walker2d-v3 -f logs/
47
+ # Upload the model and generate video (when possible)
48
+ python -m utils.push_to_hub --algo a2c --env Walker2d-v3 -f logs/ -orga sb3
49
+ ```
50
+
51
+ ## Hyperparameters
52
+ ```python
53
+ OrderedDict([('n_timesteps', 1000000.0),
54
+ ('normalize', True),
55
+ ('policy', 'MlpPolicy'),
56
+ ('normalize_kwargs', {'norm_obs': True, 'norm_reward': False})])
57
+ ```
a2c-Walker2d-v3.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e19151d1d20e0bef0c8759612866ff0aa83713f27752ae5175c8adc91827e715
3
+ size 116000
a2c-Walker2d-v3/_stable_baselines3_version ADDED
@@ -0,0 +1 @@
 
 
1
+ 1.5.1a8
a2c-Walker2d-v3/data ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "policy_class": {
3
+ ":type:": "<class 'abc.ABCMeta'>",
4
+ ":serialized:": "gAWVOwAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMEUFjdG9yQ3JpdGljUG9saWN5lJOULg==",
5
+ "__module__": "stable_baselines3.common.policies",
6
+ "__doc__": "\n Policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param sde_net_arch: Network architecture for extracting features\n when using gSDE. If None, the latent features from the policy will be used.\n Pass an empty list to use the states as features.\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Features extractor to use.\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ",
7
+ "__init__": "<function ActorCriticPolicy.__init__ at 0x7fe31d985950>",
8
+ "_get_constructor_parameters": "<function ActorCriticPolicy._get_constructor_parameters at 0x7fe31d9859e0>",
9
+ "reset_noise": "<function ActorCriticPolicy.reset_noise at 0x7fe31d985a70>",
10
+ "_build_mlp_extractor": "<function ActorCriticPolicy._build_mlp_extractor at 0x7fe31d985b00>",
11
+ "_build": "<function ActorCriticPolicy._build at 0x7fe31d985b90>",
12
+ "forward": "<function ActorCriticPolicy.forward at 0x7fe31d985c20>",
13
+ "_get_action_dist_from_latent": "<function ActorCriticPolicy._get_action_dist_from_latent at 0x7fe31d985cb0>",
14
+ "_predict": "<function ActorCriticPolicy._predict at 0x7fe31d985d40>",
15
+ "evaluate_actions": "<function ActorCriticPolicy.evaluate_actions at 0x7fe31d985dd0>",
16
+ "get_distribution": "<function ActorCriticPolicy.get_distribution at 0x7fe31d985e60>",
17
+ "predict_values": "<function ActorCriticPolicy.predict_values at 0x7fe31d985ef0>",
18
+ "__abstractmethods__": "frozenset()",
19
+ "_abc_impl": "<_abc_data object at 0x7fe31d9df210>"
20
+ },
21
+ "verbose": 1,
22
+ "policy_kwargs": {
23
+ ":type:": "<class 'dict'>",
24
+ ":serialized:": "gAWVgQAAAAAAAAB9lCiMD29wdGltaXplcl9jbGFzc5SME3RvcmNoLm9wdGltLnJtc3Byb3CUjAdSTVNwcm9wlJOUjBBvcHRpbWl6ZXJfa3dhcmdzlH2UKIwFYWxwaGGURz/vrhR64UeujANlcHOURz7k+LWI42jxjAx3ZWlnaHRfZGVjYXmUSwB1dS4=",
25
+ "optimizer_class": "<class 'torch.optim.rmsprop.RMSprop'>",
26
+ "optimizer_kwargs": {
27
+ "alpha": 0.99,
28
+ "eps": 1e-05,
29
+ "weight_decay": 0
30
+ }
31
+ },
32
+ "observation_space": {
33
+ ":type:": "<class 'gym.spaces.box.Box'>",
34
+ ":serialized:": "gAWVgQIAAAAAAACMDmd5bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lGgFk5SMAmY4lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMBl9zaGFwZZRLEYWUjANsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWiAAAAAAAAAAAAAAAAADw/wAAAAAAAPD/AAAAAAAA8P8AAAAAAADw/wAAAAAAAPD/AAAAAAAA8P8AAAAAAADw/wAAAAAAAPD/AAAAAAAA8P8AAAAAAADw/wAAAAAAAPD/AAAAAAAA8P8AAAAAAADw/wAAAAAAAPD/AAAAAAAA8P8AAAAAAADw/wAAAAAAAPD/lGgKSxGFlIwBQ5R0lFKUjARoaWdolGgSKJaIAAAAAAAAAAAAAAAAAPB/AAAAAAAA8H8AAAAAAADwfwAAAAAAAPB/AAAAAAAA8H8AAAAAAADwfwAAAAAAAPB/AAAAAAAA8H8AAAAAAADwfwAAAAAAAPB/AAAAAAAA8H8AAAAAAADwfwAAAAAAAPB/AAAAAAAA8H8AAAAAAADwfwAAAAAAAPB/AAAAAAAA8H+UaApLEYWUaBV0lFKUjA1ib3VuZGVkX2JlbG93lGgSKJYRAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlGgHjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSxGFlGgVdJRSlIwNYm91bmRlZF9hYm92ZZRoEiiWEQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAJRoIUsRhZRoFXSUUpSMCl9ucF9yYW5kb22UTnViLg==",
35
+ "dtype": "float64",
36
+ "_shape": [
37
+ 17
38
+ ],
39
+ "low": "[-inf -inf -inf -inf -inf -inf -inf -inf -inf -inf -inf -inf -inf -inf\n -inf -inf -inf]",
40
+ "high": "[inf inf inf inf inf inf inf inf inf inf inf inf inf inf inf inf inf]",
41
+ "bounded_below": "[False False False False False False False False False False False False\n False False False False False]",
42
+ "bounded_above": "[False False False False False False False False False False False False\n False False False False False]",
43
+ "_np_random": null
44
+ },
45
+ "action_space": {
46
+ ":type:": "<class 'gym.spaces.box.Box'>",
47
+ ":serialized:": "gAWVEwwAAAAAAACMDmd5bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lGgFk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMBl9zaGFwZZRLBoWUjANsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWGAAAAAAAAAAAAIC/AACAvwAAgL8AAIC/AACAvwAAgL+UaApLBoWUjAFDlHSUUpSMBGhpZ2iUaBIolhgAAAAAAAAAAACAPwAAgD8AAIA/AACAPwAAgD8AAIA/lGgKSwaFlGgVdJRSlIwNYm91bmRlZF9iZWxvd5RoEiiWBgAAAAAAAAABAQEBAQGUaAeMAmIxlImIh5RSlChLA4wBfJROTk5K/////0r/////SwB0lGJLBoWUaBV0lFKUjA1ib3VuZGVkX2Fib3ZllGgSKJYGAAAAAAAAAAEBAQEBAZRoIUsGhZRoFXSUUpSMCl9ucF9yYW5kb22UjBRudW1weS5yYW5kb20uX3BpY2tsZZSMEl9fcmFuZG9tc3RhdGVfY3RvcpSTlIwHTVQxOTkzN5SFlFKUfZQojA1iaXRfZ2VuZXJhdG9ylGgwjAVzdGF0ZZR9lCiMA2tleZRoEiiWwAkAAAAAAAAAAACAU8KznIcDtZNy7Ktb6Oay8s+2gdrVBu9hoTFNoGu1zNkT5hifdJx5L8ilG4DEeQFJng9D5F3gGJOSE1XM1EopZNIIlb400J5EcnoD8K2/CnObez7pYLEG2nUDRQtufdYWausENGaDt/P1pS9p70JjQ7Vc98J3UsxGRDctCIlu0I6ud/sYtoBPe575TzLsEti5jl6FqRnKrj12LWcrQoCexe7HH/UiAV1LzyQPzBlSZERXmHCdCvUSF7XpWt47xP9BzzqxX7aH3TPYWImqos1/ez/JlLdsD0MfMZl9G2CQq7cHHRlM3sj7jroA9c+pGt4l/iAGpRb80HbjwU71ykPTAVp531BXrc2qmIU6z9Fh4TAPx7fZ1kVF+L1Irlou+4Ckky7Ys59nB7KkciTI+N5jlb62ybZt0+ZWgIA6LKLvdx/mTQtB4k1aplT/C7L9/ybKCFn2quN/7YlIkxoH1U0xdabG6rgOrR+SHMmvUwvtKB+19Ibb07mSgVQyjNAvnyADPJf3pkxylZtn7f/OVpWEaWfl6BcLwy0grrEgUK+H+8P8XWMuBginXgwzn3sy4+ZOlr45op6TtuqX0Knz/SySGDlBIK8JqKObzB6fGt+ovJHEM8KlL4veKwkLkuuMWBaex3FBdWskry5qhslxMgnk2thh8DaXmAfbuI8j0SqHMW1kleITi9ekfXx/eSi5hX1GjA/M62Zixuay1H8zH9VjsTRcGacyJ0vh1hNReDFoNsXFbLfLqaIvbLDQjY7T289ZXsupvAxu2GVTbqWst+ckPPzwH7vLikULC+weAKwxarqm+ugAXgyz774meHOsvQYuu18nvrrunjZWDvwaKuYohEwUfSnpotE9XhX99yUTc8sGPQidTfXkzm/t8MWP8it4l4VSEgDLn8GW8t2DAh8EwFa/KOGoZEGjYqZ2IMA70E+F2LqgaZlQLFMONTIx3yuN5F2e1MT4v2wdBRK9R+lGMpxIiNldyOwwxLDBTRDMhd7APidmDwQBnvaIecKFa95btwHkRBEUT5g++/I0DDg685EX4OMO2YtTPqM3PQluS4puEhAQRVukNGSh4gYDgcBPKZl4ThNf+G+E7El9fmWJcP39Sifw6Mn+GEisM1RhHY05XZHUv5W4r8kD2jSLMY+IIL2+LtQrW7it7y28+sEicLoEfYOky9ZJF6l0fR+sXEawf+REH9LvtRJ4yzfxr7KisNpr1axv1ae5CDXS+XTzuOG/BJnHvt8arnY1XWH9SdkCOeok6MI8GBCtjTCxJ5JbpI5J0i0A66mJaRW9LMfP6Cil3/cVRQ9uN2KTtV3o7rJwY4XCnj7DJmqrUwofDDl7Ek0PoN7w0Hh8YHOy8qhPw7V8ALdjZn7eYtjCQIldQvHbM1I73RtCLQvQGFMXUCJ022pGRqTvZX5XWSizqbgX6TJmI6LDF9wcpYealB7cDwelfqdpzHRmyjRbIX9b+w4uj//aDRgP2SgiOAq/D/9/0SbgK/E0FQyclhNVAkbKwXhAxKGczpvJow0mFFUAt/5fT5KAsmQTAt8p0FsrGMDTfk4RzZgqZSm+ihVRS371Tx3twpGA1goo/AIfJh8slJC3hkR1OGCN7LAPGCwbM9rHlKSU4uuhJiff196h9q1kPMld6989MfKLVkvCl7ofCRurPUW46ceJKE951sQD1v8cK0HK1JmuBTCXAelCUCIFNLGk3tMXNVmuuFF3o3xb4V4T1IAYIfBdyEVHhIIZOE/JEY79daQw8njYEtQ6YwZ6kNCBYfrjq2OglITcRdwDmINL42ro6HnbWgLZQ8Ce/EiPVBtWHwhvGUHK1FNONzRzXgT1zKEg+WAigeuK4QVIxdITM4YvUyYvpQJuJd+xGD1no7BYIKXdV4aDlsRnWSMmS+zTyTvC0+TgBMCNpMvdChjaB/XTrMVsm0vgPmCYswn067MTYWfm5oCqqmNciqoRfFL2O2mxFT1VMcKDrxHBdBUhSG5UmAerx86KAEytbsCbn6OOj8Y02VwVynzXd0WJfLioeGMZISM1eneWfTc1mQ6CpdDxJqUmU86/KsBL3Bb0S2NAqFysFJZKxDwLej8xz+xH8IxEHzlkiiNH+2IIq0663FAwi6wg6dgcryDqQ+lNDwn898nylrcYShigDrtrFBNezKx3ZjpkPCnPUeQB4hJUrYCUJy5CyytC/x1UsByKez/aSNEWnlWnzYdJf2PoKL0YfmaR3KpXzi9ax3BHPgk1cdmgdVkqevFJ0DUdTBFQj/mhaKqcaT0rKJLgy/11AhWW4nX7+kAdgR0b1iAseI0TbMDtohBuqqUZfqMfUKsdI8v2aeUd0+IqOjPBFe7TZRC7OUYmf789SRTpw9gst4tzx7tLap8JnFt2keKhqd3vBgqpvlsxvx0DcPC+bo/qIldKiAn5D7TPjeWLzJ1gmpk1mVKOyWOv/ZzlRTfe8yEsMsRcgdPxbOuxLjlOwo1uFh9NjHoOz/xbnI62I49ZzT59GUCNtAL74UqjlRoyXZ5ELEjhTn+F5fYfEkY2TnSsgKO4Wwb/xD41S4mBL7LcUyF76ybV7Yx0L6V2QGoSfyhHFqMQJs/haLPPW18mWJb/UDl90ZN9TEzcdXvZsmCeqzCagC6YDHp3fop+5nAQSnT/Byt2j7z+6cnl/aZh6oKs5xrEMmuzpLFbXNVof9hNmX5E0DQ2M8uBqqeW95p6z8ySnOxURAO28oYWsbVyeYaNlWLZrOtIMZDRjjbecSSwMLlrBhw4mZVht4DgOQxI1+P7sPHZLMf89U+5ctf1rD0r1AXgyXjzOxKvCxWMhrz6Ah19+zal/bAIpw+0V7Pq85PRQO4UeScmMwODR8jcOfILuMmo7xXhemY/JqtOncklEaGapMeGlkiefvQkx9L5EWvLn6stI4zRP4pZXx9iOz17IKJmKOVHgCIAOiheb0bwkjNkItlfYO3LzeLLPuBDNLFg7tQu5NPWy28a4nBsE/gsyEteRvF2ECYFIOJg06dzc77IWw7o+z1Q5APxLg9uvyFniYWNuJyk7rflLCmYcg1gN657CWff8YfPr0ukKOamco94X1nFdyroxHiQlRXaP91DOqMueI1pCasyRQt0jtbWwxdEVyzP3GzUZXBWqa0xXCzwe29cxg2aiwKuuAAVfaCE/Pt1cJXq8wvliF81sMDPMbowd9+uyWuExq/e+2W3wWeV3hVofoiEySjBrJPWVJW9++UocJbC0ppNw5mtHktkZqUk6kVtUgVQ4Cj4udj/bluZzcqWjIvOCJO52M+xcQY808Ei8T/lwwS9TguuzQ3e0KR7hptgNcX1/XhCvAuUaAeMAnU0lImIh5RSlChLA2gLTk5OSv////9K/////0sAdJRiTXAChZRoFXSUUpSMA3Bvc5RNcAJ1jAloYXNfZ2F1c3OUSwCMBWdhdXNzlEcAAAAAAAAAAHVidWIu",
48
+ "dtype": "float32",
49
+ "_shape": [
50
+ 6
51
+ ],
52
+ "low": "[-1. -1. -1. -1. -1. -1.]",
53
+ "high": "[1. 1. 1. 1. 1. 1.]",
54
+ "bounded_below": "[ True True True True True True]",
55
+ "bounded_above": "[ True True True True True True]",
56
+ "_np_random": "RandomState(MT19937)"
57
+ },
58
+ "n_envs": 1,
59
+ "num_timesteps": 1000000,
60
+ "_total_timesteps": 1000000,
61
+ "_num_timesteps_at_start": 0,
62
+ "seed": 0,
63
+ "action_noise": null,
64
+ "start_time": 1652869968.9647288,
65
+ "learning_rate": 0.0007,
66
+ "tensorboard_log": null,
67
+ "lr_schedule": {
68
+ ":type:": "<class 'function'>",
69
+ ":serialized:": "gAWV0QIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwNX2J1aWx0aW5fdHlwZZSTlIwKTGFtYmRhVHlwZZSFlFKUKGgCjAhDb2RlVHlwZZSFlFKUKEsBSwBLAUsBSxNDBIgAUwCUToWUKYwBX5SFlIxRL2hvbWUvYW50b25pbi9Eb2N1bWVudHMvZGxyL3JsL3RvcmNoeS1iYXNlbGluZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lIwEZnVuY5RLgEMCAAGUjAN2YWyUhZQpdJRSlH2UKIwLX19wYWNrYWdlX1+UjBhzdGFibGVfYmFzZWxpbmVzMy5jb21tb26UjAhfX25hbWVfX5SMHnN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi51dGlsc5SMCF9fZmlsZV9flIxRL2hvbWUvYW50b25pbi9Eb2N1bWVudHMvZGxyL3JsL3RvcmNoeS1iYXNlbGluZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaCB9lH2UKGgXaA6MDF9fcXVhbG5hbWVfX5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgYjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOURz9G8AaNuLrHhZRSlIWUjBdfY2xvdWRwaWNrbGVfc3VibW9kdWxlc5RdlIwLX19nbG9iYWxzX1+UfZR1hpSGUjAu"
70
+ },
71
+ "_last_obs": null,
72
+ "_last_episode_starts": {
73
+ ":type:": "<class 'numpy.ndarray'>",
74
+ ":serialized:": "gAWVdAAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYBAAAAAAAAAACUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSwGFlIwBQ5R0lFKULg=="
75
+ },
76
+ "_last_original_obs": {
77
+ ":type:": "<class 'numpy.ndarray'>",
78
+ ":serialized:": "gAWV/QAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJaIAAAAAAAAAHpl+NVF+vM/ir6IPenzVb/AtlLZ6AFNP7hjRhrxB02/5DKDw7eqYj/jHQrFk9VyP/77DTjJHm8/sDeUomAWPr8KOozpUP5xv3tEIzZK22u/tZqlKMc0dL8elOB9d3ZnP0AvEIFTpnK/Qh5qwE9KXr+eK7lN5vRgP4AM10vYJDK/H833WqCDb7+UjAVudW1weZSMBWR0eXBllJOUjAJmOJSJiIeUUpQoSwOMATyUTk5OSv////9K/////0sAdJRiSwFLEYaUjAFDlHSUUpQu"
79
+ },
80
+ "_episode_num": 0,
81
+ "use_sde": false,
82
+ "sde_sample_freq": -1,
83
+ "_current_progress_remaining": 0.0,
84
+ "ep_info_buffer": {
85
+ ":type:": "<class 'collections.deque'>",
86
+ ":serialized:": "gAWVeBAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpSMFW51bXB5LmNvcmUubXVsdGlhcnJheZSMBnNjYWxhcpSTlIwFbnVtcHmUjAVkdHlwZZSTlIwCZjiUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYkMIlL97Rw0yhUCUhpRSlIwBbJRNHgGMAXSUR0CT9cSzgMtsdX2UKGgGaAloD0MI0qkrn+X59b+UhpRSlGgVS3NoFkdAk/ZU+C9RJnV9lChoBmgJaA9DCLKACdy6Ky9AlIaUUpRoFUt1aBZHQJP27p/wy7B1fZQoaAZoCWgPQwi3RC44gxMyQJSGlFKUaBVLhWgWR0CT95S3b212dX2UKGgGaAloD0MIHomXp7NchkCUhpRSlGgVTTEBaBZHQJP5G4lQdjp1fZQoaAZoCWgPQwjdDDfgs0yGQJSGlFKUaBVNNgFoFkdAk/qnvx6OYXV9lChoBmgJaA9DCHWPbK46qoZAlIaUUpRoFU06AWgWR0CT/Dn+AEt/dX2UKGgGaAloD0MIrFJ6phdGhECUhpRSlGgVTSUBaBZHQJP9rzH0btJ1fZQoaAZoCWgPQwgMIHwoccKGQJSGlFKUaBVNPwFoFkdAk/9Gv8qFy3V9lChoBmgJaA9DCBwKn63DuoZAlIaUUpRoFU03AWgWR0CUANKB/ZuidX2UKGgGaAloD0MIH4ZWJ8eyhkCUhpRSlGgVTUQBaBZHQJQCby5I6Kd1fZQoaAZoCWgPQwhjRKLQcs6FQJSGlFKUaBVNLgFoFkdAlAPwMYuTR3V9lChoBmgJaA9DCLtfBfjueIZAlIaUUpRoFU1BAWgWR0CUBYUEPlMidX2UKGgGaAloD0MISGsMOkHbhUCUhpRSlGgVTTEBaBZHQJQHCKUFB6d1fZQoaAZoCWgPQwh+dOrKR7+GQJSGlFKUaBVNSQFoFkdAlAisbFS88XV9lChoBmgJaA9DCO19qgpNb4VAlIaUUpRoFU0qAWgWR0CUCig5R0lrdX2UKGgGaAloD0MIueNNfgsphkCUhpRSlGgVTTUBaBZHQJQLrYK6WgR1fZQoaAZoCWgPQwilh6HVCQKHQJSGlFKUaBVNRAFoFkdAlA1KciGFjHV9lChoBmgJaA9DCK4P643aooVAlIaUUpRoFU0nAWgWR0CUDr9Gqgh9dX2UKGgGaAloD0MIWz/9Zy3VhUCUhpRSlGgVTTIBaBZHQJQQRJ17pmp1fZQoaAZoCWgPQwgj88gfDP6FQJSGlFKUaBVNPQFoFkdAlBHbC79Q43V9lChoBmgJaA9DCO+MtiqJx1BAlIaUUpRoFUtWaBZHQJQSSPq9oOB1fZQoaAZoCWgPQwitbB/y9gyGQJSGlFKUaBVNNgFoFkdAlBPQ+Y+jd3V9lChoBmgJaA9DCF2nkZaqAIRAlIaUUpRoFU0bAWgWR0CUFT3TNMXadX2UKGgGaAloD0MIB5s6jypthECUhpRSlGgVTRwBaBZHQJQWnbxmTTx1fZQoaAZoCWgPQwgwgVt3c8KEQJSGlFKUaBVNIwFoFkdAlBgNPxhDxHV9lChoBmgJaA9DCEUOETen1INAlIaUUpRoFU0eAWgWR0CUGXJBPbfxdX2UKGgGaAloD0MIBp/m5CXKhUCUhpRSlGgVTTYBaBZHQJQa9qpLmIV1fZQoaAZoCWgPQwibcRqiiouEQJSGlFKUaBVNGQFoFkdAlBxXAEdNnHV9lChoBmgJaA9DCLNeDOWkcIVAlIaUUpRoFU0ZAWgWR0CUHbT1CgK4dX2UKGgGaAloD0MICJJ3DsUniECUhpRSlGgVTUQBaBZHQJQfSa7VawF1fZQoaAZoCWgPQwiHTs+7sQ6HQJSGlFKUaBVNOwFoFkdAlCDSa3I+4nV9lChoBmgJaA9DCAWk/Q8QwIdAlIaUUpRoFU1QAWgWR0CUInYDDCP7dX2UKGgGaAloD0MIO8JpwevLhUCUhpRSlGgVTScBaBZHQJQj8P1+RYB1fZQoaAZoCWgPQwj5SEp6eFGFQJSGlFKUaBVNIwFoFkdAlCVoRZlnRXV9lChoBmgJaA9DCOpdvB/3GoNAlIaUUpRoFU0JAWgWR0CUKd1JUYKqdX2UKGgGaAloD0MIMzffiA6zhUCUhpRSlGgVTTEBaBZHQJQrY6IWP911fZQoaAZoCWgPQwiE1sOXyVqKQJSGlFKUaBVNUQFoFkdAlC0Z8a4tpXV9lChoBmgJaA9DCL9k48F2FIZAlIaUUpRoFU0cAWgWR0CULoeenQ6ZdX2UKGgGaAloD0MInIh+bb1Ei0CUhpRSlGgVTW0BaBZHQJQwW1PWQOp1fZQoaAZoCWgPQwhRiIBDaOCIQJSGlFKUaBVNTgFoFkdAlDIHKwIMSnV9lChoBmgJaA9DCFDCTNu/yopAlIaUUpRoFU1iAWgWR0CUM824uscRdX2UKGgGaAloD0MI499nXJh5iECUhpRSlGgVTUUBaBZHQJQ1bjvNNah1fZQoaAZoCWgPQwiUh4Vac2iIQJSGlFKUaBVNRgFoFkdAlDcQq3EycnV9lChoBmgJaA9DCEzEW+evSpBAlIaUUpRoFU2gAWgWR0CUOSlOXVsldX2UKGgGaAloD0MIA7StZv2tgkCUhpRSlGgVTSUBaBZHQJQ6oQL/jsF1fZQoaAZoCWgPQwgTDr3F40qMQJSGlFKUaBVNgQFoFkdAlDyQMDwH7nV9lChoBmgJaA9DCBlybD2jy4lAlIaUUpRoFU1GAWgWR0CUPjNaQmu1dX2UKGgGaAloD0MIhgSMLk89jECUhpRSlGgVTXIBaBZHQJRADgl4TsZ1fZQoaAZoCWgPQwgzUu+pXBmLQJSGlFKUaBVNaQFoFkdAlEHltj0+T3V9lChoBmgJaA9DCLvx7sg4doxAlIaUUpRoFU2CAWgWR0CUQ9qHXVbzdX2UKGgGaAloD0MIAVEwY0rZi0CUhpRSlGgVTY0BaBZHQJRF3zbvgFZ1fZQoaAZoCWgPQwiphCf0ukWOQJSGlFKUaBVNigFoFkdAlEffa+N96XV9lChoBmgJaA9DCEpFY+2vTI1AlIaUUpRoFU2UAWgWR0CUSfCQtBfKdX2UKGgGaAloD0MIAYV6+kjEcUCUhpRSlGgVTSABaBZHQJRLZciW3Sd1fZQoaAZoCWgPQwiRY+sZokGQQJSGlFKUaBVNsgFoFkdAlE2aW1MM7XV9lChoBmgJaA9DCBjQC3fubnFAlIaUUpRoFU0qAWgWR0CUTyFFUhmodX2UKGgGaAloD0MIeZCeIofnlECUhpRSlGgVTV0CaBZHQJRSNOLzf791fZQoaAZoCWgPQwjdeHdkrMxnQJSGlFKUaBVL3mgWR0CUU1SaVlf7dX2UKGgGaAloD0MIm8dhMN+pi0CUhpRSlGgVTW0BaBZHQJRVL+1jRUp1fZQoaAZoCWgPQwj8x0J0GCWTQJSGlFKUaBVNMwJoFkdAlFgMwL3K0XV9lChoBmgJaA9DCD56w310gZVAlIaUUpRoFU2MAmgWR0CUW2JF9a2XdX2UKGgGaAloD0MIcVZETZQrkECUhpRSlGgVTbIBaBZHQJRdmmvW6LB1fZQoaAZoCWgPQwiunL0zmsJ0QJSGlFKUaBVNWQFoFkdAlF9fNiYsunV9lChoBmgJaA9DCOGVJM/VNpJAlIaUUpRoFU0RAmgWR0CUYhQaJhvzdX2UKGgGaAloD0MIdlJfltYIkUCUhpRSlGgVTfABaBZHQJRkmrMkhRt1fZQoaAZoCWgPQwhvnBTm/T6QQJSGlFKUaBVN0QFoFkdAlGb4BV+7UXV9lChoBmgJaA9DCAQBMnTccJBAlIaUUpRoFU29AWgWR0CUaTqXF98adX2UKGgGaAloD0MIda+T+nKGj0CUhpRSlGgVTakBaBZHQJRrYuSOinJ1fZQoaAZoCWgPQwiDiqpfiaeLQJSGlFKUaBVNdAFoFkdAlG1M7dSEUXV9lChoBmgJaA9DCEinrnyWwJBAlIaUUpRoFU3TAWgWR0CUb6or4FibdX2UKGgGaAloD0MITTCca3j5jkCUhpRSlGgVTbkBaBZHQJRx5RVIZqF1fZQoaAZoCWgPQwhWfa62QneQQJSGlFKUaBVN1AFoFkdAlHRD1kDp1XV9lChoBmgJaA9DCEIHXcJx2JFAlIaUUpRoFU3nAWgWR0CUdrbm2b5NdX2UKGgGaAloD0MIxLRv7i8NdUCUhpRSlGgVTWABaBZHQJR4gSzw+dN1fZQoaAZoCWgPQwis/gjDcD+TQJSGlFKUaBVNFQJoFkdAlHs9VJcxCnV9lChoBmgJaA9DCAlU/yAyCpFAlIaUUpRoFU0UAmgWR0CUffSm65G0dX2UKGgGaAloD0MIHAbzV4ifj0CUhpRSlGgVTboBaBZHQJSANfKISDh1fZQoaAZoCWgPQwiNJ4I4f+2RQJSGlFKUaBVNDQJoFkdAlILhXfZVXHV9lChoBmgJaA9DCMO7XMQXr5RAlIaUUpRoFU12AmgWR0CUhhZTAFgVdX2UKGgGaAloD0MIRQ4RN9cMlECUhpRSlGgVTR4CaBZHQJSI1N9H+ZR1fZQoaAZoCWgPQwiMSuoEpG6XQJSGlFKUaBVNjwJoFkdAlIwoD1XeWXV9lChoBmgJaA9DCAtgysABvXZAlIaUUpRoFU1qAWgWR0CUjgLEk0JodX2UKGgGaAloD0MImGn7V9aBa0CUhpRSlGgVS+5oFkdAlI84+Sr5qXV9lChoBmgJaA9DCOG1SxtOzZBAlIaUUpRoFU3SAWgWR0CUkZn/1g6VdX2UKGgGaAloD0MIEy15PE12kECUhpRSlGgVTbwBaBZHQJST3N2TxG51fZQoaAZoCWgPQwhZhjjWpWiLQJSGlFKUaBVNaQFoFkdAlJWwnlXA/XV9lChoBmgJaA9DCKTfvg5c44ZAlIaUUpRoFU0mAWgWR0CUlzKWszVMdX2UKGgGaAloD0MI5/up8VLch0CUhpRSlGgVTVcBaBZHQJSY83bVSXN1fZQoaAZoCWgPQwhrf2d7lO2GQJSGlFKUaBVNIgFoFkdAlJptH2AXmHV9lChoBmgJaA9DCKUWSiYHSoRAlIaUUpRoFU0LAWgWR0CUm8hq0tyxdX2UKGgGaAloD0MI/TGtTWMwhECUhpRSlGgVTQYBaBZHQJSdHsu3+dd1fZQoaAZoCWgPQwiFQgQcAiOLQJSGlFKUaBVNZQFoFkdAlJ7zJhfBvnV9lChoBmgJaA9DCFESEmm7IYZAlIaUUpRoFU0XAWgWR0CUoFxxkupTdX2UKGgGaAloD0MIeO+oMaGWhECUhpRSlGgVTRMBaBZHQJShvhxYJVt1fZQoaAZoCWgPQwjjUSrh6Q+KQJSGlFKUaBVNSQFoFkdAlKNmRmseXHV9lChoBmgJaA9DCCf20D4W44hAlIaUUpRoFU04AWgWR0CUpPXqJMxodX2UKGgGaAloD0MI/5O/e0dtOkCUhpRSlGgVS5xoFkdAlKXBx5s0pHV9lChoBmgJaA9DCFXZd0Xwu4BAlIaUUpRoFUvxaBZHQJSm9zQu27Z1fZQoaAZoCWgPQwgSL0/nirIdwJSGlFKUaBVLmGgWR0CUp7yEtdzGdWUu"
87
+ },
88
+ "ep_success_buffer": {
89
+ ":type:": "<class 'collections.deque'>",
90
+ ":serialized:": "gAWVIAAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKULg=="
91
+ },
92
+ "_n_updates": 200000,
93
+ "n_steps": 5,
94
+ "gamma": 0.99,
95
+ "gae_lambda": 1.0,
96
+ "ent_coef": 0.0,
97
+ "vf_coef": 0.5,
98
+ "max_grad_norm": 0.5,
99
+ "normalize_advantage": false
100
+ }
a2c-Walker2d-v3/policy.optimizer.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8490cffb0a6d9921c262f88f3ceb6b077b2ff4afe77379f10cb7f4f8de86274
3
+ size 47934
a2c-Walker2d-v3/policy.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53a06389a7ff1246b82fd47938b52bbd71391f912535d416cb9fd6757b351433
3
+ size 48510
a2c-Walker2d-v3/pytorch_variables.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d030ad8db708280fcae77d87e973102039acd23a11bdecc3db8eb6c0ac940ee1
3
+ size 431
a2c-Walker2d-v3/system_info.txt ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ OS: Linux-5.4.0-113-generic-x86_64-with-debian-bullseye-sid #127-Ubuntu SMP Wed May 18 14:30:56 UTC 2022
2
+ Python: 3.7.12
3
+ Stable-Baselines3: 1.5.1a8
4
+ PyTorch: 1.11.0+cpu
5
+ GPU Enabled: False
6
+ Numpy: 1.21.6
7
+ Gym: 0.21.0
args.yml ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ !!python/object/apply:collections.OrderedDict
2
+ - - - algo
3
+ - a2c
4
+ - - device
5
+ - auto
6
+ - - env
7
+ - Walker2d-v3
8
+ - - env_kwargs
9
+ - null
10
+ - - eval_episodes
11
+ - 5
12
+ - - eval_freq
13
+ - 25000
14
+ - - gym_packages
15
+ - []
16
+ - - hyperparams
17
+ - null
18
+ - - log_folder
19
+ - logs
20
+ - - log_interval
21
+ - -1
22
+ - - n_eval_envs
23
+ - 1
24
+ - - n_evaluations
25
+ - null
26
+ - - n_jobs
27
+ - 1
28
+ - - n_startup_trials
29
+ - 10
30
+ - - n_timesteps
31
+ - -1
32
+ - - n_trials
33
+ - 500
34
+ - - no_optim_plots
35
+ - false
36
+ - - num_threads
37
+ - -1
38
+ - - optimization_log_path
39
+ - null
40
+ - - optimize_hyperparameters
41
+ - false
42
+ - - pruner
43
+ - median
44
+ - - sampler
45
+ - tpe
46
+ - - save_freq
47
+ - -1
48
+ - - save_replay_buffer
49
+ - false
50
+ - - seed
51
+ - 1103724495
52
+ - - storage
53
+ - null
54
+ - - study_name
55
+ - null
56
+ - - tensorboard_log
57
+ - ''
58
+ - - track
59
+ - false
60
+ - - trained_agent
61
+ - ''
62
+ - - truncate_last_trajectory
63
+ - true
64
+ - - uuid
65
+ - false
66
+ - - vec_env
67
+ - dummy
68
+ - - verbose
69
+ - 1
70
+ - - wandb_entity
71
+ - null
72
+ - - wandb_project_name
73
+ - sb3
config.yml ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ !!python/object/apply:collections.OrderedDict
2
+ - - - n_timesteps
3
+ - 1000000.0
4
+ - - normalize
5
+ - true
6
+ - - policy
7
+ - MlpPolicy
env_kwargs.yml ADDED
@@ -0,0 +1 @@
 
 
1
+ {}
replay.mp4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b961d5cb8ecb2cb393f9b682cf1f7578536bb0d5ca5ce9c87d096f73d5cb0371
3
+ size 1087233
results.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"mean_reward": 577.4579440999998, "std_reward": 43.783385636901656, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-06-10T14:18:21.625331"}
train_eval_metrics.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2e75a0d617f2e8d7357db008895e508c624e2ddbd8c7c239b17204cb1f6b273e
3
+ size 145561
vec_normalize.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34fb697d2b6635e427d8e3c7f1d0537d87c54d854f26234dde79e0f5b1ff7a1f
3
+ size 5012