pietroluongo commited on
Commit
8470c9d
1 Parent(s): 20ae2fe

Initial commit

Browse files
README.md ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: stable-baselines3
3
+ tags:
4
+ - PandaReachDense-v3
5
+ - deep-reinforcement-learning
6
+ - reinforcement-learning
7
+ - stable-baselines3
8
+ model-index:
9
+ - name: A2C
10
+ results:
11
+ - task:
12
+ type: reinforcement-learning
13
+ name: reinforcement-learning
14
+ dataset:
15
+ name: PandaReachDense-v3
16
+ type: PandaReachDense-v3
17
+ metrics:
18
+ - type: mean_reward
19
+ value: -0.23 +/- 0.11
20
+ name: mean_reward
21
+ verified: false
22
+ ---
23
+
24
+ # **A2C** Agent playing **PandaReachDense-v3**
25
+ This is a trained model of a **A2C** agent playing **PandaReachDense-v3**
26
+ using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
27
+
28
+ ## Usage (with Stable-baselines3)
29
+ TODO: Add your code
30
+
31
+
32
+ ```python
33
+ from stable_baselines3 import ...
34
+ from huggingface_sb3 import load_from_hub
35
+
36
+ ...
37
+ ```
a2c-PandaReachDense-v3.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4471daf2b635217e4a7d0a153750efcc99b8e09fc11b9c5cb2190e1669e0949b
3
+ size 113166
a2c-PandaReachDense-v3/_stable_baselines3_version ADDED
@@ -0,0 +1 @@
 
 
1
+ 2.1.0
a2c-PandaReachDense-v3/data ADDED
@@ -0,0 +1,97 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "policy_class": {
3
+ ":type:": "<class 'abc.ABCMeta'>",
4
+ ":serialized:": "gAWVRQAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMG011bHRpSW5wdXRBY3RvckNyaXRpY1BvbGljeZSTlC4=",
5
+ "__module__": "stable_baselines3.common.policies",
6
+ "__doc__": "\n MultiInputActorClass policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space (Tuple)\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Uses the CombinedExtractor\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param share_features_extractor: If True, the features extractor is shared between the policy and value networks.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ",
7
+ "__init__": "<function MultiInputActorCriticPolicy.__init__ at 0x0000022A46AAE2A0>",
8
+ "__abstractmethods__": "frozenset()",
9
+ "_abc_impl": "<_abc._abc_data object at 0x0000022A46A9BD00>"
10
+ },
11
+ "verbose": 1,
12
+ "policy_kwargs": {
13
+ ":type:": "<class 'dict'>",
14
+ ":serialized:": "gAWVgQAAAAAAAAB9lCiMD29wdGltaXplcl9jbGFzc5SME3RvcmNoLm9wdGltLnJtc3Byb3CUjAdSTVNwcm9wlJOUjBBvcHRpbWl6ZXJfa3dhcmdzlH2UKIwFYWxwaGGURz/vrhR64UeujANlcHOURz7k+LWI42jxjAx3ZWlnaHRfZGVjYXmUSwB1dS4=",
15
+ "optimizer_class": "<class 'torch.optim.rmsprop.RMSprop'>",
16
+ "optimizer_kwargs": {
17
+ "alpha": 0.99,
18
+ "eps": 1e-05,
19
+ "weight_decay": 0
20
+ }
21
+ },
22
+ "num_timesteps": 1000000,
23
+ "_total_timesteps": 1000000,
24
+ "_num_timesteps_at_start": 0,
25
+ "seed": null,
26
+ "action_noise": null,
27
+ "start_time": 1694569862275626200,
28
+ "learning_rate": 0.0007,
29
+ "tensorboard_log": "./logs",
30
+ "_last_obs": {
31
+ ":type:": "<class 'collections.OrderedDict'>",
32
+ ":serialized:": "gAWV+wMAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QolsAAAAAAAAAAaqV7v17Puj+L87e/EgqlPjI2pLokB/A++DEdwGzFUT7s/AjAEgqlPjI2pLokB/A+EgqlPjI2pLokB/A+EgqlPjI2pLokB/A+Y87Fv12Snz+JLJW/Bz0bP7nUC8AA0CbAvCEWP69X8z4M7bq8vjp6u8QevL/6BbI/ljmaPyu0+T7nNI0+BQaiP+plnz74zbQ/L6zQP7fzkr/0FVi/EgqlPjI2pLokB/A+AzD9vrBPAb/8Orc+YOgJP5rNsj4MKKm/lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcolsAAAAAAAAAAHOEEv8UMuz/8W4O/jSQ0v3+fxb+mIeK9EBjDv5xf4ruhKk2/s9NHP1jExb/ZrYQ+TRqzPpUVh77rOvG+asrGPzub078rMWu/EtiDvw0aGj9A45I9hdlWP2Vx0r8tXqm/BtkFP6i6Vj/6UzM9GOMxPnOhzb9I+Kg/ZJ+WP7qlpD7dqOu9kqfNP5tCOT6CIsU/DoLNP6z3L7+J7Nw+8Q5wP/jjmb8XGJI/w1wyv16lvL8EnrI+Z9UmPx/qFb1yO5+/lGgOSxBLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWgAEAAAAAAABqpXu/Xs+6P4vzt787qFy/HgeFP0nwbr8SCqU+MjakuiQH8D7UB/o+FM8mu20xyD74MR3AbMVRPuz8CMAGUw/APLyBPnN4C74SCqU+MjakuiQH8D7UB/o+FM8mu20xyD4SCqU+MjakuiQH8D7UB/o+FM8mu20xyD4SCqU+MjakuiQH8D7UB/o+FM8mu20xyD5jzsW/XZKfP4kslb+ISJm/VJQQP8hGfb4HPRs/udQLwADQJsBnzy2+AHZhv5d8xr+8IRY/r1fzPgzturxUnFe8TUzSP/dxw7++Onq7xB68v/oFsj98uRw/I1J6v6QjwD+WOZo/K7T5Puc0jT4vxsw/TIHRP6HXkr8FBqI/6mWfPvjNtD+QG6E/aEJ2v1f6vD8vrNA/t/OSv/QVWL+dGGU/+NYTP4sjVboSCqU+MjakuiQH8D7UB/o+FM8mu20xyD4DMP2+sE8Bv/w6tz7jJ1e/wQHUv2RuQT9g6Ak/ms2yPgwoqb+Rz7I9+v9ov60str+UaA5LEEsGhpRoEnSUUpR1Lg==",
33
+ "achieved_goal": "[[-9.8299277e-01 1.4594533e+00 -1.4371198e+00]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [-2.4561749e+00 2.0485467e-01 -2.1404371e+00]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [-1.5453609e+00 1.2466542e+00 -1.1654216e+00]\n [ 6.0639995e-01 -2.1848586e+00 -2.6064453e+00]\n [ 5.8645225e-01 4.7527835e-01 -2.2818111e-02]\n [-3.8181986e-03 -1.4696889e+00 1.3908074e+00]\n [ 1.2048824e+00 4.8770270e-01 2.7579424e-01]\n [ 1.2658087e+00 3.1132442e-01 1.4125357e+00]\n [ 1.6302546e+00 -1.1480626e+00 -8.4408498e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [-4.9450693e-01 -5.0512218e-01 3.5787189e-01]\n [ 5.3870201e-01 3.4922487e-01 -1.3215346e+00]]",
34
+ "desired_goal": "[[-0.5190599 1.4613272 -1.0262446 ]\n [-0.7036827 -1.5439299 -0.11041574]\n [-1.5241718 -0.00690837 -0.8014317 ]\n [ 0.780574 -1.5450544 0.25913885]\n [ 0.34981003 -0.26383653 -0.47115263]\n [ 1.5530522 -1.6531748 -0.918719 ]\n [-1.0300314 0.60196 0.07172251]\n [ 0.8392566 -1.6440855 -1.3231865 ]\n [ 0.52284276 0.83878565 0.04378126]\n [ 0.17371786 -1.6064895 1.320077 ]\n [ 1.1767392 0.3215769 -0.11506817]\n [ 1.6066763 0.18091814 1.5401156 ]\n [ 1.6055315 -0.6873729 0.43149212]\n [ 0.937728 -1.2022696 1.1413602 ]\n [-0.69672793 -1.4737966 0.3488618 ]\n [ 0.65169376 -0.03660023 -1.2440016 ]]",
35
+ "observation": "[[-9.8299277e-01 1.4594533e+00 -1.4371198e+00 -8.6194199e-01\n 1.0392797e+00 -9.3335396e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [-2.4561749e+00 2.0485467e-01 -2.1404371e+00 -2.2394423e+00\n 2.5338924e-01 -1.3620166e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [-1.5453609e+00 1.2466542e+00 -1.1654216e+00 -1.1975260e+00\n 5.6476331e-01 -2.4734032e-01]\n [ 6.0639995e-01 -2.1848586e+00 -2.6064453e+00 -1.6973649e-01\n -8.8070679e-01 -1.5506772e+00]\n [ 5.8645225e-01 4.7527835e-01 -2.2818111e-02 -1.3159830e-02\n 1.6429535e+00 -1.5269154e+00]\n [-3.8181986e-03 -1.4696889e+00 1.3908074e+00 6.1220527e-01\n -9.7781581e-01 1.5010877e+00]\n [ 1.2048824e+00 4.8770270e-01 2.7579424e-01 1.5997981e+00\n 1.6367583e+00 -1.1472055e+00]\n [ 1.2658087e+00 3.1132442e-01 1.4125357e+00 1.2586536e+00\n -9.6195078e-01 1.4763898e+00]\n [ 1.6302546e+00 -1.1480626e+00 -8.4408498e-01 8.9490682e-01\n 5.7749891e-01 -8.1306015e-04]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [-4.9450693e-01 -5.0512218e-01 3.5787189e-01 -8.4045237e-01\n -1.6563035e+00 7.5559068e-01]\n [ 5.3870201e-01 3.4922487e-01 -1.3215346e+00 8.7309964e-02\n -9.1015589e-01 -1.4232384e+00]]"
36
+ },
37
+ "_last_episode_starts": {
38
+ ":type:": "<class 'numpy.ndarray'>",
39
+ ":serialized:": "gAWVgwAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYQAAAAAAAAAAABAAEBAQAAAAAAAAABAACUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSxCFlIwBQ5R0lFKULg=="
40
+ },
41
+ "_last_original_obs": {
42
+ ":type:": "<class 'collections.OrderedDict'>",
43
+ ":serialized:": "gAWV+wMAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QolsAAAAAAAAAA6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcolsAAAAAAAAAACGEEPRE8Aj5wdYw9XvUGvmzHmb2anos+yth6vTZn2b2VlHU+09M0vXbKm7yLdRk+nYKjvQTenjv8dwI+OlGaPR+HSr2SMVQ+8WgKPmYGHD1np1k+C4FUPN6bfr2mFFo+PfAGvXwAfz06SDQ9ufaqvS62az1snFo+yeFtPQr7urv29So+pcKGvVwuLz0eZXo7G5tpPAPkGL3OJ1g+LI+6PRpNvD26iow+nj++PNfl/ryWnnE+oKtVvb2eeT1OoJg+lGgOSxBLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWgAEAAAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAACUaA5LEEsGhpRoEnSUUpR1Lg==",
44
+ "achieved_goal": "[[ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]]",
45
+ "desired_goal": "[[ 0.0323191 0.12718226 0.06858337]\n [-0.13179538 -0.0750874 0.2726944 ]\n [-0.0612419 -0.10615389 0.23982461]\n [-0.04414732 -0.01901744 0.14986245]\n [-0.07983897 0.00484824 0.12741083]\n [ 0.07535024 -0.04944527 0.20722035]\n [ 0.13516594 0.03809204 0.21255265]\n [ 0.01297022 -0.06216037 0.21296939]\n [-0.03294395 0.06225632 0.04401419]\n [-0.0834784 0.05754679 0.21348733]\n [ 0.05807665 -0.0057062 0.16695389]\n [-0.06580094 0.04276882 0.00382072]\n [ 0.01425817 -0.03732682 0.21108934]\n [ 0.09109339 0.09194393 0.2744959 ]\n [ 0.02322369 -0.03111546 0.23595652]\n [-0.05216563 0.0609424 0.29809803]]",
46
+ "observation": "[[ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]]"
47
+ },
48
+ "_episode_num": 0,
49
+ "use_sde": false,
50
+ "sde_sample_freq": -1,
51
+ "_current_progress_remaining": 0.0,
52
+ "_stats_window_size": 100,
53
+ "ep_info_buffer": {
54
+ ":type:": "<class 'collections.deque'>",
55
+ ":serialized:": "gAWV4AsAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHv7YZydWhh6WMAWyUSwKMAXSUR0CQDDVJcxCZdX2UKGgGR7/QStNi6QNkaAdLA2gIR0CQESTURWcSdX2UKGgGR7/LPfsNUfgaaAdLA2gIR0CQEI1V5rxidX2UKGgGR7/SkIomXw9aaAdLA2gIR0CQEEIN3GGVdX2UKGgGR7/IXfqHGjsVaAdLA2gIR0CQD43mFJxvdX2UKGgGR7+zULDye7L/aAdLAmgIR0CQDLF9a2WqdX2UKGgGR7+9IFvAGjbjaAdLAmgIR0CQEXw5vLowdX2UKGgGR7+eAI6bONYKaAdLAWgIR0CQETIkJKJ3dX2UKGgGR7/G62fChvitaAdLA2gIR0CQEObhWHUMdX2UKGgGR7+UC3gDRtxdaAdLAWgIR0CQEFBj4HopdX2UKGgGR7/ZCE6DGtITaAdLBGgIR0CQD//cFhXsdX2UKGgGR7+hpYcNpdrwaAdLAWgIR0CQD5s2NvOydX2UKGgGR7/RN7BwdbPhaAdLA2gIR0CQDrfZVXFMdX2UKGgGR7/DGHYYixFBaAdLAmgIR0CQDej5bhWHdX2UKGgGR7/PSofjjrAyaAdLA2gIR0CQDYyp71IzdX2UKGgGR7/HJT2nKnvVaAdLA2gIR0CQDSbAUL2IdX2UKGgGR7/ZA+pwS8J2aAdLBGgIR0CQC/iX6ZYxdX2UKGgGR7/LRSgoPTXraAdLA2gIR0CQD0KgIyCWdX2UKGgGR7/d9UCJXQt0aAdLBGgIR0CQDmNbC79RdX2UKGgGR7/U7UG3WnTBaAdLA2gIR0CQDGVrAP/adX2UKGgGR7+7f779AHE/aAdLAmgIR0CQEZv4/NaAdX2UKGgGR7+8uUUwi7kGaAdLAmgIR0CQEQemvW6LdX2UKGgGR7/RSNOuaF23aAdLA2gIR0CQELxxT850dX2UKGgGR7+xE7W/ag27aAdLAmgIR0CQDgg3Lmp3dX2UKGgGR7/GmLLpzLfUaAdLA2gIR0CQDOAXVLBbdX2UKGgGR7/MvX9R77bdaAdLA2gIR0CQEWC+UQkHdX2UKGgGR7/U3UhFEy+IaAdLA2gIR0CQEIAEMb3odX2UKGgGR7/QQPI4lyBDaAdLA2gIR0CQEC98Z1mrdX2UKGgGR7/WwEQoTfzjaAdLA2gIR0CQD8vc8DB/dX2UKGgGR7/Kmnfl6qsEaAdLA2gIR0CQDuiADq4ZdX2UKGgGR7/Zc8DB/I8yaAdLA2gIR0CQDbxJul41dX2UKGgGR7/Mdsi0OVgQaAdLA2gIR0CQDVdmg8KYdX2UKGgGR7/JKifxtpEhaAdLA2gIR0CQDCk+otL+dX2UKGgGR7+6q5sj3VTaaAdLAmgIR0CQEbowVTJhdX2UKGgGR7/RXxe9i+cpaAdLA2gIR0CQD28tPHktdX2UKGgGR7/GBJZntfG/aAdLA2gIR0CQDo/oJRfndX2UKGgGR7/B7di2DxsmaAdLAmgIR0CQDibvw3HadX2UKGgGR7/PrGipNsWPaAdLA2gIR0CQETQ0GeMAdX2UKGgGR7/UylvZRKpUaAdLA2gIR0CQEOf4REncdX2UKGgGR7/F/0dzXBgvaAdLAmgIR0CQD+iKR+z/dX2UKGgGR7/SESdvsJIEaAdLBGgIR0CQDJ9H+ZPVdX2UKGgGR7+yii7CiypraAdLAmgIR0CQDETkhib2dX2UKGgGR7/XK6nR9gF5aAdLA2gIR0CQEZDfWMCLdX2UKGgGR7/MiRnvlU6xaAdLA2gIR0CQEK8f3evZdX2UKGgGR7/Q9Jz1bqyGaAdLA2gIR0CQEF2R7qptdX2UKGgGR7/CmO2iL2pRaAdLAmgIR0CQD47r9l3AdX2UKGgGR7+6Eal1r6+GaAdLAmgIR0CQDkauOjqOdX2UKGgGR7/UTQmeDnNgaAdLA2gIR0CQDepeNT99dX2UKGgGR7/NHiFTNt65aAdLA2gIR0CQDYV6/qPfdX2UKGgGR7/ZVVghKUV0aAdLBGgIR0CQDR+UQkHEdX2UKGgGR7/K21D0Dlo2aAdLA2gIR0CQEelKsdT6dX2UKGgGR7/aHZK3/givaAdLBGgIR0CQDyTrVvuPdX2UKGgGR7/Mb70nPVuraAdLA2gIR0CQDr8D0UXYdX2UKGgGR7+8UsWfseGPaAdLAmgIR0CQDGWp6yB1dX2UKGgGR7+2GsV+I/JOaAdLAmgIR0CQEa2OQyRCdX2UKGgGR7/UIYFaB7NTaAdLA2gIR0CQEWJLM9r5dX2UKGgGR7/VJbdJrcj8aAdLA2gIR0CQEBafjCHidX2UKGgGR7/Ax/NJOFg2aAdLAmgIR0CQDaEjPfKqdX2UKGgGR7/S8ohIOH32aAdLA2gIR0CQDM5lOGj9dX2UKGgGR7+gCr92ovSMaAdLAWgIR0CQDHQBxPwedX2UKGgGR7/dRiPQv6CUaAdLBGgIR0CQESTr3TNMdX2UKGgGR7/MtbLU1AJLaAdLA2gIR0CQENmkFfRedX2UKGgGR7/QMgU1yeZoaAdLA2gIR0CQEIkcS5AhdX2UKGgGR7/Gz8gpz90jaAdLA2gIR0CQD7p2ECeVdX2UKGgGR7/QLfDUExIraAdLA2gIR0CQEhTVUdaMdX2UKGgGR7+/4BV+7UXpaAdLAmgIR0CQEcvF3pwCdX2UKGgGR7+99RaX8fmtaAdLAmgIR0CQEYCDEm6YdX2UKGgGR7+iCUX531SPaAdLAWgIR0CQD8nSfDk3dX2UKGgGR7/MFcpsoDxLaAdLA2gIR0CQD1F7D2rXdX2UKGgGR7/LTQVsUIszaAdLA2gIR0CQDuuTibUgdX2UKGgGR7/VzKcNH6MzaAdLBGgIR0CQDoKaoddWdX2UKGgGR7/auUliSaE0aAdLBGgIR0CQDiZK3/gjdX2UKGgGR7/Qc7yQPqcFaAdLBGgIR0CQDVyHVPN3dX2UKGgGR7+5kFwDNhVmaAdLAmgIR0CQDO+vhZQpdX2UKGgGR7+38yeqaPS2aAdLAmgIR0CQEUnIhhYvdX2UKGgGR7/JTF2mpEQYaAdLA2gIR0CQEEpYs/Y8dX2UKGgGR7/QFhXr+o9+aAdLA2gIR0CQDKa0hNdrdX2UKGgGR7+yOwPiDM/yaAdLAmgIR0CQEe6WgOBldX2UKGgGR7+8Kmbb1yvLaAdLAmgIR0CQEaNT987ZdX2UKGgGR7/TWZ7XxvvSaAdLA2gIR0CQEQzWPLgXdX2UKGgGR7/MnGbTc6/7aAdLA2gIR0CQELxOclPadX2UKGgGR7/Bt4zJp35faAdLAmgIR0CQD+2pyZKGdX2UKGgGR7+0+8oQWepXaAdLAmgIR0CQDkkcCHRDdX2UKGgGR7/goi9qUNayaAdLBGgIR0CQDeQ4jrzHdX2UKGgGR7+lFfAsTWXkaAdLAWgIR0CQDLYQrc0tdX2UKGgGR7/U1oQFs54oaAdLA2gIR0CQEkgIyCWedX2UKGgGR7/BH9WIXTEzaAdLAmgIR0CQEGgLJCBxdX2UKGgGR7/QO/L1VYITaAdLA2gIR0CQDx3BYV7AdX2UKGgGR7+ig7HQyAQQaAdLAWgIR0CQDlh4MWoFdX2UKGgGR7/HG9YfW+XaaAdLA2gIR0CQDYyoXKr8dX2UKGgGR7/J8ohIOH32aAdLA2gIR0CQDR/Q0GeMdX2UKGgGR7+iwUxmCiAUaAdLAWgIR0CQDMVs1sLwdX2UKGgGR7/V5tFa0QbuaAdLBGgIR0CQD5YXfqHHdX2UKGgGR7/Xl0o0ALiNaAdLBGgIR0CQDsYw7DEWdX2UKGgGR7+iOmzjWCmNaAdLAWgIR0CQDTE5yU9qdX2UKGgGR7/KYvWYnfEXaAdLA2gIR0CQEiDEm6XjdX2UKGgGR7/a6KtPpIMCaAdLBGgIR0CQEYnKGL1mdX2UKGgGR7/Ts9jgAIY4aAdLA2gIR0CQET6DGtITdX2UKGgGR7/UWFvhqCYkaAdLA2gIR0CQEO37UG3XdX2UKGgGR7/BDrqt5le4aAdLAmgIR0CQDa3yZrpJdWUu"
56
+ },
57
+ "ep_success_buffer": {
58
+ ":type:": "<class 'collections.deque'>",
59
+ ":serialized:": "gAWVIAAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKULg=="
60
+ },
61
+ "_n_updates": 12500,
62
+ "n_steps": 5,
63
+ "gamma": 0.99,
64
+ "gae_lambda": 1.0,
65
+ "ent_coef": 0.0,
66
+ "vf_coef": 0.5,
67
+ "max_grad_norm": 0.5,
68
+ "normalize_advantage": false,
69
+ "observation_space": {
70
+ ":type:": "<class 'gymnasium.spaces.dict.Dict'>",
71
+ ":serialized:": "gAWVsAMAAAAAAACMFWd5bW5hc2l1bS5zcGFjZXMuZGljdJSMBERpY3SUk5QpgZR9lCiMBnNwYWNlc5SMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwUZ3ltbmFzaXVtLnNwYWNlcy5ib3iUjANCb3iUk5QpgZR9lCiMBWR0eXBllIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYowNYm91bmRlZF9iZWxvd5SMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYDAAAAAAAAAAEBAZRoE4wCYjGUiYiHlFKUKEsDjAF8lE5OTkr/////Sv////9LAHSUYksDhZSMAUOUdJRSlIwNYm91bmRlZF9hYm92ZZRoHCiWAwAAAAAAAAABAQGUaCBLA4WUaCR0lFKUjAZfc2hhcGWUSwOFlIwDbG93lGgcKJYMAAAAAAAAAAAAIMEAACDBAAAgwZRoFksDhZRoJHSUUpSMBGhpZ2iUaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlIwIbG93X3JlcHKUjAUtMTAuMJSMCWhpZ2hfcmVwcpSMBDEwLjCUjApfbnBfcmFuZG9tlE51YowMZGVzaXJlZF9nb2FslGgNKYGUfZQoaBBoFmgZaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgnaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgsSwOFlGguaBwolgwAAAAAAAAAAAAgwQAAIMEAACDBlGgWSwOFlGgkdJRSlGgzaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlGg4jAUtMTAuMJRoOowEMTAuMJRoPE51YowLb2JzZXJ2YXRpb26UaA0pgZR9lChoEGgWaBloHCiWBgAAAAAAAAABAQEBAQGUaCBLBoWUaCR0lFKUaCdoHCiWBgAAAAAAAAABAQEBAQGUaCBLBoWUaCR0lFKUaCxLBoWUaC5oHCiWGAAAAAAAAAAAACDBAAAgwQAAIMEAACDBAAAgwQAAIMGUaBZLBoWUaCR0lFKUaDNoHCiWGAAAAAAAAAAAACBBAAAgQQAAIEEAACBBAAAgQQAAIEGUaBZLBoWUaCR0lFKUaDiMBS0xMC4wlGg6jAQxMC4wlGg8TnVidWgsTmgQTmg8TnViLg==",
72
+ "spaces": "OrderedDict([('achieved_goal', Box(-10.0, 10.0, (3,), float32)), ('desired_goal', Box(-10.0, 10.0, (3,), float32)), ('observation', Box(-10.0, 10.0, (6,), float32))])",
73
+ "_shape": null,
74
+ "dtype": null,
75
+ "_np_random": null
76
+ },
77
+ "action_space": {
78
+ ":type:": "<class 'gymnasium.spaces.box.Box'>",
79
+ ":serialized:": "gAWVYAIAAAAAAACMFGd5bW5hc2l1bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lIwFZHR5cGWUk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMDWJvdW5kZWRfYmVsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWAwAAAAAAAAABAQGUaAiMAmIxlImIh5RSlChLA4wBfJROTk5K/////0r/////SwB0lGJLA4WUjAFDlHSUUpSMDWJvdW5kZWRfYWJvdmWUaBEolgMAAAAAAAAAAQEBlGgVSwOFlGgZdJRSlIwGX3NoYXBllEsDhZSMA2xvd5RoESiWDAAAAAAAAAAAAIC/AACAvwAAgL+UaAtLA4WUaBl0lFKUjARoaWdolGgRKJYMAAAAAAAAAAAAgD8AAIA/AACAP5RoC0sDhZRoGXSUUpSMCGxvd19yZXBylIwELTEuMJSMCWhpZ2hfcmVwcpSMAzEuMJSMCl9ucF9yYW5kb22UjBRudW1weS5yYW5kb20uX3BpY2tsZZSMEF9fZ2VuZXJhdG9yX2N0b3KUk5SMBVBDRzY0lGgyjBRfX2JpdF9nZW5lcmF0b3JfY3RvcpSTlIaUUpR9lCiMDWJpdF9nZW5lcmF0b3KUjAVQQ0c2NJSMBXN0YXRllH2UKGg9ihAv4SazqaHDf6u7n5nS1VU5jANpbmOUihCN1pCeJC84ecBpGA//tjVndYwKaGFzX3VpbnQzMpRLAIwIdWludGVnZXKUSwB1YnViLg==",
80
+ "dtype": "float32",
81
+ "bounded_below": "[ True True True]",
82
+ "bounded_above": "[ True True True]",
83
+ "_shape": [
84
+ 3
85
+ ],
86
+ "low": "[-1. -1. -1.]",
87
+ "high": "[1. 1. 1.]",
88
+ "low_repr": "-1.0",
89
+ "high_repr": "1.0",
90
+ "_np_random": "Generator(PCG64)"
91
+ },
92
+ "n_envs": 16,
93
+ "lr_schedule": {
94
+ ":type:": "<class 'function'>",
95
+ ":serialized:": "gAWVogIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLAUsTQwiVAZcAiQFTAJROhZQpjAFflIWUjGFDOlxVc2Vyc1xwaWV0cm9sdW9uZ29cbWluaWNvbmRhM1xlbnZzXHJlbGVhcm5cTGliXHNpdGUtcGFja2FnZXNcc3RhYmxlX2Jhc2VsaW5lczNcY29tbW9uXHV0aWxzLnB5lIwEZnVuY5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUS4NDCPiAANgPEogKlEMAlIwDdmFslIWUKXSUUpR9lCiMC19fcGFja2FnZV9flIwYc3RhYmxlX2Jhc2VsaW5lczMuY29tbW9ulIwIX19uYW1lX1+UjB5zdGFibGVfYmFzZWxpbmVzMy5jb21tb24udXRpbHOUjAhfX2ZpbGVfX5RoDHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaCB9lH2UKGgYaA2MDF9fcXVhbG5hbWVfX5RoDowPX19hbm5vdGF0aW9uc19flH2UjA5fX2t3ZGVmYXVsdHNfX5ROjAxfX2RlZmF1bHRzX1+UTowKX19tb2R1bGVfX5RoGYwHX19kb2NfX5ROjAtfX2Nsb3N1cmVfX5RoAIwKX21ha2VfY2VsbJSTlEc/RvAGjbi6x4WUUpSFlIwXX2Nsb3VkcGlja2xlX3N1Ym1vZHVsZXOUXZSMC19fZ2xvYmFsc19flH2UdYaUhlIwLg=="
96
+ }
97
+ }
a2c-PandaReachDense-v3/policy.optimizer.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bbccacc6526914e4e740811c240ecbdc0e49da2446b3a08cfa5f4ee32bfcb1c2
3
+ size 44734
a2c-PandaReachDense-v3/policy.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c8eb7fec435103c83eaa3fc14c697e7559d097f9460359d3a9745695b76b9c0
3
+ size 46014
a2c-PandaReachDense-v3/pytorch_variables.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d030ad8db708280fcae77d87e973102039acd23a11bdecc3db8eb6c0ac940ee1
3
+ size 431
a2c-PandaReachDense-v3/system_info.txt ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ - OS: Windows-10-10.0.22621-SP0 10.0.22621
2
+ - Python: 3.11.4
3
+ - Stable-Baselines3: 2.1.0
4
+ - PyTorch: 2.0.1
5
+ - GPU Enabled: True
6
+ - Numpy: 1.25.2
7
+ - Cloudpickle: 2.2.1
8
+ - Gymnasium: 0.29.1
9
+ - OpenAI Gym: 0.26.2
config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"policy_class": {":type:": "<class 'abc.ABCMeta'>", ":serialized:": "gAWVRQAAAAAAAACMIXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5wb2xpY2llc5SMG011bHRpSW5wdXRBY3RvckNyaXRpY1BvbGljeZSTlC4=", "__module__": "stable_baselines3.common.policies", "__doc__": "\n MultiInputActorClass policy class for actor-critic algorithms (has both policy and value prediction).\n Used by A2C, PPO and the likes.\n\n :param observation_space: Observation space (Tuple)\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param ortho_init: Whether to use or not orthogonal initialization\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param full_std: Whether to use (n_features x n_actions) parameters\n for the std instead of only (n_features,) when using gSDE\n :param use_expln: Use ``expln()`` function instead of ``exp()`` to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param squash_output: Whether to squash the output using a tanh function,\n this allows to ensure boundaries when using gSDE.\n :param features_extractor_class: Uses the CombinedExtractor\n :param features_extractor_kwargs: Keyword arguments\n to pass to the features extractor.\n :param share_features_extractor: If True, the features extractor is shared between the policy and value networks.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n ", "__init__": "<function MultiInputActorCriticPolicy.__init__ at 0x0000022A46AAE2A0>", "__abstractmethods__": "frozenset()", "_abc_impl": "<_abc._abc_data object at 0x0000022A46A9BD00>"}, "verbose": 1, "policy_kwargs": {":type:": "<class 'dict'>", ":serialized:": "gAWVgQAAAAAAAAB9lCiMD29wdGltaXplcl9jbGFzc5SME3RvcmNoLm9wdGltLnJtc3Byb3CUjAdSTVNwcm9wlJOUjBBvcHRpbWl6ZXJfa3dhcmdzlH2UKIwFYWxwaGGURz/vrhR64UeujANlcHOURz7k+LWI42jxjAx3ZWlnaHRfZGVjYXmUSwB1dS4=", "optimizer_class": "<class 'torch.optim.rmsprop.RMSprop'>", "optimizer_kwargs": {"alpha": 0.99, "eps": 1e-05, "weight_decay": 0}}, "num_timesteps": 1000000, "_total_timesteps": 1000000, "_num_timesteps_at_start": 0, "seed": null, "action_noise": null, "start_time": 1694569862275626200, "learning_rate": 0.0007, "tensorboard_log": "./logs", "_last_obs": {":type:": "<class 'collections.OrderedDict'>", ":serialized:": "gAWV+wMAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QolsAAAAAAAAAAaqV7v17Puj+L87e/EgqlPjI2pLokB/A++DEdwGzFUT7s/AjAEgqlPjI2pLokB/A+EgqlPjI2pLokB/A+EgqlPjI2pLokB/A+Y87Fv12Snz+JLJW/Bz0bP7nUC8AA0CbAvCEWP69X8z4M7bq8vjp6u8QevL/6BbI/ljmaPyu0+T7nNI0+BQaiP+plnz74zbQ/L6zQP7fzkr/0FVi/EgqlPjI2pLokB/A+AzD9vrBPAb/8Orc+YOgJP5rNsj4MKKm/lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcolsAAAAAAAAAAHOEEv8UMuz/8W4O/jSQ0v3+fxb+mIeK9EBjDv5xf4ruhKk2/s9NHP1jExb/ZrYQ+TRqzPpUVh77rOvG+asrGPzub078rMWu/EtiDvw0aGj9A45I9hdlWP2Vx0r8tXqm/BtkFP6i6Vj/6UzM9GOMxPnOhzb9I+Kg/ZJ+WP7qlpD7dqOu9kqfNP5tCOT6CIsU/DoLNP6z3L7+J7Nw+8Q5wP/jjmb8XGJI/w1wyv16lvL8EnrI+Z9UmPx/qFb1yO5+/lGgOSxBLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWgAEAAAAAAABqpXu/Xs+6P4vzt787qFy/HgeFP0nwbr8SCqU+MjakuiQH8D7UB/o+FM8mu20xyD74MR3AbMVRPuz8CMAGUw/APLyBPnN4C74SCqU+MjakuiQH8D7UB/o+FM8mu20xyD4SCqU+MjakuiQH8D7UB/o+FM8mu20xyD4SCqU+MjakuiQH8D7UB/o+FM8mu20xyD5jzsW/XZKfP4kslb+ISJm/VJQQP8hGfb4HPRs/udQLwADQJsBnzy2+AHZhv5d8xr+8IRY/r1fzPgzturxUnFe8TUzSP/dxw7++Onq7xB68v/oFsj98uRw/I1J6v6QjwD+WOZo/K7T5Puc0jT4vxsw/TIHRP6HXkr8FBqI/6mWfPvjNtD+QG6E/aEJ2v1f6vD8vrNA/t/OSv/QVWL+dGGU/+NYTP4sjVboSCqU+MjakuiQH8D7UB/o+FM8mu20xyD4DMP2+sE8Bv/w6tz7jJ1e/wQHUv2RuQT9g6Ak/ms2yPgwoqb+Rz7I9+v9ov60str+UaA5LEEsGhpRoEnSUUpR1Lg==", "achieved_goal": "[[-9.8299277e-01 1.4594533e+00 -1.4371198e+00]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [-2.4561749e+00 2.0485467e-01 -2.1404371e+00]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [-1.5453609e+00 1.2466542e+00 -1.1654216e+00]\n [ 6.0639995e-01 -2.1848586e+00 -2.6064453e+00]\n [ 5.8645225e-01 4.7527835e-01 -2.2818111e-02]\n [-3.8181986e-03 -1.4696889e+00 1.3908074e+00]\n [ 1.2048824e+00 4.8770270e-01 2.7579424e-01]\n [ 1.2658087e+00 3.1132442e-01 1.4125357e+00]\n [ 1.6302546e+00 -1.1480626e+00 -8.4408498e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01]\n [-4.9450693e-01 -5.0512218e-01 3.5787189e-01]\n [ 5.3870201e-01 3.4922487e-01 -1.3215346e+00]]", "desired_goal": "[[-0.5190599 1.4613272 -1.0262446 ]\n [-0.7036827 -1.5439299 -0.11041574]\n [-1.5241718 -0.00690837 -0.8014317 ]\n [ 0.780574 -1.5450544 0.25913885]\n [ 0.34981003 -0.26383653 -0.47115263]\n [ 1.5530522 -1.6531748 -0.918719 ]\n [-1.0300314 0.60196 0.07172251]\n [ 0.8392566 -1.6440855 -1.3231865 ]\n [ 0.52284276 0.83878565 0.04378126]\n [ 0.17371786 -1.6064895 1.320077 ]\n [ 1.1767392 0.3215769 -0.11506817]\n [ 1.6066763 0.18091814 1.5401156 ]\n [ 1.6055315 -0.6873729 0.43149212]\n [ 0.937728 -1.2022696 1.1413602 ]\n [-0.69672793 -1.4737966 0.3488618 ]\n [ 0.65169376 -0.03660023 -1.2440016 ]]", "observation": "[[-9.8299277e-01 1.4594533e+00 -1.4371198e+00 -8.6194199e-01\n 1.0392797e+00 -9.3335396e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [-2.4561749e+00 2.0485467e-01 -2.1404371e+00 -2.2394423e+00\n 2.5338924e-01 -1.3620166e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [-1.5453609e+00 1.2466542e+00 -1.1654216e+00 -1.1975260e+00\n 5.6476331e-01 -2.4734032e-01]\n [ 6.0639995e-01 -2.1848586e+00 -2.6064453e+00 -1.6973649e-01\n -8.8070679e-01 -1.5506772e+00]\n [ 5.8645225e-01 4.7527835e-01 -2.2818111e-02 -1.3159830e-02\n 1.6429535e+00 -1.5269154e+00]\n [-3.8181986e-03 -1.4696889e+00 1.3908074e+00 6.1220527e-01\n -9.7781581e-01 1.5010877e+00]\n [ 1.2048824e+00 4.8770270e-01 2.7579424e-01 1.5997981e+00\n 1.6367583e+00 -1.1472055e+00]\n [ 1.2658087e+00 3.1132442e-01 1.4125357e+00 1.2586536e+00\n -9.6195078e-01 1.4763898e+00]\n [ 1.6302546e+00 -1.1480626e+00 -8.4408498e-01 8.9490682e-01\n 5.7749891e-01 -8.1306015e-04]\n [ 3.2234246e-01 -1.2528358e-03 4.6880448e-01 4.8834097e-01\n -2.5453018e-03 3.9100209e-01]\n [-4.9450693e-01 -5.0512218e-01 3.5787189e-01 -8.4045237e-01\n -1.6563035e+00 7.5559068e-01]\n [ 5.3870201e-01 3.4922487e-01 -1.3215346e+00 8.7309964e-02\n -9.1015589e-01 -1.4232384e+00]]"}, "_last_episode_starts": {":type:": "<class 'numpy.ndarray'>", ":serialized:": "gAWVgwAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYQAAAAAAAAAAABAAEBAQAAAAAAAAABAACUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSxCFlIwBQ5R0lFKULg=="}, "_last_original_obs": {":type:": "<class 'collections.OrderedDict'>", ":serialized:": "gAWV+wMAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QolsAAAAAAAAAA6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+6nIdPRlsGqxDI0o+lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksQSwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcolsAAAAAAAAAACGEEPRE8Aj5wdYw9XvUGvmzHmb2anos+yth6vTZn2b2VlHU+09M0vXbKm7yLdRk+nYKjvQTenjv8dwI+OlGaPR+HSr2SMVQ+8WgKPmYGHD1np1k+C4FUPN6bfr2mFFo+PfAGvXwAfz06SDQ9ufaqvS62az1snFo+yeFtPQr7urv29So+pcKGvVwuLz0eZXo7G5tpPAPkGL3OJ1g+LI+6PRpNvD26iow+nj++PNfl/ryWnnE+oKtVvb2eeT1OoJg+lGgOSxBLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWgAEAAAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAADqch09GWwarEMjSj4AAAAAAAAAgAAAAACUaA5LEEsGhpRoEnSUUpR1Lg==", "achieved_goal": "[[ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01]]", "desired_goal": "[[ 0.0323191 0.12718226 0.06858337]\n [-0.13179538 -0.0750874 0.2726944 ]\n [-0.0612419 -0.10615389 0.23982461]\n [-0.04414732 -0.01901744 0.14986245]\n [-0.07983897 0.00484824 0.12741083]\n [ 0.07535024 -0.04944527 0.20722035]\n [ 0.13516594 0.03809204 0.21255265]\n [ 0.01297022 -0.06216037 0.21296939]\n [-0.03294395 0.06225632 0.04401419]\n [-0.0834784 0.05754679 0.21348733]\n [ 0.05807665 -0.0057062 0.16695389]\n [-0.06580094 0.04276882 0.00382072]\n [ 0.01425817 -0.03732682 0.21108934]\n [ 0.09109339 0.09194393 0.2744959 ]\n [ 0.02322369 -0.03111546 0.23595652]\n [-0.05216563 0.0609424 0.29809803]]", "observation": "[[ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]\n [ 3.8439669e-02 -2.1944723e-12 1.9740014e-01 0.0000000e+00\n -0.0000000e+00 0.0000000e+00]]"}, "_episode_num": 0, "use_sde": false, "sde_sample_freq": -1, "_current_progress_remaining": 0.0, "_stats_window_size": 100, "ep_info_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWV4AsAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHv7YZydWhh6WMAWyUSwKMAXSUR0CQDDVJcxCZdX2UKGgGR7/QStNi6QNkaAdLA2gIR0CQESTURWcSdX2UKGgGR7/LPfsNUfgaaAdLA2gIR0CQEI1V5rxidX2UKGgGR7/SkIomXw9aaAdLA2gIR0CQEEIN3GGVdX2UKGgGR7/IXfqHGjsVaAdLA2gIR0CQD43mFJxvdX2UKGgGR7+zULDye7L/aAdLAmgIR0CQDLF9a2WqdX2UKGgGR7+9IFvAGjbjaAdLAmgIR0CQEXw5vLowdX2UKGgGR7+eAI6bONYKaAdLAWgIR0CQETIkJKJ3dX2UKGgGR7/G62fChvitaAdLA2gIR0CQEObhWHUMdX2UKGgGR7+UC3gDRtxdaAdLAWgIR0CQEFBj4HopdX2UKGgGR7/ZCE6DGtITaAdLBGgIR0CQD//cFhXsdX2UKGgGR7+hpYcNpdrwaAdLAWgIR0CQD5s2NvOydX2UKGgGR7/RN7BwdbPhaAdLA2gIR0CQDrfZVXFMdX2UKGgGR7/DGHYYixFBaAdLAmgIR0CQDej5bhWHdX2UKGgGR7/PSofjjrAyaAdLA2gIR0CQDYyp71IzdX2UKGgGR7/HJT2nKnvVaAdLA2gIR0CQDSbAUL2IdX2UKGgGR7/ZA+pwS8J2aAdLBGgIR0CQC/iX6ZYxdX2UKGgGR7/LRSgoPTXraAdLA2gIR0CQD0KgIyCWdX2UKGgGR7/d9UCJXQt0aAdLBGgIR0CQDmNbC79RdX2UKGgGR7/U7UG3WnTBaAdLA2gIR0CQDGVrAP/adX2UKGgGR7+7f779AHE/aAdLAmgIR0CQEZv4/NaAdX2UKGgGR7+8uUUwi7kGaAdLAmgIR0CQEQemvW6LdX2UKGgGR7/RSNOuaF23aAdLA2gIR0CQELxxT850dX2UKGgGR7+xE7W/ag27aAdLAmgIR0CQDgg3Lmp3dX2UKGgGR7/GmLLpzLfUaAdLA2gIR0CQDOAXVLBbdX2UKGgGR7/MvX9R77bdaAdLA2gIR0CQEWC+UQkHdX2UKGgGR7/U3UhFEy+IaAdLA2gIR0CQEIAEMb3odX2UKGgGR7/QQPI4lyBDaAdLA2gIR0CQEC98Z1mrdX2UKGgGR7/WwEQoTfzjaAdLA2gIR0CQD8vc8DB/dX2UKGgGR7/Kmnfl6qsEaAdLA2gIR0CQDuiADq4ZdX2UKGgGR7/Zc8DB/I8yaAdLA2gIR0CQDbxJul41dX2UKGgGR7/Mdsi0OVgQaAdLA2gIR0CQDVdmg8KYdX2UKGgGR7/JKifxtpEhaAdLA2gIR0CQDCk+otL+dX2UKGgGR7+6q5sj3VTaaAdLAmgIR0CQEbowVTJhdX2UKGgGR7/RXxe9i+cpaAdLA2gIR0CQD28tPHktdX2UKGgGR7/GBJZntfG/aAdLA2gIR0CQDo/oJRfndX2UKGgGR7/B7di2DxsmaAdLAmgIR0CQDibvw3HadX2UKGgGR7/PrGipNsWPaAdLA2gIR0CQETQ0GeMAdX2UKGgGR7/UylvZRKpUaAdLA2gIR0CQEOf4REncdX2UKGgGR7/F/0dzXBgvaAdLAmgIR0CQD+iKR+z/dX2UKGgGR7/SESdvsJIEaAdLBGgIR0CQDJ9H+ZPVdX2UKGgGR7+yii7CiypraAdLAmgIR0CQDETkhib2dX2UKGgGR7/XK6nR9gF5aAdLA2gIR0CQEZDfWMCLdX2UKGgGR7/MiRnvlU6xaAdLA2gIR0CQEK8f3evZdX2UKGgGR7/Q9Jz1bqyGaAdLA2gIR0CQEF2R7qptdX2UKGgGR7/CmO2iL2pRaAdLAmgIR0CQD47r9l3AdX2UKGgGR7+6Eal1r6+GaAdLAmgIR0CQDkauOjqOdX2UKGgGR7/UTQmeDnNgaAdLA2gIR0CQDepeNT99dX2UKGgGR7/NHiFTNt65aAdLA2gIR0CQDYV6/qPfdX2UKGgGR7/ZVVghKUV0aAdLBGgIR0CQDR+UQkHEdX2UKGgGR7/K21D0Dlo2aAdLA2gIR0CQEelKsdT6dX2UKGgGR7/aHZK3/givaAdLBGgIR0CQDyTrVvuPdX2UKGgGR7/Mb70nPVuraAdLA2gIR0CQDr8D0UXYdX2UKGgGR7+8UsWfseGPaAdLAmgIR0CQDGWp6yB1dX2UKGgGR7+2GsV+I/JOaAdLAmgIR0CQEa2OQyRCdX2UKGgGR7/UIYFaB7NTaAdLA2gIR0CQEWJLM9r5dX2UKGgGR7/VJbdJrcj8aAdLA2gIR0CQEBafjCHidX2UKGgGR7/Ax/NJOFg2aAdLAmgIR0CQDaEjPfKqdX2UKGgGR7/S8ohIOH32aAdLA2gIR0CQDM5lOGj9dX2UKGgGR7+gCr92ovSMaAdLAWgIR0CQDHQBxPwedX2UKGgGR7/dRiPQv6CUaAdLBGgIR0CQESTr3TNMdX2UKGgGR7/MtbLU1AJLaAdLA2gIR0CQENmkFfRedX2UKGgGR7/QMgU1yeZoaAdLA2gIR0CQEIkcS5AhdX2UKGgGR7/Gz8gpz90jaAdLA2gIR0CQD7p2ECeVdX2UKGgGR7/QLfDUExIraAdLA2gIR0CQEhTVUdaMdX2UKGgGR7+/4BV+7UXpaAdLAmgIR0CQEcvF3pwCdX2UKGgGR7+99RaX8fmtaAdLAmgIR0CQEYCDEm6YdX2UKGgGR7+iCUX531SPaAdLAWgIR0CQD8nSfDk3dX2UKGgGR7/MFcpsoDxLaAdLA2gIR0CQD1F7D2rXdX2UKGgGR7/LTQVsUIszaAdLA2gIR0CQDuuTibUgdX2UKGgGR7/VzKcNH6MzaAdLBGgIR0CQDoKaoddWdX2UKGgGR7/auUliSaE0aAdLBGgIR0CQDiZK3/gjdX2UKGgGR7/Qc7yQPqcFaAdLBGgIR0CQDVyHVPN3dX2UKGgGR7+5kFwDNhVmaAdLAmgIR0CQDO+vhZQpdX2UKGgGR7+38yeqaPS2aAdLAmgIR0CQEUnIhhYvdX2UKGgGR7/JTF2mpEQYaAdLA2gIR0CQEEpYs/Y8dX2UKGgGR7/QFhXr+o9+aAdLA2gIR0CQDKa0hNdrdX2UKGgGR7+yOwPiDM/yaAdLAmgIR0CQEe6WgOBldX2UKGgGR7+8Kmbb1yvLaAdLAmgIR0CQEaNT987ZdX2UKGgGR7/TWZ7XxvvSaAdLA2gIR0CQEQzWPLgXdX2UKGgGR7/MnGbTc6/7aAdLA2gIR0CQELxOclPadX2UKGgGR7/Bt4zJp35faAdLAmgIR0CQD+2pyZKGdX2UKGgGR7+0+8oQWepXaAdLAmgIR0CQDkkcCHRDdX2UKGgGR7/goi9qUNayaAdLBGgIR0CQDeQ4jrzHdX2UKGgGR7+lFfAsTWXkaAdLAWgIR0CQDLYQrc0tdX2UKGgGR7/U1oQFs54oaAdLA2gIR0CQEkgIyCWedX2UKGgGR7/BH9WIXTEzaAdLAmgIR0CQEGgLJCBxdX2UKGgGR7/QO/L1VYITaAdLA2gIR0CQDx3BYV7AdX2UKGgGR7+ig7HQyAQQaAdLAWgIR0CQDlh4MWoFdX2UKGgGR7/HG9YfW+XaaAdLA2gIR0CQDYyoXKr8dX2UKGgGR7/J8ohIOH32aAdLA2gIR0CQDR/Q0GeMdX2UKGgGR7+iwUxmCiAUaAdLAWgIR0CQDMVs1sLwdX2UKGgGR7/V5tFa0QbuaAdLBGgIR0CQD5YXfqHHdX2UKGgGR7/Xl0o0ALiNaAdLBGgIR0CQDsYw7DEWdX2UKGgGR7+iOmzjWCmNaAdLAWgIR0CQDTE5yU9qdX2UKGgGR7/KYvWYnfEXaAdLA2gIR0CQEiDEm6XjdX2UKGgGR7/a6KtPpIMCaAdLBGgIR0CQEYnKGL1mdX2UKGgGR7/Ts9jgAIY4aAdLA2gIR0CQET6DGtITdX2UKGgGR7/UWFvhqCYkaAdLA2gIR0CQEO37UG3XdX2UKGgGR7/BDrqt5le4aAdLAmgIR0CQDa3yZrpJdWUu"}, "ep_success_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWVIAAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKULg=="}, "_n_updates": 12500, "n_steps": 5, "gamma": 0.99, "gae_lambda": 1.0, "ent_coef": 0.0, "vf_coef": 0.5, "max_grad_norm": 0.5, "normalize_advantage": false, "observation_space": {":type:": "<class 'gymnasium.spaces.dict.Dict'>", ":serialized:": "gAWVsAMAAAAAAACMFWd5bW5hc2l1bS5zcGFjZXMuZGljdJSMBERpY3SUk5QpgZR9lCiMBnNwYWNlc5SMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwUZ3ltbmFzaXVtLnNwYWNlcy5ib3iUjANCb3iUk5QpgZR9lCiMBWR0eXBllIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYowNYm91bmRlZF9iZWxvd5SMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYDAAAAAAAAAAEBAZRoE4wCYjGUiYiHlFKUKEsDjAF8lE5OTkr/////Sv////9LAHSUYksDhZSMAUOUdJRSlIwNYm91bmRlZF9hYm92ZZRoHCiWAwAAAAAAAAABAQGUaCBLA4WUaCR0lFKUjAZfc2hhcGWUSwOFlIwDbG93lGgcKJYMAAAAAAAAAAAAIMEAACDBAAAgwZRoFksDhZRoJHSUUpSMBGhpZ2iUaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlIwIbG93X3JlcHKUjAUtMTAuMJSMCWhpZ2hfcmVwcpSMBDEwLjCUjApfbnBfcmFuZG9tlE51YowMZGVzaXJlZF9nb2FslGgNKYGUfZQoaBBoFmgZaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgnaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgsSwOFlGguaBwolgwAAAAAAAAAAAAgwQAAIMEAACDBlGgWSwOFlGgkdJRSlGgzaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlGg4jAUtMTAuMJRoOowEMTAuMJRoPE51YowLb2JzZXJ2YXRpb26UaA0pgZR9lChoEGgWaBloHCiWBgAAAAAAAAABAQEBAQGUaCBLBoWUaCR0lFKUaCdoHCiWBgAAAAAAAAABAQEBAQGUaCBLBoWUaCR0lFKUaCxLBoWUaC5oHCiWGAAAAAAAAAAAACDBAAAgwQAAIMEAACDBAAAgwQAAIMGUaBZLBoWUaCR0lFKUaDNoHCiWGAAAAAAAAAAAACBBAAAgQQAAIEEAACBBAAAgQQAAIEGUaBZLBoWUaCR0lFKUaDiMBS0xMC4wlGg6jAQxMC4wlGg8TnVidWgsTmgQTmg8TnViLg==", "spaces": "OrderedDict([('achieved_goal', Box(-10.0, 10.0, (3,), float32)), ('desired_goal', Box(-10.0, 10.0, (3,), float32)), ('observation', Box(-10.0, 10.0, (6,), float32))])", "_shape": null, "dtype": null, "_np_random": null}, "action_space": {":type:": "<class 'gymnasium.spaces.box.Box'>", ":serialized:": "gAWVYAIAAAAAAACMFGd5bW5hc2l1bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lIwFZHR5cGWUk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMDWJvdW5kZWRfYmVsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWAwAAAAAAAAABAQGUaAiMAmIxlImIh5RSlChLA4wBfJROTk5K/////0r/////SwB0lGJLA4WUjAFDlHSUUpSMDWJvdW5kZWRfYWJvdmWUaBEolgMAAAAAAAAAAQEBlGgVSwOFlGgZdJRSlIwGX3NoYXBllEsDhZSMA2xvd5RoESiWDAAAAAAAAAAAAIC/AACAvwAAgL+UaAtLA4WUaBl0lFKUjARoaWdolGgRKJYMAAAAAAAAAAAAgD8AAIA/AACAP5RoC0sDhZRoGXSUUpSMCGxvd19yZXBylIwELTEuMJSMCWhpZ2hfcmVwcpSMAzEuMJSMCl9ucF9yYW5kb22UjBRudW1weS5yYW5kb20uX3BpY2tsZZSMEF9fZ2VuZXJhdG9yX2N0b3KUk5SMBVBDRzY0lGgyjBRfX2JpdF9nZW5lcmF0b3JfY3RvcpSTlIaUUpR9lCiMDWJpdF9nZW5lcmF0b3KUjAVQQ0c2NJSMBXN0YXRllH2UKGg9ihAv4SazqaHDf6u7n5nS1VU5jANpbmOUihCN1pCeJC84ecBpGA//tjVndYwKaGFzX3VpbnQzMpRLAIwIdWludGVnZXKUSwB1YnViLg==", "dtype": "float32", "bounded_below": "[ True True True]", "bounded_above": "[ True True True]", "_shape": [3], "low": "[-1. -1. -1.]", "high": "[1. 1. 1.]", "low_repr": "-1.0", "high_repr": "1.0", "_np_random": "Generator(PCG64)"}, "n_envs": 16, "lr_schedule": {":type:": "<class 'function'>", ":serialized:": "gAWVogIAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLAUsTQwiVAZcAiQFTAJROhZQpjAFflIWUjGFDOlxVc2Vyc1xwaWV0cm9sdW9uZ29cbWluaWNvbmRhM1xlbnZzXHJlbGVhcm5cTGliXHNpdGUtcGFja2FnZXNcc3RhYmxlX2Jhc2VsaW5lczNcY29tbW9uXHV0aWxzLnB5lIwEZnVuY5SMGWNvbnN0YW50X2ZuLjxsb2NhbHM+LmZ1bmOUS4NDCPiAANgPEogKlEMAlIwDdmFslIWUKXSUUpR9lCiMC19fcGFja2FnZV9flIwYc3RhYmxlX2Jhc2VsaW5lczMuY29tbW9ulIwIX19uYW1lX1+UjB5zdGFibGVfYmFzZWxpbmVzMy5jb21tb24udXRpbHOUjAhfX2ZpbGVfX5RoDHVOTmgAjBBfbWFrZV9lbXB0eV9jZWxslJOUKVKUhZR0lFKUjBxjbG91ZHBpY2tsZS5jbG91ZHBpY2tsZV9mYXN0lIwSX2Z1bmN0aW9uX3NldHN0YXRllJOUaCB9lH2UKGgYaA2MDF9fcXVhbG5hbWVfX5RoDowPX19hbm5vdGF0aW9uc19flH2UjA5fX2t3ZGVmYXVsdHNfX5ROjAxfX2RlZmF1bHRzX1+UTowKX19tb2R1bGVfX5RoGYwHX19kb2NfX5ROjAtfX2Nsb3N1cmVfX5RoAIwKX21ha2VfY2VsbJSTlEc/RvAGjbi6x4WUUpSFlIwXX2Nsb3VkcGlja2xlX3N1Ym1vZHVsZXOUXZSMC19fZ2xvYmFsc19flH2UdYaUhlIwLg=="}, "system_info": {"OS": "Windows-10-10.0.22621-SP0 10.0.22621", "Python": "3.11.4", "Stable-Baselines3": "2.1.0", "PyTorch": "2.0.1", "GPU Enabled": "True", "Numpy": "1.25.2", "Cloudpickle": "2.2.1", "Gymnasium": "0.29.1", "OpenAI Gym": "0.26.2"}}
replay.mp4 ADDED
Binary file (673 kB). View file
 
results.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"mean_reward": -0.23433016519993544, "std_reward": 0.10970732477291437, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-09-12T23:39:13.590465"}
vec_normalize.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c042023358a6d975b8d7c37e700880db7580bf060361818047beae2d346cf880
3
+ size 3561