ppo-LunarLander-v2 / results.json
seriy21's picture
PPO LunarLander-v2
08d5b13
raw
history blame
165 Bytes
{"mean_reward": 286.35920477380904, "std_reward": 12.707492647244553, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-04T16:09:04.816471"}