ppo-LunarLander-v2 / results.json
JS2498's picture
Upload 1st version of PPO
42122a7
raw
history blame contribute delete
163 Bytes
{"mean_reward": 292.9663419779685, "std_reward": 12.89326229241089, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-05T21:21:24.145952"}