ppo-LunarLander-v2-1_5m / results.json
Thanis33's picture
First attempt with PPO 1.5M
1e91005
{"mean_reward": 271.23260806854313, "std_reward": 23.385529776498345, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-12T11:36:44.846969"}