TEST2ppo-LunarLander-v2 / results.json
Lendalf's picture
Upload PPO LunarLander-v2 with 2 million time steps
e2e6cdf
raw
history blame
165 Bytes
{"mean_reward": 278.17022678970727, "std_reward": 30.439213284273055, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-05-15T19:37:08.784276"}