ppo-LunarLander-v2 / results.json
huph22's picture
mlp_100000_ppo_1022
c9d42b6 verified
raw
history blame contribute delete
158 Bytes
{"mean_reward": 261.5126836, "std_reward": 19.795311217445793, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-10-22T09:55:07.605060"}