ppo-LunarLander-v2 / results.json
samuelabc's picture
first commit
94480ce
raw
history blame
162 Bytes
{"mean_reward": 261.3381083799153, "std_reward": 11.8498643118098, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-04-26T08:13:23.492520"}