ppo-LunarLander-v2 / results.json
iamnambiar's picture
First PPO-LunarLanderv2 model
d9b8eb2
raw
history blame contribute delete
157 Bytes
{"mean_reward": 270.515721, "std_reward": 21.756675541336968, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-06-12T23:16:21.953552"}