ppo-LunarLander-v2 / results.json
ali97's picture
first PPO model
98e4fd3
raw
history blame contribute delete
165 Bytes
{"mean_reward": 259.73983516708273, "std_reward": 19.226414170715543, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-19T21:40:05.935239"}