ppo-LunarLander-v2 / results.json
MerlinTK's picture
Upload my first PPO model
cc143da
raw
history blame
164 Bytes
{"mean_reward": 175.85867267480396, "std_reward": 106.4339725531986, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-06-13T20:48:51.053155"}