ppo-LunarLander-v2 / results.json
DBusAI's picture
Add PPO model for LunarLander-v2 v2
7692f7b
{"mean_reward": 255.37769807852015, "std_reward": 19.489840626036145, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-04T15:17:00.141856"}