LunarLander-v2 / results.json
Bala-A87's picture
Add a simple PPO model for LunarLander-v2
a784584
raw
history blame
164 Bytes
{"mean_reward": 257.74631629999993, "std_reward": 24.49346234129307, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-06-10T16:22:28.560751"}