PPO-LunarLander-v2-test / results.json
Sami's picture
Upload model: PPO-LunarLander-v2, version: 10.000000
b4a5eb5
raw
history blame contribute delete
162 Bytes
{"mean_reward": 284.367624770834, "std_reward": 17.26904368714923, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-06T14:05:21.051628"}