PPO_LunarLander-v2 / results.json
DanilaBay's picture
First model
44d131b
raw
history blame contribute delete
162 Bytes
{"mean_reward": 281.4624835355572, "std_reward": 17.9490156006653, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-10T20:46:31.289718"}