ppo-lunar-lander / results.json
dimi1357's picture
First model
b5d84f1
raw
history blame contribute delete
164 Bytes
{"mean_reward": 254.99662633965514, "std_reward": 19.27607025738996, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-08T18:27:40.642541"}