ppo-MountainCar-v0 / results.json
RayanRen's picture
Initial commit
b2ab4da
{"mean_reward": -109.4, "std_reward": 8.957678270623477, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-28T20:51:40.432935"}