PPO-BipedalWalker-v3 / config.json

Commit History

Retrain PPO model for BipedalWalker-v3 v0
cac1da8

DBusAI commited on