ppo-SnowballTarget / README.md
Segamboam's picture
First Push :D
bbe0c28
metadata
tags:
  - unity-ml-agents
  - ml-agents
  - deep-reinforcement-learning
  - reinforcement-learning
  - ML-Agents-SnowballTarget
library_name: ml-agents

ppo Agent playing SnowballTarget

This is a trained model of a ppo agent playing SnowballTarget using the Unity ML-Agents Library.

Usage (with ML-Agents)

The Documentation: https://github.com/huggingface/ml-agents#get-started We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:

Resume the training

mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume

Watch your Agent play

You can watch your agent playing directly in your browser:.

  1. Go to https://huggingface.co/spaces/unity/ML-Agents-SnowballTarget
  2. Step 1: Write your model_id: Segamboam/ppo-SnowballTarget
  3. Step 2: Select your .nn /.onnx file
  4. Click on Watch the agent play 👀