eldraco's picture
Upload . with huggingface_hub
b17365e
raw
history blame
182 kB
[2023-02-24 17:11:14,766][2343622] Saving configuration to /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/config.json...
[2023-02-24 17:11:14,767][2343622] Rollout worker 0 uses device cpu
[2023-02-24 17:11:14,767][2343622] Rollout worker 1 uses device cpu
[2023-02-24 17:11:14,767][2343622] Rollout worker 2 uses device cpu
[2023-02-24 17:11:14,767][2343622] Rollout worker 3 uses device cpu
[2023-02-24 17:11:14,767][2343622] Rollout worker 4 uses device cpu
[2023-02-24 17:11:14,767][2343622] Rollout worker 5 uses device cpu
[2023-02-24 17:11:14,768][2343622] Rollout worker 6 uses device cpu
[2023-02-24 17:11:14,768][2343622] Rollout worker 7 uses device cpu
[2023-02-24 17:11:14,797][2343622] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:11:14,797][2343622] InferenceWorker_p0-w0: min num requests: 2
[2023-02-24 17:11:14,815][2343622] Starting all processes...
[2023-02-24 17:11:14,815][2343622] Starting process learner_proc0
[2023-02-24 17:11:15,520][2343622] Starting all processes...
[2023-02-24 17:11:15,523][2343622] Starting process inference_proc0-0
[2023-02-24 17:11:15,523][2343622] Starting process rollout_proc0
[2023-02-24 17:11:15,525][2343698] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:11:15,525][2343698] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for learning process 0
[2023-02-24 17:11:15,524][2343622] Starting process rollout_proc1
[2023-02-24 17:11:15,527][2343622] Starting process rollout_proc2
[2023-02-24 17:11:15,530][2343622] Starting process rollout_proc3
[2023-02-24 17:11:15,530][2343622] Starting process rollout_proc4
[2023-02-24 17:11:15,532][2343622] Starting process rollout_proc5
[2023-02-24 17:11:15,533][2343622] Starting process rollout_proc6
[2023-02-24 17:11:15,533][2343622] Starting process rollout_proc7
[2023-02-24 17:11:15,542][2343698] Num visible devices: 1
[2023-02-24 17:11:15,559][2343698] Starting seed is not provided
[2023-02-24 17:11:15,559][2343698] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:11:15,559][2343698] Initializing actor-critic model on device cuda:0
[2023-02-24 17:11:15,559][2343698] RunningMeanStd input shape: (3, 72, 128)
[2023-02-24 17:11:15,560][2343698] RunningMeanStd input shape: (1,)
[2023-02-24 17:11:15,571][2343698] ConvEncoder: input_channels=3
[2023-02-24 17:11:15,751][2343698] Conv encoder output size: 512
[2023-02-24 17:11:15,751][2343698] Policy head output size: 512
[2023-02-24 17:11:15,766][2343698] Created Actor Critic model with architecture:
[2023-02-24 17:11:15,766][2343698] ActorCriticSharedWeights(
(obs_normalizer): ObservationNormalizer(
(running_mean_std): RunningMeanStdDictInPlace(
(running_mean_std): ModuleDict(
(obs): RunningMeanStdInPlace()
)
)
)
(returns_normalizer): RecursiveScriptModule(original_name=RunningMeanStdInPlace)
(encoder): VizdoomEncoder(
(basic_encoder): ConvEncoder(
(enc): RecursiveScriptModule(
original_name=ConvEncoderImpl
(conv_head): RecursiveScriptModule(
original_name=Sequential
(0): RecursiveScriptModule(original_name=Conv2d)
(1): RecursiveScriptModule(original_name=ELU)
(2): RecursiveScriptModule(original_name=Conv2d)
(3): RecursiveScriptModule(original_name=ELU)
(4): RecursiveScriptModule(original_name=Conv2d)
(5): RecursiveScriptModule(original_name=ELU)
)
(mlp_layers): RecursiveScriptModule(
original_name=Sequential
(0): RecursiveScriptModule(original_name=Linear)
(1): RecursiveScriptModule(original_name=ELU)
)
)
)
)
(core): ModelCoreRNN(
(core): GRU(512, 512)
)
(decoder): MlpDecoder(
(mlp): Identity()
)
(critic_linear): Linear(in_features=512, out_features=1, bias=True)
(action_parameterization): ActionParameterizationDefault(
(distribution_linear): Linear(in_features=512, out_features=5, bias=True)
)
)
[2023-02-24 17:11:16,608][2343747] Worker 7 uses CPU cores [14, 15]
[2023-02-24 17:11:16,626][2343746] Worker 4 uses CPU cores [8, 9]
[2023-02-24 17:11:16,638][2343750] Worker 3 uses CPU cores [6, 7]
[2023-02-24 17:11:16,640][2343744] Worker 1 uses CPU cores [2, 3]
[2023-02-24 17:11:16,669][2343745] Worker 2 uses CPU cores [4, 5]
[2023-02-24 17:11:16,708][2343749] Worker 5 uses CPU cores [10, 11]
[2023-02-24 17:11:16,715][2343726] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:11:16,715][2343726] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for inference process 0
[2023-02-24 17:11:16,730][2343726] Num visible devices: 1
[2023-02-24 17:11:16,758][2343748] Worker 6 uses CPU cores [12, 13]
[2023-02-24 17:11:16,825][2343743] Worker 0 uses CPU cores [0, 1]
[2023-02-24 17:11:17,546][2343698] Using optimizer <class 'torch.optim.adam.Adam'>
[2023-02-24 17:11:17,547][2343698] No checkpoints found
[2023-02-24 17:11:17,547][2343698] Did not load from checkpoint, starting from scratch!
[2023-02-24 17:11:17,547][2343698] Initialized policy 0 weights for model version 0
[2023-02-24 17:11:17,548][2343698] LearnerWorker_p0 finished initialization!
[2023-02-24 17:11:17,548][2343698] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:11:17,640][2343726] RunningMeanStd input shape: (3, 72, 128)
[2023-02-24 17:11:17,641][2343726] RunningMeanStd input shape: (1,)
[2023-02-24 17:11:17,648][2343726] ConvEncoder: input_channels=3
[2023-02-24 17:11:17,708][2343726] Conv encoder output size: 512
[2023-02-24 17:11:17,708][2343726] Policy head output size: 512
[2023-02-24 17:11:18,236][2343622] Fps is (10 sec: nan, 60 sec: nan, 300 sec: nan). Total num frames: 0. Throughput: 0: nan. Samples: 0. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0)
[2023-02-24 17:11:18,948][2343622] Inference worker 0-0 is ready!
[2023-02-24 17:11:18,948][2343622] All inference workers are ready! Signal rollout workers to start!
[2023-02-24 17:11:18,961][2343745] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,961][2343746] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,961][2343743] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,961][2343749] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,966][2343748] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,966][2343744] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,975][2343747] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:18,977][2343750] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:11:19,171][2343744] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,174][2343746] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,174][2343743] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,243][2343745] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,245][2343748] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,356][2343746] Decorrelating experience for 32 frames...
[2023-02-24 17:11:19,359][2343744] Decorrelating experience for 32 frames...
[2023-02-24 17:11:19,427][2343748] Decorrelating experience for 32 frames...
[2023-02-24 17:11:19,496][2343749] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,561][2343743] Decorrelating experience for 32 frames...
[2023-02-24 17:11:19,635][2343748] Decorrelating experience for 64 frames...
[2023-02-24 17:11:19,702][2343744] Decorrelating experience for 64 frames...
[2023-02-24 17:11:19,725][2343750] Decorrelating experience for 0 frames...
[2023-02-24 17:11:19,763][2343749] Decorrelating experience for 32 frames...
[2023-02-24 17:11:19,803][2343746] Decorrelating experience for 64 frames...
[2023-02-24 17:11:19,804][2343743] Decorrelating experience for 64 frames...
[2023-02-24 17:11:19,893][2343748] Decorrelating experience for 96 frames...
[2023-02-24 17:11:19,916][2343750] Decorrelating experience for 32 frames...
[2023-02-24 17:11:19,952][2343744] Decorrelating experience for 96 frames...
[2023-02-24 17:11:20,019][2343746] Decorrelating experience for 96 frames...
[2023-02-24 17:11:20,028][2343749] Decorrelating experience for 64 frames...
[2023-02-24 17:11:20,101][2343747] Decorrelating experience for 0 frames...
[2023-02-24 17:11:20,240][2343749] Decorrelating experience for 96 frames...
[2023-02-24 17:11:20,241][2343745] Decorrelating experience for 32 frames...
[2023-02-24 17:11:20,296][2343750] Decorrelating experience for 64 frames...
[2023-02-24 17:11:20,325][2343743] Decorrelating experience for 96 frames...
[2023-02-24 17:11:20,369][2343747] Decorrelating experience for 32 frames...
[2023-02-24 17:11:20,471][2343745] Decorrelating experience for 64 frames...
[2023-02-24 17:11:20,530][2343750] Decorrelating experience for 96 frames...
[2023-02-24 17:11:20,679][2343747] Decorrelating experience for 64 frames...
[2023-02-24 17:11:20,777][2343745] Decorrelating experience for 96 frames...
[2023-02-24 17:11:20,815][2343698] Signal inference workers to stop experience collection...
[2023-02-24 17:11:20,819][2343726] InferenceWorker_p0-w0: stopping experience collection
[2023-02-24 17:11:20,912][2343747] Decorrelating experience for 96 frames...
[2023-02-24 17:11:21,610][2343698] Signal inference workers to resume experience collection...
[2023-02-24 17:11:21,610][2343726] InferenceWorker_p0-w0: resuming experience collection
[2023-02-24 17:11:23,236][2343622] Fps is (10 sec: 7372.6, 60 sec: 7372.6, 300 sec: 7372.6). Total num frames: 36864. Throughput: 0: 626.0. Samples: 3130. Policy #0 lag: (min: 0.0, avg: 0.0, max: 0.0)
[2023-02-24 17:11:23,242][2343622] Avg episode reward: [(0, '4.086')]
[2023-02-24 17:11:23,269][2343726] Updated weights for policy 0, policy_version 10 (0.0231)
[2023-02-24 17:11:24,717][2343726] Updated weights for policy 0, policy_version 20 (0.0007)
[2023-02-24 17:11:26,117][2343726] Updated weights for policy 0, policy_version 30 (0.0007)
[2023-02-24 17:11:27,512][2343726] Updated weights for policy 0, policy_version 40 (0.0007)
[2023-02-24 17:11:28,236][2343622] Fps is (10 sec: 18431.9, 60 sec: 18431.9, 300 sec: 18431.9). Total num frames: 184320. Throughput: 0: 4463.4. Samples: 44634. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0)
[2023-02-24 17:11:28,242][2343622] Avg episode reward: [(0, '4.252')]
[2023-02-24 17:11:28,243][2343698] Saving new best policy, reward=4.252!
[2023-02-24 17:11:28,928][2343726] Updated weights for policy 0, policy_version 50 (0.0007)
[2023-02-24 17:11:30,360][2343726] Updated weights for policy 0, policy_version 60 (0.0006)
[2023-02-24 17:11:31,782][2343726] Updated weights for policy 0, policy_version 70 (0.0007)
[2023-02-24 17:11:33,190][2343726] Updated weights for policy 0, policy_version 80 (0.0007)
[2023-02-24 17:11:33,236][2343622] Fps is (10 sec: 29081.8, 60 sec: 21845.2, 300 sec: 21845.2). Total num frames: 327680. Throughput: 0: 4426.8. Samples: 66402. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:11:33,244][2343622] Avg episode reward: [(0, '4.526')]
[2023-02-24 17:11:33,247][2343698] Saving new best policy, reward=4.526!
[2023-02-24 17:11:34,623][2343726] Updated weights for policy 0, policy_version 90 (0.0007)
[2023-02-24 17:11:34,792][2343622] Heartbeat connected on Batcher_0
[2023-02-24 17:11:34,794][2343622] Heartbeat connected on LearnerWorker_p0
[2023-02-24 17:11:34,800][2343622] Heartbeat connected on InferenceWorker_p0-w0
[2023-02-24 17:11:34,801][2343622] Heartbeat connected on RolloutWorker_w0
[2023-02-24 17:11:34,802][2343622] Heartbeat connected on RolloutWorker_w1
[2023-02-24 17:11:34,807][2343622] Heartbeat connected on RolloutWorker_w3
[2023-02-24 17:11:34,808][2343622] Heartbeat connected on RolloutWorker_w2
[2023-02-24 17:11:34,809][2343622] Heartbeat connected on RolloutWorker_w4
[2023-02-24 17:11:34,810][2343622] Heartbeat connected on RolloutWorker_w5
[2023-02-24 17:11:34,813][2343622] Heartbeat connected on RolloutWorker_w6
[2023-02-24 17:11:34,816][2343622] Heartbeat connected on RolloutWorker_w7
[2023-02-24 17:11:36,090][2343726] Updated weights for policy 0, policy_version 100 (0.0007)
[2023-02-24 17:11:37,537][2343726] Updated weights for policy 0, policy_version 110 (0.0007)
[2023-02-24 17:11:38,236][2343622] Fps is (10 sec: 28262.5, 60 sec: 23347.2, 300 sec: 23347.2). Total num frames: 466944. Throughput: 0: 5462.9. Samples: 109258. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:11:38,246][2343622] Avg episode reward: [(0, '4.374')]
[2023-02-24 17:11:38,984][2343726] Updated weights for policy 0, policy_version 120 (0.0007)
[2023-02-24 17:11:40,452][2343726] Updated weights for policy 0, policy_version 130 (0.0007)
[2023-02-24 17:11:41,886][2343726] Updated weights for policy 0, policy_version 140 (0.0007)
[2023-02-24 17:11:43,236][2343622] Fps is (10 sec: 28262.5, 60 sec: 24412.1, 300 sec: 24412.1). Total num frames: 610304. Throughput: 0: 6064.6. Samples: 151616. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:11:43,248][2343622] Avg episode reward: [(0, '4.299')]
[2023-02-24 17:11:43,339][2343726] Updated weights for policy 0, policy_version 150 (0.0007)
[2023-02-24 17:11:44,774][2343726] Updated weights for policy 0, policy_version 160 (0.0007)
[2023-02-24 17:11:46,247][2343726] Updated weights for policy 0, policy_version 170 (0.0007)
[2023-02-24 17:11:47,693][2343726] Updated weights for policy 0, policy_version 180 (0.0007)
[2023-02-24 17:11:48,236][2343622] Fps is (10 sec: 28262.2, 60 sec: 24985.5, 300 sec: 24985.5). Total num frames: 749568. Throughput: 0: 5760.8. Samples: 172826. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:11:48,236][2343622] Avg episode reward: [(0, '4.570')]
[2023-02-24 17:11:48,259][2343698] Saving new best policy, reward=4.570!
[2023-02-24 17:11:49,140][2343726] Updated weights for policy 0, policy_version 190 (0.0008)
[2023-02-24 17:11:50,563][2343726] Updated weights for policy 0, policy_version 200 (0.0007)
[2023-02-24 17:11:52,015][2343726] Updated weights for policy 0, policy_version 210 (0.0007)
[2023-02-24 17:11:53,236][2343622] Fps is (10 sec: 28262.4, 60 sec: 25512.2, 300 sec: 25512.2). Total num frames: 892928. Throughput: 0: 6152.8. Samples: 215350. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:11:53,238][2343622] Avg episode reward: [(0, '4.848')]
[2023-02-24 17:11:53,242][2343698] Saving new best policy, reward=4.848!
[2023-02-24 17:11:53,484][2343726] Updated weights for policy 0, policy_version 220 (0.0007)
[2023-02-24 17:11:54,907][2343726] Updated weights for policy 0, policy_version 230 (0.0007)
[2023-02-24 17:11:56,319][2343726] Updated weights for policy 0, policy_version 240 (0.0007)
[2023-02-24 17:11:57,782][2343726] Updated weights for policy 0, policy_version 250 (0.0006)
[2023-02-24 17:11:58,236][2343622] Fps is (10 sec: 28672.2, 60 sec: 25907.2, 300 sec: 25907.2). Total num frames: 1036288. Throughput: 0: 6446.3. Samples: 257854. Policy #0 lag: (min: 0.0, avg: 0.5, max: 2.0)
[2023-02-24 17:11:58,236][2343622] Avg episode reward: [(0, '4.846')]
[2023-02-24 17:11:59,235][2343726] Updated weights for policy 0, policy_version 260 (0.0007)
[2023-02-24 17:12:00,862][2343726] Updated weights for policy 0, policy_version 270 (0.0008)
[2023-02-24 17:12:02,356][2343726] Updated weights for policy 0, policy_version 280 (0.0007)
[2023-02-24 17:12:03,236][2343622] Fps is (10 sec: 27853.0, 60 sec: 26032.4, 300 sec: 26032.4). Total num frames: 1171456. Throughput: 0: 6174.2. Samples: 277840. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:12:03,236][2343622] Avg episode reward: [(0, '4.947')]
[2023-02-24 17:12:03,239][2343698] Saving new best policy, reward=4.947!
[2023-02-24 17:12:03,847][2343726] Updated weights for policy 0, policy_version 290 (0.0008)
[2023-02-24 17:12:05,268][2343726] Updated weights for policy 0, policy_version 300 (0.0007)
[2023-02-24 17:12:06,687][2343726] Updated weights for policy 0, policy_version 310 (0.0006)
[2023-02-24 17:12:08,150][2343726] Updated weights for policy 0, policy_version 320 (0.0007)
[2023-02-24 17:12:08,236][2343622] Fps is (10 sec: 27443.2, 60 sec: 26214.4, 300 sec: 26214.4). Total num frames: 1310720. Throughput: 0: 7043.1. Samples: 320070. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:12:08,236][2343622] Avg episode reward: [(0, '6.912')]
[2023-02-24 17:12:08,236][2343698] Saving new best policy, reward=6.912!
[2023-02-24 17:12:09,606][2343726] Updated weights for policy 0, policy_version 330 (0.0008)
[2023-02-24 17:12:11,175][2343726] Updated weights for policy 0, policy_version 340 (0.0007)
[2023-02-24 17:12:12,639][2343726] Updated weights for policy 0, policy_version 350 (0.0007)
[2023-02-24 17:12:13,236][2343622] Fps is (10 sec: 27443.0, 60 sec: 26288.8, 300 sec: 26288.8). Total num frames: 1445888. Throughput: 0: 7036.6. Samples: 361282. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:12:13,236][2343622] Avg episode reward: [(0, '8.564')]
[2023-02-24 17:12:13,242][2343698] Saving new best policy, reward=8.564!
[2023-02-24 17:12:14,109][2343726] Updated weights for policy 0, policy_version 360 (0.0007)
[2023-02-24 17:12:15,599][2343726] Updated weights for policy 0, policy_version 370 (0.0007)
[2023-02-24 17:12:17,098][2343726] Updated weights for policy 0, policy_version 380 (0.0007)
[2023-02-24 17:12:18,236][2343622] Fps is (10 sec: 27443.2, 60 sec: 26419.2, 300 sec: 26419.2). Total num frames: 1585152. Throughput: 0: 7014.2. Samples: 382040. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:12:18,236][2343622] Avg episode reward: [(0, '8.241')]
[2023-02-24 17:12:18,592][2343726] Updated weights for policy 0, policy_version 390 (0.0008)
[2023-02-24 17:12:20,082][2343726] Updated weights for policy 0, policy_version 400 (0.0007)
[2023-02-24 17:12:21,524][2343726] Updated weights for policy 0, policy_version 410 (0.0007)
[2023-02-24 17:12:22,983][2343726] Updated weights for policy 0, policy_version 420 (0.0007)
[2023-02-24 17:12:23,236][2343622] Fps is (10 sec: 27852.9, 60 sec: 28125.9, 300 sec: 26529.5). Total num frames: 1724416. Throughput: 0: 6984.0. Samples: 423538. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:12:23,236][2343622] Avg episode reward: [(0, '10.228')]
[2023-02-24 17:12:23,263][2343698] Saving new best policy, reward=10.228!
[2023-02-24 17:12:24,436][2343726] Updated weights for policy 0, policy_version 430 (0.0007)
[2023-02-24 17:12:25,914][2343726] Updated weights for policy 0, policy_version 440 (0.0007)
[2023-02-24 17:12:27,379][2343726] Updated weights for policy 0, policy_version 450 (0.0007)
[2023-02-24 17:12:28,236][2343622] Fps is (10 sec: 27852.7, 60 sec: 27989.3, 300 sec: 26624.0). Total num frames: 1863680. Throughput: 0: 6975.6. Samples: 465520. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:12:28,236][2343622] Avg episode reward: [(0, '12.021')]
[2023-02-24 17:12:28,264][2343698] Saving new best policy, reward=12.021!
[2023-02-24 17:12:28,842][2343726] Updated weights for policy 0, policy_version 460 (0.0007)
[2023-02-24 17:12:30,319][2343726] Updated weights for policy 0, policy_version 470 (0.0007)
[2023-02-24 17:12:31,757][2343726] Updated weights for policy 0, policy_version 480 (0.0007)
[2023-02-24 17:12:33,211][2343726] Updated weights for policy 0, policy_version 490 (0.0007)
[2023-02-24 17:12:33,236][2343622] Fps is (10 sec: 28262.3, 60 sec: 27989.3, 300 sec: 26760.5). Total num frames: 2007040. Throughput: 0: 6972.7. Samples: 486596. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:12:33,236][2343622] Avg episode reward: [(0, '14.858')]
[2023-02-24 17:12:33,239][2343698] Saving new best policy, reward=14.858!
[2023-02-24 17:12:34,673][2343726] Updated weights for policy 0, policy_version 500 (0.0007)
[2023-02-24 17:12:36,167][2343726] Updated weights for policy 0, policy_version 510 (0.0007)
[2023-02-24 17:12:37,680][2343726] Updated weights for policy 0, policy_version 520 (0.0007)
[2023-02-24 17:12:38,236][2343622] Fps is (10 sec: 27852.8, 60 sec: 27921.0, 300 sec: 26777.6). Total num frames: 2142208. Throughput: 0: 6958.1. Samples: 528464. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:12:38,236][2343622] Avg episode reward: [(0, '13.124')]
[2023-02-24 17:12:39,147][2343726] Updated weights for policy 0, policy_version 530 (0.0007)
[2023-02-24 17:12:40,649][2343726] Updated weights for policy 0, policy_version 540 (0.0008)
[2023-02-24 17:12:42,118][2343726] Updated weights for policy 0, policy_version 550 (0.0007)
[2023-02-24 17:12:43,236][2343622] Fps is (10 sec: 27443.2, 60 sec: 27852.8, 300 sec: 26840.8). Total num frames: 2281472. Throughput: 0: 6928.3. Samples: 569626. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:12:43,236][2343622] Avg episode reward: [(0, '14.789')]
[2023-02-24 17:12:43,605][2343726] Updated weights for policy 0, policy_version 560 (0.0007)
[2023-02-24 17:12:45,110][2343726] Updated weights for policy 0, policy_version 570 (0.0006)
[2023-02-24 17:12:46,567][2343726] Updated weights for policy 0, policy_version 580 (0.0007)
[2023-02-24 17:12:48,043][2343726] Updated weights for policy 0, policy_version 590 (0.0007)
[2023-02-24 17:12:48,236][2343622] Fps is (10 sec: 27853.0, 60 sec: 27852.8, 300 sec: 26897.1). Total num frames: 2420736. Throughput: 0: 6946.0. Samples: 590410. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:12:48,236][2343622] Avg episode reward: [(0, '19.318')]
[2023-02-24 17:12:48,236][2343698] Saving new best policy, reward=19.318!
[2023-02-24 17:12:49,544][2343726] Updated weights for policy 0, policy_version 600 (0.0006)
[2023-02-24 17:12:50,998][2343726] Updated weights for policy 0, policy_version 610 (0.0007)
[2023-02-24 17:12:52,482][2343726] Updated weights for policy 0, policy_version 620 (0.0006)
[2023-02-24 17:12:53,236][2343622] Fps is (10 sec: 27852.8, 60 sec: 27784.5, 300 sec: 26947.4). Total num frames: 2560000. Throughput: 0: 6928.7. Samples: 631860. Policy #0 lag: (min: 0.0, avg: 0.5, max: 1.0)
[2023-02-24 17:12:53,236][2343622] Avg episode reward: [(0, '16.987')]
[2023-02-24 17:12:53,955][2343726] Updated weights for policy 0, policy_version 630 (0.0007)
[2023-02-24 17:12:55,438][2343726] Updated weights for policy 0, policy_version 640 (0.0007)
[2023-02-24 17:12:56,905][2343726] Updated weights for policy 0, policy_version 650 (0.0007)
[2023-02-24 17:12:58,236][2343622] Fps is (10 sec: 27852.8, 60 sec: 27716.3, 300 sec: 26992.6). Total num frames: 2699264. Throughput: 0: 6940.9. Samples: 673620. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:12:58,236][2343622] Avg episode reward: [(0, '17.090')]
[2023-02-24 17:12:58,372][2343726] Updated weights for policy 0, policy_version 660 (0.0007)
[2023-02-24 17:12:59,889][2343726] Updated weights for policy 0, policy_version 670 (0.0008)
[2023-02-24 17:13:01,368][2343726] Updated weights for policy 0, policy_version 680 (0.0007)
[2023-02-24 17:13:02,828][2343726] Updated weights for policy 0, policy_version 690 (0.0007)
[2023-02-24 17:13:03,236][2343622] Fps is (10 sec: 27443.1, 60 sec: 27716.2, 300 sec: 26994.6). Total num frames: 2834432. Throughput: 0: 6933.1. Samples: 694028. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:13:03,236][2343622] Avg episode reward: [(0, '22.089')]
[2023-02-24 17:13:03,240][2343698] Saving new best policy, reward=22.089!
[2023-02-24 17:13:04,318][2343726] Updated weights for policy 0, policy_version 700 (0.0008)
[2023-02-24 17:13:05,824][2343726] Updated weights for policy 0, policy_version 710 (0.0008)
[2023-02-24 17:13:07,331][2343726] Updated weights for policy 0, policy_version 720 (0.0008)
[2023-02-24 17:13:08,236][2343622] Fps is (10 sec: 27443.1, 60 sec: 27716.3, 300 sec: 27033.6). Total num frames: 2973696. Throughput: 0: 6930.7. Samples: 735420. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:13:08,236][2343622] Avg episode reward: [(0, '19.494')]
[2023-02-24 17:13:08,793][2343726] Updated weights for policy 0, policy_version 730 (0.0007)
[2023-02-24 17:13:10,279][2343726] Updated weights for policy 0, policy_version 740 (0.0007)
[2023-02-24 17:13:11,813][2343726] Updated weights for policy 0, policy_version 750 (0.0007)
[2023-02-24 17:13:13,236][2343622] Fps is (10 sec: 27443.2, 60 sec: 27716.3, 300 sec: 27033.6). Total num frames: 3108864. Throughput: 0: 6916.3. Samples: 776754. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:13:13,236][2343622] Avg episode reward: [(0, '20.543')]
[2023-02-24 17:13:13,240][2343698] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000000759_3108864.pth...
[2023-02-24 17:13:13,329][2343726] Updated weights for policy 0, policy_version 760 (0.0008)
[2023-02-24 17:13:14,774][2343726] Updated weights for policy 0, policy_version 770 (0.0006)
[2023-02-24 17:13:16,248][2343726] Updated weights for policy 0, policy_version 780 (0.0008)
[2023-02-24 17:13:17,761][2343726] Updated weights for policy 0, policy_version 790 (0.0007)
[2023-02-24 17:13:18,236][2343622] Fps is (10 sec: 27443.2, 60 sec: 27716.3, 300 sec: 27067.7). Total num frames: 3248128. Throughput: 0: 6906.6. Samples: 797394. Policy #0 lag: (min: 0.0, avg: 0.6, max: 2.0)
[2023-02-24 17:13:18,236][2343622] Avg episode reward: [(0, '21.927')]
[2023-02-24 17:13:19,279][2343726] Updated weights for policy 0, policy_version 800 (0.0007)
[2023-02-24 17:13:20,759][2343726] Updated weights for policy 0, policy_version 810 (0.0007)
[2023-02-24 17:13:22,276][2343726] Updated weights for policy 0, policy_version 820 (0.0007)
[2023-02-24 17:13:23,236][2343622] Fps is (10 sec: 27443.3, 60 sec: 27648.0, 300 sec: 27066.4). Total num frames: 3383296. Throughput: 0: 6885.7. Samples: 838322. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:13:23,236][2343622] Avg episode reward: [(0, '22.487')]
[2023-02-24 17:13:23,239][2343698] Saving new best policy, reward=22.487!
[2023-02-24 17:13:23,768][2343726] Updated weights for policy 0, policy_version 830 (0.0007)
[2023-02-24 17:13:25,246][2343726] Updated weights for policy 0, policy_version 840 (0.0007)
[2023-02-24 17:13:26,742][2343726] Updated weights for policy 0, policy_version 850 (0.0007)
[2023-02-24 17:13:28,236][2343622] Fps is (10 sec: 27034.0, 60 sec: 27579.8, 300 sec: 27065.1). Total num frames: 3518464. Throughput: 0: 6883.4. Samples: 879380. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:13:28,236][2343622] Avg episode reward: [(0, '22.399')]
[2023-02-24 17:13:28,237][2343726] Updated weights for policy 0, policy_version 860 (0.0007)
[2023-02-24 17:13:29,767][2343726] Updated weights for policy 0, policy_version 870 (0.0007)
[2023-02-24 17:13:31,283][2343726] Updated weights for policy 0, policy_version 880 (0.0007)
[2023-02-24 17:13:32,771][2343726] Updated weights for policy 0, policy_version 890 (0.0007)
[2023-02-24 17:13:33,236][2343622] Fps is (10 sec: 27443.3, 60 sec: 27511.5, 300 sec: 27094.3). Total num frames: 3657728. Throughput: 0: 6874.0. Samples: 899738. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:13:33,236][2343622] Avg episode reward: [(0, '24.897')]
[2023-02-24 17:13:33,239][2343698] Saving new best policy, reward=24.897!
[2023-02-24 17:13:34,270][2343726] Updated weights for policy 0, policy_version 900 (0.0007)
[2023-02-24 17:13:35,771][2343726] Updated weights for policy 0, policy_version 910 (0.0007)
[2023-02-24 17:13:37,284][2343726] Updated weights for policy 0, policy_version 920 (0.0007)
[2023-02-24 17:13:38,236][2343622] Fps is (10 sec: 27442.9, 60 sec: 27511.5, 300 sec: 27092.1). Total num frames: 3792896. Throughput: 0: 6862.1. Samples: 940656. Policy #0 lag: (min: 0.0, avg: 0.8, max: 2.0)
[2023-02-24 17:13:38,236][2343622] Avg episode reward: [(0, '22.457')]
[2023-02-24 17:13:38,783][2343726] Updated weights for policy 0, policy_version 930 (0.0007)
[2023-02-24 17:13:40,263][2343726] Updated weights for policy 0, policy_version 940 (0.0007)
[2023-02-24 17:13:41,748][2343726] Updated weights for policy 0, policy_version 950 (0.0007)
[2023-02-24 17:13:43,236][2343622] Fps is (10 sec: 27033.6, 60 sec: 27443.2, 300 sec: 27090.1). Total num frames: 3928064. Throughput: 0: 6847.8. Samples: 981770. Policy #0 lag: (min: 0.0, avg: 0.7, max: 2.0)
[2023-02-24 17:13:43,236][2343622] Avg episode reward: [(0, '21.833')]
[2023-02-24 17:13:43,244][2343726] Updated weights for policy 0, policy_version 960 (0.0008)
[2023-02-24 17:13:44,723][2343726] Updated weights for policy 0, policy_version 970 (0.0007)
[2023-02-24 17:13:45,940][2343698] Stopping Batcher_0...
[2023-02-24 17:13:45,941][2343698] Loop batcher_evt_loop terminating...
[2023-02-24 17:13:45,941][2343698] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth...
[2023-02-24 17:13:45,945][2343622] Component Batcher_0 stopped!
[2023-02-24 17:13:45,949][2343744] Stopping RolloutWorker_w1...
[2023-02-24 17:13:45,949][2343745] Stopping RolloutWorker_w2...
[2023-02-24 17:13:45,949][2343622] Component RolloutWorker_w2 stopped!
[2023-02-24 17:13:45,949][2343622] Component RolloutWorker_w1 stopped!
[2023-02-24 17:13:45,949][2343744] Loop rollout_proc1_evt_loop terminating...
[2023-02-24 17:13:45,949][2343746] Stopping RolloutWorker_w4...
[2023-02-24 17:13:45,949][2343750] Stopping RolloutWorker_w3...
[2023-02-24 17:13:45,949][2343745] Loop rollout_proc2_evt_loop terminating...
[2023-02-24 17:13:45,949][2343749] Stopping RolloutWorker_w5...
[2023-02-24 17:13:45,950][2343622] Component RolloutWorker_w4 stopped!
[2023-02-24 17:13:45,950][2343746] Loop rollout_proc4_evt_loop terminating...
[2023-02-24 17:13:45,950][2343622] Component RolloutWorker_w3 stopped!
[2023-02-24 17:13:45,950][2343749] Loop rollout_proc5_evt_loop terminating...
[2023-02-24 17:13:45,950][2343750] Loop rollout_proc3_evt_loop terminating...
[2023-02-24 17:13:45,950][2343622] Component RolloutWorker_w5 stopped!
[2023-02-24 17:13:45,950][2343743] Stopping RolloutWorker_w0...
[2023-02-24 17:13:45,950][2343622] Component RolloutWorker_w0 stopped!
[2023-02-24 17:13:45,950][2343743] Loop rollout_proc0_evt_loop terminating...
[2023-02-24 17:13:45,953][2343622] Component RolloutWorker_w7 stopped!
[2023-02-24 17:13:45,953][2343747] Stopping RolloutWorker_w7...
[2023-02-24 17:13:45,953][2343747] Loop rollout_proc7_evt_loop terminating...
[2023-02-24 17:13:45,955][2343726] Weights refcount: 2 0
[2023-02-24 17:13:45,955][2343726] Stopping InferenceWorker_p0-w0...
[2023-02-24 17:13:45,956][2343622] Component InferenceWorker_p0-w0 stopped!
[2023-02-24 17:13:45,956][2343726] Loop inference_proc0-0_evt_loop terminating...
[2023-02-24 17:13:46,004][2343748] Stopping RolloutWorker_w6...
[2023-02-24 17:13:46,005][2343622] Component RolloutWorker_w6 stopped!
[2023-02-24 17:13:46,005][2343748] Loop rollout_proc6_evt_loop terminating...
[2023-02-24 17:13:46,008][2343698] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth...
[2023-02-24 17:13:46,105][2343698] Stopping LearnerWorker_p0...
[2023-02-24 17:13:46,105][2343622] Component LearnerWorker_p0 stopped!
[2023-02-24 17:13:46,106][2343698] Loop learner_proc0_evt_loop terminating...
[2023-02-24 17:13:46,106][2343622] Waiting for process learner_proc0 to stop...
[2023-02-24 17:13:46,712][2343622] Waiting for process inference_proc0-0 to join...
[2023-02-24 17:13:46,712][2343622] Waiting for process rollout_proc0 to join...
[2023-02-24 17:13:46,712][2343622] Waiting for process rollout_proc1 to join...
[2023-02-24 17:13:46,712][2343622] Waiting for process rollout_proc2 to join...
[2023-02-24 17:13:46,712][2343622] Waiting for process rollout_proc3 to join...
[2023-02-24 17:13:46,712][2343622] Waiting for process rollout_proc4 to join...
[2023-02-24 17:13:46,712][2343622] Waiting for process rollout_proc5 to join...
[2023-02-24 17:13:46,713][2343622] Waiting for process rollout_proc6 to join...
[2023-02-24 17:13:46,713][2343622] Waiting for process rollout_proc7 to join...
[2023-02-24 17:13:46,713][2343622] Batcher 0 profile tree view:
batching: 11.9572, releasing_batches: 0.0124
[2023-02-24 17:13:46,713][2343622] InferenceWorker_p0-w0 profile tree view:
wait_policy: 0.0000
wait_policy_total: 2.9043
update_model: 2.1825
weight_update: 0.0007
one_step: 0.0019
handle_policy_step: 133.4731
deserialize: 6.2447, stack: 0.7828, obs_to_device_normalize: 34.7488, forward: 55.1577, send_messages: 9.6852
prepare_outputs: 21.4301
to_cpu: 14.6154
[2023-02-24 17:13:46,714][2343622] Learner 0 profile tree view:
misc: 0.0029, prepare_batch: 6.2234
train: 18.9230
epoch_init: 0.0032, minibatch_init: 0.0035, losses_postprocess: 0.3250, kl_divergence: 0.1740, after_optimizer: 6.1827
calculate_losses: 6.6723
losses_init: 0.0020, forward_head: 0.5088, bptt_initial: 4.1259, tail: 0.3587, advantages_returns: 0.1020, losses: 0.7122
bptt: 0.7553
bptt_forward_core: 0.7253
update: 5.3342
clip: 0.8350
[2023-02-24 17:13:46,714][2343622] RolloutWorker_w0 profile tree view:
wait_for_trajectories: 0.1049, enqueue_policy_requests: 5.6350, env_step: 81.8998, overhead: 6.8628, complete_rollouts: 0.1662
save_policy_outputs: 6.0304
split_output_tensors: 2.8870
[2023-02-24 17:13:46,714][2343622] RolloutWorker_w7 profile tree view:
wait_for_trajectories: 0.1033, enqueue_policy_requests: 5.6327, env_step: 82.5081, overhead: 7.0028, complete_rollouts: 0.1684
save_policy_outputs: 6.1360
split_output_tensors: 2.9393
[2023-02-24 17:13:46,714][2343622] Loop Runner_EvtLoop terminating...
[2023-02-24 17:13:46,714][2343622] Runner profile tree view:
main_loop: 151.8999
[2023-02-24 17:13:46,715][2343622] Collected {0: 4005888}, FPS: 26371.9
[2023-02-24 17:32:32,877][2364708] Saving configuration to /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/config.json...
[2023-02-24 17:32:32,878][2364708] Rollout worker 0 uses device cpu
[2023-02-24 17:32:32,878][2364708] Rollout worker 1 uses device cpu
[2023-02-24 17:32:32,878][2364708] Rollout worker 2 uses device cpu
[2023-02-24 17:32:32,878][2364708] Rollout worker 3 uses device cpu
[2023-02-24 17:32:32,878][2364708] Rollout worker 4 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 5 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 6 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 7 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 8 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 9 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 10 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 11 uses device cpu
[2023-02-24 17:32:32,879][2364708] Rollout worker 12 uses device cpu
[2023-02-24 17:32:32,880][2364708] Rollout worker 13 uses device cpu
[2023-02-24 17:32:32,880][2364708] Rollout worker 14 uses device cpu
[2023-02-24 17:32:32,880][2364708] Rollout worker 15 uses device cpu
[2023-02-24 17:32:32,936][2364708] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:32:32,936][2364708] InferenceWorker_p0-w0: min num requests: 5
[2023-02-24 17:32:32,970][2364708] Starting all processes...
[2023-02-24 17:32:32,970][2364708] Starting process learner_proc0
[2023-02-24 17:32:33,717][2364708] Starting all processes...
[2023-02-24 17:32:33,721][2364708] Starting process inference_proc0-0
[2023-02-24 17:32:33,721][2364708] Starting process rollout_proc0
[2023-02-24 17:32:33,721][2364708] Starting process rollout_proc1
[2023-02-24 17:32:33,722][2364791] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:32:33,722][2364791] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for learning process 0
[2023-02-24 17:32:33,721][2364708] Starting process rollout_proc2
[2023-02-24 17:32:33,721][2364708] Starting process rollout_proc3
[2023-02-24 17:32:33,721][2364708] Starting process rollout_proc4
[2023-02-24 17:32:33,722][2364708] Starting process rollout_proc5
[2023-02-24 17:32:33,722][2364708] Starting process rollout_proc6
[2023-02-24 17:32:33,723][2364708] Starting process rollout_proc7
[2023-02-24 17:32:33,726][2364708] Starting process rollout_proc8
[2023-02-24 17:32:33,727][2364708] Starting process rollout_proc9
[2023-02-24 17:32:33,739][2364791] Num visible devices: 1
[2023-02-24 17:32:33,728][2364708] Starting process rollout_proc10
[2023-02-24 17:32:33,729][2364708] Starting process rollout_proc11
[2023-02-24 17:32:33,729][2364708] Starting process rollout_proc12
[2023-02-24 17:32:33,729][2364708] Starting process rollout_proc13
[2023-02-24 17:32:33,796][2364791] Starting seed is not provided
[2023-02-24 17:32:33,796][2364791] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:32:33,797][2364791] Initializing actor-critic model on device cuda:0
[2023-02-24 17:32:33,797][2364791] RunningMeanStd input shape: (3, 72, 128)
[2023-02-24 17:32:33,798][2364791] RunningMeanStd input shape: (1,)
[2023-02-24 17:32:33,730][2364708] Starting process rollout_proc14
[2023-02-24 17:32:33,813][2364791] ConvEncoder: input_channels=3
[2023-02-24 17:32:34,006][2364791] Conv encoder output size: 512
[2023-02-24 17:32:34,023][2364791] Policy head output size: 512
[2023-02-24 17:32:34,038][2364791] Created Actor Critic model with architecture:
[2023-02-24 17:32:34,050][2364791] ActorCriticSharedWeights(
(obs_normalizer): ObservationNormalizer(
(running_mean_std): RunningMeanStdDictInPlace(
(running_mean_std): ModuleDict(
(obs): RunningMeanStdInPlace()
)
)
)
(returns_normalizer): RecursiveScriptModule(original_name=RunningMeanStdInPlace)
(encoder): VizdoomEncoder(
(basic_encoder): ConvEncoder(
(enc): RecursiveScriptModule(
original_name=ConvEncoderImpl
(conv_head): RecursiveScriptModule(
original_name=Sequential
(0): RecursiveScriptModule(original_name=Conv2d)
(1): RecursiveScriptModule(original_name=ELU)
(2): RecursiveScriptModule(original_name=Conv2d)
(3): RecursiveScriptModule(original_name=ELU)
(4): RecursiveScriptModule(original_name=Conv2d)
(5): RecursiveScriptModule(original_name=ELU)
)
(mlp_layers): RecursiveScriptModule(
original_name=Sequential
(0): RecursiveScriptModule(original_name=Linear)
(1): RecursiveScriptModule(original_name=ELU)
)
)
)
)
(core): ModelCoreRNN(
(core): GRU(512, 512)
)
(decoder): MlpDecoder(
(mlp): Identity()
)
(critic_linear): Linear(in_features=512, out_features=1, bias=True)
(action_parameterization): ActionParameterizationDefault(
(distribution_linear): Linear(in_features=512, out_features=5, bias=True)
)
)
[2023-02-24 17:32:35,405][2364708] Starting process rollout_proc15
[2023-02-24 17:32:35,416][2364834] Worker 2 uses CPU cores [2]
[2023-02-24 17:32:35,418][2364833] Worker 1 uses CPU cores [1]
[2023-02-24 17:32:35,490][2364839] Worker 7 uses CPU cores [7]
[2023-02-24 17:32:35,500][2364855] Worker 10 uses CPU cores [10]
[2023-02-24 17:32:35,502][2364832] Worker 0 uses CPU cores [0]
[2023-02-24 17:32:35,502][2364831] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:32:35,503][2364831] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for inference process 0
[2023-02-24 17:32:35,518][2364838] Worker 6 uses CPU cores [6]
[2023-02-24 17:32:35,519][2364831] Num visible devices: 1
[2023-02-24 17:32:35,530][2364862] Worker 14 uses CPU cores [14]
[2023-02-24 17:32:35,534][2364860] Worker 12 uses CPU cores [12]
[2023-02-24 17:32:35,554][2364835] Worker 4 uses CPU cores [4]
[2023-02-24 17:32:35,573][2364861] Worker 13 uses CPU cores [13]
[2023-02-24 17:32:35,598][2364859] Worker 8 uses CPU cores [8]
[2023-02-24 17:32:35,642][2364836] Worker 3 uses CPU cores [3]
[2023-02-24 17:32:35,643][2364837] Worker 5 uses CPU cores [5]
[2023-02-24 17:32:35,661][2364856] Worker 9 uses CPU cores [9]
[2023-02-24 17:32:35,712][2364857] Worker 11 uses CPU cores [11]
[2023-02-24 17:32:36,334][2365108] Worker 15 uses CPU cores [15]
[2023-02-24 17:32:36,424][2364791] Using optimizer <class 'torch.optim.adam.Adam'>
[2023-02-24 17:32:36,424][2364791] Loading state from checkpoint /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth...
[2023-02-24 17:32:36,444][2364791] Loading model from checkpoint
[2023-02-24 17:32:36,447][2364791] Loaded experiment state at self.train_step=978, self.env_steps=4005888
[2023-02-24 17:32:36,447][2364791] Initialized policy 0 weights for model version 978
[2023-02-24 17:32:36,449][2364791] LearnerWorker_p0 finished initialization!
[2023-02-24 17:32:36,449][2364791] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-24 17:32:36,525][2364831] RunningMeanStd input shape: (3, 72, 128)
[2023-02-24 17:32:36,525][2364831] RunningMeanStd input shape: (1,)
[2023-02-24 17:32:36,532][2364831] ConvEncoder: input_channels=3
[2023-02-24 17:32:36,595][2364831] Conv encoder output size: 512
[2023-02-24 17:32:36,595][2364831] Policy head output size: 512
[2023-02-24 17:32:37,849][2364708] Inference worker 0-0 is ready!
[2023-02-24 17:32:37,849][2364708] All inference workers are ready! Signal rollout workers to start!
[2023-02-24 17:32:37,887][2364862] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,887][2364836] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,888][2364833] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,888][2364835] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,888][2364832] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,888][2364859] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,887][2364857] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,889][2364856] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,889][2364837] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,889][2365108] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,889][2364839] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,889][2364860] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,887][2364855] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,888][2364861] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,888][2364834] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:37,892][2364838] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:32:38,292][2364855] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,293][2364833] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,294][2364857] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,294][2364836] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,295][2364834] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,295][2364856] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,296][2364859] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,296][2364832] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,484][2364856] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,511][2364862] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,516][2364861] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,517][2364835] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,517][2364839] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,585][2364834] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,617][2364857] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,617][2364836] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,628][2364855] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,697][2364856] Decorrelating experience for 64 frames...
[2023-02-24 17:32:38,700][2364862] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,719][2364832] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,821][2364857] Decorrelating experience for 64 frames...
[2023-02-24 17:32:38,827][2364835] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,834][2364860] Decorrelating experience for 0 frames...
[2023-02-24 17:32:38,844][2364834] Decorrelating experience for 64 frames...
[2023-02-24 17:32:38,858][2364861] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,877][2364859] Decorrelating experience for 32 frames...
[2023-02-24 17:32:38,926][2364856] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,022][2364860] Decorrelating experience for 32 frames...
[2023-02-24 17:32:39,026][2364837] Decorrelating experience for 0 frames...
[2023-02-24 17:32:39,041][2364832] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,129][2365108] Decorrelating experience for 0 frames...
[2023-02-24 17:32:39,155][2364839] Decorrelating experience for 32 frames...
[2023-02-24 17:32:39,180][2364861] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,263][2364832] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,298][2364837] Decorrelating experience for 32 frames...
[2023-02-24 17:32:39,332][2364835] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,336][2364836] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,359][2364857] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,365][2364838] Decorrelating experience for 0 frames...
[2023-02-24 17:32:39,378][2364834] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,523][2365108] Decorrelating experience for 32 frames...
[2023-02-24 17:32:39,535][2364833] Decorrelating experience for 32 frames...
[2023-02-24 17:32:39,552][2364862] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,593][2364857] Decorrelating experience for 128 frames...
[2023-02-24 17:32:39,623][2364861] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,682][2364835] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,695][2364860] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,708][2364837] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,748][2364856] Decorrelating experience for 128 frames...
[2023-02-24 17:32:39,783][2364862] Decorrelating experience for 96 frames...
[2023-02-24 17:32:39,789][2364832] Decorrelating experience for 128 frames...
[2023-02-24 17:32:39,875][2364857] Decorrelating experience for 160 frames...
[2023-02-24 17:32:39,884][2364855] Decorrelating experience for 64 frames...
[2023-02-24 17:32:39,898][2364861] Decorrelating experience for 128 frames...
[2023-02-24 17:32:39,924][2364859] Decorrelating experience for 64 frames...
[2023-02-24 17:32:40,010][2364838] Decorrelating experience for 32 frames...
[2023-02-24 17:32:40,043][2365108] Decorrelating experience for 64 frames...
[2023-02-24 17:32:40,047][2364856] Decorrelating experience for 160 frames...
[2023-02-24 17:32:40,051][2364860] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,070][2364839] Decorrelating experience for 64 frames...
[2023-02-24 17:32:40,088][2364835] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,123][2364834] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,183][2364833] Decorrelating experience for 64 frames...
[2023-02-24 17:32:40,213][2364859] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,319][2364838] Decorrelating experience for 64 frames...
[2023-02-24 17:32:40,333][2364835] Decorrelating experience for 160 frames...
[2023-02-24 17:32:40,337][2364832] Decorrelating experience for 160 frames...
[2023-02-24 17:32:40,372][2364862] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,404][2364861] Decorrelating experience for 160 frames...
[2023-02-24 17:32:40,429][2364837] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,514][2365108] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,535][2364839] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,595][2364834] Decorrelating experience for 160 frames...
[2023-02-24 17:32:40,629][2364860] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,640][2364833] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,643][2364859] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,748][2364838] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,777][2365108] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,793][2364836] Decorrelating experience for 96 frames...
[2023-02-24 17:32:40,912][2364837] Decorrelating experience for 128 frames...
[2023-02-24 17:32:40,918][2364791] Signal inference workers to stop experience collection...
[2023-02-24 17:32:40,922][2364831] InferenceWorker_p0-w0: stopping experience collection
[2023-02-24 17:32:40,955][2364860] Decorrelating experience for 160 frames...
[2023-02-24 17:32:40,971][2364859] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,025][2364833] Decorrelating experience for 128 frames...
[2023-02-24 17:32:41,056][2364836] Decorrelating experience for 128 frames...
[2023-02-24 17:32:41,076][2364838] Decorrelating experience for 128 frames...
[2023-02-24 17:32:41,132][2365108] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,199][2364837] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,202][2364862] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,266][2364708] Fps is (10 sec: nan, 60 sec: nan, 300 sec: nan). Total num frames: 4005888. Throughput: 0: nan. Samples: 2304. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0)
[2023-02-24 17:32:41,266][2364708] Avg episode reward: [(0, '1.871')]
[2023-02-24 17:32:41,343][2364839] Decorrelating experience for 128 frames...
[2023-02-24 17:32:41,343][2364836] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,392][2364838] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,443][2364855] Decorrelating experience for 96 frames...
[2023-02-24 17:32:41,569][2364833] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,656][2364855] Decorrelating experience for 128 frames...
[2023-02-24 17:32:41,671][2364839] Decorrelating experience for 160 frames...
[2023-02-24 17:32:41,801][2364791] Signal inference workers to resume experience collection...
[2023-02-24 17:32:41,801][2364831] InferenceWorker_p0-w0: resuming experience collection
[2023-02-24 17:32:41,877][2364855] Decorrelating experience for 160 frames...
[2023-02-24 17:32:43,213][2364831] Updated weights for policy 0, policy_version 988 (0.0223)
[2023-02-24 17:32:44,160][2364831] Updated weights for policy 0, policy_version 998 (0.0009)
[2023-02-24 17:32:45,129][2364831] Updated weights for policy 0, policy_version 1008 (0.0011)
[2023-02-24 17:32:46,071][2364831] Updated weights for policy 0, policy_version 1018 (0.0009)
[2023-02-24 17:32:46,266][2364708] Fps is (10 sec: 33586.6, 60 sec: 33586.6, 300 sec: 33586.6). Total num frames: 4173824. Throughput: 0: 3350.3. Samples: 19056. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:32:46,266][2364708] Avg episode reward: [(0, '18.870')]
[2023-02-24 17:32:47,062][2364831] Updated weights for policy 0, policy_version 1028 (0.0009)
[2023-02-24 17:32:48,070][2364831] Updated weights for policy 0, policy_version 1038 (0.0012)
[2023-02-24 17:32:49,031][2364831] Updated weights for policy 0, policy_version 1048 (0.0009)
[2023-02-24 17:32:50,044][2364831] Updated weights for policy 0, policy_version 1058 (0.0016)
[2023-02-24 17:32:50,993][2364831] Updated weights for policy 0, policy_version 1068 (0.0013)
[2023-02-24 17:32:51,266][2364708] Fps is (10 sec: 37683.0, 60 sec: 37683.0, 300 sec: 37683.0). Total num frames: 4382720. Throughput: 0: 7973.1. Samples: 82035. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:32:51,266][2364708] Avg episode reward: [(0, '24.692')]
[2023-02-24 17:32:51,978][2364831] Updated weights for policy 0, policy_version 1078 (0.0010)
[2023-02-24 17:32:52,932][2364708] Heartbeat connected on Batcher_0
[2023-02-24 17:32:52,934][2364708] Heartbeat connected on LearnerWorker_p0
[2023-02-24 17:32:52,941][2364708] Heartbeat connected on InferenceWorker_p0-w0
[2023-02-24 17:32:52,944][2364708] Heartbeat connected on RolloutWorker_w2
[2023-02-24 17:32:52,945][2364708] Heartbeat connected on RolloutWorker_w3
[2023-02-24 17:32:52,949][2364708] Heartbeat connected on RolloutWorker_w4
[2023-02-24 17:32:52,949][2364708] Heartbeat connected on RolloutWorker_w5
[2023-02-24 17:32:52,953][2364708] Heartbeat connected on RolloutWorker_w6
[2023-02-24 17:32:52,953][2364708] Heartbeat connected on RolloutWorker_w7
[2023-02-24 17:32:52,955][2364708] Heartbeat connected on RolloutWorker_w0
[2023-02-24 17:32:52,955][2364708] Heartbeat connected on RolloutWorker_w1
[2023-02-24 17:32:52,957][2364708] Heartbeat connected on RolloutWorker_w8
[2023-02-24 17:32:52,958][2364708] Heartbeat connected on RolloutWorker_w9
[2023-02-24 17:32:52,962][2364708] Heartbeat connected on RolloutWorker_w10
[2023-02-24 17:32:52,963][2364708] Heartbeat connected on RolloutWorker_w11
[2023-02-24 17:32:52,966][2364708] Heartbeat connected on RolloutWorker_w13
[2023-02-24 17:32:52,967][2364708] Heartbeat connected on RolloutWorker_w12
[2023-02-24 17:32:52,968][2364708] Heartbeat connected on RolloutWorker_w14
[2023-02-24 17:32:52,981][2364708] Heartbeat connected on RolloutWorker_w15
[2023-02-24 17:32:52,981][2364831] Updated weights for policy 0, policy_version 1088 (0.0012)
[2023-02-24 17:32:53,959][2364831] Updated weights for policy 0, policy_version 1098 (0.0010)
[2023-02-24 17:32:54,954][2364831] Updated weights for policy 0, policy_version 1108 (0.0009)
[2023-02-24 17:32:55,930][2364831] Updated weights for policy 0, policy_version 1118 (0.0011)
[2023-02-24 17:32:56,266][2364708] Fps is (10 sec: 41779.4, 60 sec: 39048.4, 300 sec: 39048.4). Total num frames: 4591616. Throughput: 0: 9485.2. Samples: 144582. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:32:56,266][2364708] Avg episode reward: [(0, '25.204')]
[2023-02-24 17:32:56,266][2364791] Saving new best policy, reward=25.204!
[2023-02-24 17:32:56,911][2364831] Updated weights for policy 0, policy_version 1128 (0.0014)
[2023-02-24 17:32:57,948][2364831] Updated weights for policy 0, policy_version 1138 (0.0011)
[2023-02-24 17:32:58,912][2364831] Updated weights for policy 0, policy_version 1148 (0.0011)
[2023-02-24 17:32:59,887][2364831] Updated weights for policy 0, policy_version 1158 (0.0012)
[2023-02-24 17:33:00,871][2364831] Updated weights for policy 0, policy_version 1168 (0.0016)
[2023-02-24 17:33:01,266][2364708] Fps is (10 sec: 41369.3, 60 sec: 39526.2, 300 sec: 39526.2). Total num frames: 4796416. Throughput: 0: 8660.6. Samples: 175518. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:33:01,266][2364708] Avg episode reward: [(0, '24.312')]
[2023-02-24 17:33:01,888][2364831] Updated weights for policy 0, policy_version 1178 (0.0009)
[2023-02-24 17:33:02,860][2364831] Updated weights for policy 0, policy_version 1188 (0.0009)
[2023-02-24 17:33:03,875][2364831] Updated weights for policy 0, policy_version 1198 (0.0014)
[2023-02-24 17:33:04,853][2364831] Updated weights for policy 0, policy_version 1208 (0.0009)
[2023-02-24 17:33:05,857][2364831] Updated weights for policy 0, policy_version 1218 (0.0012)
[2023-02-24 17:33:06,266][2364708] Fps is (10 sec: 41368.6, 60 sec: 39976.5, 300 sec: 39976.5). Total num frames: 5005312. Throughput: 0: 9410.3. Samples: 237564. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:33:06,266][2364708] Avg episode reward: [(0, '24.517')]
[2023-02-24 17:33:06,796][2364831] Updated weights for policy 0, policy_version 1228 (0.0011)
[2023-02-24 17:33:07,774][2364831] Updated weights for policy 0, policy_version 1238 (0.0011)
[2023-02-24 17:33:08,779][2364831] Updated weights for policy 0, policy_version 1248 (0.0012)
[2023-02-24 17:33:09,772][2364831] Updated weights for policy 0, policy_version 1258 (0.0010)
[2023-02-24 17:33:10,736][2364831] Updated weights for policy 0, policy_version 1268 (0.0014)
[2023-02-24 17:33:11,266][2364708] Fps is (10 sec: 41779.2, 60 sec: 40277.2, 300 sec: 40277.2). Total num frames: 5214208. Throughput: 0: 9920.8. Samples: 299928. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:33:11,266][2364708] Avg episode reward: [(0, '30.044')]
[2023-02-24 17:33:11,269][2364791] Saving new best policy, reward=30.044!
[2023-02-24 17:33:11,776][2364831] Updated weights for policy 0, policy_version 1278 (0.0009)
[2023-02-24 17:33:12,732][2364831] Updated weights for policy 0, policy_version 1288 (0.0009)
[2023-02-24 17:33:13,718][2364831] Updated weights for policy 0, policy_version 1298 (0.0014)
[2023-02-24 17:33:14,743][2364831] Updated weights for policy 0, policy_version 1308 (0.0011)
[2023-02-24 17:33:15,710][2364831] Updated weights for policy 0, policy_version 1318 (0.0012)
[2023-02-24 17:33:16,266][2364708] Fps is (10 sec: 41370.3, 60 sec: 40374.7, 300 sec: 40374.7). Total num frames: 5419008. Throughput: 0: 9385.3. Samples: 330789. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:33:16,266][2364708] Avg episode reward: [(0, '26.183')]
[2023-02-24 17:33:16,724][2364831] Updated weights for policy 0, policy_version 1328 (0.0008)
[2023-02-24 17:33:17,710][2364831] Updated weights for policy 0, policy_version 1338 (0.0014)
[2023-02-24 17:33:18,666][2364831] Updated weights for policy 0, policy_version 1348 (0.0009)
[2023-02-24 17:33:19,683][2364831] Updated weights for policy 0, policy_version 1358 (0.0009)
[2023-02-24 17:33:20,617][2364831] Updated weights for policy 0, policy_version 1368 (0.0010)
[2023-02-24 17:33:21,266][2364708] Fps is (10 sec: 41370.0, 60 sec: 40550.4, 300 sec: 40550.4). Total num frames: 5627904. Throughput: 0: 9770.6. Samples: 393129. Policy #0 lag: (min: 1.0, avg: 2.0, max: 4.0)
[2023-02-24 17:33:21,266][2364708] Avg episode reward: [(0, '25.398')]
[2023-02-24 17:33:21,637][2364831] Updated weights for policy 0, policy_version 1378 (0.0009)
[2023-02-24 17:33:22,601][2364831] Updated weights for policy 0, policy_version 1388 (0.0012)
[2023-02-24 17:33:23,610][2364831] Updated weights for policy 0, policy_version 1398 (0.0009)
[2023-02-24 17:33:24,587][2364831] Updated weights for policy 0, policy_version 1408 (0.0012)
[2023-02-24 17:33:25,585][2364831] Updated weights for policy 0, policy_version 1418 (0.0010)
[2023-02-24 17:33:26,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 40595.8, 300 sec: 40595.8). Total num frames: 5832704. Throughput: 0: 10069.7. Samples: 455439. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:33:26,266][2364708] Avg episode reward: [(0, '24.553')]
[2023-02-24 17:33:26,570][2364831] Updated weights for policy 0, policy_version 1428 (0.0009)
[2023-02-24 17:33:27,565][2364831] Updated weights for policy 0, policy_version 1438 (0.0009)
[2023-02-24 17:33:28,519][2364831] Updated weights for policy 0, policy_version 1448 (0.0012)
[2023-02-24 17:33:29,541][2364831] Updated weights for policy 0, policy_version 1458 (0.0011)
[2023-02-24 17:33:30,518][2364831] Updated weights for policy 0, policy_version 1468 (0.0012)
[2023-02-24 17:33:31,266][2364708] Fps is (10 sec: 41369.0, 60 sec: 40714.1, 300 sec: 40714.1). Total num frames: 6041600. Throughput: 0: 10387.5. Samples: 486495. Policy #0 lag: (min: 0.0, avg: 2.1, max: 5.0)
[2023-02-24 17:33:31,266][2364708] Avg episode reward: [(0, '27.755')]
[2023-02-24 17:33:31,537][2364831] Updated weights for policy 0, policy_version 1478 (0.0009)
[2023-02-24 17:33:32,488][2364831] Updated weights for policy 0, policy_version 1488 (0.0009)
[2023-02-24 17:33:33,488][2364831] Updated weights for policy 0, policy_version 1498 (0.0010)
[2023-02-24 17:33:34,494][2364831] Updated weights for policy 0, policy_version 1508 (0.0009)
[2023-02-24 17:33:35,471][2364831] Updated weights for policy 0, policy_version 1518 (0.0012)
[2023-02-24 17:33:36,266][2364708] Fps is (10 sec: 41368.5, 60 sec: 40736.3, 300 sec: 40736.3). Total num frames: 6246400. Throughput: 0: 10367.1. Samples: 548559. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:33:36,267][2364708] Avg episode reward: [(0, '29.595')]
[2023-02-24 17:33:36,445][2364831] Updated weights for policy 0, policy_version 1528 (0.0015)
[2023-02-24 17:33:37,457][2364831] Updated weights for policy 0, policy_version 1538 (0.0010)
[2023-02-24 17:33:38,386][2364831] Updated weights for policy 0, policy_version 1548 (0.0010)
[2023-02-24 17:33:39,394][2364831] Updated weights for policy 0, policy_version 1558 (0.0009)
[2023-02-24 17:33:40,414][2364831] Updated weights for policy 0, policy_version 1568 (0.0013)
[2023-02-24 17:33:41,266][2364708] Fps is (10 sec: 41779.5, 60 sec: 40891.7, 300 sec: 40891.7). Total num frames: 6459392. Throughput: 0: 10363.3. Samples: 610929. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:33:41,266][2364708] Avg episode reward: [(0, '27.467')]
[2023-02-24 17:33:41,392][2364831] Updated weights for policy 0, policy_version 1578 (0.0011)
[2023-02-24 17:33:42,409][2364831] Updated weights for policy 0, policy_version 1588 (0.0015)
[2023-02-24 17:33:43,364][2364831] Updated weights for policy 0, policy_version 1598 (0.0016)
[2023-02-24 17:33:44,391][2364831] Updated weights for policy 0, policy_version 1608 (0.0009)
[2023-02-24 17:33:45,344][2364831] Updated weights for policy 0, policy_version 1618 (0.0009)
[2023-02-24 17:33:46,266][2364708] Fps is (10 sec: 41371.0, 60 sec: 41437.9, 300 sec: 40834.0). Total num frames: 6660096. Throughput: 0: 10362.0. Samples: 641805. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:33:46,266][2364708] Avg episode reward: [(0, '26.443')]
[2023-02-24 17:33:46,345][2364831] Updated weights for policy 0, policy_version 1628 (0.0009)
[2023-02-24 17:33:47,361][2364831] Updated weights for policy 0, policy_version 1638 (0.0012)
[2023-02-24 17:33:48,358][2364831] Updated weights for policy 0, policy_version 1648 (0.0015)
[2023-02-24 17:33:49,372][2364831] Updated weights for policy 0, policy_version 1658 (0.0010)
[2023-02-24 17:33:50,384][2364831] Updated weights for policy 0, policy_version 1668 (0.0009)
[2023-02-24 17:33:51,266][2364708] Fps is (10 sec: 40550.3, 60 sec: 41369.5, 300 sec: 40842.9). Total num frames: 6864896. Throughput: 0: 10346.7. Samples: 703164. Policy #0 lag: (min: 0.0, avg: 1.9, max: 5.0)
[2023-02-24 17:33:51,266][2364708] Avg episode reward: [(0, '29.470')]
[2023-02-24 17:33:51,339][2364831] Updated weights for policy 0, policy_version 1678 (0.0009)
[2023-02-24 17:33:52,333][2364831] Updated weights for policy 0, policy_version 1688 (0.0013)
[2023-02-24 17:33:53,373][2364831] Updated weights for policy 0, policy_version 1698 (0.0010)
[2023-02-24 17:33:54,326][2364831] Updated weights for policy 0, policy_version 1708 (0.0011)
[2023-02-24 17:33:55,317][2364831] Updated weights for policy 0, policy_version 1718 (0.0015)
[2023-02-24 17:33:56,266][2364708] Fps is (10 sec: 41369.2, 60 sec: 41369.5, 300 sec: 40905.3). Total num frames: 7073792. Throughput: 0: 10341.1. Samples: 765279. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:33:56,266][2364708] Avg episode reward: [(0, '29.662')]
[2023-02-24 17:33:56,319][2364831] Updated weights for policy 0, policy_version 1728 (0.0009)
[2023-02-24 17:33:57,289][2364831] Updated weights for policy 0, policy_version 1738 (0.0014)
[2023-02-24 17:33:58,346][2364831] Updated weights for policy 0, policy_version 1748 (0.0009)
[2023-02-24 17:33:59,308][2364831] Updated weights for policy 0, policy_version 1758 (0.0009)
[2023-02-24 17:34:00,266][2364831] Updated weights for policy 0, policy_version 1768 (0.0009)
[2023-02-24 17:34:01,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 41369.6, 300 sec: 40908.7). Total num frames: 7278592. Throughput: 0: 10343.3. Samples: 796236. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:34:01,266][2364708] Avg episode reward: [(0, '28.381')]
[2023-02-24 17:34:01,272][2364831] Updated weights for policy 0, policy_version 1778 (0.0014)
[2023-02-24 17:34:02,285][2364831] Updated weights for policy 0, policy_version 1788 (0.0012)
[2023-02-24 17:34:03,229][2364831] Updated weights for policy 0, policy_version 1798 (0.0014)
[2023-02-24 17:34:04,225][2364831] Updated weights for policy 0, policy_version 1808 (0.0011)
[2023-02-24 17:34:05,252][2364831] Updated weights for policy 0, policy_version 1818 (0.0011)
[2023-02-24 17:34:06,217][2364831] Updated weights for policy 0, policy_version 1828 (0.0014)
[2023-02-24 17:34:06,266][2364708] Fps is (10 sec: 41370.1, 60 sec: 41369.8, 300 sec: 40960.0). Total num frames: 7487488. Throughput: 0: 10333.9. Samples: 858156. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:34:06,266][2364708] Avg episode reward: [(0, '28.723')]
[2023-02-24 17:34:07,238][2364831] Updated weights for policy 0, policy_version 1838 (0.0012)
[2023-02-24 17:34:08,199][2364831] Updated weights for policy 0, policy_version 1848 (0.0009)
[2023-02-24 17:34:09,209][2364831] Updated weights for policy 0, policy_version 1858 (0.0015)
[2023-02-24 17:34:10,200][2364831] Updated weights for policy 0, policy_version 1868 (0.0013)
[2023-02-24 17:34:11,160][2364831] Updated weights for policy 0, policy_version 1878 (0.0012)
[2023-02-24 17:34:11,266][2364708] Fps is (10 sec: 41370.0, 60 sec: 41301.4, 300 sec: 40960.0). Total num frames: 7692288. Throughput: 0: 10328.4. Samples: 920217. Policy #0 lag: (min: 0.0, avg: 1.6, max: 3.0)
[2023-02-24 17:34:11,266][2364708] Avg episode reward: [(0, '27.028')]
[2023-02-24 17:34:12,189][2364831] Updated weights for policy 0, policy_version 1888 (0.0013)
[2023-02-24 17:34:13,157][2364831] Updated weights for policy 0, policy_version 1898 (0.0009)
[2023-02-24 17:34:14,140][2364831] Updated weights for policy 0, policy_version 1908 (0.0009)
[2023-02-24 17:34:15,142][2364831] Updated weights for policy 0, policy_version 1918 (0.0009)
[2023-02-24 17:34:16,149][2364831] Updated weights for policy 0, policy_version 1928 (0.0008)
[2023-02-24 17:34:16,266][2364708] Fps is (10 sec: 41368.9, 60 sec: 41369.6, 300 sec: 41003.1). Total num frames: 7901184. Throughput: 0: 10329.2. Samples: 951309. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:34:16,266][2364708] Avg episode reward: [(0, '31.603')]
[2023-02-24 17:34:16,267][2364791] Saving new best policy, reward=31.603!
[2023-02-24 17:34:17,141][2364831] Updated weights for policy 0, policy_version 1938 (0.0011)
[2023-02-24 17:34:18,125][2364831] Updated weights for policy 0, policy_version 1948 (0.0010)
[2023-02-24 17:34:19,131][2364831] Updated weights for policy 0, policy_version 1958 (0.0012)
[2023-02-24 17:34:20,142][2364831] Updated weights for policy 0, policy_version 1968 (0.0009)
[2023-02-24 17:34:21,131][2364831] Updated weights for policy 0, policy_version 1978 (0.0014)
[2023-02-24 17:34:21,266][2364708] Fps is (10 sec: 41369.1, 60 sec: 41301.3, 300 sec: 41000.9). Total num frames: 8105984. Throughput: 0: 10320.3. Samples: 1012971. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:34:21,266][2364708] Avg episode reward: [(0, '28.359')]
[2023-02-24 17:34:22,137][2364831] Updated weights for policy 0, policy_version 1988 (0.0009)
[2023-02-24 17:34:23,105][2364831] Updated weights for policy 0, policy_version 1998 (0.0009)
[2023-02-24 17:34:24,115][2364831] Updated weights for policy 0, policy_version 2008 (0.0012)
[2023-02-24 17:34:25,058][2364831] Updated weights for policy 0, policy_version 2018 (0.0010)
[2023-02-24 17:34:26,094][2364831] Updated weights for policy 0, policy_version 2028 (0.0010)
[2023-02-24 17:34:26,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 41369.6, 300 sec: 41038.0). Total num frames: 8314880. Throughput: 0: 10310.5. Samples: 1074903. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:34:26,266][2364708] Avg episode reward: [(0, '28.710')]
[2023-02-24 17:34:27,060][2364831] Updated weights for policy 0, policy_version 2038 (0.0011)
[2023-02-24 17:34:28,047][2364831] Updated weights for policy 0, policy_version 2048 (0.0013)
[2023-02-24 17:34:29,023][2364831] Updated weights for policy 0, policy_version 2058 (0.0009)
[2023-02-24 17:34:30,021][2364831] Updated weights for policy 0, policy_version 2068 (0.0012)
[2023-02-24 17:34:31,055][2364831] Updated weights for policy 0, policy_version 2078 (0.0009)
[2023-02-24 17:34:31,266][2364708] Fps is (10 sec: 41369.6, 60 sec: 41301.4, 300 sec: 41034.4). Total num frames: 8519680. Throughput: 0: 10314.6. Samples: 1105962. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:34:31,266][2364708] Avg episode reward: [(0, '32.526')]
[2023-02-24 17:34:31,271][2364791] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000002080_8519680.pth...
[2023-02-24 17:34:31,337][2364791] Removing /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000000759_3108864.pth
[2023-02-24 17:34:31,343][2364791] Saving new best policy, reward=32.526!
[2023-02-24 17:34:32,027][2364831] Updated weights for policy 0, policy_version 2088 (0.0014)
[2023-02-24 17:34:33,002][2364831] Updated weights for policy 0, policy_version 2098 (0.0011)
[2023-02-24 17:34:34,028][2364831] Updated weights for policy 0, policy_version 2108 (0.0010)
[2023-02-24 17:34:34,994][2364831] Updated weights for policy 0, policy_version 2118 (0.0009)
[2023-02-24 17:34:36,021][2364831] Updated weights for policy 0, policy_version 2128 (0.0011)
[2023-02-24 17:34:36,266][2364708] Fps is (10 sec: 40960.1, 60 sec: 41301.5, 300 sec: 41031.2). Total num frames: 8724480. Throughput: 0: 10328.4. Samples: 1167942. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:34:36,266][2364708] Avg episode reward: [(0, '29.338')]
[2023-02-24 17:34:37,019][2364831] Updated weights for policy 0, policy_version 2138 (0.0012)
[2023-02-24 17:34:38,029][2364831] Updated weights for policy 0, policy_version 2148 (0.0010)
[2023-02-24 17:34:38,999][2364831] Updated weights for policy 0, policy_version 2158 (0.0009)
[2023-02-24 17:34:40,008][2364831] Updated weights for policy 0, policy_version 2168 (0.0009)
[2023-02-24 17:34:40,960][2364831] Updated weights for policy 0, policy_version 2178 (0.0013)
[2023-02-24 17:34:41,266][2364708] Fps is (10 sec: 40960.0, 60 sec: 41164.8, 300 sec: 41028.2). Total num frames: 8929280. Throughput: 0: 10315.9. Samples: 1229496. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:34:41,266][2364708] Avg episode reward: [(0, '28.569')]
[2023-02-24 17:34:41,985][2364831] Updated weights for policy 0, policy_version 2188 (0.0012)
[2023-02-24 17:34:42,962][2364831] Updated weights for policy 0, policy_version 2198 (0.0009)
[2023-02-24 17:34:43,968][2364831] Updated weights for policy 0, policy_version 2208 (0.0009)
[2023-02-24 17:34:44,995][2364831] Updated weights for policy 0, policy_version 2218 (0.0010)
[2023-02-24 17:34:45,933][2364831] Updated weights for policy 0, policy_version 2228 (0.0010)
[2023-02-24 17:34:46,266][2364708] Fps is (10 sec: 41369.4, 60 sec: 41301.2, 300 sec: 41058.3). Total num frames: 9138176. Throughput: 0: 10313.9. Samples: 1260363. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:34:46,266][2364708] Avg episode reward: [(0, '29.705')]
[2023-02-24 17:34:46,949][2364831] Updated weights for policy 0, policy_version 2238 (0.0011)
[2023-02-24 17:34:47,940][2364831] Updated weights for policy 0, policy_version 2248 (0.0012)
[2023-02-24 17:34:48,931][2364831] Updated weights for policy 0, policy_version 2258 (0.0016)
[2023-02-24 17:34:49,903][2364831] Updated weights for policy 0, policy_version 2268 (0.0015)
[2023-02-24 17:34:50,883][2364831] Updated weights for policy 0, policy_version 2278 (0.0009)
[2023-02-24 17:34:51,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 41301.4, 300 sec: 41054.5). Total num frames: 9342976. Throughput: 0: 10311.3. Samples: 1322166. Policy #0 lag: (min: 0.0, avg: 1.6, max: 4.0)
[2023-02-24 17:34:51,266][2364708] Avg episode reward: [(0, '30.602')]
[2023-02-24 17:34:51,902][2364831] Updated weights for policy 0, policy_version 2288 (0.0010)
[2023-02-24 17:34:52,909][2364831] Updated weights for policy 0, policy_version 2298 (0.0010)
[2023-02-24 17:34:53,904][2364831] Updated weights for policy 0, policy_version 2308 (0.0010)
[2023-02-24 17:34:54,860][2364831] Updated weights for policy 0, policy_version 2318 (0.0010)
[2023-02-24 17:34:55,858][2364831] Updated weights for policy 0, policy_version 2328 (0.0009)
[2023-02-24 17:34:56,266][2364708] Fps is (10 sec: 40960.2, 60 sec: 41233.1, 300 sec: 41051.0). Total num frames: 9547776. Throughput: 0: 10311.1. Samples: 1384218. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:34:56,266][2364708] Avg episode reward: [(0, '29.322')]
[2023-02-24 17:34:56,865][2364831] Updated weights for policy 0, policy_version 2338 (0.0009)
[2023-02-24 17:34:57,857][2364831] Updated weights for policy 0, policy_version 2348 (0.0009)
[2023-02-24 17:34:58,810][2364831] Updated weights for policy 0, policy_version 2358 (0.0016)
[2023-02-24 17:34:59,854][2364831] Updated weights for policy 0, policy_version 2368 (0.0012)
[2023-02-24 17:35:00,835][2364831] Updated weights for policy 0, policy_version 2378 (0.0009)
[2023-02-24 17:35:01,266][2364708] Fps is (10 sec: 41369.5, 60 sec: 41301.3, 300 sec: 41077.0). Total num frames: 9756672. Throughput: 0: 10308.8. Samples: 1415205. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:35:01,266][2364708] Avg episode reward: [(0, '30.241')]
[2023-02-24 17:35:01,804][2364831] Updated weights for policy 0, policy_version 2388 (0.0012)
[2023-02-24 17:35:02,822][2364831] Updated weights for policy 0, policy_version 2398 (0.0010)
[2023-02-24 17:35:03,814][2364831] Updated weights for policy 0, policy_version 2408 (0.0010)
[2023-02-24 17:35:04,799][2364831] Updated weights for policy 0, policy_version 2418 (0.0011)
[2023-02-24 17:35:05,796][2364831] Updated weights for policy 0, policy_version 2428 (0.0009)
[2023-02-24 17:35:06,266][2364708] Fps is (10 sec: 41369.5, 60 sec: 41233.0, 300 sec: 41073.0). Total num frames: 9961472. Throughput: 0: 10314.6. Samples: 1477128. Policy #0 lag: (min: 0.0, avg: 2.2, max: 4.0)
[2023-02-24 17:35:06,266][2364708] Avg episode reward: [(0, '26.511')]
[2023-02-24 17:35:06,791][2364831] Updated weights for policy 0, policy_version 2438 (0.0012)
[2023-02-24 17:35:07,790][2364831] Updated weights for policy 0, policy_version 2448 (0.0009)
[2023-02-24 17:35:08,780][2364831] Updated weights for policy 0, policy_version 2458 (0.0014)
[2023-02-24 17:35:09,752][2364831] Updated weights for policy 0, policy_version 2468 (0.0010)
[2023-02-24 17:35:10,772][2364831] Updated weights for policy 0, policy_version 2478 (0.0010)
[2023-02-24 17:35:11,266][2364708] Fps is (10 sec: 40958.9, 60 sec: 41232.8, 300 sec: 41069.1). Total num frames: 10166272. Throughput: 0: 10309.2. Samples: 1538820. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:35:11,267][2364708] Avg episode reward: [(0, '31.927')]
[2023-02-24 17:35:11,766][2364831] Updated weights for policy 0, policy_version 2488 (0.0012)
[2023-02-24 17:35:12,753][2364831] Updated weights for policy 0, policy_version 2498 (0.0011)
[2023-02-24 17:35:13,754][2364831] Updated weights for policy 0, policy_version 2508 (0.0014)
[2023-02-24 17:35:14,735][2364831] Updated weights for policy 0, policy_version 2518 (0.0013)
[2023-02-24 17:35:15,703][2364831] Updated weights for policy 0, policy_version 2528 (0.0011)
[2023-02-24 17:35:16,266][2364708] Fps is (10 sec: 41370.2, 60 sec: 41233.2, 300 sec: 41092.1). Total num frames: 10375168. Throughput: 0: 10308.7. Samples: 1569852. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:35:16,266][2364708] Avg episode reward: [(0, '29.333')]
[2023-02-24 17:35:16,713][2364831] Updated weights for policy 0, policy_version 2538 (0.0012)
[2023-02-24 17:35:17,728][2364831] Updated weights for policy 0, policy_version 2548 (0.0009)
[2023-02-24 17:35:18,681][2364831] Updated weights for policy 0, policy_version 2558 (0.0012)
[2023-02-24 17:35:19,719][2364831] Updated weights for policy 0, policy_version 2568 (0.0008)
[2023-02-24 17:35:20,725][2364831] Updated weights for policy 0, policy_version 2578 (0.0010)
[2023-02-24 17:35:21,266][2364708] Fps is (10 sec: 41370.6, 60 sec: 41233.1, 300 sec: 41088.0). Total num frames: 10579968. Throughput: 0: 10308.5. Samples: 1631826. Policy #0 lag: (min: 0.0, avg: 1.4, max: 3.0)
[2023-02-24 17:35:21,266][2364708] Avg episode reward: [(0, '29.221')]
[2023-02-24 17:35:21,646][2364831] Updated weights for policy 0, policy_version 2588 (0.0013)
[2023-02-24 17:35:22,678][2364831] Updated weights for policy 0, policy_version 2598 (0.0011)
[2023-02-24 17:35:23,706][2364831] Updated weights for policy 0, policy_version 2608 (0.0010)
[2023-02-24 17:35:24,640][2364831] Updated weights for policy 0, policy_version 2618 (0.0014)
[2023-02-24 17:35:25,686][2364831] Updated weights for policy 0, policy_version 2628 (0.0014)
[2023-02-24 17:35:26,266][2364708] Fps is (10 sec: 41369.5, 60 sec: 41233.2, 300 sec: 41108.9). Total num frames: 10788864. Throughput: 0: 10318.8. Samples: 1693842. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:35:26,266][2364708] Avg episode reward: [(0, '32.770')]
[2023-02-24 17:35:26,267][2364791] Saving new best policy, reward=32.770!
[2023-02-24 17:35:26,645][2364831] Updated weights for policy 0, policy_version 2638 (0.0019)
[2023-02-24 17:35:27,596][2364831] Updated weights for policy 0, policy_version 2648 (0.0012)
[2023-02-24 17:35:28,675][2364831] Updated weights for policy 0, policy_version 2658 (0.0012)
[2023-02-24 17:35:29,615][2364831] Updated weights for policy 0, policy_version 2668 (0.0009)
[2023-02-24 17:35:30,625][2364831] Updated weights for policy 0, policy_version 2678 (0.0012)
[2023-02-24 17:35:31,266][2364708] Fps is (10 sec: 41369.4, 60 sec: 41233.0, 300 sec: 41104.5). Total num frames: 10993664. Throughput: 0: 10317.5. Samples: 1724652. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:35:31,266][2364708] Avg episode reward: [(0, '33.019')]
[2023-02-24 17:35:31,271][2364791] Saving new best policy, reward=33.019!
[2023-02-24 17:35:31,627][2364831] Updated weights for policy 0, policy_version 2688 (0.0013)
[2023-02-24 17:35:32,607][2364831] Updated weights for policy 0, policy_version 2698 (0.0012)
[2023-02-24 17:35:33,625][2364831] Updated weights for policy 0, policy_version 2708 (0.0009)
[2023-02-24 17:35:34,595][2364831] Updated weights for policy 0, policy_version 2718 (0.0010)
[2023-02-24 17:35:35,612][2364831] Updated weights for policy 0, policy_version 2728 (0.0010)
[2023-02-24 17:35:36,266][2364708] Fps is (10 sec: 40959.8, 60 sec: 41233.1, 300 sec: 41100.4). Total num frames: 11198464. Throughput: 0: 10313.8. Samples: 1786287. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:35:36,266][2364708] Avg episode reward: [(0, '30.687')]
[2023-02-24 17:35:36,597][2364831] Updated weights for policy 0, policy_version 2738 (0.0012)
[2023-02-24 17:35:37,588][2364831] Updated weights for policy 0, policy_version 2748 (0.0012)
[2023-02-24 17:35:38,625][2364831] Updated weights for policy 0, policy_version 2758 (0.0016)
[2023-02-24 17:35:39,609][2364831] Updated weights for policy 0, policy_version 2768 (0.0009)
[2023-02-24 17:35:40,592][2364831] Updated weights for policy 0, policy_version 2778 (0.0009)
[2023-02-24 17:35:41,266][2364708] Fps is (10 sec: 40960.2, 60 sec: 41233.0, 300 sec: 41096.5). Total num frames: 11403264. Throughput: 0: 10304.3. Samples: 1847913. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:35:41,266][2364708] Avg episode reward: [(0, '30.382')]
[2023-02-24 17:35:41,573][2364831] Updated weights for policy 0, policy_version 2788 (0.0012)
[2023-02-24 17:35:42,590][2364831] Updated weights for policy 0, policy_version 2798 (0.0010)
[2023-02-24 17:35:43,575][2364831] Updated weights for policy 0, policy_version 2808 (0.0010)
[2023-02-24 17:35:44,572][2364831] Updated weights for policy 0, policy_version 2818 (0.0011)
[2023-02-24 17:35:45,597][2364831] Updated weights for policy 0, policy_version 2828 (0.0008)
[2023-02-24 17:35:46,266][2364708] Fps is (10 sec: 41369.3, 60 sec: 41233.1, 300 sec: 41115.0). Total num frames: 11612160. Throughput: 0: 10300.4. Samples: 1878723. Policy #0 lag: (min: 0.0, avg: 2.2, max: 4.0)
[2023-02-24 17:35:46,266][2364708] Avg episode reward: [(0, '30.394')]
[2023-02-24 17:35:46,560][2364831] Updated weights for policy 0, policy_version 2838 (0.0010)
[2023-02-24 17:35:47,531][2364831] Updated weights for policy 0, policy_version 2848 (0.0010)
[2023-02-24 17:35:48,545][2364831] Updated weights for policy 0, policy_version 2858 (0.0012)
[2023-02-24 17:35:49,500][2364831] Updated weights for policy 0, policy_version 2868 (0.0010)
[2023-02-24 17:35:50,524][2364831] Updated weights for policy 0, policy_version 2878 (0.0012)
[2023-02-24 17:35:51,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 41233.0, 300 sec: 41110.9). Total num frames: 11816960. Throughput: 0: 10303.3. Samples: 1940775. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:35:51,266][2364708] Avg episode reward: [(0, '27.697')]
[2023-02-24 17:35:51,481][2364831] Updated weights for policy 0, policy_version 2888 (0.0011)
[2023-02-24 17:35:52,493][2364831] Updated weights for policy 0, policy_version 2898 (0.0012)
[2023-02-24 17:35:53,490][2364831] Updated weights for policy 0, policy_version 2908 (0.0009)
[2023-02-24 17:35:54,455][2364831] Updated weights for policy 0, policy_version 2918 (0.0012)
[2023-02-24 17:35:55,469][2364831] Updated weights for policy 0, policy_version 2928 (0.0012)
[2023-02-24 17:35:56,266][2364708] Fps is (10 sec: 40960.4, 60 sec: 41233.1, 300 sec: 41107.0). Total num frames: 12021760. Throughput: 0: 10309.2. Samples: 2002731. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:35:56,266][2364708] Avg episode reward: [(0, '32.058')]
[2023-02-24 17:35:56,472][2364831] Updated weights for policy 0, policy_version 2938 (0.0009)
[2023-02-24 17:35:57,467][2364831] Updated weights for policy 0, policy_version 2948 (0.0011)
[2023-02-24 17:35:58,438][2364831] Updated weights for policy 0, policy_version 2958 (0.0013)
[2023-02-24 17:35:59,455][2364831] Updated weights for policy 0, policy_version 2968 (0.0009)
[2023-02-24 17:36:00,457][2364831] Updated weights for policy 0, policy_version 2978 (0.0009)
[2023-02-24 17:36:01,266][2364708] Fps is (10 sec: 41369.4, 60 sec: 41233.0, 300 sec: 41123.8). Total num frames: 12230656. Throughput: 0: 10305.4. Samples: 2033598. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:36:01,266][2364708] Avg episode reward: [(0, '29.736')]
[2023-02-24 17:36:01,465][2364831] Updated weights for policy 0, policy_version 2988 (0.0009)
[2023-02-24 17:36:02,473][2364831] Updated weights for policy 0, policy_version 2998 (0.0009)
[2023-02-24 17:36:03,476][2364831] Updated weights for policy 0, policy_version 3008 (0.0010)
[2023-02-24 17:36:04,459][2364831] Updated weights for policy 0, policy_version 3018 (0.0009)
[2023-02-24 17:36:05,480][2364831] Updated weights for policy 0, policy_version 3028 (0.0011)
[2023-02-24 17:36:06,266][2364708] Fps is (10 sec: 41369.1, 60 sec: 41233.0, 300 sec: 41119.8). Total num frames: 12435456. Throughput: 0: 10293.7. Samples: 2095044. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:36:06,266][2364708] Avg episode reward: [(0, '30.393')]
[2023-02-24 17:36:06,445][2364831] Updated weights for policy 0, policy_version 3038 (0.0012)
[2023-02-24 17:36:07,421][2364831] Updated weights for policy 0, policy_version 3048 (0.0009)
[2023-02-24 17:36:08,415][2364831] Updated weights for policy 0, policy_version 3058 (0.0011)
[2023-02-24 17:36:09,436][2364831] Updated weights for policy 0, policy_version 3068 (0.0010)
[2023-02-24 17:36:10,457][2364831] Updated weights for policy 0, policy_version 3078 (0.0009)
[2023-02-24 17:36:11,266][2364708] Fps is (10 sec: 40960.7, 60 sec: 41233.3, 300 sec: 41116.0). Total num frames: 12640256. Throughput: 0: 10281.7. Samples: 2156517. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:36:11,266][2364708] Avg episode reward: [(0, '30.391')]
[2023-02-24 17:36:11,465][2364831] Updated weights for policy 0, policy_version 3088 (0.0009)
[2023-02-24 17:36:12,457][2364831] Updated weights for policy 0, policy_version 3098 (0.0011)
[2023-02-24 17:36:13,406][2364831] Updated weights for policy 0, policy_version 3108 (0.0013)
[2023-02-24 17:36:14,404][2364831] Updated weights for policy 0, policy_version 3118 (0.0012)
[2023-02-24 17:36:15,425][2364831] Updated weights for policy 0, policy_version 3128 (0.0011)
[2023-02-24 17:36:16,266][2364708] Fps is (10 sec: 40960.3, 60 sec: 41164.7, 300 sec: 41112.4). Total num frames: 12845056. Throughput: 0: 10287.8. Samples: 2187600. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:36:16,266][2364708] Avg episode reward: [(0, '27.913')]
[2023-02-24 17:36:16,372][2364831] Updated weights for policy 0, policy_version 3138 (0.0014)
[2023-02-24 17:36:17,447][2364831] Updated weights for policy 0, policy_version 3148 (0.0009)
[2023-02-24 17:36:18,429][2364831] Updated weights for policy 0, policy_version 3158 (0.0009)
[2023-02-24 17:36:19,407][2364831] Updated weights for policy 0, policy_version 3168 (0.0013)
[2023-02-24 17:36:20,357][2364831] Updated weights for policy 0, policy_version 3178 (0.0009)
[2023-02-24 17:36:21,266][2364708] Fps is (10 sec: 40959.6, 60 sec: 41164.8, 300 sec: 41108.9). Total num frames: 13049856. Throughput: 0: 10288.8. Samples: 2249286. Policy #0 lag: (min: 0.0, avg: 1.9, max: 3.0)
[2023-02-24 17:36:21,266][2364708] Avg episode reward: [(0, '32.921')]
[2023-02-24 17:36:21,375][2364831] Updated weights for policy 0, policy_version 3188 (0.0012)
[2023-02-24 17:36:22,374][2364831] Updated weights for policy 0, policy_version 3198 (0.0015)
[2023-02-24 17:36:23,132][2364791] Signal inference workers to stop experience collection... (50 times)
[2023-02-24 17:36:23,134][2364791] Signal inference workers to resume experience collection... (50 times)
[2023-02-24 17:36:23,142][2364831] InferenceWorker_p0-w0: stopping experience collection (50 times)
[2023-02-24 17:36:23,145][2364831] InferenceWorker_p0-w0: resuming experience collection (50 times)
[2023-02-24 17:36:23,415][2364831] Updated weights for policy 0, policy_version 3208 (0.0010)
[2023-02-24 17:36:24,357][2364831] Updated weights for policy 0, policy_version 3218 (0.0009)
[2023-02-24 17:36:25,363][2364831] Updated weights for policy 0, policy_version 3228 (0.0014)
[2023-02-24 17:36:26,266][2364708] Fps is (10 sec: 41369.4, 60 sec: 41164.7, 300 sec: 41123.8). Total num frames: 13258752. Throughput: 0: 10290.0. Samples: 2310963. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:36:26,266][2364708] Avg episode reward: [(0, '34.018')]
[2023-02-24 17:36:26,267][2364791] Saving new best policy, reward=34.018!
[2023-02-24 17:36:26,376][2364831] Updated weights for policy 0, policy_version 3238 (0.0009)
[2023-02-24 17:36:27,385][2364831] Updated weights for policy 0, policy_version 3248 (0.0012)
[2023-02-24 17:36:28,374][2364831] Updated weights for policy 0, policy_version 3258 (0.0008)
[2023-02-24 17:36:29,381][2364831] Updated weights for policy 0, policy_version 3268 (0.0009)
[2023-02-24 17:36:30,372][2364831] Updated weights for policy 0, policy_version 3278 (0.0009)
[2023-02-24 17:36:31,266][2364708] Fps is (10 sec: 41369.5, 60 sec: 41164.8, 300 sec: 41120.3). Total num frames: 13463552. Throughput: 0: 10287.8. Samples: 2341674. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:36:31,266][2364708] Avg episode reward: [(0, '31.155')]
[2023-02-24 17:36:31,271][2364791] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000003287_13463552.pth...
[2023-02-24 17:36:31,331][2364791] Removing /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000000978_4005888.pth
[2023-02-24 17:36:31,376][2364831] Updated weights for policy 0, policy_version 3288 (0.0010)
[2023-02-24 17:36:32,389][2364831] Updated weights for policy 0, policy_version 3298 (0.0012)
[2023-02-24 17:36:33,398][2364831] Updated weights for policy 0, policy_version 3308 (0.0013)
[2023-02-24 17:36:34,328][2364831] Updated weights for policy 0, policy_version 3318 (0.0009)
[2023-02-24 17:36:35,370][2364831] Updated weights for policy 0, policy_version 3328 (0.0009)
[2023-02-24 17:36:36,266][2364708] Fps is (10 sec: 40960.6, 60 sec: 41164.8, 300 sec: 41116.9). Total num frames: 13668352. Throughput: 0: 10277.8. Samples: 2403273. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:36:36,266][2364708] Avg episode reward: [(0, '32.249')]
[2023-02-24 17:36:36,352][2364831] Updated weights for policy 0, policy_version 3338 (0.0008)
[2023-02-24 17:36:37,358][2364831] Updated weights for policy 0, policy_version 3348 (0.0012)
[2023-02-24 17:36:38,366][2364831] Updated weights for policy 0, policy_version 3358 (0.0013)
[2023-02-24 17:36:39,348][2364831] Updated weights for policy 0, policy_version 3368 (0.0010)
[2023-02-24 17:36:40,325][2364831] Updated weights for policy 0, policy_version 3378 (0.0009)
[2023-02-24 17:36:41,266][2364708] Fps is (10 sec: 40960.6, 60 sec: 41164.9, 300 sec: 41113.6). Total num frames: 13873152. Throughput: 0: 10271.1. Samples: 2464929. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:36:41,266][2364708] Avg episode reward: [(0, '30.922')]
[2023-02-24 17:36:41,354][2364831] Updated weights for policy 0, policy_version 3388 (0.0013)
[2023-02-24 17:36:42,336][2364831] Updated weights for policy 0, policy_version 3398 (0.0011)
[2023-02-24 17:36:43,307][2364831] Updated weights for policy 0, policy_version 3408 (0.0010)
[2023-02-24 17:36:44,352][2364831] Updated weights for policy 0, policy_version 3418 (0.0017)
[2023-02-24 17:36:45,308][2364831] Updated weights for policy 0, policy_version 3428 (0.0011)
[2023-02-24 17:36:46,266][2364708] Fps is (10 sec: 40959.9, 60 sec: 41096.6, 300 sec: 41110.5). Total num frames: 14077952. Throughput: 0: 10271.3. Samples: 2495805. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:36:46,266][2364708] Avg episode reward: [(0, '35.645')]
[2023-02-24 17:36:46,267][2364791] Saving new best policy, reward=35.645!
[2023-02-24 17:36:46,360][2364831] Updated weights for policy 0, policy_version 3438 (0.0011)
[2023-02-24 17:36:47,341][2364831] Updated weights for policy 0, policy_version 3448 (0.0010)
[2023-02-24 17:36:48,315][2364831] Updated weights for policy 0, policy_version 3458 (0.0012)
[2023-02-24 17:36:49,289][2364831] Updated weights for policy 0, policy_version 3468 (0.0008)
[2023-02-24 17:36:50,307][2364831] Updated weights for policy 0, policy_version 3478 (0.0009)
[2023-02-24 17:36:51,266][2364708] Fps is (10 sec: 40959.9, 60 sec: 41096.6, 300 sec: 41107.5). Total num frames: 14282752. Throughput: 0: 10276.8. Samples: 2557500. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:36:51,266][2364708] Avg episode reward: [(0, '34.180')]
[2023-02-24 17:36:51,296][2364831] Updated weights for policy 0, policy_version 3488 (0.0009)
[2023-02-24 17:36:52,295][2364831] Updated weights for policy 0, policy_version 3498 (0.0014)
[2023-02-24 17:36:53,320][2364831] Updated weights for policy 0, policy_version 3508 (0.0010)
[2023-02-24 17:36:54,312][2364831] Updated weights for policy 0, policy_version 3518 (0.0013)
[2023-02-24 17:36:55,310][2364831] Updated weights for policy 0, policy_version 3528 (0.0012)
[2023-02-24 17:36:56,260][2364831] Updated weights for policy 0, policy_version 3538 (0.0011)
[2023-02-24 17:36:56,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 41164.9, 300 sec: 41120.6). Total num frames: 14491648. Throughput: 0: 10279.0. Samples: 2619072. Policy #0 lag: (min: 0.0, avg: 2.0, max: 5.0)
[2023-02-24 17:36:56,266][2364708] Avg episode reward: [(0, '35.711')]
[2023-02-24 17:36:56,267][2364791] Saving new best policy, reward=35.711!
[2023-02-24 17:36:57,297][2364831] Updated weights for policy 0, policy_version 3548 (0.0014)
[2023-02-24 17:36:58,305][2364831] Updated weights for policy 0, policy_version 3558 (0.0009)
[2023-02-24 17:36:59,282][2364831] Updated weights for policy 0, policy_version 3568 (0.0009)
[2023-02-24 17:37:00,281][2364831] Updated weights for policy 0, policy_version 3578 (0.0018)
[2023-02-24 17:37:01,266][2364708] Fps is (10 sec: 40960.0, 60 sec: 41028.4, 300 sec: 41101.8). Total num frames: 14692352. Throughput: 0: 10270.1. Samples: 2649753. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:37:01,266][2364708] Avg episode reward: [(0, '32.713')]
[2023-02-24 17:37:01,290][2364831] Updated weights for policy 0, policy_version 3588 (0.0011)
[2023-02-24 17:37:02,293][2364831] Updated weights for policy 0, policy_version 3598 (0.0012)
[2023-02-24 17:37:03,300][2364831] Updated weights for policy 0, policy_version 3608 (0.0010)
[2023-02-24 17:37:04,288][2364831] Updated weights for policy 0, policy_version 3618 (0.0012)
[2023-02-24 17:37:05,307][2364831] Updated weights for policy 0, policy_version 3628 (0.0012)
[2023-02-24 17:37:06,256][2364831] Updated weights for policy 0, policy_version 3638 (0.0011)
[2023-02-24 17:37:06,266][2364708] Fps is (10 sec: 40959.4, 60 sec: 41096.6, 300 sec: 41114.5). Total num frames: 14901248. Throughput: 0: 10265.3. Samples: 2711226. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:37:06,266][2364708] Avg episode reward: [(0, '33.770')]
[2023-02-24 17:37:07,256][2364831] Updated weights for policy 0, policy_version 3648 (0.0010)
[2023-02-24 17:37:08,283][2364831] Updated weights for policy 0, policy_version 3658 (0.0010)
[2023-02-24 17:37:09,271][2364831] Updated weights for policy 0, policy_version 3668 (0.0010)
[2023-02-24 17:37:10,247][2364831] Updated weights for policy 0, policy_version 3678 (0.0011)
[2023-02-24 17:37:11,255][2364831] Updated weights for policy 0, policy_version 3688 (0.0009)
[2023-02-24 17:37:11,266][2364708] Fps is (10 sec: 41369.7, 60 sec: 41096.5, 300 sec: 41111.7). Total num frames: 15106048. Throughput: 0: 10267.2. Samples: 2772984. Policy #0 lag: (min: 0.0, avg: 2.1, max: 5.0)
[2023-02-24 17:37:11,266][2364708] Avg episode reward: [(0, '35.557')]
[2023-02-24 17:37:12,240][2364831] Updated weights for policy 0, policy_version 3698 (0.0010)
[2023-02-24 17:37:13,251][2364831] Updated weights for policy 0, policy_version 3708 (0.0010)
[2023-02-24 17:37:14,229][2364831] Updated weights for policy 0, policy_version 3718 (0.0009)
[2023-02-24 17:37:15,279][2364831] Updated weights for policy 0, policy_version 3728 (0.0015)
[2023-02-24 17:37:16,243][2364831] Updated weights for policy 0, policy_version 3738 (0.0011)
[2023-02-24 17:37:16,266][2364708] Fps is (10 sec: 40960.5, 60 sec: 41096.6, 300 sec: 41108.9). Total num frames: 15310848. Throughput: 0: 10269.1. Samples: 2803782. Policy #0 lag: (min: 0.0, avg: 2.2, max: 4.0)
[2023-02-24 17:37:16,266][2364708] Avg episode reward: [(0, '34.978')]
[2023-02-24 17:37:17,232][2364831] Updated weights for policy 0, policy_version 3748 (0.0011)
[2023-02-24 17:37:18,232][2364831] Updated weights for policy 0, policy_version 3758 (0.0012)
[2023-02-24 17:37:19,246][2364831] Updated weights for policy 0, policy_version 3768 (0.0011)
[2023-02-24 17:37:20,215][2364831] Updated weights for policy 0, policy_version 3778 (0.0016)
[2023-02-24 17:37:21,224][2364831] Updated weights for policy 0, policy_version 3788 (0.0009)
[2023-02-24 17:37:21,266][2364708] Fps is (10 sec: 40959.6, 60 sec: 41096.5, 300 sec: 41106.3). Total num frames: 15515648. Throughput: 0: 10267.2. Samples: 2865300. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:37:21,266][2364708] Avg episode reward: [(0, '32.422')]
[2023-02-24 17:37:22,190][2364831] Updated weights for policy 0, policy_version 3798 (0.0009)
[2023-02-24 17:37:23,210][2364831] Updated weights for policy 0, policy_version 3808 (0.0010)
[2023-02-24 17:37:24,201][2364831] Updated weights for policy 0, policy_version 3818 (0.0009)
[2023-02-24 17:37:25,224][2364831] Updated weights for policy 0, policy_version 3828 (0.0012)
[2023-02-24 17:37:26,251][2364831] Updated weights for policy 0, policy_version 3838 (0.0012)
[2023-02-24 17:37:26,266][2364708] Fps is (10 sec: 40960.0, 60 sec: 41028.4, 300 sec: 41103.7). Total num frames: 15720448. Throughput: 0: 10264.2. Samples: 2926818. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:37:26,266][2364708] Avg episode reward: [(0, '30.861')]
[2023-02-24 17:37:27,226][2364831] Updated weights for policy 0, policy_version 3848 (0.0014)
[2023-02-24 17:37:28,214][2364831] Updated weights for policy 0, policy_version 3858 (0.0010)
[2023-02-24 17:37:29,243][2364831] Updated weights for policy 0, policy_version 3868 (0.0013)
[2023-02-24 17:37:30,235][2364831] Updated weights for policy 0, policy_version 3878 (0.0009)
[2023-02-24 17:37:31,194][2364831] Updated weights for policy 0, policy_version 3888 (0.0010)
[2023-02-24 17:37:31,266][2364708] Fps is (10 sec: 40959.8, 60 sec: 41028.3, 300 sec: 41101.2). Total num frames: 15925248. Throughput: 0: 10260.7. Samples: 2957538. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:37:31,266][2364708] Avg episode reward: [(0, '33.478')]
[2023-02-24 17:37:32,210][2364831] Updated weights for policy 0, policy_version 3898 (0.0011)
[2023-02-24 17:37:33,187][2364831] Updated weights for policy 0, policy_version 3908 (0.0013)
[2023-02-24 17:37:34,160][2364831] Updated weights for policy 0, policy_version 3918 (0.0012)
[2023-02-24 17:37:35,197][2364831] Updated weights for policy 0, policy_version 3928 (0.0012)
[2023-02-24 17:37:36,189][2364831] Updated weights for policy 0, policy_version 3938 (0.0012)
[2023-02-24 17:37:36,266][2364708] Fps is (10 sec: 40960.0, 60 sec: 41028.3, 300 sec: 41098.8). Total num frames: 16130048. Throughput: 0: 10267.2. Samples: 3019524. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:37:36,266][2364708] Avg episode reward: [(0, '34.030')]
[2023-02-24 17:37:37,168][2364831] Updated weights for policy 0, policy_version 3948 (0.0010)
[2023-02-24 17:37:38,212][2364831] Updated weights for policy 0, policy_version 3958 (0.0010)
[2023-02-24 17:37:39,181][2364831] Updated weights for policy 0, policy_version 3968 (0.0009)
[2023-02-24 17:37:40,158][2364831] Updated weights for policy 0, policy_version 3978 (0.0012)
[2023-02-24 17:37:41,178][2364831] Updated weights for policy 0, policy_version 3988 (0.0011)
[2023-02-24 17:37:41,266][2364708] Fps is (10 sec: 40960.6, 60 sec: 41028.3, 300 sec: 41223.8). Total num frames: 16334848. Throughput: 0: 10262.3. Samples: 3080874. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:37:41,266][2364708] Avg episode reward: [(0, '30.119')]
[2023-02-24 17:37:42,173][2364831] Updated weights for policy 0, policy_version 3998 (0.0009)
[2023-02-24 17:37:43,173][2364831] Updated weights for policy 0, policy_version 4008 (0.0012)
[2023-02-24 17:37:44,204][2364831] Updated weights for policy 0, policy_version 4018 (0.0013)
[2023-02-24 17:37:45,182][2364831] Updated weights for policy 0, policy_version 4028 (0.0015)
[2023-02-24 17:37:46,192][2364831] Updated weights for policy 0, policy_version 4038 (0.0010)
[2023-02-24 17:37:46,266][2364708] Fps is (10 sec: 40959.9, 60 sec: 41028.2, 300 sec: 41209.9). Total num frames: 16539648. Throughput: 0: 10264.2. Samples: 3111642. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:37:46,266][2364708] Avg episode reward: [(0, '34.674')]
[2023-02-24 17:37:47,165][2364831] Updated weights for policy 0, policy_version 4048 (0.0009)
[2023-02-24 17:37:48,150][2364831] Updated weights for policy 0, policy_version 4058 (0.0009)
[2023-02-24 17:37:49,155][2364831] Updated weights for policy 0, policy_version 4068 (0.0010)
[2023-02-24 17:37:50,176][2364831] Updated weights for policy 0, policy_version 4078 (0.0011)
[2023-02-24 17:37:51,128][2364831] Updated weights for policy 0, policy_version 4088 (0.0013)
[2023-02-24 17:37:51,266][2364708] Fps is (10 sec: 41369.0, 60 sec: 41096.4, 300 sec: 41209.9). Total num frames: 16748544. Throughput: 0: 10268.4. Samples: 3173304. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:37:51,267][2364708] Avg episode reward: [(0, '33.236')]
[2023-02-24 17:37:52,111][2364831] Updated weights for policy 0, policy_version 4098 (0.0009)
[2023-02-24 17:37:53,146][2364831] Updated weights for policy 0, policy_version 4108 (0.0014)
[2023-02-24 17:37:54,142][2364831] Updated weights for policy 0, policy_version 4118 (0.0012)
[2023-02-24 17:37:55,117][2364831] Updated weights for policy 0, policy_version 4128 (0.0010)
[2023-02-24 17:37:56,172][2364831] Updated weights for policy 0, policy_version 4138 (0.0011)
[2023-02-24 17:37:56,266][2364708] Fps is (10 sec: 41369.4, 60 sec: 41028.2, 300 sec: 41209.9). Total num frames: 16953344. Throughput: 0: 10265.4. Samples: 3234927. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:37:56,266][2364708] Avg episode reward: [(0, '33.288')]
[2023-02-24 17:37:57,176][2364831] Updated weights for policy 0, policy_version 4148 (0.0009)
[2023-02-24 17:37:58,108][2364831] Updated weights for policy 0, policy_version 4158 (0.0010)
[2023-02-24 17:37:59,170][2364831] Updated weights for policy 0, policy_version 4168 (0.0016)
[2023-02-24 17:38:00,167][2364831] Updated weights for policy 0, policy_version 4178 (0.0016)
[2023-02-24 17:38:01,108][2364831] Updated weights for policy 0, policy_version 4188 (0.0011)
[2023-02-24 17:38:01,266][2364708] Fps is (10 sec: 40550.7, 60 sec: 41028.2, 300 sec: 41182.2). Total num frames: 17154048. Throughput: 0: 10261.8. Samples: 3265563. Policy #0 lag: (min: 0.0, avg: 1.5, max: 4.0)
[2023-02-24 17:38:01,266][2364708] Avg episode reward: [(0, '35.841')]
[2023-02-24 17:38:01,270][2364791] Saving new best policy, reward=35.841!
[2023-02-24 17:38:02,155][2364831] Updated weights for policy 0, policy_version 4198 (0.0012)
[2023-02-24 17:38:03,132][2364831] Updated weights for policy 0, policy_version 4208 (0.0009)
[2023-02-24 17:38:04,177][2364831] Updated weights for policy 0, policy_version 4218 (0.0012)
[2023-02-24 17:38:05,155][2364831] Updated weights for policy 0, policy_version 4228 (0.0013)
[2023-02-24 17:38:06,137][2364831] Updated weights for policy 0, policy_version 4238 (0.0010)
[2023-02-24 17:38:06,266][2364708] Fps is (10 sec: 40960.0, 60 sec: 41028.3, 300 sec: 41182.2). Total num frames: 17362944. Throughput: 0: 10260.6. Samples: 3327027. Policy #0 lag: (min: 1.0, avg: 2.0, max: 4.0)
[2023-02-24 17:38:06,266][2364708] Avg episode reward: [(0, '34.701')]
[2023-02-24 17:38:07,128][2364831] Updated weights for policy 0, policy_version 4248 (0.0010)
[2023-02-24 17:38:08,120][2364831] Updated weights for policy 0, policy_version 4258 (0.0009)
[2023-02-24 17:38:09,120][2364831] Updated weights for policy 0, policy_version 4268 (0.0010)
[2023-02-24 17:38:10,112][2364831] Updated weights for policy 0, policy_version 4278 (0.0009)
[2023-02-24 17:38:11,084][2364831] Updated weights for policy 0, policy_version 4288 (0.0009)
[2023-02-24 17:38:11,266][2364708] Fps is (10 sec: 41368.9, 60 sec: 41028.1, 300 sec: 41182.1). Total num frames: 17567744. Throughput: 0: 10268.7. Samples: 3388911. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:38:11,267][2364708] Avg episode reward: [(0, '32.559')]
[2023-02-24 17:38:12,108][2364831] Updated weights for policy 0, policy_version 4298 (0.0009)
[2023-02-24 17:38:13,088][2364831] Updated weights for policy 0, policy_version 4308 (0.0011)
[2023-02-24 17:38:14,094][2364831] Updated weights for policy 0, policy_version 4318 (0.0009)
[2023-02-24 17:38:15,081][2364831] Updated weights for policy 0, policy_version 4328 (0.0009)
[2023-02-24 17:38:16,054][2364831] Updated weights for policy 0, policy_version 4338 (0.0010)
[2023-02-24 17:38:16,266][2364708] Fps is (10 sec: 40959.2, 60 sec: 41028.1, 300 sec: 41168.2). Total num frames: 17772544. Throughput: 0: 10274.2. Samples: 3419880. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:38:16,267][2364708] Avg episode reward: [(0, '32.783')]
[2023-02-24 17:38:17,103][2364831] Updated weights for policy 0, policy_version 4348 (0.0012)
[2023-02-24 17:38:18,095][2364831] Updated weights for policy 0, policy_version 4358 (0.0009)
[2023-02-24 17:38:19,078][2364831] Updated weights for policy 0, policy_version 4368 (0.0011)
[2023-02-24 17:38:20,078][2364831] Updated weights for policy 0, policy_version 4378 (0.0012)
[2023-02-24 17:38:21,065][2364831] Updated weights for policy 0, policy_version 4388 (0.0016)
[2023-02-24 17:38:21,266][2364708] Fps is (10 sec: 41370.7, 60 sec: 41096.6, 300 sec: 41182.2). Total num frames: 17981440. Throughput: 0: 10262.9. Samples: 3481353. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:38:21,266][2364708] Avg episode reward: [(0, '35.299')]
[2023-02-24 17:38:22,058][2364831] Updated weights for policy 0, policy_version 4398 (0.0012)
[2023-02-24 17:38:23,045][2364831] Updated weights for policy 0, policy_version 4408 (0.0010)
[2023-02-24 17:38:24,107][2364831] Updated weights for policy 0, policy_version 4418 (0.0011)
[2023-02-24 17:38:25,058][2364831] Updated weights for policy 0, policy_version 4428 (0.0009)
[2023-02-24 17:38:26,042][2364831] Updated weights for policy 0, policy_version 4438 (0.0012)
[2023-02-24 17:38:26,266][2364708] Fps is (10 sec: 40960.7, 60 sec: 41028.2, 300 sec: 41154.4). Total num frames: 18182144. Throughput: 0: 10268.6. Samples: 3542961. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:38:26,266][2364708] Avg episode reward: [(0, '33.602')]
[2023-02-24 17:38:27,045][2364831] Updated weights for policy 0, policy_version 4448 (0.0013)
[2023-02-24 17:38:28,079][2364831] Updated weights for policy 0, policy_version 4458 (0.0009)
[2023-02-24 17:38:29,077][2364831] Updated weights for policy 0, policy_version 4468 (0.0012)
[2023-02-24 17:38:30,014][2364831] Updated weights for policy 0, policy_version 4478 (0.0014)
[2023-02-24 17:38:31,056][2364831] Updated weights for policy 0, policy_version 4488 (0.0014)
[2023-02-24 17:38:31,266][2364708] Fps is (10 sec: 40959.3, 60 sec: 41096.5, 300 sec: 41168.3). Total num frames: 18391040. Throughput: 0: 10269.4. Samples: 3573768. Policy #0 lag: (min: 0.0, avg: 2.0, max: 5.0)
[2023-02-24 17:38:31,266][2364708] Avg episode reward: [(0, '30.251')]
[2023-02-24 17:38:31,271][2364791] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000004490_18391040.pth...
[2023-02-24 17:38:31,336][2364791] Removing /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000002080_8519680.pth
[2023-02-24 17:38:32,050][2364831] Updated weights for policy 0, policy_version 4498 (0.0012)
[2023-02-24 17:38:33,042][2364831] Updated weights for policy 0, policy_version 4508 (0.0009)
[2023-02-24 17:38:34,048][2364831] Updated weights for policy 0, policy_version 4518 (0.0010)
[2023-02-24 17:38:35,020][2364831] Updated weights for policy 0, policy_version 4528 (0.0010)
[2023-02-24 17:38:36,020][2364831] Updated weights for policy 0, policy_version 4538 (0.0010)
[2023-02-24 17:38:36,266][2364708] Fps is (10 sec: 41369.6, 60 sec: 41096.5, 300 sec: 41140.5). Total num frames: 18595840. Throughput: 0: 10268.5. Samples: 3635388. Policy #0 lag: (min: 0.0, avg: 2.0, max: 5.0)
[2023-02-24 17:38:36,266][2364708] Avg episode reward: [(0, '34.728')]
[2023-02-24 17:38:37,018][2364831] Updated weights for policy 0, policy_version 4548 (0.0012)
[2023-02-24 17:38:37,996][2364831] Updated weights for policy 0, policy_version 4558 (0.0009)
[2023-02-24 17:38:39,029][2364831] Updated weights for policy 0, policy_version 4568 (0.0010)
[2023-02-24 17:38:40,017][2364831] Updated weights for policy 0, policy_version 4578 (0.0010)
[2023-02-24 17:38:40,985][2364831] Updated weights for policy 0, policy_version 4588 (0.0010)
[2023-02-24 17:38:41,266][2364708] Fps is (10 sec: 40959.9, 60 sec: 41096.4, 300 sec: 41154.4). Total num frames: 18800640. Throughput: 0: 10268.8. Samples: 3697023. Policy #0 lag: (min: 0.0, avg: 1.8, max: 4.0)
[2023-02-24 17:38:41,267][2364708] Avg episode reward: [(0, '32.483')]
[2023-02-24 17:38:42,014][2364831] Updated weights for policy 0, policy_version 4598 (0.0014)
[2023-02-24 17:38:42,975][2364831] Updated weights for policy 0, policy_version 4608 (0.0009)
[2023-02-24 17:38:44,032][2364831] Updated weights for policy 0, policy_version 4618 (0.0009)
[2023-02-24 17:38:45,014][2364831] Updated weights for policy 0, policy_version 4628 (0.0011)
[2023-02-24 17:38:45,974][2364831] Updated weights for policy 0, policy_version 4638 (0.0012)
[2023-02-24 17:38:46,266][2364708] Fps is (10 sec: 41370.5, 60 sec: 41164.9, 300 sec: 41168.3). Total num frames: 19009536. Throughput: 0: 10273.0. Samples: 3727845. Policy #0 lag: (min: 0.0, avg: 2.1, max: 4.0)
[2023-02-24 17:38:46,266][2364708] Avg episode reward: [(0, '34.188')]
[2023-02-24 17:38:47,022][2364831] Updated weights for policy 0, policy_version 4648 (0.0014)
[2023-02-24 17:38:48,027][2364831] Updated weights for policy 0, policy_version 4658 (0.0013)
[2023-02-24 17:38:49,020][2364831] Updated weights for policy 0, policy_version 4668 (0.0010)
[2023-02-24 17:38:50,024][2364831] Updated weights for policy 0, policy_version 4678 (0.0010)
[2023-02-24 17:38:51,024][2364831] Updated weights for policy 0, policy_version 4688 (0.0011)
[2023-02-24 17:38:51,266][2364708] Fps is (10 sec: 40960.4, 60 sec: 41028.3, 300 sec: 41140.5). Total num frames: 19210240. Throughput: 0: 10270.9. Samples: 3789216. Policy #0 lag: (min: 0.0, avg: 2.0, max: 4.0)
[2023-02-24 17:38:51,266][2364708] Avg episode reward: [(0, '36.140')]
[2023-02-24 17:38:51,270][2364791] Saving new best policy, reward=36.140!
[2023-02-24 17:38:52,009][2364831] Updated weights for policy 0, policy_version 4698 (0.0012)
[2023-02-24 17:38:53,006][2364831] Updated weights for policy 0, policy_version 4708 (0.0011)
[2023-02-24 17:38:54,017][2364831] Updated weights for policy 0, policy_version 4718 (0.0012)
[2023-02-24 17:38:54,981][2364831] Updated weights for policy 0, policy_version 4728 (0.0011)
[2023-02-24 17:38:55,968][2364831] Updated weights for policy 0, policy_version 4738 (0.0012)
[2023-02-24 17:38:56,266][2364708] Fps is (10 sec: 40959.4, 60 sec: 41096.6, 300 sec: 41154.4). Total num frames: 19419136. Throughput: 0: 10265.0. Samples: 3850833. Policy #0 lag: (min: 0.0, avg: 1.7, max: 4.0)
[2023-02-24 17:38:56,266][2364708] Avg episode reward: [(0, '32.669')]
[2023-02-24 17:38:56,994][2364831] Updated weights for policy 0, policy_version 4748 (0.0011)
[2023-02-24 17:38:57,985][2364831] Updated weights for policy 0, policy_version 4758 (0.0008)
[2023-02-24 17:38:58,970][2364831] Updated weights for policy 0, policy_version 4768 (0.0010)
[2023-02-24 17:38:59,953][2364831] Updated weights for policy 0, policy_version 4778 (0.0010)
[2023-02-24 17:39:00,969][2364831] Updated weights for policy 0, policy_version 4788 (0.0012)
[2023-02-24 17:39:01,266][2364708] Fps is (10 sec: 41369.3, 60 sec: 41164.8, 300 sec: 41140.5). Total num frames: 19623936. Throughput: 0: 10264.2. Samples: 3881769. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:39:01,266][2364708] Avg episode reward: [(0, '33.503')]
[2023-02-24 17:39:01,956][2364831] Updated weights for policy 0, policy_version 4798 (0.0011)
[2023-02-24 17:39:02,959][2364831] Updated weights for policy 0, policy_version 4808 (0.0009)
[2023-02-24 17:39:03,937][2364831] Updated weights for policy 0, policy_version 4818 (0.0010)
[2023-02-24 17:39:04,994][2364831] Updated weights for policy 0, policy_version 4828 (0.0012)
[2023-02-24 17:39:05,948][2364831] Updated weights for policy 0, policy_version 4838 (0.0009)
[2023-02-24 17:39:06,266][2364708] Fps is (10 sec: 40959.8, 60 sec: 41096.5, 300 sec: 41140.5). Total num frames: 19828736. Throughput: 0: 10266.0. Samples: 3943326. Policy #0 lag: (min: 0.0, avg: 1.9, max: 4.0)
[2023-02-24 17:39:06,266][2364708] Avg episode reward: [(0, '33.048')]
[2023-02-24 17:39:06,927][2364831] Updated weights for policy 0, policy_version 4848 (0.0009)
[2023-02-24 17:39:07,963][2364831] Updated weights for policy 0, policy_version 4858 (0.0010)
[2023-02-24 17:39:08,957][2364831] Updated weights for policy 0, policy_version 4868 (0.0013)
[2023-02-24 17:39:09,952][2364831] Updated weights for policy 0, policy_version 4878 (0.0009)
[2023-02-24 17:39:10,563][2364791] Stopping Batcher_0...
[2023-02-24 17:39:10,563][2364708] Component Batcher_0 stopped!
[2023-02-24 17:39:10,564][2364791] Loop batcher_evt_loop terminating...
[2023-02-24 17:39:10,566][2364791] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000004884_20004864.pth...
[2023-02-24 17:39:10,580][2365108] Stopping RolloutWorker_w15...
[2023-02-24 17:39:10,580][2364708] Component RolloutWorker_w15 stopped!
[2023-02-24 17:39:10,580][2364862] Stopping RolloutWorker_w14...
[2023-02-24 17:39:10,581][2364708] Component RolloutWorker_w14 stopped!
[2023-02-24 17:39:10,581][2364855] Stopping RolloutWorker_w10...
[2023-02-24 17:39:10,581][2364708] Component RolloutWorker_w10 stopped!
[2023-02-24 17:39:10,581][2364855] Loop rollout_proc10_evt_loop terminating...
[2023-02-24 17:39:10,581][2364708] Component RolloutWorker_w0 stopped!
[2023-02-24 17:39:10,581][2365108] Loop rollout_proc15_evt_loop terminating...
[2023-02-24 17:39:10,581][2364832] Stopping RolloutWorker_w0...
[2023-02-24 17:39:10,582][2364862] Loop rollout_proc14_evt_loop terminating...
[2023-02-24 17:39:10,582][2364708] Component RolloutWorker_w3 stopped!
[2023-02-24 17:39:10,582][2364836] Stopping RolloutWorker_w3...
[2023-02-24 17:39:10,582][2364857] Stopping RolloutWorker_w11...
[2023-02-24 17:39:10,582][2364832] Loop rollout_proc0_evt_loop terminating...
[2023-02-24 17:39:10,583][2364708] Component RolloutWorker_w11 stopped!
[2023-02-24 17:39:10,583][2364836] Loop rollout_proc3_evt_loop terminating...
[2023-02-24 17:39:10,583][2364859] Stopping RolloutWorker_w8...
[2023-02-24 17:39:10,583][2364857] Loop rollout_proc11_evt_loop terminating...
[2023-02-24 17:39:10,583][2364708] Component RolloutWorker_w8 stopped!
[2023-02-24 17:39:10,583][2364861] Stopping RolloutWorker_w13...
[2023-02-24 17:39:10,583][2364833] Stopping RolloutWorker_w1...
[2023-02-24 17:39:10,583][2364839] Stopping RolloutWorker_w7...
[2023-02-24 17:39:10,583][2364838] Stopping RolloutWorker_w6...
[2023-02-24 17:39:10,583][2364708] Component RolloutWorker_w13 stopped!
[2023-02-24 17:39:10,583][2364861] Loop rollout_proc13_evt_loop terminating...
[2023-02-24 17:39:10,583][2364708] Component RolloutWorker_w1 stopped!
[2023-02-24 17:39:10,583][2364833] Loop rollout_proc1_evt_loop terminating...
[2023-02-24 17:39:10,583][2364838] Loop rollout_proc6_evt_loop terminating...
[2023-02-24 17:39:10,583][2364839] Loop rollout_proc7_evt_loop terminating...
[2023-02-24 17:39:10,583][2364860] Stopping RolloutWorker_w12...
[2023-02-24 17:39:10,583][2364708] Component RolloutWorker_w7 stopped!
[2023-02-24 17:39:10,584][2364708] Component RolloutWorker_w6 stopped!
[2023-02-24 17:39:10,584][2364860] Loop rollout_proc12_evt_loop terminating...
[2023-02-24 17:39:10,584][2364708] Component RolloutWorker_w12 stopped!
[2023-02-24 17:39:10,583][2364859] Loop rollout_proc8_evt_loop terminating...
[2023-02-24 17:39:10,585][2364708] Component RolloutWorker_w9 stopped!
[2023-02-24 17:39:10,585][2364856] Stopping RolloutWorker_w9...
[2023-02-24 17:39:10,585][2364856] Loop rollout_proc9_evt_loop terminating...
[2023-02-24 17:39:10,585][2364708] Component RolloutWorker_w4 stopped!
[2023-02-24 17:39:10,585][2364835] Stopping RolloutWorker_w4...
[2023-02-24 17:39:10,586][2364835] Loop rollout_proc4_evt_loop terminating...
[2023-02-24 17:39:10,591][2364831] Weights refcount: 2 0
[2023-02-24 17:39:10,596][2364831] Stopping InferenceWorker_p0-w0...
[2023-02-24 17:39:10,596][2364708] Component InferenceWorker_p0-w0 stopped!
[2023-02-24 17:39:10,596][2364831] Loop inference_proc0-0_evt_loop terminating...
[2023-02-24 17:39:10,664][2364791] Removing /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000003287_13463552.pth
[2023-02-24 17:39:10,667][2364837] Stopping RolloutWorker_w5...
[2023-02-24 17:39:10,667][2364708] Component RolloutWorker_w5 stopped!
[2023-02-24 17:39:10,667][2364837] Loop rollout_proc5_evt_loop terminating...
[2023-02-24 17:39:10,675][2364791] Saving /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000004884_20004864.pth...
[2023-02-24 17:39:10,679][2364834] Stopping RolloutWorker_w2...
[2023-02-24 17:39:10,679][2364708] Component RolloutWorker_w2 stopped!
[2023-02-24 17:39:10,679][2364834] Loop rollout_proc2_evt_loop terminating...
[2023-02-24 17:39:10,810][2364791] Stopping LearnerWorker_p0...
[2023-02-24 17:39:10,810][2364708] Component LearnerWorker_p0 stopped!
[2023-02-24 17:39:10,811][2364791] Loop learner_proc0_evt_loop terminating...
[2023-02-24 17:39:10,811][2364708] Waiting for process learner_proc0 to stop...
[2023-02-24 17:39:11,642][2364708] Waiting for process inference_proc0-0 to join...
[2023-02-24 17:39:11,642][2364708] Waiting for process rollout_proc0 to join...
[2023-02-24 17:39:11,643][2364708] Waiting for process rollout_proc1 to join...
[2023-02-24 17:39:11,643][2364708] Waiting for process rollout_proc2 to join...
[2023-02-24 17:39:11,643][2364708] Waiting for process rollout_proc3 to join...
[2023-02-24 17:39:11,643][2364708] Waiting for process rollout_proc4 to join...
[2023-02-24 17:39:11,643][2364708] Waiting for process rollout_proc5 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc6 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc7 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc8 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc9 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc10 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc11 to join...
[2023-02-24 17:39:11,644][2364708] Waiting for process rollout_proc12 to join...
[2023-02-24 17:39:11,645][2364708] Waiting for process rollout_proc13 to join...
[2023-02-24 17:39:11,645][2364708] Waiting for process rollout_proc14 to join...
[2023-02-24 17:39:11,645][2364708] Waiting for process rollout_proc15 to join...
[2023-02-24 17:39:11,645][2364708] Batcher 0 profile tree view:
batching: 73.4519, releasing_batches: 0.1153
[2023-02-24 17:39:11,645][2364708] InferenceWorker_p0-w0 profile tree view:
wait_policy: 0.0001
wait_policy_total: 14.4354
update_model: 8.0668
weight_update: 0.0009
one_step: 0.0083
handle_policy_step: 352.4861
deserialize: 37.8441, stack: 1.8339, obs_to_device_normalize: 110.8509, forward: 112.2633, send_messages: 30.0739
prepare_outputs: 46.8292
to_cpu: 31.2023
[2023-02-24 17:39:11,646][2364708] Learner 0 profile tree view:
misc: 0.0142, prepare_batch: 21.8985
train: 88.0533
epoch_init: 0.0176, minibatch_init: 0.0202, losses_postprocess: 1.0969, kl_divergence: 0.8556, after_optimizer: 0.9595
calculate_losses: 27.5889
losses_init: 0.0102, forward_head: 2.8021, bptt_initial: 14.7305, tail: 1.9333, advantages_returns: 0.5209, losses: 3.3108
bptt: 3.7098
bptt_forward_core: 3.5507
update: 56.2303
clip: 3.0247
[2023-02-24 17:39:11,646][2364708] RolloutWorker_w0 profile tree view:
wait_for_trajectories: 0.1784, enqueue_policy_requests: 13.5703, env_step: 193.6379, overhead: 19.5570, complete_rollouts: 0.3859
save_policy_outputs: 14.2004
split_output_tensors: 6.7511
[2023-02-24 17:39:11,646][2364708] RolloutWorker_w15 profile tree view:
wait_for_trajectories: 0.1858, enqueue_policy_requests: 14.0534, env_step: 199.4115, overhead: 20.2979, complete_rollouts: 0.3205
save_policy_outputs: 14.6899
split_output_tensors: 6.9819
[2023-02-24 17:39:11,646][2364708] Loop Runner_EvtLoop terminating...
[2023-02-24 17:39:11,646][2364708] Runner profile tree view:
main_loop: 398.6762
[2023-02-24 17:39:11,647][2364708] Collected {0: 20004864}, FPS: 40130.2
[2023-02-24 17:39:11,652][2364708] Loading existing experiment configuration from train_dir/default_experiment/config.json
[2023-02-24 17:39:11,652][2364708] Overriding arg 'train_dir' with value 'train_dir' passed from command line
[2023-02-24 17:39:11,652][2364708] Overriding arg 'num_workers' with value 1 passed from command line
[2023-02-24 17:39:11,652][2364708] Adding new argument 'no_render'=True that is not in the saved config file!
[2023-02-24 17:39:11,652][2364708] Adding new argument 'save_video'=True that is not in the saved config file!
[2023-02-24 17:39:11,652][2364708] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file!
[2023-02-24 17:39:11,652][2364708] Adding new argument 'video_name'=None that is not in the saved config file!
[2023-02-24 17:39:11,652][2364708] Adding new argument 'max_num_frames'=1000000000.0 that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'max_num_episodes'=10 that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'push_to_hub'=True that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'hf_repository'='eldraco/rl_course_vizdoom_health_gathering_supreme' that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'policy_index'=0 that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'eval_deterministic'=False that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'train_script'=None that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Adding new argument 'enjoy_script'=None that is not in the saved config file!
[2023-02-24 17:39:11,653][2364708] Using frameskip 1 and render_action_repeat=4 for evaluation
[2023-02-24 17:39:11,659][2364708] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-24 17:39:11,659][2364708] RunningMeanStd input shape: (3, 72, 128)
[2023-02-24 17:39:11,660][2364708] RunningMeanStd input shape: (1,)
[2023-02-24 17:39:11,667][2364708] ConvEncoder: input_channels=3
[2023-02-24 17:39:11,779][2364708] Conv encoder output size: 512
[2023-02-24 17:39:11,779][2364708] Policy head output size: 512
[2023-02-24 17:39:13,104][2364708] Loading state from checkpoint train_dir/default_experiment/checkpoint_p0/checkpoint_000004884_20004864.pth...
[2023-02-24 17:39:13,883][2364708] Num frames 100...
[2023-02-24 17:39:13,953][2364708] Num frames 200...
[2023-02-24 17:39:14,021][2364708] Num frames 300...
[2023-02-24 17:39:14,089][2364708] Num frames 400...
[2023-02-24 17:39:14,160][2364708] Num frames 500...
[2023-02-24 17:39:14,229][2364708] Num frames 600...
[2023-02-24 17:39:14,297][2364708] Num frames 700...
[2023-02-24 17:39:14,365][2364708] Num frames 800...
[2023-02-24 17:39:14,435][2364708] Num frames 900...
[2023-02-24 17:39:14,505][2364708] Num frames 1000...
[2023-02-24 17:39:14,575][2364708] Num frames 1100...
[2023-02-24 17:39:14,645][2364708] Num frames 1200...
[2023-02-24 17:39:14,714][2364708] Num frames 1300...
[2023-02-24 17:39:14,784][2364708] Num frames 1400...
[2023-02-24 17:39:14,886][2364708] Avg episode rewards: #0: 34.720, true rewards: #0: 14.720
[2023-02-24 17:39:14,887][2364708] Avg episode reward: 34.720, avg true_objective: 14.720
[2023-02-24 17:39:14,917][2364708] Num frames 1500...
[2023-02-24 17:39:14,998][2364708] Num frames 1600...
[2023-02-24 17:39:15,067][2364708] Num frames 1700...
[2023-02-24 17:39:15,138][2364708] Num frames 1800...
[2023-02-24 17:39:15,208][2364708] Num frames 1900...
[2023-02-24 17:39:15,277][2364708] Num frames 2000...
[2023-02-24 17:39:15,347][2364708] Num frames 2100...
[2023-02-24 17:39:15,415][2364708] Num frames 2200...
[2023-02-24 17:39:15,485][2364708] Num frames 2300...
[2023-02-24 17:39:15,554][2364708] Num frames 2400...
[2023-02-24 17:39:15,621][2364708] Num frames 2500...
[2023-02-24 17:39:15,714][2364708] Avg episode rewards: #0: 31.300, true rewards: #0: 12.800
[2023-02-24 17:39:15,715][2364708] Avg episode reward: 31.300, avg true_objective: 12.800
[2023-02-24 17:39:15,758][2364708] Num frames 2600...
[2023-02-24 17:39:15,831][2364708] Num frames 2700...
[2023-02-24 17:39:15,900][2364708] Num frames 2800...
[2023-02-24 17:39:15,969][2364708] Num frames 2900...
[2023-02-24 17:39:16,037][2364708] Num frames 3000...
[2023-02-24 17:39:16,106][2364708] Num frames 3100...
[2023-02-24 17:39:16,177][2364708] Num frames 3200...
[2023-02-24 17:39:16,247][2364708] Num frames 3300...
[2023-02-24 17:39:16,316][2364708] Num frames 3400...
[2023-02-24 17:39:16,386][2364708] Num frames 3500...
[2023-02-24 17:39:16,455][2364708] Num frames 3600...
[2023-02-24 17:39:16,526][2364708] Num frames 3700...
[2023-02-24 17:39:16,596][2364708] Num frames 3800...
[2023-02-24 17:39:16,665][2364708] Num frames 3900...
[2023-02-24 17:39:16,734][2364708] Num frames 4000...
[2023-02-24 17:39:16,802][2364708] Num frames 4100...
[2023-02-24 17:39:16,871][2364708] Num frames 4200...
[2023-02-24 17:39:16,941][2364708] Num frames 4300...
[2023-02-24 17:39:17,010][2364708] Num frames 4400...
[2023-02-24 17:39:17,078][2364708] Num frames 4500...
[2023-02-24 17:39:17,148][2364708] Num frames 4600...
[2023-02-24 17:39:17,244][2364708] Avg episode rewards: #0: 41.533, true rewards: #0: 15.533
[2023-02-24 17:39:17,244][2364708] Avg episode reward: 41.533, avg true_objective: 15.533
[2023-02-24 17:39:17,288][2364708] Num frames 4700...
[2023-02-24 17:39:17,360][2364708] Num frames 4800...
[2023-02-24 17:39:17,430][2364708] Num frames 4900...
[2023-02-24 17:39:17,499][2364708] Num frames 5000...
[2023-02-24 17:39:17,569][2364708] Num frames 5100...
[2023-02-24 17:39:17,640][2364708] Num frames 5200...
[2023-02-24 17:39:17,720][2364708] Avg episode rewards: #0: 34.100, true rewards: #0: 13.100
[2023-02-24 17:39:17,720][2364708] Avg episode reward: 34.100, avg true_objective: 13.100
[2023-02-24 17:39:17,789][2364708] Num frames 5300...
[2023-02-24 17:39:17,857][2364708] Num frames 5400...
[2023-02-24 17:39:17,926][2364708] Num frames 5500...
[2023-02-24 17:39:17,995][2364708] Num frames 5600...
[2023-02-24 17:39:18,063][2364708] Num frames 5700...
[2023-02-24 17:39:18,132][2364708] Num frames 5800...
[2023-02-24 17:39:18,195][2364708] Avg episode rewards: #0: 29.232, true rewards: #0: 11.632
[2023-02-24 17:39:18,195][2364708] Avg episode reward: 29.232, avg true_objective: 11.632
[2023-02-24 17:39:18,273][2364708] Num frames 5900...
[2023-02-24 17:39:18,342][2364708] Num frames 6000...
[2023-02-24 17:39:18,409][2364708] Num frames 6100...
[2023-02-24 17:39:18,482][2364708] Num frames 6200...
[2023-02-24 17:39:18,552][2364708] Num frames 6300...
[2023-02-24 17:39:18,646][2364708] Avg episode rewards: #0: 25.600, true rewards: #0: 10.600
[2023-02-24 17:39:18,646][2364708] Avg episode reward: 25.600, avg true_objective: 10.600
[2023-02-24 17:39:18,691][2364708] Num frames 6400...
[2023-02-24 17:39:18,767][2364708] Num frames 6500...
[2023-02-24 17:39:18,835][2364708] Num frames 6600...
[2023-02-24 17:39:18,904][2364708] Num frames 6700...
[2023-02-24 17:39:18,972][2364708] Num frames 6800...
[2023-02-24 17:39:19,041][2364708] Num frames 6900...
[2023-02-24 17:39:19,110][2364708] Num frames 7000...
[2023-02-24 17:39:19,179][2364708] Num frames 7100...
[2023-02-24 17:39:19,249][2364708] Num frames 7200...
[2023-02-24 17:39:19,318][2364708] Num frames 7300...
[2023-02-24 17:39:19,388][2364708] Num frames 7400...
[2023-02-24 17:39:19,458][2364708] Num frames 7500...
[2023-02-24 17:39:19,528][2364708] Num frames 7600...
[2023-02-24 17:39:19,598][2364708] Num frames 7700...
[2023-02-24 17:39:19,667][2364708] Num frames 7800...
[2023-02-24 17:39:19,736][2364708] Num frames 7900...
[2023-02-24 17:39:19,808][2364708] Avg episode rewards: #0: 28.040, true rewards: #0: 11.326
[2023-02-24 17:39:19,808][2364708] Avg episode reward: 28.040, avg true_objective: 11.326
[2023-02-24 17:39:19,879][2364708] Num frames 8000...
[2023-02-24 17:39:19,948][2364708] Num frames 8100...
[2023-02-24 17:39:20,017][2364708] Num frames 8200...
[2023-02-24 17:39:20,086][2364708] Num frames 8300...
[2023-02-24 17:39:20,155][2364708] Num frames 8400...
[2023-02-24 17:39:20,225][2364708] Num frames 8500...
[2023-02-24 17:39:20,294][2364708] Num frames 8600...
[2023-02-24 17:39:20,363][2364708] Num frames 8700...
[2023-02-24 17:39:20,432][2364708] Num frames 8800...
[2023-02-24 17:39:20,501][2364708] Num frames 8900...
[2023-02-24 17:39:20,571][2364708] Num frames 9000...
[2023-02-24 17:39:20,640][2364708] Num frames 9100...
[2023-02-24 17:39:20,711][2364708] Num frames 9200...
[2023-02-24 17:39:20,781][2364708] Num frames 9300...
[2023-02-24 17:39:20,850][2364708] Num frames 9400...
[2023-02-24 17:39:20,919][2364708] Num frames 9500...
[2023-02-24 17:39:20,990][2364708] Num frames 9600...
[2023-02-24 17:39:21,060][2364708] Num frames 9700...
[2023-02-24 17:39:21,130][2364708] Num frames 9800...
[2023-02-24 17:39:21,198][2364708] Num frames 9900...
[2023-02-24 17:39:21,269][2364708] Num frames 10000...
[2023-02-24 17:39:21,341][2364708] Avg episode rewards: #0: 32.035, true rewards: #0: 12.535
[2023-02-24 17:39:21,341][2364708] Avg episode reward: 32.035, avg true_objective: 12.535
[2023-02-24 17:39:21,412][2364708] Num frames 10100...
[2023-02-24 17:39:21,481][2364708] Num frames 10200...
[2023-02-24 17:39:21,551][2364708] Num frames 10300...
[2023-02-24 17:39:21,621][2364708] Num frames 10400...
[2023-02-24 17:39:21,691][2364708] Num frames 10500...
[2023-02-24 17:39:21,759][2364708] Num frames 10600...
[2023-02-24 17:39:21,828][2364708] Num frames 10700...
[2023-02-24 17:39:21,897][2364708] Num frames 10800...
[2023-02-24 17:39:21,966][2364708] Num frames 10900...
[2023-02-24 17:39:22,035][2364708] Num frames 11000...
[2023-02-24 17:39:22,117][2364708] Avg episode rewards: #0: 31.603, true rewards: #0: 12.270
[2023-02-24 17:39:22,118][2364708] Avg episode reward: 31.603, avg true_objective: 12.270
[2023-02-24 17:39:22,180][2364708] Num frames 11100...
[2023-02-24 17:39:22,251][2364708] Num frames 11200...
[2023-02-24 17:39:22,320][2364708] Num frames 11300...
[2023-02-24 17:39:22,389][2364708] Num frames 11400...
[2023-02-24 17:39:22,458][2364708] Num frames 11500...
[2023-02-24 17:39:22,527][2364708] Num frames 11600...
[2023-02-24 17:39:22,596][2364708] Num frames 11700...
[2023-02-24 17:39:22,666][2364708] Num frames 11800...
[2023-02-24 17:39:22,735][2364708] Num frames 11900...
[2023-02-24 17:39:22,804][2364708] Num frames 12000...
[2023-02-24 17:39:22,874][2364708] Num frames 12100...
[2023-02-24 17:39:22,944][2364708] Num frames 12200...
[2023-02-24 17:39:23,013][2364708] Num frames 12300...
[2023-02-24 17:39:23,082][2364708] Num frames 12400...
[2023-02-24 17:39:23,152][2364708] Num frames 12500...
[2023-02-24 17:39:23,220][2364708] Num frames 12600...
[2023-02-24 17:39:23,289][2364708] Num frames 12700...
[2023-02-24 17:39:23,360][2364708] Num frames 12800...
[2023-02-24 17:39:23,433][2364708] Avg episode rewards: #0: 33.229, true rewards: #0: 12.829
[2023-02-24 17:39:23,433][2364708] Avg episode reward: 33.229, avg true_objective: 12.829
[2023-02-24 17:39:39,087][2364708] Replay video saved to train_dir/default_experiment/replay.mp4!
[2023-02-24 17:39:45,076][2364708] The model has been pushed to https://huggingface.co/eldraco/rl_course_vizdoom_health_gathering_supreme
[2023-02-25 11:13:23,382][2391555] Saving configuration to /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/config.json...
[2023-02-25 11:13:23,382][2391555] Rollout worker 0 uses device cpu
[2023-02-25 11:13:23,382][2391555] Rollout worker 1 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 2 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 3 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 4 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 5 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 6 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 7 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 8 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 9 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 10 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 11 uses device cpu
[2023-02-25 11:13:23,383][2391555] Rollout worker 12 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 13 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 14 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 15 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 16 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 17 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 18 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 19 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 20 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 21 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 22 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 23 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 24 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 25 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 26 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 27 uses device cpu
[2023-02-25 11:13:23,384][2391555] Rollout worker 28 uses device cpu
[2023-02-25 11:13:23,385][2391555] Rollout worker 29 uses device cpu
[2023-02-25 11:13:23,385][2391555] Rollout worker 30 uses device cpu
[2023-02-25 11:13:23,385][2391555] Rollout worker 31 uses device cpu
[2023-02-25 11:13:23,820][2391555] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-25 11:13:23,821][2391555] InferenceWorker_p0-w0: min num requests: 10
[2023-02-25 11:13:23,889][2391555] Starting all processes...
[2023-02-25 11:13:23,889][2391555] Starting process learner_proc0
[2023-02-25 11:13:24,631][2391555] Starting all processes...
[2023-02-25 11:13:24,637][2391555] Starting process inference_proc0-0
[2023-02-25 11:13:24,638][2391646] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-25 11:13:24,638][2391646] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for learning process 0
[2023-02-25 11:13:24,639][2391555] Starting process rollout_proc0
[2023-02-25 11:13:24,639][2391555] Starting process rollout_proc1
[2023-02-25 11:13:24,639][2391555] Starting process rollout_proc2
[2023-02-25 11:13:24,640][2391555] Starting process rollout_proc3
[2023-02-25 11:13:24,641][2391555] Starting process rollout_proc4
[2023-02-25 11:13:24,642][2391555] Starting process rollout_proc5
[2023-02-25 11:13:24,643][2391555] Starting process rollout_proc6
[2023-02-25 11:13:24,645][2391555] Starting process rollout_proc7
[2023-02-25 11:13:24,648][2391555] Starting process rollout_proc8
[2023-02-25 11:13:24,654][2391646] Num visible devices: 1
[2023-02-25 11:13:24,649][2391555] Starting process rollout_proc9
[2023-02-25 11:13:24,687][2391646] Starting seed is not provided
[2023-02-25 11:13:24,687][2391646] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-25 11:13:24,688][2391646] Initializing actor-critic model on device cuda:0
[2023-02-25 11:13:24,688][2391646] RunningMeanStd input shape: (23,)
[2023-02-25 11:13:24,688][2391646] RunningMeanStd input shape: (3, 72, 128)
[2023-02-25 11:13:24,650][2391555] Starting process rollout_proc10
[2023-02-25 11:13:24,689][2391646] RunningMeanStd input shape: (1,)
[2023-02-25 11:13:24,650][2391555] Starting process rollout_proc11
[2023-02-25 11:13:24,650][2391555] Starting process rollout_proc12
[2023-02-25 11:13:24,651][2391555] Starting process rollout_proc13
[2023-02-25 11:13:24,653][2391555] Starting process rollout_proc14
[2023-02-25 11:13:24,704][2391646] ConvEncoder: input_channels=3
[2023-02-25 11:13:24,853][2391646] Conv encoder output size: 512
[2023-02-25 11:13:24,854][2391646] Policy head output size: 512
[2023-02-25 11:13:24,870][2391646] Created Actor Critic model with architecture:
[2023-02-25 11:13:24,870][2391646] ActorCriticSharedWeights(
(obs_normalizer): ObservationNormalizer(
(running_mean_std): RunningMeanStdDictInPlace(
(running_mean_std): ModuleDict(
(measurements): RunningMeanStdInPlace()
(obs): RunningMeanStdInPlace()
)
)
)
(returns_normalizer): RecursiveScriptModule(original_name=RunningMeanStdInPlace)
(encoder): VizdoomEncoder(
(basic_encoder): ConvEncoder(
(enc): RecursiveScriptModule(
original_name=ConvEncoderImpl
(conv_head): RecursiveScriptModule(
original_name=Sequential
(0): RecursiveScriptModule(original_name=Conv2d)
(1): RecursiveScriptModule(original_name=ELU)
(2): RecursiveScriptModule(original_name=Conv2d)
(3): RecursiveScriptModule(original_name=ELU)
(4): RecursiveScriptModule(original_name=Conv2d)
(5): RecursiveScriptModule(original_name=ELU)
)
(mlp_layers): RecursiveScriptModule(
original_name=Sequential
(0): RecursiveScriptModule(original_name=Linear)
(1): RecursiveScriptModule(original_name=ELU)
)
)
)
)
(core): ModelCoreRNN(
(core): GRU(512, 512)
)
(decoder): MlpDecoder(
(mlp): Identity()
)
(critic_linear): Linear(in_features=512, out_features=1, bias=True)
(action_parameterization): ActionParameterizationDefault(
(distribution_linear): Linear(in_features=512, out_features=39, bias=True)
)
)
[2023-02-25 11:13:26,317][2391555] Starting process rollout_proc15
[2023-02-25 11:13:26,327][2391692] Worker 0 uses CPU cores [0]
[2023-02-25 11:13:26,341][2391555] Starting process rollout_proc16
[2023-02-25 11:13:26,352][2391691] Using GPUs [0] for process 0 (actually maps to GPUs [0])
[2023-02-25 11:13:26,353][2391691] Set environment var CUDA_VISIBLE_DEVICES to '0' (GPU indices [0]) for inference process 0
[2023-02-25 11:13:26,370][2391555] Starting process rollout_proc17
[2023-02-25 11:13:26,376][2391691] Num visible devices: 1
[2023-02-25 11:13:26,377][2391696] Worker 5 uses CPU cores [5]
[2023-02-25 11:13:26,413][2391555] Starting process rollout_proc18
[2023-02-25 11:13:26,422][2391694] Worker 3 uses CPU cores [3]
[2023-02-25 11:13:26,423][2391555] Starting process rollout_proc19
[2023-02-25 11:13:26,434][2391695] Worker 1 uses CPU cores [1]
[2023-02-25 11:13:26,437][2391555] Starting process rollout_proc20
[2023-02-25 11:13:26,457][2391722] Worker 12 uses CPU cores [12]
[2023-02-25 11:13:26,457][2391555] Starting process rollout_proc21
[2023-02-25 11:13:26,459][2391555] Starting process rollout_proc22
[2023-02-25 11:13:26,466][2391555] Starting process rollout_proc23
[2023-02-25 11:13:26,469][2391721] Worker 14 uses CPU cores [14]
[2023-02-25 11:13:26,474][2391712] Worker 4 uses CPU cores [4]
[2023-02-25 11:13:26,486][2391718] Worker 11 uses CPU cores [11]
[2023-02-25 11:13:26,491][2391555] Starting process rollout_proc24
[2023-02-25 11:13:26,502][2391720] Worker 13 uses CPU cores [13]
[2023-02-25 11:13:26,523][2391555] Starting process rollout_proc25
[2023-02-25 11:13:26,536][2391555] Starting process rollout_proc26
[2023-02-25 11:13:26,546][2391715] Worker 7 uses CPU cores [7]
[2023-02-25 11:13:26,550][2391719] Worker 8 uses CPU cores [8]
[2023-02-25 11:13:26,558][2391555] Starting process rollout_proc27
[2023-02-25 11:13:26,570][2391693] Worker 2 uses CPU cores [2]
[2023-02-25 11:13:26,583][2391555] Starting process rollout_proc28
[2023-02-25 11:13:26,602][2391717] Worker 10 uses CPU cores [10]
[2023-02-25 11:13:26,638][2391555] Starting process rollout_proc29
[2023-02-25 11:13:26,646][2391716] Worker 9 uses CPU cores [9]
[2023-02-25 11:13:26,655][2391555] Starting process rollout_proc30
[2023-02-25 11:13:26,670][2391713] Worker 6 uses CPU cores [6]
[2023-02-25 11:13:27,941][2391646] Using optimizer <class 'torch.optim.adam.Adam'>
[2023-02-25 11:13:27,942][2391646] Loading state from checkpoint /home/sebas/research/hugging-face-course/vizdoom/train_dir/default_experiment/checkpoint_p0/checkpoint_000004884_20004864.pth...
[2023-02-25 11:13:27,976][2391646] Loading model from checkpoint
[2023-02-25 11:13:27,978][2391646] EvtLoop [learner_proc0_evt_loop, process=learner_proc0] unhandled exception in slot='init' connected to emitter=Emitter(object_id='Runner_EvtLoop', signal_name='start'), args=()
Traceback (most recent call last):
File "/home/sebas/miniconda3/envs/RL-vizdoom/lib/python3.10/site-packages/signal_slot/signal_slot.py", line 355, in _process_signal
slot_callable(*args)
File "/home/sebas/miniconda3/envs/RL-vizdoom/lib/python3.10/site-packages/sample_factory/algo/learning/learner_worker.py", line 139, in init
init_model_data = self.learner.init()
File "/home/sebas/miniconda3/envs/RL-vizdoom/lib/python3.10/site-packages/sample_factory/algo/learning/learner.py", line 243, in init
self.load_from_checkpoint(self.policy_id)
File "/home/sebas/miniconda3/envs/RL-vizdoom/lib/python3.10/site-packages/sample_factory/algo/learning/learner.py", line 305, in load_from_checkpoint
self._load_state(checkpoint_dict, load_progress=load_progress)
File "/home/sebas/miniconda3/envs/RL-vizdoom/lib/python3.10/site-packages/sample_factory/algo/learning/learner.py", line 289, in _load_state
self.actor_critic.load_state_dict(checkpoint_dict["model"])
File "/home/sebas/miniconda3/envs/RL-vizdoom/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1671, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for ActorCriticSharedWeights:
Missing key(s) in state_dict: "obs_normalizer.running_mean_std.running_mean_std.measurements.running_mean", "obs_normalizer.running_mean_std.running_mean_std.measurements.running_var", "obs_normalizer.running_mean_std.running_mean_std.measurements.count".
size mismatch for action_parameterization.distribution_linear.weight: copying a param with shape torch.Size([5, 512]) from checkpoint, the shape in current model is torch.Size([39, 512]).
size mismatch for action_parameterization.distribution_linear.bias: copying a param with shape torch.Size([5]) from checkpoint, the shape in current model is torch.Size([39]).
[2023-02-25 11:13:27,980][2391646] Unhandled exception Error(s) in loading state_dict for ActorCriticSharedWeights:
Missing key(s) in state_dict: "obs_normalizer.running_mean_std.running_mean_std.measurements.running_mean", "obs_normalizer.running_mean_std.running_mean_std.measurements.running_var", "obs_normalizer.running_mean_std.running_mean_std.measurements.count".
size mismatch for action_parameterization.distribution_linear.weight: copying a param with shape torch.Size([5, 512]) from checkpoint, the shape in current model is torch.Size([39, 512]).
size mismatch for action_parameterization.distribution_linear.bias: copying a param with shape torch.Size([5]) from checkpoint, the shape in current model is torch.Size([39]). in evt loop learner_proc0_evt_loop
[2023-02-25 11:13:28,110][2391555] Starting process rollout_proc31
[2023-02-25 11:13:28,126][2391996] Worker 17 uses CPU cores [1]
[2023-02-25 11:13:28,242][2392076] Worker 23 uses CPU cores [7]
[2023-02-25 11:13:28,314][2392027] Worker 18 uses CPU cores [2]
[2023-02-25 11:13:28,330][2392044] Worker 20 uses CPU cores [4]
[2023-02-25 11:13:28,350][2391995] Worker 16 uses CPU cores [0]
[2023-02-25 11:13:28,378][2392028] Worker 19 uses CPU cores [3]
[2023-02-25 11:13:28,394][2392139] Worker 25 uses CPU cores [9]
[2023-02-25 11:13:28,394][2391979] Worker 15 uses CPU cores [15]
[2023-02-25 11:13:28,397][2392123] Worker 24 uses CPU cores [8]
[2023-02-25 11:13:28,407][2392171] Worker 27 uses CPU cores [11]
[2023-02-25 11:13:28,410][2392107] Worker 22 uses CPU cores [6]
[2023-02-25 11:13:28,446][2392140] Worker 26 uses CPU cores [10]
[2023-02-25 11:13:28,458][2392075] Worker 21 uses CPU cores [5]
[2023-02-25 11:13:28,490][2392203] Worker 29 uses CPU cores [13]
[2023-02-25 11:13:28,547][2392187] Worker 28 uses CPU cores [12]
[2023-02-25 11:13:28,562][2392219] Worker 30 uses CPU cores [14]
[2023-02-25 11:13:29,110][2392478] Worker 31 uses CPU cores [15]
[2023-02-25 11:13:43,817][2391555] Heartbeat connected on Batcher_0
[2023-02-25 11:13:43,822][2391555] Heartbeat connected on InferenceWorker_p0-w0
[2023-02-25 11:13:43,824][2391555] Heartbeat connected on RolloutWorker_w0
[2023-02-25 11:13:43,826][2391555] Heartbeat connected on RolloutWorker_w1
[2023-02-25 11:13:43,829][2391555] Heartbeat connected on RolloutWorker_w2
[2023-02-25 11:13:43,831][2391555] Heartbeat connected on RolloutWorker_w3
[2023-02-25 11:13:43,833][2391555] Heartbeat connected on RolloutWorker_w4
[2023-02-25 11:13:43,835][2391555] Heartbeat connected on RolloutWorker_w5
[2023-02-25 11:13:43,837][2391555] Heartbeat connected on RolloutWorker_w6
[2023-02-25 11:13:43,839][2391555] Heartbeat connected on RolloutWorker_w7
[2023-02-25 11:13:43,841][2391555] Heartbeat connected on RolloutWorker_w8
[2023-02-25 11:13:43,843][2391555] Heartbeat connected on RolloutWorker_w9
[2023-02-25 11:13:43,845][2391555] Heartbeat connected on RolloutWorker_w10
[2023-02-25 11:13:43,847][2391555] Heartbeat connected on RolloutWorker_w11
[2023-02-25 11:13:43,849][2391555] Heartbeat connected on RolloutWorker_w12
[2023-02-25 11:13:43,852][2391555] Heartbeat connected on RolloutWorker_w13
[2023-02-25 11:13:43,854][2391555] Heartbeat connected on RolloutWorker_w14
[2023-02-25 11:13:43,856][2391555] Heartbeat connected on RolloutWorker_w15
[2023-02-25 11:13:43,858][2391555] Heartbeat connected on RolloutWorker_w16
[2023-02-25 11:13:43,860][2391555] Heartbeat connected on RolloutWorker_w17
[2023-02-25 11:13:43,861][2391555] Heartbeat connected on RolloutWorker_w18
[2023-02-25 11:13:43,863][2391555] Heartbeat connected on RolloutWorker_w19
[2023-02-25 11:13:43,865][2391555] Heartbeat connected on RolloutWorker_w20
[2023-02-25 11:13:43,867][2391555] Heartbeat connected on RolloutWorker_w21
[2023-02-25 11:13:43,870][2391555] Heartbeat connected on RolloutWorker_w22
[2023-02-25 11:13:43,872][2391555] Heartbeat connected on RolloutWorker_w23
[2023-02-25 11:13:43,874][2391555] Heartbeat connected on RolloutWorker_w24
[2023-02-25 11:13:43,876][2391555] Heartbeat connected on RolloutWorker_w25
[2023-02-25 11:13:43,878][2391555] Heartbeat connected on RolloutWorker_w26
[2023-02-25 11:13:43,880][2391555] Heartbeat connected on RolloutWorker_w27
[2023-02-25 11:13:43,882][2391555] Heartbeat connected on RolloutWorker_w28
[2023-02-25 11:13:43,884][2391555] Heartbeat connected on RolloutWorker_w29
[2023-02-25 11:13:43,886][2391555] Heartbeat connected on RolloutWorker_w30
[2023-02-25 11:13:43,888][2391555] Heartbeat connected on RolloutWorker_w31
[2023-02-25 11:23:21,760][2391555] Components not started: LearnerWorker_p0, wait_time=600.0 seconds
[2023-02-25 11:33:21,760][2391555] Components not started: LearnerWorker_p0, wait_time=1200.0 seconds
[2023-02-25 11:43:21,761][2391555] Components not started: LearnerWorker_p0, wait_time=1800.0 seconds
[2023-02-25 11:43:21,762][2391555] Components take too long to start: LearnerWorker_p0. Aborting the experiment!
[2023-02-25 11:43:21,763][2391694] Stopping RolloutWorker_w3...
[2023-02-25 11:43:21,764][2391555] Component RolloutWorker_w3 stopped!
[2023-02-25 11:43:21,763][2391716] Stopping RolloutWorker_w9...
[2023-02-25 11:43:21,763][2392140] Stopping RolloutWorker_w26...
[2023-02-25 11:43:21,764][2391691] Stopping InferenceWorker_p0-w0...
[2023-02-25 11:43:21,764][2392187] Stopping RolloutWorker_w28...
[2023-02-25 11:43:21,764][2391712] Stopping RolloutWorker_w4...
[2023-02-25 11:43:21,764][2391721] Stopping RolloutWorker_w14...
[2023-02-25 11:43:21,764][2391646] Stopping Batcher_0...
[2023-02-25 11:43:21,764][2392027] Stopping RolloutWorker_w18...
[2023-02-25 11:43:21,764][2391719] Stopping RolloutWorker_w8...
[2023-02-25 11:43:21,764][2391720] Stopping RolloutWorker_w13...
[2023-02-25 11:43:21,764][2391715] Stopping RolloutWorker_w7...
[2023-02-25 11:43:21,764][2392028] Stopping RolloutWorker_w19...
[2023-02-25 11:43:21,764][2391695] Stopping RolloutWorker_w1...
[2023-02-25 11:43:21,764][2392187] Loop rollout_proc28_evt_loop terminating...
[2023-02-25 11:43:21,764][2391716] Loop rollout_proc9_evt_loop terminating...
[2023-02-25 11:43:21,764][2392140] Loop rollout_proc26_evt_loop terminating...
[2023-02-25 11:43:21,764][2391646] Loop batcher_evt_loop terminating...
[2023-02-25 11:43:21,764][2391721] Loop rollout_proc14_evt_loop terminating...
[2023-02-25 11:43:21,764][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w6', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w9', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w12', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w20', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w26', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w30', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,764][2392027] Loop rollout_proc18_evt_loop terminating...
[2023-02-25 11:43:21,764][2391719] Loop rollout_proc8_evt_loop terminating...
[2023-02-25 11:43:21,764][2391715] Loop rollout_proc7_evt_loop terminating...
[2023-02-25 11:43:21,764][2392028] Loop rollout_proc19_evt_loop terminating...
[2023-02-25 11:43:21,764][2391718] Stopping RolloutWorker_w11...
[2023-02-25 11:43:21,764][2391555] Component RolloutWorker_w26 stopped!
[2023-02-25 11:43:21,764][2392075] Stopping RolloutWorker_w21...
[2023-02-25 11:43:21,764][2391995] Stopping RolloutWorker_w16...
[2023-02-25 11:43:21,764][2392107] Stopping RolloutWorker_w22...
[2023-02-25 11:43:21,764][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w6', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w9', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w12', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w20', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w30', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,764][2391718] Loop rollout_proc11_evt_loop terminating...
[2023-02-25 11:43:21,765][2391995] Loop rollout_proc16_evt_loop terminating...
[2023-02-25 11:43:21,765][2391555] Component RolloutWorker_w9 stopped!
[2023-02-25 11:43:21,765][2392075] Loop rollout_proc21_evt_loop terminating...
[2023-02-25 11:43:21,763][2391713] Stopping RolloutWorker_w6...
[2023-02-25 11:43:21,763][2392044] Stopping RolloutWorker_w20...
[2023-02-25 11:43:21,765][2391713] Loop rollout_proc6_evt_loop terminating...
[2023-02-25 11:43:21,765][2391979] Stopping RolloutWorker_w15...
[2023-02-25 11:43:21,765][2392044] Loop rollout_proc20_evt_loop terminating...
[2023-02-25 11:43:21,766][2391979] Loop rollout_proc15_evt_loop terminating...
[2023-02-25 11:43:21,763][2391722] Stopping RolloutWorker_w12...
[2023-02-25 11:43:21,771][2392076] Stopping RolloutWorker_w23...
[2023-02-25 11:43:21,771][2391722] Loop rollout_proc12_evt_loop terminating...
[2023-02-25 11:43:21,771][2392478] Stopping RolloutWorker_w31...
[2023-02-25 11:43:21,771][2392076] Loop rollout_proc23_evt_loop terminating...
[2023-02-25 11:43:21,771][2392478] Loop rollout_proc31_evt_loop terminating...
[2023-02-25 11:43:21,763][2392219] Stopping RolloutWorker_w30...
[2023-02-25 11:43:21,775][2392219] Loop rollout_proc30_evt_loop terminating...
[2023-02-25 11:43:21,763][2391693] Stopping RolloutWorker_w2...
[2023-02-25 11:43:21,779][2391693] Loop rollout_proc2_evt_loop terminating...
[2023-02-25 11:43:21,763][2392123] Stopping RolloutWorker_w24...
[2023-02-25 11:43:21,783][2392123] Loop rollout_proc24_evt_loop terminating...
[2023-02-25 11:43:21,763][2392203] Stopping RolloutWorker_w29...
[2023-02-25 11:43:21,787][2392203] Loop rollout_proc29_evt_loop terminating...
[2023-02-25 11:43:21,763][2391694] Loop rollout_proc3_evt_loop terminating...
[2023-02-25 11:43:21,764][2391720] Loop rollout_proc13_evt_loop terminating...
[2023-02-25 11:43:21,763][2391996] Stopping RolloutWorker_w17...
[2023-02-25 11:43:21,791][2391996] Loop rollout_proc17_evt_loop terminating...
[2023-02-25 11:43:21,764][2391717] Stopping RolloutWorker_w10...
[2023-02-25 11:43:21,795][2391717] Loop rollout_proc10_evt_loop terminating...
[2023-02-25 11:43:21,764][2392139] Stopping RolloutWorker_w25...
[2023-02-25 11:43:21,799][2392139] Loop rollout_proc25_evt_loop terminating...
[2023-02-25 11:43:21,764][2392171] Stopping RolloutWorker_w27...
[2023-02-25 11:43:21,764][2391691] Loop inference_proc0-0_evt_loop terminating...
[2023-02-25 11:43:21,803][2392171] Loop rollout_proc27_evt_loop terminating...
[2023-02-25 11:43:21,764][2391712] Loop rollout_proc4_evt_loop terminating...
[2023-02-25 11:43:21,764][2391695] Loop rollout_proc1_evt_loop terminating...
[2023-02-25 11:43:21,765][2392107] Loop rollout_proc22_evt_loop terminating...
[2023-02-25 11:43:21,763][2391696] Stopping RolloutWorker_w5...
[2023-02-25 11:43:21,765][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w6', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w12', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w20', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w30', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,815][2391555] Component RolloutWorker_w20 stopped!
[2023-02-25 11:43:21,815][2391696] Loop rollout_proc5_evt_loop terminating...
[2023-02-25 11:43:21,815][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w6', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w12', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w30', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,816][2391555] Component RolloutWorker_w6 stopped!
[2023-02-25 11:43:21,816][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w12', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w30', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,816][2391555] Component RolloutWorker_w12 stopped!
[2023-02-25 11:43:21,816][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w30', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,816][2391555] Component RolloutWorker_w30 stopped!
[2023-02-25 11:43:21,816][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w2', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,817][2391555] Component RolloutWorker_w2 stopped!
[2023-02-25 11:43:21,817][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w5', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,817][2391555] Component RolloutWorker_w5 stopped!
[2023-02-25 11:43:21,817][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w29', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,817][2391555] Component RolloutWorker_w29 stopped!
[2023-02-25 11:43:21,817][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w24', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,818][2391555] Component RolloutWorker_w24 stopped!
[2023-02-25 11:43:21,818][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w11', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,818][2391555] Component RolloutWorker_w11 stopped!
[2023-02-25 11:43:21,818][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w17', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,818][2391555] Component RolloutWorker_w17 stopped!
[2023-02-25 11:43:21,771][2391692] Stopping RolloutWorker_w0...
[2023-02-25 11:43:21,818][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w10', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,819][2391555] Component RolloutWorker_w10 stopped!
[2023-02-25 11:43:21,819][2391692] Loop rollout_proc0_evt_loop terminating...
[2023-02-25 11:43:21,819][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w25', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,819][2391555] Component RolloutWorker_w25 stopped!
[2023-02-25 11:43:21,819][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'InferenceWorker_p0-w0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,819][2391555] Component InferenceWorker_p0-w0 stopped!
[2023-02-25 11:43:21,819][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w27', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,820][2391555] Component RolloutWorker_w27 stopped!
[2023-02-25 11:43:21,820][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w28', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,820][2391555] Component RolloutWorker_w28 stopped!
[2023-02-25 11:43:21,820][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w4', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,820][2391555] Component RolloutWorker_w4 stopped!
[2023-02-25 11:43:21,820][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w21', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,820][2391555] Component RolloutWorker_w21 stopped!
[2023-02-25 11:43:21,821][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w22', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,821][2391555] Component RolloutWorker_w22 stopped!
[2023-02-25 11:43:21,821][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w14', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,821][2391555] Component RolloutWorker_w14 stopped!
[2023-02-25 11:43:21,821][2391555] Waiting for ['Batcher_0', 'LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,821][2391555] Component Batcher_0 stopped!
[2023-02-25 11:43:21,821][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w18', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,821][2391555] Component RolloutWorker_w18 stopped!
[2023-02-25 11:43:21,822][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w13', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,822][2391555] Component RolloutWorker_w13 stopped!
[2023-02-25 11:43:21,822][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w8', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,822][2391555] Component RolloutWorker_w8 stopped!
[2023-02-25 11:43:21,822][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w7', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,822][2391555] Component RolloutWorker_w7 stopped!
[2023-02-25 11:43:21,822][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w19', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,822][2391555] Component RolloutWorker_w19 stopped!
[2023-02-25 11:43:21,822][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w1', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,823][2391555] Component RolloutWorker_w1 stopped!
[2023-02-25 11:43:21,823][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w15', 'RolloutWorker_w16', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,823][2391555] Component RolloutWorker_w16 stopped!
[2023-02-25 11:43:21,823][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w15', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,823][2391555] Component RolloutWorker_w15 stopped!
[2023-02-25 11:43:21,823][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w23', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,823][2391555] Component RolloutWorker_w23 stopped!
[2023-02-25 11:43:21,823][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w0', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,823][2391555] Component RolloutWorker_w0 stopped!
[2023-02-25 11:43:21,824][2391555] Waiting for ['LearnerWorker_p0', 'RolloutWorker_w31'] to stop...
[2023-02-25 11:43:21,824][2391555] Component RolloutWorker_w31 stopped!
[2023-02-25 11:43:21,824][2391555] Waiting for ['LearnerWorker_p0'] to stop...
[2023-02-25 12:10:30,585][2391555] Keyboard interrupt detected in the event loop EvtLoop [Runner_EvtLoop, process=main process 2391555], exiting...
[2023-02-25 12:10:30,588][2391555] Runner profile tree view:
main_loop: 3426.6990
[2023-02-25 12:10:30,588][2391555] Collected {}, FPS: 0.0
[2023-02-25 12:10:30,595][2391555] Loading existing experiment configuration from train_dir/default_experiment/config.json
[2023-02-25 12:10:30,595][2391555] Overriding arg 'env' with value 'doom_health_gathering_supreme' passed from command line
[2023-02-25 12:10:30,596][2391555] Overriding arg 'train_dir' with value 'train_dir' passed from command line
[2023-02-25 12:10:30,596][2391555] Overriding arg 'num_workers' with value 1 passed from command line
[2023-02-25 12:10:30,596][2391555] Adding new argument 'no_render'=True that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'save_video'=True that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'video_name'=None that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'max_num_frames'=1000000000.0 that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'max_num_episodes'=10 that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'push_to_hub'=True that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'hf_repository'='eldraco/rl_course_vizdoom_health_gathering_supreme' that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'policy_index'=0 that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'eval_deterministic'=False that is not in the saved config file!
[2023-02-25 12:10:30,596][2391555] Adding new argument 'train_script'=None that is not in the saved config file!
[2023-02-25 12:10:30,597][2391555] Adding new argument 'enjoy_script'=None that is not in the saved config file!
[2023-02-25 12:10:30,597][2391555] Using frameskip 1 and render_action_repeat=4 for evaluation
[2023-02-25 12:10:30,602][2391555] Doom resolution: 160x120, resize resolution: (128, 72)
[2023-02-25 12:10:30,603][2391555] RunningMeanStd input shape: (3, 72, 128)
[2023-02-25 12:10:30,603][2391555] RunningMeanStd input shape: (1,)
[2023-02-25 12:10:30,610][2391555] ConvEncoder: input_channels=3
[2023-02-25 12:10:30,753][2391555] Conv encoder output size: 512
[2023-02-25 12:10:30,753][2391555] Policy head output size: 512
[2023-02-25 12:10:32,086][2391555] Loading state from checkpoint train_dir/default_experiment/checkpoint_p0/checkpoint_000004884_20004864.pth...
[2023-02-25 12:10:32,762][2391555] Num frames 100...
[2023-02-25 12:10:32,829][2391555] Num frames 200...
[2023-02-25 12:10:32,896][2391555] Num frames 300...
[2023-02-25 12:10:32,964][2391555] Num frames 400...
[2023-02-25 12:10:33,034][2391555] Num frames 500...
[2023-02-25 12:10:33,121][2391555] Num frames 600...
[2023-02-25 12:10:33,191][2391555] Num frames 700...
[2023-02-25 12:10:33,261][2391555] Num frames 800...
[2023-02-25 12:10:33,330][2391555] Num frames 900...
[2023-02-25 12:10:33,398][2391555] Num frames 1000...
[2023-02-25 12:10:33,466][2391555] Num frames 1100...
[2023-02-25 12:10:33,536][2391555] Num frames 1200...
[2023-02-25 12:10:33,605][2391555] Num frames 1300...
[2023-02-25 12:10:33,673][2391555] Num frames 1400...
[2023-02-25 12:10:33,741][2391555] Num frames 1500...
[2023-02-25 12:10:33,796][2391555] Avg episode rewards: #0: 40.040, true rewards: #0: 15.040
[2023-02-25 12:10:33,796][2391555] Avg episode reward: 40.040, avg true_objective: 15.040
[2023-02-25 12:10:33,886][2391555] Num frames 1600...
[2023-02-25 12:10:33,955][2391555] Num frames 1700...
[2023-02-25 12:10:34,023][2391555] Num frames 1800...
[2023-02-25 12:10:34,091][2391555] Num frames 1900...
[2023-02-25 12:10:34,158][2391555] Num frames 2000...
[2023-02-25 12:10:34,226][2391555] Num frames 2100...
[2023-02-25 12:10:34,286][2391555] Avg episode rewards: #0: 25.560, true rewards: #0: 10.560
[2023-02-25 12:10:34,287][2391555] Avg episode reward: 25.560, avg true_objective: 10.560
[2023-02-25 12:10:34,371][2391555] Num frames 2200...
[2023-02-25 12:10:34,439][2391555] Num frames 2300...
[2023-02-25 12:10:34,506][2391555] Num frames 2400...
[2023-02-25 12:10:34,573][2391555] Num frames 2500...
[2023-02-25 12:10:34,642][2391555] Num frames 2600...
[2023-02-25 12:10:34,711][2391555] Num frames 2700...
[2023-02-25 12:10:34,780][2391555] Num frames 2800...
[2023-02-25 12:10:34,842][2391555] Avg episode rewards: #0: 22.043, true rewards: #0: 9.377
[2023-02-25 12:10:34,842][2391555] Avg episode reward: 22.043, avg true_objective: 9.377
[2023-02-25 12:10:34,928][2391555] Num frames 2900...
[2023-02-25 12:10:34,995][2391555] Num frames 3000...
[2023-02-25 12:10:35,062][2391555] Num frames 3100...
[2023-02-25 12:10:35,133][2391555] Num frames 3200...
[2023-02-25 12:10:35,203][2391555] Num frames 3300...
[2023-02-25 12:10:35,271][2391555] Num frames 3400...
[2023-02-25 12:10:35,339][2391555] Num frames 3500...
[2023-02-25 12:10:35,408][2391555] Num frames 3600...
[2023-02-25 12:10:35,475][2391555] Num frames 3700...
[2023-02-25 12:10:35,545][2391555] Num frames 3800...
[2023-02-25 12:10:35,613][2391555] Num frames 3900...
[2023-02-25 12:10:35,680][2391555] Num frames 4000...
[2023-02-25 12:10:35,749][2391555] Num frames 4100...
[2023-02-25 12:10:35,819][2391555] Num frames 4200...
[2023-02-25 12:10:35,889][2391555] Num frames 4300...
[2023-02-25 12:10:35,957][2391555] Num frames 4400...
[2023-02-25 12:10:36,024][2391555] Num frames 4500...
[2023-02-25 12:10:36,093][2391555] Num frames 4600...
[2023-02-25 12:10:36,163][2391555] Num frames 4700...
[2023-02-25 12:10:36,231][2391555] Num frames 4800...
[2023-02-25 12:10:36,299][2391555] Num frames 4900...
[2023-02-25 12:10:36,361][2391555] Avg episode rewards: #0: 31.282, true rewards: #0: 12.283
[2023-02-25 12:10:36,361][2391555] Avg episode reward: 31.282, avg true_objective: 12.283
[2023-02-25 12:10:36,443][2391555] Num frames 5000...
[2023-02-25 12:10:36,511][2391555] Num frames 5100...
[2023-02-25 12:10:36,580][2391555] Num frames 5200...
[2023-02-25 12:10:36,647][2391555] Num frames 5300...
[2023-02-25 12:10:36,716][2391555] Num frames 5400...
[2023-02-25 12:10:36,786][2391555] Num frames 5500...
[2023-02-25 12:10:36,854][2391555] Num frames 5600...
[2023-02-25 12:10:36,923][2391555] Num frames 5700...
[2023-02-25 12:10:36,991][2391555] Num frames 5800...
[2023-02-25 12:10:37,060][2391555] Num frames 5900...
[2023-02-25 12:10:37,128][2391555] Num frames 6000...
[2023-02-25 12:10:37,198][2391555] Num frames 6100...
[2023-02-25 12:10:37,280][2391555] Num frames 6200...
[2023-02-25 12:10:37,350][2391555] Num frames 6300...
[2023-02-25 12:10:37,418][2391555] Num frames 6400...
[2023-02-25 12:10:37,488][2391555] Num frames 6500...
[2023-02-25 12:10:37,557][2391555] Num frames 6600...
[2023-02-25 12:10:37,625][2391555] Num frames 6700...
[2023-02-25 12:10:37,696][2391555] Num frames 6800...
[2023-02-25 12:10:37,765][2391555] Num frames 6900...
[2023-02-25 12:10:37,834][2391555] Num frames 7000...
[2023-02-25 12:10:37,895][2391555] Avg episode rewards: #0: 36.425, true rewards: #0: 14.026
[2023-02-25 12:10:37,896][2391555] Avg episode reward: 36.425, avg true_objective: 14.026
[2023-02-25 12:10:37,979][2391555] Num frames 7100...
[2023-02-25 12:10:38,045][2391555] Num frames 7200...
[2023-02-25 12:10:38,113][2391555] Num frames 7300...
[2023-02-25 12:10:38,182][2391555] Num frames 7400...
[2023-02-25 12:10:38,251][2391555] Num frames 7500...
[2023-02-25 12:10:38,318][2391555] Num frames 7600...
[2023-02-25 12:10:38,386][2391555] Num frames 7700...
[2023-02-25 12:10:38,455][2391555] Num frames 7800...
[2023-02-25 12:10:38,522][2391555] Num frames 7900...
[2023-02-25 12:10:38,591][2391555] Num frames 8000...
[2023-02-25 12:10:38,659][2391555] Num frames 8100...
[2023-02-25 12:10:38,729][2391555] Num frames 8200...
[2023-02-25 12:10:38,798][2391555] Num frames 8300...
[2023-02-25 12:10:38,866][2391555] Num frames 8400...
[2023-02-25 12:10:38,936][2391555] Num frames 8500...
[2023-02-25 12:10:39,004][2391555] Num frames 8600...
[2023-02-25 12:10:39,072][2391555] Num frames 8700...
[2023-02-25 12:10:39,131][2391555] Avg episode rewards: #0: 38.514, true rewards: #0: 14.515
[2023-02-25 12:10:39,131][2391555] Avg episode reward: 38.514, avg true_objective: 14.515
[2023-02-25 12:10:39,218][2391555] Num frames 8800...
[2023-02-25 12:10:39,287][2391555] Num frames 8900...
[2023-02-25 12:10:39,364][2391555] Num frames 9000...
[2023-02-25 12:10:39,433][2391555] Num frames 9100...
[2023-02-25 12:10:39,502][2391555] Num frames 9200...
[2023-02-25 12:10:39,570][2391555] Num frames 9300...
[2023-02-25 12:10:39,638][2391555] Num frames 9400...
[2023-02-25 12:10:39,706][2391555] Num frames 9500...
[2023-02-25 12:10:39,774][2391555] Num frames 9600...
[2023-02-25 12:10:39,842][2391555] Num frames 9700...
[2023-02-25 12:10:39,934][2391555] Avg episode rewards: #0: 36.367, true rewards: #0: 13.939
[2023-02-25 12:10:39,935][2391555] Avg episode reward: 36.367, avg true_objective: 13.939
[2023-02-25 12:10:40,022][2391555] Num frames 9800...
[2023-02-25 12:10:40,090][2391555] Num frames 9900...
[2023-02-25 12:10:40,159][2391555] Num frames 10000...
[2023-02-25 12:10:40,227][2391555] Num frames 10100...
[2023-02-25 12:10:40,296][2391555] Num frames 10200...
[2023-02-25 12:10:40,365][2391555] Num frames 10300...
[2023-02-25 12:10:40,434][2391555] Num frames 10400...
[2023-02-25 12:10:40,503][2391555] Num frames 10500...
[2023-02-25 12:10:40,571][2391555] Num frames 10600...
[2023-02-25 12:10:40,641][2391555] Num frames 10700...
[2023-02-25 12:10:40,712][2391555] Num frames 10800...
[2023-02-25 12:10:40,783][2391555] Num frames 10900...
[2023-02-25 12:10:40,852][2391555] Num frames 11000...
[2023-02-25 12:10:40,922][2391555] Num frames 11100...
[2023-02-25 12:10:40,991][2391555] Num frames 11200...
[2023-02-25 12:10:41,048][2391555] Avg episode rewards: #0: 36.757, true rewards: #0: 14.007
[2023-02-25 12:10:41,048][2391555] Avg episode reward: 36.757, avg true_objective: 14.007
[2023-02-25 12:10:41,141][2391555] Num frames 11300...
[2023-02-25 12:10:41,209][2391555] Num frames 11400...
[2023-02-25 12:10:41,278][2391555] Num frames 11500...
[2023-02-25 12:10:41,347][2391555] Num frames 11600...
[2023-02-25 12:10:41,419][2391555] Num frames 11700...
[2023-02-25 12:10:41,489][2391555] Num frames 11800...
[2023-02-25 12:10:41,559][2391555] Num frames 11900...
[2023-02-25 12:10:41,626][2391555] Num frames 12000...
[2023-02-25 12:10:41,694][2391555] Num frames 12100...
[2023-02-25 12:10:41,793][2391555] Avg episode rewards: #0: 34.740, true rewards: #0: 13.518
[2023-02-25 12:10:41,793][2391555] Avg episode reward: 34.740, avg true_objective: 13.518
[2023-02-25 12:10:41,834][2391555] Num frames 12200...
[2023-02-25 12:10:41,911][2391555] Num frames 12300...
[2023-02-25 12:10:41,981][2391555] Num frames 12400...
[2023-02-25 12:10:42,050][2391555] Num frames 12500...
[2023-02-25 12:10:42,121][2391555] Num frames 12600...
[2023-02-25 12:10:42,192][2391555] Num frames 12700...
[2023-02-25 12:10:42,261][2391555] Num frames 12800...
[2023-02-25 12:10:42,330][2391555] Num frames 12900...
[2023-02-25 12:10:42,399][2391555] Num frames 13000...
[2023-02-25 12:10:42,468][2391555] Num frames 13100...
[2023-02-25 12:10:42,538][2391555] Num frames 13200...
[2023-02-25 12:10:42,607][2391555] Num frames 13300...
[2023-02-25 12:10:42,675][2391555] Num frames 13400...
[2023-02-25 12:10:42,748][2391555] Num frames 13500...
[2023-02-25 12:10:42,820][2391555] Num frames 13600...
[2023-02-25 12:10:42,891][2391555] Num frames 13700...
[2023-02-25 12:10:42,964][2391555] Num frames 13800...
[2023-02-25 12:10:43,037][2391555] Num frames 13900...
[2023-02-25 12:10:43,108][2391555] Num frames 14000...
[2023-02-25 12:10:43,178][2391555] Num frames 14100...
[2023-02-25 12:10:43,251][2391555] Num frames 14200...
[2023-02-25 12:10:43,351][2391555] Avg episode rewards: #0: 37.566, true rewards: #0: 14.266
[2023-02-25 12:10:43,351][2391555] Avg episode reward: 37.566, avg true_objective: 14.266
[2023-02-25 12:11:00,944][2391555] Replay video saved to train_dir/default_experiment/replay.mp4!