2022-05-03 11:40:27,479 INFO [train.py:775] (0/8) Training started 2022-05-03 11:40:27,483 INFO [train.py:785] (0/8) Device: cuda:0 2022-05-03 11:40:27,485 INFO [train.py:794] (0/8) {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'encoder_dim': 512, 'nhead': 8, 'dim_feedforward': 2048, 'num_encoder_layers': 12, 'decoder_dim': 512, 'joiner_dim': 512, 'model_warm_step': 3000, 'env_info': {'k2-version': '1.14', 'k2-build-type': 'Debug', 'k2-with-cuda': True, 'k2-git-sha1': '1b29f0a946f50186aaa82df46a59f492ade9692b', 'k2-git-date': 'Tue Apr 12 20:46:49 2022', 'lhotse-version': '1.1.0', 'torch-version': '1.10.1+cu111', 'torch-cuda-available': True, 'torch-cuda-version': '11.1', 'python-version': '3.8', 'icefall-git-branch': 'spgi', 'icefall-git-sha1': 'e2e5c77-dirty', 'icefall-git-date': 'Mon May 2 14:38:25 2022', 'icefall-path': '/exp/draj/mini_scale_2022/icefall', 'k2-path': '/exp/draj/mini_scale_2022/k2/k2/python/k2/__init__.py', 'lhotse-path': '/exp/draj/mini_scale_2022/lhotse/lhotse/__init__.py', 'hostname': 'r8n04', 'IP address': '10.1.8.4'}, 'world_size': 8, 'master_port': 12354, 'tensorboard': True, 'num_epochs': 20, 'start_epoch': 0, 'start_batch': 0, 'exp_dir': PosixPath('pruned_transducer_stateless2/exp/v2'), 'bpe_model': 'data/lang_bpe_500/bpe.model', 'initial_lr': 0.003, 'lr_batches': 5000, 'lr_epochs': 4, 'context_size': 2, 'prune_range': 5, 'lm_scale': 0.25, 'am_scale': 0.0, 'simple_loss_scale': 0.5, 'seed': 42, 'print_diagnostics': False, 'save_every_n': 8000, 'keep_last_k': 10, 'use_fp16': True, 'manifest_dir': PosixPath('data/manifests'), 'enable_musan': True, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'max_duration': 200, 'num_buckets': 30, 'on_the_fly_feats': False, 'shuffle': True, 'num_workers': 8, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'blank_id': 0, 'vocab_size': 500} 2022-05-03 11:40:27,486 INFO [train.py:796] (0/8) About to create model 2022-05-03 11:40:27,846 INFO [train.py:800] (0/8) Number of model parameters: 78648040 2022-05-03 11:40:33,552 INFO [train.py:806] (0/8) Using DDP 2022-05-03 11:40:34,134 INFO [asr_datamodule.py:321] (0/8) About to get SPGISpeech train cuts 2022-05-03 11:40:34,137 INFO [asr_datamodule.py:179] (0/8) About to get Musan cuts 2022-05-03 11:40:35,817 INFO [asr_datamodule.py:184] (0/8) Enable MUSAN 2022-05-03 11:40:35,817 INFO [asr_datamodule.py:207] (0/8) Enable SpecAugment 2022-05-03 11:40:35,817 INFO [asr_datamodule.py:208] (0/8) Time warp factor: 80 2022-05-03 11:40:35,817 INFO [asr_datamodule.py:221] (0/8) About to create train dataset 2022-05-03 11:40:35,817 INFO [asr_datamodule.py:234] (0/8) Using DynamicBucketingSampler. 2022-05-03 11:40:36,213 INFO [asr_datamodule.py:242] (0/8) About to create train dataloader 2022-05-03 11:40:36,214 INFO [asr_datamodule.py:326] (0/8) About to get SPGISpeech dev cuts 2022-05-03 11:40:36,215 INFO [asr_datamodule.py:274] (0/8) About to create dev dataset 2022-05-03 11:40:36,363 INFO [asr_datamodule.py:289] (0/8) About to create dev dataloader 2022-05-03 11:41:08,013 INFO [train.py:715] (0/8) Epoch 0, batch 0, loss[loss=3.35, simple_loss=6.699, pruned_loss=5.971, over 4984.00 frames.], tot_loss[loss=3.35, simple_loss=6.699, pruned_loss=5.971, over 4984.00 frames.], batch size: 14, lr: 3.00e-03 2022-05-03 11:41:08,405 INFO [distributed.py:874] (0/8) Reducer buckets have been rebuilt in this iteration. 2022-05-03 11:41:46,318 INFO [train.py:715] (0/8) Epoch 0, batch 50, loss[loss=0.4917, simple_loss=0.9833, pruned_loss=6.845, over 4856.00 frames.], tot_loss[loss=1.324, simple_loss=2.648, pruned_loss=6.46, over 219349.22 frames.], batch size: 20, lr: 3.00e-03 2022-05-03 11:42:25,580 INFO [train.py:715] (0/8) Epoch 0, batch 100, loss[loss=0.4045, simple_loss=0.8091, pruned_loss=6.698, over 4889.00 frames.], tot_loss[loss=0.818, simple_loss=1.636, pruned_loss=6.578, over 386837.66 frames.], batch size: 22, lr: 3.00e-03 2022-05-03 11:43:04,755 INFO [train.py:715] (0/8) Epoch 0, batch 150, loss[loss=0.3684, simple_loss=0.7368, pruned_loss=6.657, over 4937.00 frames.], tot_loss[loss=0.6287, simple_loss=1.257, pruned_loss=6.587, over 516922.01 frames.], batch size: 29, lr: 3.00e-03 2022-05-03 11:43:43,124 INFO [train.py:715] (0/8) Epoch 0, batch 200, loss[loss=0.3229, simple_loss=0.6458, pruned_loss=6.613, over 4789.00 frames.], tot_loss[loss=0.5306, simple_loss=1.061, pruned_loss=6.576, over 618433.26 frames.], batch size: 18, lr: 3.00e-03 2022-05-03 11:44:22,066 INFO [train.py:715] (0/8) Epoch 0, batch 250, loss[loss=0.31, simple_loss=0.6199, pruned_loss=6.486, over 4781.00 frames.], tot_loss[loss=0.4727, simple_loss=0.9453, pruned_loss=6.592, over 696675.51 frames.], batch size: 12, lr: 3.00e-03 2022-05-03 11:45:01,537 INFO [train.py:715] (0/8) Epoch 0, batch 300, loss[loss=0.3643, simple_loss=0.7285, pruned_loss=6.801, over 4962.00 frames.], tot_loss[loss=0.4331, simple_loss=0.8662, pruned_loss=6.604, over 757091.79 frames.], batch size: 21, lr: 3.00e-03 2022-05-03 11:45:41,185 INFO [train.py:715] (0/8) Epoch 0, batch 350, loss[loss=0.3418, simple_loss=0.6837, pruned_loss=6.771, over 4988.00 frames.], tot_loss[loss=0.4058, simple_loss=0.8116, pruned_loss=6.62, over 805450.46 frames.], batch size: 25, lr: 3.00e-03 2022-05-03 11:46:19,554 INFO [train.py:715] (0/8) Epoch 0, batch 400, loss[loss=0.3193, simple_loss=0.6387, pruned_loss=6.502, over 4816.00 frames.], tot_loss[loss=0.3849, simple_loss=0.7699, pruned_loss=6.631, over 842997.16 frames.], batch size: 25, lr: 3.00e-03 2022-05-03 11:46:58,907 INFO [train.py:715] (0/8) Epoch 0, batch 450, loss[loss=0.3053, simple_loss=0.6106, pruned_loss=6.644, over 4753.00 frames.], tot_loss[loss=0.3688, simple_loss=0.7375, pruned_loss=6.641, over 871893.64 frames.], batch size: 19, lr: 2.99e-03 2022-05-03 11:47:38,004 INFO [train.py:715] (0/8) Epoch 0, batch 500, loss[loss=0.3297, simple_loss=0.6595, pruned_loss=6.656, over 4783.00 frames.], tot_loss[loss=0.3568, simple_loss=0.7136, pruned_loss=6.644, over 894345.08 frames.], batch size: 17, lr: 2.99e-03 2022-05-03 11:48:17,109 INFO [train.py:715] (0/8) Epoch 0, batch 550, loss[loss=0.3036, simple_loss=0.6072, pruned_loss=6.77, over 4966.00 frames.], tot_loss[loss=0.346, simple_loss=0.6919, pruned_loss=6.649, over 911493.45 frames.], batch size: 21, lr: 2.99e-03 2022-05-03 11:48:55,926 INFO [train.py:715] (0/8) Epoch 0, batch 600, loss[loss=0.2961, simple_loss=0.5922, pruned_loss=6.696, over 4783.00 frames.], tot_loss[loss=0.3372, simple_loss=0.6744, pruned_loss=6.663, over 925310.16 frames.], batch size: 18, lr: 2.99e-03 2022-05-03 11:49:35,150 INFO [train.py:715] (0/8) Epoch 0, batch 650, loss[loss=0.2823, simple_loss=0.5647, pruned_loss=6.729, over 4939.00 frames.], tot_loss[loss=0.3261, simple_loss=0.6522, pruned_loss=6.681, over 935339.24 frames.], batch size: 21, lr: 2.99e-03 2022-05-03 11:50:14,497 INFO [train.py:715] (0/8) Epoch 0, batch 700, loss[loss=0.2555, simple_loss=0.5109, pruned_loss=6.791, over 4937.00 frames.], tot_loss[loss=0.3146, simple_loss=0.6293, pruned_loss=6.701, over 943595.29 frames.], batch size: 23, lr: 2.99e-03 2022-05-03 11:50:53,000 INFO [train.py:715] (0/8) Epoch 0, batch 750, loss[loss=0.2407, simple_loss=0.4814, pruned_loss=6.704, over 4730.00 frames.], tot_loss[loss=0.3021, simple_loss=0.6041, pruned_loss=6.715, over 949588.70 frames.], batch size: 16, lr: 2.98e-03 2022-05-03 11:51:32,782 INFO [train.py:715] (0/8) Epoch 0, batch 800, loss[loss=0.2425, simple_loss=0.485, pruned_loss=6.622, over 4842.00 frames.], tot_loss[loss=0.291, simple_loss=0.582, pruned_loss=6.719, over 954166.52 frames.], batch size: 30, lr: 2.98e-03 2022-05-03 11:52:12,746 INFO [train.py:715] (0/8) Epoch 0, batch 850, loss[loss=0.2352, simple_loss=0.4704, pruned_loss=6.764, over 4885.00 frames.], tot_loss[loss=0.2796, simple_loss=0.5592, pruned_loss=6.714, over 958183.46 frames.], batch size: 16, lr: 2.98e-03 2022-05-03 11:52:51,639 INFO [train.py:715] (0/8) Epoch 0, batch 900, loss[loss=0.224, simple_loss=0.4479, pruned_loss=6.665, over 4829.00 frames.], tot_loss[loss=0.2698, simple_loss=0.5396, pruned_loss=6.711, over 961368.52 frames.], batch size: 26, lr: 2.98e-03 2022-05-03 11:53:30,231 INFO [train.py:715] (0/8) Epoch 0, batch 950, loss[loss=0.2398, simple_loss=0.4795, pruned_loss=6.654, over 4926.00 frames.], tot_loss[loss=0.2603, simple_loss=0.5207, pruned_loss=6.71, over 964381.24 frames.], batch size: 29, lr: 2.97e-03 2022-05-03 11:54:09,542 INFO [train.py:715] (0/8) Epoch 0, batch 1000, loss[loss=0.2421, simple_loss=0.4842, pruned_loss=6.797, over 4870.00 frames.], tot_loss[loss=0.2528, simple_loss=0.5056, pruned_loss=6.717, over 965941.48 frames.], batch size: 20, lr: 2.97e-03 2022-05-03 11:54:48,898 INFO [train.py:715] (0/8) Epoch 0, batch 1050, loss[loss=0.2117, simple_loss=0.4233, pruned_loss=6.768, over 4867.00 frames.], tot_loss[loss=0.2455, simple_loss=0.4911, pruned_loss=6.72, over 967173.55 frames.], batch size: 32, lr: 2.97e-03 2022-05-03 11:55:27,470 INFO [train.py:715] (0/8) Epoch 0, batch 1100, loss[loss=0.2168, simple_loss=0.4335, pruned_loss=6.714, over 4781.00 frames.], tot_loss[loss=0.238, simple_loss=0.476, pruned_loss=6.716, over 967361.56 frames.], batch size: 14, lr: 2.96e-03 2022-05-03 11:56:07,472 INFO [train.py:715] (0/8) Epoch 0, batch 1150, loss[loss=0.1893, simple_loss=0.3787, pruned_loss=6.734, over 4768.00 frames.], tot_loss[loss=0.232, simple_loss=0.464, pruned_loss=6.72, over 967904.60 frames.], batch size: 18, lr: 2.96e-03 2022-05-03 11:56:47,809 INFO [train.py:715] (0/8) Epoch 0, batch 1200, loss[loss=0.2058, simple_loss=0.4116, pruned_loss=6.79, over 4939.00 frames.], tot_loss[loss=0.227, simple_loss=0.454, pruned_loss=6.723, over 968589.95 frames.], batch size: 21, lr: 2.96e-03 2022-05-03 11:57:28,435 INFO [train.py:715] (0/8) Epoch 0, batch 1250, loss[loss=0.2016, simple_loss=0.4031, pruned_loss=6.763, over 4910.00 frames.], tot_loss[loss=0.2219, simple_loss=0.4439, pruned_loss=6.72, over 970440.09 frames.], batch size: 17, lr: 2.95e-03 2022-05-03 11:58:07,332 INFO [train.py:715] (0/8) Epoch 0, batch 1300, loss[loss=0.2214, simple_loss=0.4429, pruned_loss=6.699, over 4852.00 frames.], tot_loss[loss=0.2175, simple_loss=0.435, pruned_loss=6.72, over 971593.59 frames.], batch size: 38, lr: 2.95e-03 2022-05-03 11:58:47,763 INFO [train.py:715] (0/8) Epoch 0, batch 1350, loss[loss=0.2165, simple_loss=0.433, pruned_loss=6.734, over 4978.00 frames.], tot_loss[loss=0.2138, simple_loss=0.4276, pruned_loss=6.718, over 972443.51 frames.], batch size: 31, lr: 2.95e-03 2022-05-03 11:59:28,709 INFO [train.py:715] (0/8) Epoch 0, batch 1400, loss[loss=0.1668, simple_loss=0.3336, pruned_loss=6.539, over 4772.00 frames.], tot_loss[loss=0.2104, simple_loss=0.4208, pruned_loss=6.716, over 972463.91 frames.], batch size: 12, lr: 2.94e-03 2022-05-03 12:00:09,324 INFO [train.py:715] (0/8) Epoch 0, batch 1450, loss[loss=0.1931, simple_loss=0.3862, pruned_loss=6.741, over 4945.00 frames.], tot_loss[loss=0.2081, simple_loss=0.4162, pruned_loss=6.715, over 972164.29 frames.], batch size: 24, lr: 2.94e-03 2022-05-03 12:00:48,853 INFO [train.py:715] (0/8) Epoch 0, batch 1500, loss[loss=0.1947, simple_loss=0.3893, pruned_loss=6.798, over 4746.00 frames.], tot_loss[loss=0.2042, simple_loss=0.4085, pruned_loss=6.705, over 971813.56 frames.], batch size: 19, lr: 2.94e-03 2022-05-03 12:01:29,916 INFO [train.py:715] (0/8) Epoch 0, batch 1550, loss[loss=0.1604, simple_loss=0.3208, pruned_loss=6.554, over 4827.00 frames.], tot_loss[loss=0.2017, simple_loss=0.4034, pruned_loss=6.704, over 972551.96 frames.], batch size: 13, lr: 2.93e-03 2022-05-03 12:02:11,267 INFO [train.py:715] (0/8) Epoch 0, batch 1600, loss[loss=0.1618, simple_loss=0.3236, pruned_loss=6.649, over 4763.00 frames.], tot_loss[loss=0.2001, simple_loss=0.4002, pruned_loss=6.701, over 972742.47 frames.], batch size: 19, lr: 2.93e-03 2022-05-03 12:02:51,032 INFO [train.py:715] (0/8) Epoch 0, batch 1650, loss[loss=0.2011, simple_loss=0.4021, pruned_loss=6.73, over 4866.00 frames.], tot_loss[loss=0.1977, simple_loss=0.3953, pruned_loss=6.7, over 972770.27 frames.], batch size: 32, lr: 2.92e-03 2022-05-03 12:03:32,804 INFO [train.py:715] (0/8) Epoch 0, batch 1700, loss[loss=0.2108, simple_loss=0.4215, pruned_loss=6.681, over 4977.00 frames.], tot_loss[loss=0.195, simple_loss=0.3899, pruned_loss=6.691, over 972048.18 frames.], batch size: 31, lr: 2.92e-03 2022-05-03 12:04:14,553 INFO [train.py:715] (0/8) Epoch 0, batch 1750, loss[loss=0.1917, simple_loss=0.3834, pruned_loss=6.71, over 4935.00 frames.], tot_loss[loss=0.1934, simple_loss=0.3869, pruned_loss=6.689, over 971598.62 frames.], batch size: 23, lr: 2.91e-03 2022-05-03 12:04:55,999 INFO [train.py:715] (0/8) Epoch 0, batch 1800, loss[loss=0.1962, simple_loss=0.3924, pruned_loss=6.706, over 4902.00 frames.], tot_loss[loss=0.1925, simple_loss=0.3851, pruned_loss=6.686, over 972719.32 frames.], batch size: 17, lr: 2.91e-03 2022-05-03 12:05:36,593 INFO [train.py:715] (0/8) Epoch 0, batch 1850, loss[loss=0.1641, simple_loss=0.3281, pruned_loss=6.627, over 4952.00 frames.], tot_loss[loss=0.1903, simple_loss=0.3805, pruned_loss=6.679, over 973664.35 frames.], batch size: 21, lr: 2.91e-03 2022-05-03 12:06:18,630 INFO [train.py:715] (0/8) Epoch 0, batch 1900, loss[loss=0.1793, simple_loss=0.3586, pruned_loss=6.591, over 4780.00 frames.], tot_loss[loss=0.1894, simple_loss=0.3789, pruned_loss=6.679, over 972688.68 frames.], batch size: 17, lr: 2.90e-03 2022-05-03 12:07:00,148 INFO [train.py:715] (0/8) Epoch 0, batch 1950, loss[loss=0.1504, simple_loss=0.3008, pruned_loss=6.598, over 4826.00 frames.], tot_loss[loss=0.1876, simple_loss=0.3752, pruned_loss=6.674, over 972340.09 frames.], batch size: 13, lr: 2.90e-03 2022-05-03 12:07:38,876 INFO [train.py:715] (0/8) Epoch 0, batch 2000, loss[loss=0.1979, simple_loss=0.3959, pruned_loss=6.615, over 4749.00 frames.], tot_loss[loss=0.1858, simple_loss=0.3715, pruned_loss=6.67, over 972194.98 frames.], batch size: 19, lr: 2.89e-03 2022-05-03 12:08:19,992 INFO [train.py:715] (0/8) Epoch 0, batch 2050, loss[loss=0.1832, simple_loss=0.3664, pruned_loss=6.758, over 4781.00 frames.], tot_loss[loss=0.1836, simple_loss=0.3673, pruned_loss=6.66, over 971573.47 frames.], batch size: 17, lr: 2.89e-03 2022-05-03 12:09:00,596 INFO [train.py:715] (0/8) Epoch 0, batch 2100, loss[loss=0.164, simple_loss=0.3281, pruned_loss=6.607, over 4831.00 frames.], tot_loss[loss=0.1823, simple_loss=0.3645, pruned_loss=6.659, over 971272.13 frames.], batch size: 15, lr: 2.88e-03 2022-05-03 12:09:41,209 INFO [train.py:715] (0/8) Epoch 0, batch 2150, loss[loss=0.1849, simple_loss=0.3699, pruned_loss=6.671, over 4886.00 frames.], tot_loss[loss=0.1817, simple_loss=0.3634, pruned_loss=6.663, over 971537.33 frames.], batch size: 22, lr: 2.88e-03 2022-05-03 12:10:20,500 INFO [train.py:715] (0/8) Epoch 0, batch 2200, loss[loss=0.1759, simple_loss=0.3518, pruned_loss=6.665, over 4850.00 frames.], tot_loss[loss=0.1803, simple_loss=0.3607, pruned_loss=6.664, over 972593.10 frames.], batch size: 32, lr: 2.87e-03 2022-05-03 12:11:01,495 INFO [train.py:715] (0/8) Epoch 0, batch 2250, loss[loss=0.178, simple_loss=0.3559, pruned_loss=6.707, over 4888.00 frames.], tot_loss[loss=0.1803, simple_loss=0.3605, pruned_loss=6.668, over 972442.24 frames.], batch size: 32, lr: 2.86e-03 2022-05-03 12:11:42,776 INFO [train.py:715] (0/8) Epoch 0, batch 2300, loss[loss=0.1692, simple_loss=0.3383, pruned_loss=6.554, over 4979.00 frames.], tot_loss[loss=0.1787, simple_loss=0.3575, pruned_loss=6.664, over 972871.32 frames.], batch size: 15, lr: 2.86e-03 2022-05-03 12:12:22,384 INFO [train.py:715] (0/8) Epoch 0, batch 2350, loss[loss=0.1691, simple_loss=0.3382, pruned_loss=6.624, over 4967.00 frames.], tot_loss[loss=0.1772, simple_loss=0.3545, pruned_loss=6.662, over 972414.42 frames.], batch size: 14, lr: 2.85e-03 2022-05-03 12:13:03,128 INFO [train.py:715] (0/8) Epoch 0, batch 2400, loss[loss=0.1782, simple_loss=0.3564, pruned_loss=6.65, over 4921.00 frames.], tot_loss[loss=0.1763, simple_loss=0.3526, pruned_loss=6.666, over 972436.09 frames.], batch size: 29, lr: 2.85e-03 2022-05-03 12:13:43,818 INFO [train.py:715] (0/8) Epoch 0, batch 2450, loss[loss=0.1737, simple_loss=0.3473, pruned_loss=6.66, over 4873.00 frames.], tot_loss[loss=0.1754, simple_loss=0.3509, pruned_loss=6.668, over 972520.48 frames.], batch size: 20, lr: 2.84e-03 2022-05-03 12:14:24,675 INFO [train.py:715] (0/8) Epoch 0, batch 2500, loss[loss=0.1759, simple_loss=0.3517, pruned_loss=6.594, over 4925.00 frames.], tot_loss[loss=0.1747, simple_loss=0.3494, pruned_loss=6.668, over 972795.65 frames.], batch size: 23, lr: 2.84e-03 2022-05-03 12:15:03,911 INFO [train.py:715] (0/8) Epoch 0, batch 2550, loss[loss=0.1757, simple_loss=0.3514, pruned_loss=6.602, over 4899.00 frames.], tot_loss[loss=0.1743, simple_loss=0.3486, pruned_loss=6.669, over 972591.26 frames.], batch size: 19, lr: 2.83e-03 2022-05-03 12:15:44,626 INFO [train.py:715] (0/8) Epoch 0, batch 2600, loss[loss=0.1501, simple_loss=0.3002, pruned_loss=6.622, over 4847.00 frames.], tot_loss[loss=0.173, simple_loss=0.346, pruned_loss=6.66, over 973165.60 frames.], batch size: 30, lr: 2.83e-03 2022-05-03 12:16:25,713 INFO [train.py:715] (0/8) Epoch 0, batch 2650, loss[loss=0.145, simple_loss=0.2899, pruned_loss=6.569, over 4698.00 frames.], tot_loss[loss=0.1726, simple_loss=0.3452, pruned_loss=6.657, over 972854.66 frames.], batch size: 15, lr: 2.82e-03 2022-05-03 12:17:08,083 INFO [train.py:715] (0/8) Epoch 0, batch 2700, loss[loss=0.1613, simple_loss=0.3226, pruned_loss=6.584, over 4818.00 frames.], tot_loss[loss=0.1716, simple_loss=0.3431, pruned_loss=6.648, over 973272.60 frames.], batch size: 13, lr: 2.81e-03 2022-05-03 12:17:48,876 INFO [train.py:715] (0/8) Epoch 0, batch 2750, loss[loss=0.1675, simple_loss=0.3349, pruned_loss=6.731, over 4877.00 frames.], tot_loss[loss=0.1714, simple_loss=0.3428, pruned_loss=6.647, over 972479.50 frames.], batch size: 22, lr: 2.81e-03 2022-05-03 12:18:29,713 INFO [train.py:715] (0/8) Epoch 0, batch 2800, loss[loss=0.176, simple_loss=0.352, pruned_loss=6.445, over 4932.00 frames.], tot_loss[loss=0.1708, simple_loss=0.3416, pruned_loss=6.644, over 973372.64 frames.], batch size: 29, lr: 2.80e-03 2022-05-03 12:19:10,261 INFO [train.py:715] (0/8) Epoch 0, batch 2850, loss[loss=0.1984, simple_loss=0.3968, pruned_loss=6.713, over 4983.00 frames.], tot_loss[loss=0.1719, simple_loss=0.3437, pruned_loss=6.645, over 973535.60 frames.], batch size: 26, lr: 2.80e-03 2022-05-03 12:19:49,115 INFO [train.py:715] (0/8) Epoch 0, batch 2900, loss[loss=0.1899, simple_loss=0.3799, pruned_loss=6.816, over 4958.00 frames.], tot_loss[loss=0.1709, simple_loss=0.3419, pruned_loss=6.64, over 973165.20 frames.], batch size: 24, lr: 2.79e-03 2022-05-03 12:20:29,367 INFO [train.py:715] (0/8) Epoch 0, batch 2950, loss[loss=0.1565, simple_loss=0.3129, pruned_loss=6.629, over 4945.00 frames.], tot_loss[loss=0.1707, simple_loss=0.3415, pruned_loss=6.643, over 972228.53 frames.], batch size: 21, lr: 2.78e-03 2022-05-03 12:21:11,358 INFO [train.py:715] (0/8) Epoch 0, batch 3000, loss[loss=0.8425, simple_loss=0.3422, pruned_loss=6.714, over 4865.00 frames.], tot_loss[loss=0.2065, simple_loss=0.3399, pruned_loss=6.644, over 972535.80 frames.], batch size: 30, lr: 2.78e-03 2022-05-03 12:21:11,359 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 12:21:21,130 INFO [train.py:742] (0/8) Epoch 0, validation: loss=2.223, simple_loss=0.2788, pruned_loss=2.083, over 914524.00 frames. 2022-05-03 12:22:02,153 INFO [train.py:715] (0/8) Epoch 0, batch 3050, loss[loss=0.2286, simple_loss=0.3269, pruned_loss=0.6521, over 4742.00 frames.], tot_loss[loss=0.2236, simple_loss=0.3429, pruned_loss=5.409, over 972561.98 frames.], batch size: 16, lr: 2.77e-03 2022-05-03 12:22:41,560 INFO [train.py:715] (0/8) Epoch 0, batch 3100, loss[loss=0.1994, simple_loss=0.3304, pruned_loss=0.342, over 4923.00 frames.], tot_loss[loss=0.2233, simple_loss=0.3431, pruned_loss=4.323, over 971965.83 frames.], batch size: 23, lr: 2.77e-03 2022-05-03 12:23:22,412 INFO [train.py:715] (0/8) Epoch 0, batch 3150, loss[loss=0.2057, simple_loss=0.352, pruned_loss=0.2974, over 4753.00 frames.], tot_loss[loss=0.2182, simple_loss=0.3421, pruned_loss=3.434, over 971583.13 frames.], batch size: 16, lr: 2.76e-03 2022-05-03 12:24:03,659 INFO [train.py:715] (0/8) Epoch 0, batch 3200, loss[loss=0.1667, simple_loss=0.2959, pruned_loss=0.1872, over 4971.00 frames.], tot_loss[loss=0.2129, simple_loss=0.3412, pruned_loss=2.728, over 972015.24 frames.], batch size: 14, lr: 2.75e-03 2022-05-03 12:24:44,873 INFO [train.py:715] (0/8) Epoch 0, batch 3250, loss[loss=0.2032, simple_loss=0.3576, pruned_loss=0.2445, over 4772.00 frames.], tot_loss[loss=0.2071, simple_loss=0.3384, pruned_loss=2.174, over 971370.41 frames.], batch size: 18, lr: 2.75e-03 2022-05-03 12:25:24,112 INFO [train.py:715] (0/8) Epoch 0, batch 3300, loss[loss=0.1757, simple_loss=0.3128, pruned_loss=0.1931, over 4863.00 frames.], tot_loss[loss=0.2031, simple_loss=0.3377, pruned_loss=1.737, over 972906.75 frames.], batch size: 20, lr: 2.74e-03 2022-05-03 12:26:05,353 INFO [train.py:715] (0/8) Epoch 0, batch 3350, loss[loss=0.1647, simple_loss=0.2977, pruned_loss=0.1581, over 4967.00 frames.], tot_loss[loss=0.1971, simple_loss=0.3327, pruned_loss=1.394, over 972262.17 frames.], batch size: 14, lr: 2.73e-03 2022-05-03 12:26:46,179 INFO [train.py:715] (0/8) Epoch 0, batch 3400, loss[loss=0.1947, simple_loss=0.3469, pruned_loss=0.2127, over 4910.00 frames.], tot_loss[loss=0.1943, simple_loss=0.3321, pruned_loss=1.128, over 972982.77 frames.], batch size: 19, lr: 2.73e-03 2022-05-03 12:27:25,316 INFO [train.py:715] (0/8) Epoch 0, batch 3450, loss[loss=0.1675, simple_loss=0.3024, pruned_loss=0.1631, over 4848.00 frames.], tot_loss[loss=0.1924, simple_loss=0.3323, pruned_loss=0.921, over 972383.13 frames.], batch size: 15, lr: 2.72e-03 2022-05-03 12:28:06,923 INFO [train.py:715] (0/8) Epoch 0, batch 3500, loss[loss=0.1523, simple_loss=0.2763, pruned_loss=0.1413, over 4926.00 frames.], tot_loss[loss=0.1897, simple_loss=0.3305, pruned_loss=0.7566, over 972570.34 frames.], batch size: 29, lr: 2.72e-03 2022-05-03 12:28:48,554 INFO [train.py:715] (0/8) Epoch 0, batch 3550, loss[loss=0.1706, simple_loss=0.3081, pruned_loss=0.1661, over 4772.00 frames.], tot_loss[loss=0.1886, simple_loss=0.331, pruned_loss=0.6298, over 972566.05 frames.], batch size: 12, lr: 2.71e-03 2022-05-03 12:29:29,807 INFO [train.py:715] (0/8) Epoch 0, batch 3600, loss[loss=0.1524, simple_loss=0.2788, pruned_loss=0.1298, over 4818.00 frames.], tot_loss[loss=0.187, simple_loss=0.3301, pruned_loss=0.5298, over 971625.38 frames.], batch size: 13, lr: 2.70e-03 2022-05-03 12:30:08,995 INFO [train.py:715] (0/8) Epoch 0, batch 3650, loss[loss=0.1346, simple_loss=0.2516, pruned_loss=0.08801, over 4837.00 frames.], tot_loss[loss=0.185, simple_loss=0.3283, pruned_loss=0.4494, over 972623.97 frames.], batch size: 13, lr: 2.70e-03 2022-05-03 12:30:50,510 INFO [train.py:715] (0/8) Epoch 0, batch 3700, loss[loss=0.2039, simple_loss=0.3671, pruned_loss=0.2029, over 4903.00 frames.], tot_loss[loss=0.1844, simple_loss=0.3285, pruned_loss=0.39, over 971198.82 frames.], batch size: 17, lr: 2.69e-03 2022-05-03 12:31:32,097 INFO [train.py:715] (0/8) Epoch 0, batch 3750, loss[loss=0.1522, simple_loss=0.2771, pruned_loss=0.1368, over 4983.00 frames.], tot_loss[loss=0.1833, simple_loss=0.3277, pruned_loss=0.3414, over 971500.66 frames.], batch size: 25, lr: 2.68e-03 2022-05-03 12:32:11,307 INFO [train.py:715] (0/8) Epoch 0, batch 3800, loss[loss=0.185, simple_loss=0.3334, pruned_loss=0.1835, over 4984.00 frames.], tot_loss[loss=0.1833, simple_loss=0.3284, pruned_loss=0.3047, over 972987.00 frames.], batch size: 35, lr: 2.68e-03 2022-05-03 12:33:05,632 INFO [train.py:715] (0/8) Epoch 0, batch 3850, loss[loss=0.1731, simple_loss=0.3175, pruned_loss=0.1431, over 4885.00 frames.], tot_loss[loss=0.1814, simple_loss=0.3259, pruned_loss=0.2727, over 973604.67 frames.], batch size: 22, lr: 2.67e-03 2022-05-03 12:33:46,700 INFO [train.py:715] (0/8) Epoch 0, batch 3900, loss[loss=0.1408, simple_loss=0.2591, pruned_loss=0.113, over 4824.00 frames.], tot_loss[loss=0.1811, simple_loss=0.3261, pruned_loss=0.2497, over 973486.36 frames.], batch size: 15, lr: 2.66e-03 2022-05-03 12:34:26,861 INFO [train.py:715] (0/8) Epoch 0, batch 3950, loss[loss=0.1708, simple_loss=0.3088, pruned_loss=0.1638, over 4866.00 frames.], tot_loss[loss=0.1808, simple_loss=0.326, pruned_loss=0.2321, over 973333.31 frames.], batch size: 32, lr: 2.66e-03 2022-05-03 12:35:06,668 INFO [train.py:715] (0/8) Epoch 0, batch 4000, loss[loss=0.1729, simple_loss=0.3157, pruned_loss=0.151, over 4957.00 frames.], tot_loss[loss=0.1787, simple_loss=0.3229, pruned_loss=0.2147, over 972328.27 frames.], batch size: 21, lr: 2.65e-03 2022-05-03 12:35:47,595 INFO [train.py:715] (0/8) Epoch 0, batch 4050, loss[loss=0.1818, simple_loss=0.3294, pruned_loss=0.1712, over 4969.00 frames.], tot_loss[loss=0.1788, simple_loss=0.3233, pruned_loss=0.2043, over 972962.70 frames.], batch size: 15, lr: 2.64e-03 2022-05-03 12:36:28,805 INFO [train.py:715] (0/8) Epoch 0, batch 4100, loss[loss=0.1661, simple_loss=0.3034, pruned_loss=0.1442, over 4685.00 frames.], tot_loss[loss=0.1779, simple_loss=0.3221, pruned_loss=0.1941, over 972140.72 frames.], batch size: 15, lr: 2.64e-03 2022-05-03 12:37:07,952 INFO [train.py:715] (0/8) Epoch 0, batch 4150, loss[loss=0.193, simple_loss=0.3488, pruned_loss=0.1861, over 4821.00 frames.], tot_loss[loss=0.1782, simple_loss=0.3229, pruned_loss=0.1874, over 972144.91 frames.], batch size: 25, lr: 2.63e-03 2022-05-03 12:37:49,188 INFO [train.py:715] (0/8) Epoch 0, batch 4200, loss[loss=0.1465, simple_loss=0.2716, pruned_loss=0.1067, over 4912.00 frames.], tot_loss[loss=0.1776, simple_loss=0.3219, pruned_loss=0.1811, over 971372.22 frames.], batch size: 17, lr: 2.63e-03 2022-05-03 12:38:30,917 INFO [train.py:715] (0/8) Epoch 0, batch 4250, loss[loss=0.1897, simple_loss=0.3437, pruned_loss=0.1785, over 4807.00 frames.], tot_loss[loss=0.1773, simple_loss=0.3218, pruned_loss=0.1762, over 970924.41 frames.], batch size: 21, lr: 2.62e-03 2022-05-03 12:39:11,498 INFO [train.py:715] (0/8) Epoch 0, batch 4300, loss[loss=0.192, simple_loss=0.3458, pruned_loss=0.1904, over 4787.00 frames.], tot_loss[loss=0.1771, simple_loss=0.3217, pruned_loss=0.1716, over 970647.18 frames.], batch size: 14, lr: 2.61e-03 2022-05-03 12:39:51,601 INFO [train.py:715] (0/8) Epoch 0, batch 4350, loss[loss=0.1833, simple_loss=0.3299, pruned_loss=0.1831, over 4984.00 frames.], tot_loss[loss=0.1776, simple_loss=0.3226, pruned_loss=0.1699, over 971324.96 frames.], batch size: 28, lr: 2.61e-03 2022-05-03 12:40:33,087 INFO [train.py:715] (0/8) Epoch 0, batch 4400, loss[loss=0.1658, simple_loss=0.3014, pruned_loss=0.1508, over 4878.00 frames.], tot_loss[loss=0.1765, simple_loss=0.321, pruned_loss=0.1658, over 971705.89 frames.], batch size: 32, lr: 2.60e-03 2022-05-03 12:41:14,315 INFO [train.py:715] (0/8) Epoch 0, batch 4450, loss[loss=0.1602, simple_loss=0.2929, pruned_loss=0.1371, over 4783.00 frames.], tot_loss[loss=0.1741, simple_loss=0.3171, pruned_loss=0.1602, over 971586.40 frames.], batch size: 18, lr: 2.59e-03 2022-05-03 12:41:53,444 INFO [train.py:715] (0/8) Epoch 0, batch 4500, loss[loss=0.1824, simple_loss=0.3311, pruned_loss=0.1686, over 4957.00 frames.], tot_loss[loss=0.1742, simple_loss=0.3175, pruned_loss=0.1583, over 972035.25 frames.], batch size: 21, lr: 2.59e-03 2022-05-03 12:42:34,814 INFO [train.py:715] (0/8) Epoch 0, batch 4550, loss[loss=0.1699, simple_loss=0.3089, pruned_loss=0.1547, over 4846.00 frames.], tot_loss[loss=0.174, simple_loss=0.317, pruned_loss=0.157, over 971607.25 frames.], batch size: 20, lr: 2.58e-03 2022-05-03 12:43:16,365 INFO [train.py:715] (0/8) Epoch 0, batch 4600, loss[loss=0.211, simple_loss=0.3809, pruned_loss=0.2051, over 4970.00 frames.], tot_loss[loss=0.1738, simple_loss=0.3168, pruned_loss=0.1558, over 971209.40 frames.], batch size: 24, lr: 2.57e-03 2022-05-03 12:43:56,532 INFO [train.py:715] (0/8) Epoch 0, batch 4650, loss[loss=0.1967, simple_loss=0.3575, pruned_loss=0.1799, over 4794.00 frames.], tot_loss[loss=0.1741, simple_loss=0.3174, pruned_loss=0.1552, over 971219.30 frames.], batch size: 21, lr: 2.57e-03 2022-05-03 12:44:36,470 INFO [train.py:715] (0/8) Epoch 0, batch 4700, loss[loss=0.17, simple_loss=0.3126, pruned_loss=0.1365, over 4812.00 frames.], tot_loss[loss=0.1743, simple_loss=0.3178, pruned_loss=0.155, over 972265.69 frames.], batch size: 21, lr: 2.56e-03 2022-05-03 12:45:17,609 INFO [train.py:715] (0/8) Epoch 0, batch 4750, loss[loss=0.1783, simple_loss=0.3218, pruned_loss=0.1741, over 4924.00 frames.], tot_loss[loss=0.1731, simple_loss=0.316, pruned_loss=0.1524, over 972489.88 frames.], batch size: 18, lr: 2.55e-03 2022-05-03 12:45:58,876 INFO [train.py:715] (0/8) Epoch 0, batch 4800, loss[loss=0.1826, simple_loss=0.333, pruned_loss=0.1607, over 4961.00 frames.], tot_loss[loss=0.1731, simple_loss=0.316, pruned_loss=0.1522, over 973638.34 frames.], batch size: 39, lr: 2.55e-03 2022-05-03 12:46:38,840 INFO [train.py:715] (0/8) Epoch 0, batch 4850, loss[loss=0.2021, simple_loss=0.3615, pruned_loss=0.2136, over 4912.00 frames.], tot_loss[loss=0.173, simple_loss=0.3159, pruned_loss=0.151, over 972572.96 frames.], batch size: 17, lr: 2.54e-03 2022-05-03 12:47:19,644 INFO [train.py:715] (0/8) Epoch 0, batch 4900, loss[loss=0.1379, simple_loss=0.256, pruned_loss=0.09907, over 4885.00 frames.], tot_loss[loss=0.1722, simple_loss=0.3146, pruned_loss=0.1493, over 972186.33 frames.], batch size: 19, lr: 2.54e-03 2022-05-03 12:48:01,141 INFO [train.py:715] (0/8) Epoch 0, batch 4950, loss[loss=0.2015, simple_loss=0.3686, pruned_loss=0.1724, over 4889.00 frames.], tot_loss[loss=0.1722, simple_loss=0.3147, pruned_loss=0.1487, over 971774.28 frames.], batch size: 22, lr: 2.53e-03 2022-05-03 12:48:41,437 INFO [train.py:715] (0/8) Epoch 0, batch 5000, loss[loss=0.16, simple_loss=0.2954, pruned_loss=0.123, over 4985.00 frames.], tot_loss[loss=0.1718, simple_loss=0.314, pruned_loss=0.148, over 971769.19 frames.], batch size: 31, lr: 2.52e-03 2022-05-03 12:49:22,164 INFO [train.py:715] (0/8) Epoch 0, batch 5050, loss[loss=0.1618, simple_loss=0.298, pruned_loss=0.1279, over 4948.00 frames.], tot_loss[loss=0.1719, simple_loss=0.3142, pruned_loss=0.1481, over 971887.50 frames.], batch size: 21, lr: 2.52e-03 2022-05-03 12:50:05,004 INFO [train.py:715] (0/8) Epoch 0, batch 5100, loss[loss=0.1797, simple_loss=0.3285, pruned_loss=0.1546, over 4862.00 frames.], tot_loss[loss=0.1713, simple_loss=0.3132, pruned_loss=0.1468, over 972054.66 frames.], batch size: 20, lr: 2.51e-03 2022-05-03 12:50:48,204 INFO [train.py:715] (0/8) Epoch 0, batch 5150, loss[loss=0.1812, simple_loss=0.3307, pruned_loss=0.159, over 4889.00 frames.], tot_loss[loss=0.1703, simple_loss=0.3117, pruned_loss=0.145, over 972250.77 frames.], batch size: 22, lr: 2.50e-03 2022-05-03 12:51:28,082 INFO [train.py:715] (0/8) Epoch 0, batch 5200, loss[loss=0.1898, simple_loss=0.3493, pruned_loss=0.1514, over 4746.00 frames.], tot_loss[loss=0.1696, simple_loss=0.3106, pruned_loss=0.1432, over 971982.87 frames.], batch size: 16, lr: 2.50e-03 2022-05-03 12:52:08,693 INFO [train.py:715] (0/8) Epoch 0, batch 5250, loss[loss=0.1333, simple_loss=0.2456, pruned_loss=0.1044, over 4772.00 frames.], tot_loss[loss=0.1698, simple_loss=0.3111, pruned_loss=0.1427, over 972741.00 frames.], batch size: 12, lr: 2.49e-03 2022-05-03 12:52:49,812 INFO [train.py:715] (0/8) Epoch 0, batch 5300, loss[loss=0.1713, simple_loss=0.3148, pruned_loss=0.1387, over 4963.00 frames.], tot_loss[loss=0.1694, simple_loss=0.3104, pruned_loss=0.1418, over 973507.44 frames.], batch size: 24, lr: 2.49e-03 2022-05-03 12:53:30,340 INFO [train.py:715] (0/8) Epoch 0, batch 5350, loss[loss=0.2015, simple_loss=0.3647, pruned_loss=0.191, over 4966.00 frames.], tot_loss[loss=0.1697, simple_loss=0.3108, pruned_loss=0.1427, over 973754.58 frames.], batch size: 15, lr: 2.48e-03 2022-05-03 12:54:10,014 INFO [train.py:715] (0/8) Epoch 0, batch 5400, loss[loss=0.2065, simple_loss=0.3686, pruned_loss=0.2222, over 4963.00 frames.], tot_loss[loss=0.1693, simple_loss=0.3102, pruned_loss=0.1423, over 974232.25 frames.], batch size: 24, lr: 2.47e-03 2022-05-03 12:54:50,451 INFO [train.py:715] (0/8) Epoch 0, batch 5450, loss[loss=0.161, simple_loss=0.2949, pruned_loss=0.1357, over 4888.00 frames.], tot_loss[loss=0.1692, simple_loss=0.31, pruned_loss=0.1419, over 973199.28 frames.], batch size: 22, lr: 2.47e-03 2022-05-03 12:55:31,404 INFO [train.py:715] (0/8) Epoch 0, batch 5500, loss[loss=0.1592, simple_loss=0.2938, pruned_loss=0.1235, over 4955.00 frames.], tot_loss[loss=0.1689, simple_loss=0.3096, pruned_loss=0.1409, over 972200.41 frames.], batch size: 35, lr: 2.46e-03 2022-05-03 12:56:11,122 INFO [train.py:715] (0/8) Epoch 0, batch 5550, loss[loss=0.1881, simple_loss=0.3411, pruned_loss=0.176, over 4829.00 frames.], tot_loss[loss=0.1692, simple_loss=0.3102, pruned_loss=0.1413, over 972196.06 frames.], batch size: 15, lr: 2.45e-03 2022-05-03 12:56:51,159 INFO [train.py:715] (0/8) Epoch 0, batch 5600, loss[loss=0.1705, simple_loss=0.3124, pruned_loss=0.1427, over 4745.00 frames.], tot_loss[loss=0.1684, simple_loss=0.3089, pruned_loss=0.1392, over 972421.62 frames.], batch size: 16, lr: 2.45e-03 2022-05-03 12:57:32,352 INFO [train.py:715] (0/8) Epoch 0, batch 5650, loss[loss=0.1508, simple_loss=0.2788, pruned_loss=0.1136, over 4813.00 frames.], tot_loss[loss=0.1671, simple_loss=0.3067, pruned_loss=0.1372, over 972779.08 frames.], batch size: 25, lr: 2.44e-03 2022-05-03 12:58:12,918 INFO [train.py:715] (0/8) Epoch 0, batch 5700, loss[loss=0.1577, simple_loss=0.2915, pruned_loss=0.1197, over 4773.00 frames.], tot_loss[loss=0.1675, simple_loss=0.3075, pruned_loss=0.1373, over 972187.00 frames.], batch size: 17, lr: 2.44e-03 2022-05-03 12:58:52,123 INFO [train.py:715] (0/8) Epoch 0, batch 5750, loss[loss=0.1845, simple_loss=0.3379, pruned_loss=0.1554, over 4918.00 frames.], tot_loss[loss=0.167, simple_loss=0.3068, pruned_loss=0.1362, over 972126.20 frames.], batch size: 29, lr: 2.43e-03 2022-05-03 12:59:33,131 INFO [train.py:715] (0/8) Epoch 0, batch 5800, loss[loss=0.1543, simple_loss=0.2831, pruned_loss=0.1274, over 4855.00 frames.], tot_loss[loss=0.1661, simple_loss=0.3052, pruned_loss=0.1346, over 972518.68 frames.], batch size: 13, lr: 2.42e-03 2022-05-03 13:00:14,315 INFO [train.py:715] (0/8) Epoch 0, batch 5850, loss[loss=0.1714, simple_loss=0.3119, pruned_loss=0.1545, over 4969.00 frames.], tot_loss[loss=0.166, simple_loss=0.3051, pruned_loss=0.135, over 972319.09 frames.], batch size: 24, lr: 2.42e-03 2022-05-03 13:00:54,231 INFO [train.py:715] (0/8) Epoch 0, batch 5900, loss[loss=0.1798, simple_loss=0.3282, pruned_loss=0.1572, over 4945.00 frames.], tot_loss[loss=0.1662, simple_loss=0.3055, pruned_loss=0.1344, over 972194.35 frames.], batch size: 24, lr: 2.41e-03 2022-05-03 13:01:33,979 INFO [train.py:715] (0/8) Epoch 0, batch 5950, loss[loss=0.142, simple_loss=0.2639, pruned_loss=0.1004, over 4741.00 frames.], tot_loss[loss=0.1665, simple_loss=0.3061, pruned_loss=0.1347, over 972757.69 frames.], batch size: 19, lr: 2.41e-03 2022-05-03 13:02:14,775 INFO [train.py:715] (0/8) Epoch 0, batch 6000, loss[loss=0.2898, simple_loss=0.3124, pruned_loss=0.1336, over 4922.00 frames.], tot_loss[loss=0.1675, simple_loss=0.3059, pruned_loss=0.1347, over 972571.79 frames.], batch size: 23, lr: 2.40e-03 2022-05-03 13:02:14,776 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 13:02:25,810 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1779, simple_loss=0.2457, pruned_loss=0.05502, over 914524.00 frames. 2022-05-03 13:03:07,302 INFO [train.py:715] (0/8) Epoch 0, batch 6050, loss[loss=0.2597, simple_loss=0.3055, pruned_loss=0.1069, over 4779.00 frames.], tot_loss[loss=0.2002, simple_loss=0.3095, pruned_loss=0.139, over 973071.20 frames.], batch size: 18, lr: 2.39e-03 2022-05-03 13:03:47,844 INFO [train.py:715] (0/8) Epoch 0, batch 6100, loss[loss=0.2332, simple_loss=0.2614, pruned_loss=0.1026, over 4796.00 frames.], tot_loss[loss=0.2176, simple_loss=0.3074, pruned_loss=0.1367, over 973131.62 frames.], batch size: 24, lr: 2.39e-03 2022-05-03 13:04:27,373 INFO [train.py:715] (0/8) Epoch 0, batch 6150, loss[loss=0.2839, simple_loss=0.3243, pruned_loss=0.1217, over 4829.00 frames.], tot_loss[loss=0.2345, simple_loss=0.3084, pruned_loss=0.137, over 972591.44 frames.], batch size: 15, lr: 2.38e-03 2022-05-03 13:05:08,106 INFO [train.py:715] (0/8) Epoch 0, batch 6200, loss[loss=0.3514, simple_loss=0.3546, pruned_loss=0.1741, over 4835.00 frames.], tot_loss[loss=0.2476, simple_loss=0.3098, pruned_loss=0.1368, over 971858.25 frames.], batch size: 26, lr: 2.38e-03 2022-05-03 13:05:48,913 INFO [train.py:715] (0/8) Epoch 0, batch 6250, loss[loss=0.2306, simple_loss=0.2648, pruned_loss=0.09822, over 4952.00 frames.], tot_loss[loss=0.2543, simple_loss=0.3083, pruned_loss=0.1345, over 971987.52 frames.], batch size: 24, lr: 2.37e-03 2022-05-03 13:06:29,114 INFO [train.py:715] (0/8) Epoch 0, batch 6300, loss[loss=0.3315, simple_loss=0.3405, pruned_loss=0.1613, over 4794.00 frames.], tot_loss[loss=0.2605, simple_loss=0.3076, pruned_loss=0.1334, over 972311.60 frames.], batch size: 24, lr: 2.37e-03 2022-05-03 13:07:09,793 INFO [train.py:715] (0/8) Epoch 0, batch 6350, loss[loss=0.3464, simple_loss=0.3624, pruned_loss=0.1652, over 4916.00 frames.], tot_loss[loss=0.2675, simple_loss=0.309, pruned_loss=0.1338, over 971954.85 frames.], batch size: 17, lr: 2.36e-03 2022-05-03 13:07:50,717 INFO [train.py:715] (0/8) Epoch 0, batch 6400, loss[loss=0.2218, simple_loss=0.2638, pruned_loss=0.08988, over 4750.00 frames.], tot_loss[loss=0.2732, simple_loss=0.3104, pruned_loss=0.1342, over 971839.39 frames.], batch size: 19, lr: 2.35e-03 2022-05-03 13:08:30,733 INFO [train.py:715] (0/8) Epoch 0, batch 6450, loss[loss=0.2159, simple_loss=0.2653, pruned_loss=0.08323, over 4984.00 frames.], tot_loss[loss=0.2741, simple_loss=0.3092, pruned_loss=0.1321, over 971972.78 frames.], batch size: 14, lr: 2.35e-03 2022-05-03 13:09:10,066 INFO [train.py:715] (0/8) Epoch 0, batch 6500, loss[loss=0.3443, simple_loss=0.3542, pruned_loss=0.1672, over 4934.00 frames.], tot_loss[loss=0.2768, simple_loss=0.3097, pruned_loss=0.1318, over 971830.26 frames.], batch size: 29, lr: 2.34e-03 2022-05-03 13:09:50,937 INFO [train.py:715] (0/8) Epoch 0, batch 6550, loss[loss=0.3157, simple_loss=0.3192, pruned_loss=0.1561, over 4798.00 frames.], tot_loss[loss=0.2803, simple_loss=0.3112, pruned_loss=0.1323, over 972574.91 frames.], batch size: 12, lr: 2.34e-03 2022-05-03 13:10:31,738 INFO [train.py:715] (0/8) Epoch 0, batch 6600, loss[loss=0.2245, simple_loss=0.2657, pruned_loss=0.09163, over 4798.00 frames.], tot_loss[loss=0.2798, simple_loss=0.3098, pruned_loss=0.1309, over 972575.86 frames.], batch size: 14, lr: 2.33e-03 2022-05-03 13:11:11,210 INFO [train.py:715] (0/8) Epoch 0, batch 6650, loss[loss=0.329, simple_loss=0.346, pruned_loss=0.1559, over 4915.00 frames.], tot_loss[loss=0.2805, simple_loss=0.31, pruned_loss=0.1302, over 972608.12 frames.], batch size: 39, lr: 2.33e-03 2022-05-03 13:11:51,647 INFO [train.py:715] (0/8) Epoch 0, batch 6700, loss[loss=0.2928, simple_loss=0.3228, pruned_loss=0.1314, over 4898.00 frames.], tot_loss[loss=0.2806, simple_loss=0.3095, pruned_loss=0.1294, over 972863.26 frames.], batch size: 19, lr: 2.32e-03 2022-05-03 13:12:32,418 INFO [train.py:715] (0/8) Epoch 0, batch 6750, loss[loss=0.2716, simple_loss=0.3009, pruned_loss=0.1211, over 4782.00 frames.], tot_loss[loss=0.2808, simple_loss=0.3094, pruned_loss=0.1289, over 972407.88 frames.], batch size: 14, lr: 2.31e-03 2022-05-03 13:13:12,495 INFO [train.py:715] (0/8) Epoch 0, batch 6800, loss[loss=0.3307, simple_loss=0.334, pruned_loss=0.1637, over 4847.00 frames.], tot_loss[loss=0.2802, simple_loss=0.3092, pruned_loss=0.1278, over 972449.29 frames.], batch size: 30, lr: 2.31e-03 2022-05-03 13:13:52,213 INFO [train.py:715] (0/8) Epoch 0, batch 6850, loss[loss=0.3576, simple_loss=0.3669, pruned_loss=0.1742, over 4821.00 frames.], tot_loss[loss=0.2799, simple_loss=0.3088, pruned_loss=0.1272, over 972762.51 frames.], batch size: 26, lr: 2.30e-03 2022-05-03 13:14:32,488 INFO [train.py:715] (0/8) Epoch 0, batch 6900, loss[loss=0.286, simple_loss=0.3134, pruned_loss=0.1293, over 4901.00 frames.], tot_loss[loss=0.28, simple_loss=0.3087, pruned_loss=0.1269, over 972714.86 frames.], batch size: 22, lr: 2.30e-03 2022-05-03 13:15:12,911 INFO [train.py:715] (0/8) Epoch 0, batch 6950, loss[loss=0.25, simple_loss=0.2936, pruned_loss=0.1032, over 4939.00 frames.], tot_loss[loss=0.2781, simple_loss=0.3074, pruned_loss=0.1254, over 972787.86 frames.], batch size: 21, lr: 2.29e-03 2022-05-03 13:15:53,032 INFO [train.py:715] (0/8) Epoch 0, batch 7000, loss[loss=0.2507, simple_loss=0.2882, pruned_loss=0.1067, over 4830.00 frames.], tot_loss[loss=0.2778, simple_loss=0.308, pruned_loss=0.1246, over 972294.14 frames.], batch size: 13, lr: 2.29e-03 2022-05-03 13:16:33,740 INFO [train.py:715] (0/8) Epoch 0, batch 7050, loss[loss=0.2353, simple_loss=0.2737, pruned_loss=0.0985, over 4754.00 frames.], tot_loss[loss=0.2758, simple_loss=0.3065, pruned_loss=0.1232, over 972204.35 frames.], batch size: 19, lr: 2.28e-03 2022-05-03 13:17:14,925 INFO [train.py:715] (0/8) Epoch 0, batch 7100, loss[loss=0.2706, simple_loss=0.2996, pruned_loss=0.1208, over 4794.00 frames.], tot_loss[loss=0.2765, simple_loss=0.3068, pruned_loss=0.1236, over 972171.66 frames.], batch size: 24, lr: 2.28e-03 2022-05-03 13:17:55,866 INFO [train.py:715] (0/8) Epoch 0, batch 7150, loss[loss=0.2962, simple_loss=0.3207, pruned_loss=0.1358, over 4778.00 frames.], tot_loss[loss=0.2771, simple_loss=0.307, pruned_loss=0.1239, over 972061.56 frames.], batch size: 17, lr: 2.27e-03 2022-05-03 13:18:35,512 INFO [train.py:715] (0/8) Epoch 0, batch 7200, loss[loss=0.2455, simple_loss=0.2825, pruned_loss=0.1043, over 4984.00 frames.], tot_loss[loss=0.276, simple_loss=0.3064, pruned_loss=0.1231, over 972097.60 frames.], batch size: 15, lr: 2.27e-03 2022-05-03 13:19:16,089 INFO [train.py:715] (0/8) Epoch 0, batch 7250, loss[loss=0.2819, simple_loss=0.305, pruned_loss=0.1294, over 4912.00 frames.], tot_loss[loss=0.275, simple_loss=0.3061, pruned_loss=0.1222, over 972268.41 frames.], batch size: 17, lr: 2.26e-03 2022-05-03 13:19:55,974 INFO [train.py:715] (0/8) Epoch 0, batch 7300, loss[loss=0.2513, simple_loss=0.2811, pruned_loss=0.1108, over 4947.00 frames.], tot_loss[loss=0.2743, simple_loss=0.3055, pruned_loss=0.1217, over 972753.91 frames.], batch size: 35, lr: 2.26e-03 2022-05-03 13:20:36,062 INFO [train.py:715] (0/8) Epoch 0, batch 7350, loss[loss=0.2129, simple_loss=0.2567, pruned_loss=0.08454, over 4815.00 frames.], tot_loss[loss=0.2741, simple_loss=0.3055, pruned_loss=0.1215, over 972569.33 frames.], batch size: 12, lr: 2.25e-03 2022-05-03 13:21:16,439 INFO [train.py:715] (0/8) Epoch 0, batch 7400, loss[loss=0.3084, simple_loss=0.3354, pruned_loss=0.1408, over 4842.00 frames.], tot_loss[loss=0.2732, simple_loss=0.305, pruned_loss=0.1208, over 973297.52 frames.], batch size: 26, lr: 2.24e-03 2022-05-03 13:21:57,037 INFO [train.py:715] (0/8) Epoch 0, batch 7450, loss[loss=0.2356, simple_loss=0.2759, pruned_loss=0.09764, over 4915.00 frames.], tot_loss[loss=0.2725, simple_loss=0.3052, pruned_loss=0.12, over 973385.71 frames.], batch size: 23, lr: 2.24e-03 2022-05-03 13:22:36,841 INFO [train.py:715] (0/8) Epoch 0, batch 7500, loss[loss=0.2608, simple_loss=0.2916, pruned_loss=0.115, over 4795.00 frames.], tot_loss[loss=0.2699, simple_loss=0.3034, pruned_loss=0.1182, over 973087.19 frames.], batch size: 14, lr: 2.23e-03 2022-05-03 13:23:16,565 INFO [train.py:715] (0/8) Epoch 0, batch 7550, loss[loss=0.3515, simple_loss=0.3636, pruned_loss=0.1696, over 4807.00 frames.], tot_loss[loss=0.2708, simple_loss=0.3042, pruned_loss=0.1188, over 972809.80 frames.], batch size: 24, lr: 2.23e-03 2022-05-03 13:23:57,044 INFO [train.py:715] (0/8) Epoch 0, batch 7600, loss[loss=0.2593, simple_loss=0.2996, pruned_loss=0.1095, over 4754.00 frames.], tot_loss[loss=0.2698, simple_loss=0.3036, pruned_loss=0.118, over 972008.15 frames.], batch size: 19, lr: 2.22e-03 2022-05-03 13:24:37,501 INFO [train.py:715] (0/8) Epoch 0, batch 7650, loss[loss=0.2639, simple_loss=0.304, pruned_loss=0.1119, over 4906.00 frames.], tot_loss[loss=0.2679, simple_loss=0.3022, pruned_loss=0.1168, over 971548.73 frames.], batch size: 18, lr: 2.22e-03 2022-05-03 13:25:16,993 INFO [train.py:715] (0/8) Epoch 0, batch 7700, loss[loss=0.2549, simple_loss=0.2938, pruned_loss=0.108, over 4690.00 frames.], tot_loss[loss=0.2705, simple_loss=0.3044, pruned_loss=0.1183, over 971710.93 frames.], batch size: 15, lr: 2.21e-03 2022-05-03 13:25:57,322 INFO [train.py:715] (0/8) Epoch 0, batch 7750, loss[loss=0.2454, simple_loss=0.2905, pruned_loss=0.1001, over 4737.00 frames.], tot_loss[loss=0.2708, simple_loss=0.305, pruned_loss=0.1183, over 972276.47 frames.], batch size: 16, lr: 2.21e-03 2022-05-03 13:26:38,378 INFO [train.py:715] (0/8) Epoch 0, batch 7800, loss[loss=0.2784, simple_loss=0.3185, pruned_loss=0.1191, over 4932.00 frames.], tot_loss[loss=0.2706, simple_loss=0.3047, pruned_loss=0.1183, over 972815.95 frames.], batch size: 29, lr: 2.20e-03 2022-05-03 13:27:18,714 INFO [train.py:715] (0/8) Epoch 0, batch 7850, loss[loss=0.3163, simple_loss=0.324, pruned_loss=0.1543, over 4773.00 frames.], tot_loss[loss=0.2712, simple_loss=0.3054, pruned_loss=0.1185, over 972751.94 frames.], batch size: 18, lr: 2.20e-03 2022-05-03 13:27:58,879 INFO [train.py:715] (0/8) Epoch 0, batch 7900, loss[loss=0.2948, simple_loss=0.3263, pruned_loss=0.1316, over 4755.00 frames.], tot_loss[loss=0.2713, simple_loss=0.3057, pruned_loss=0.1184, over 972862.49 frames.], batch size: 14, lr: 2.19e-03 2022-05-03 13:28:39,527 INFO [train.py:715] (0/8) Epoch 0, batch 7950, loss[loss=0.2593, simple_loss=0.2919, pruned_loss=0.1133, over 4885.00 frames.], tot_loss[loss=0.2706, simple_loss=0.3049, pruned_loss=0.1182, over 972082.99 frames.], batch size: 38, lr: 2.19e-03 2022-05-03 13:29:19,141 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-8000.pt 2022-05-03 13:29:22,248 INFO [train.py:715] (0/8) Epoch 0, batch 8000, loss[loss=0.3109, simple_loss=0.3352, pruned_loss=0.1433, over 4987.00 frames.], tot_loss[loss=0.2681, simple_loss=0.3031, pruned_loss=0.1165, over 972507.45 frames.], batch size: 14, lr: 2.18e-03 2022-05-03 13:30:02,111 INFO [train.py:715] (0/8) Epoch 0, batch 8050, loss[loss=0.3451, simple_loss=0.373, pruned_loss=0.1586, over 4975.00 frames.], tot_loss[loss=0.2691, simple_loss=0.3039, pruned_loss=0.1172, over 971312.97 frames.], batch size: 28, lr: 2.18e-03 2022-05-03 13:30:41,977 INFO [train.py:715] (0/8) Epoch 0, batch 8100, loss[loss=0.2883, simple_loss=0.318, pruned_loss=0.1292, over 4899.00 frames.], tot_loss[loss=0.2696, simple_loss=0.3045, pruned_loss=0.1173, over 970884.16 frames.], batch size: 22, lr: 2.17e-03 2022-05-03 13:31:23,000 INFO [train.py:715] (0/8) Epoch 0, batch 8150, loss[loss=0.2406, simple_loss=0.2801, pruned_loss=0.1006, over 4944.00 frames.], tot_loss[loss=0.2682, simple_loss=0.3033, pruned_loss=0.1166, over 971638.10 frames.], batch size: 21, lr: 2.17e-03 2022-05-03 13:32:02,626 INFO [train.py:715] (0/8) Epoch 0, batch 8200, loss[loss=0.2546, simple_loss=0.3004, pruned_loss=0.1044, over 4914.00 frames.], tot_loss[loss=0.2656, simple_loss=0.3018, pruned_loss=0.1147, over 971415.33 frames.], batch size: 23, lr: 2.16e-03 2022-05-03 13:32:42,138 INFO [train.py:715] (0/8) Epoch 0, batch 8250, loss[loss=0.2089, simple_loss=0.2623, pruned_loss=0.07772, over 4991.00 frames.], tot_loss[loss=0.2656, simple_loss=0.3018, pruned_loss=0.1147, over 971508.27 frames.], batch size: 14, lr: 2.16e-03 2022-05-03 13:33:22,996 INFO [train.py:715] (0/8) Epoch 0, batch 8300, loss[loss=0.2816, simple_loss=0.3234, pruned_loss=0.1199, over 4870.00 frames.], tot_loss[loss=0.2648, simple_loss=0.3013, pruned_loss=0.1142, over 971675.48 frames.], batch size: 22, lr: 2.15e-03 2022-05-03 13:34:03,427 INFO [train.py:715] (0/8) Epoch 0, batch 8350, loss[loss=0.278, simple_loss=0.3218, pruned_loss=0.1172, over 4919.00 frames.], tot_loss[loss=0.2628, simple_loss=0.2998, pruned_loss=0.1129, over 971947.12 frames.], batch size: 17, lr: 2.15e-03 2022-05-03 13:34:43,100 INFO [train.py:715] (0/8) Epoch 0, batch 8400, loss[loss=0.2841, simple_loss=0.3181, pruned_loss=0.1251, over 4921.00 frames.], tot_loss[loss=0.2642, simple_loss=0.3014, pruned_loss=0.1135, over 972573.54 frames.], batch size: 17, lr: 2.15e-03 2022-05-03 13:35:23,384 INFO [train.py:715] (0/8) Epoch 0, batch 8450, loss[loss=0.2618, simple_loss=0.2949, pruned_loss=0.1144, over 4923.00 frames.], tot_loss[loss=0.2627, simple_loss=0.3001, pruned_loss=0.1127, over 972397.52 frames.], batch size: 18, lr: 2.14e-03 2022-05-03 13:36:04,655 INFO [train.py:715] (0/8) Epoch 0, batch 8500, loss[loss=0.2389, simple_loss=0.2761, pruned_loss=0.1009, over 4962.00 frames.], tot_loss[loss=0.2633, simple_loss=0.3003, pruned_loss=0.1132, over 972121.37 frames.], batch size: 35, lr: 2.14e-03 2022-05-03 13:36:45,724 INFO [train.py:715] (0/8) Epoch 0, batch 8550, loss[loss=0.253, simple_loss=0.2977, pruned_loss=0.1042, over 4984.00 frames.], tot_loss[loss=0.2628, simple_loss=0.3, pruned_loss=0.1128, over 972079.70 frames.], batch size: 28, lr: 2.13e-03 2022-05-03 13:37:25,367 INFO [train.py:715] (0/8) Epoch 0, batch 8600, loss[loss=0.2527, simple_loss=0.2943, pruned_loss=0.1056, over 4989.00 frames.], tot_loss[loss=0.263, simple_loss=0.3002, pruned_loss=0.1129, over 973001.59 frames.], batch size: 16, lr: 2.13e-03 2022-05-03 13:38:06,750 INFO [train.py:715] (0/8) Epoch 0, batch 8650, loss[loss=0.2126, simple_loss=0.2656, pruned_loss=0.07979, over 4806.00 frames.], tot_loss[loss=0.2608, simple_loss=0.2993, pruned_loss=0.1112, over 972925.32 frames.], batch size: 26, lr: 2.12e-03 2022-05-03 13:38:47,697 INFO [train.py:715] (0/8) Epoch 0, batch 8700, loss[loss=0.255, simple_loss=0.2828, pruned_loss=0.1136, over 4778.00 frames.], tot_loss[loss=0.26, simple_loss=0.2983, pruned_loss=0.1108, over 972976.78 frames.], batch size: 12, lr: 2.12e-03 2022-05-03 13:39:27,756 INFO [train.py:715] (0/8) Epoch 0, batch 8750, loss[loss=0.2171, simple_loss=0.2692, pruned_loss=0.08249, over 4748.00 frames.], tot_loss[loss=0.259, simple_loss=0.2976, pruned_loss=0.1102, over 973076.90 frames.], batch size: 19, lr: 2.11e-03 2022-05-03 13:40:08,242 INFO [train.py:715] (0/8) Epoch 0, batch 8800, loss[loss=0.2644, simple_loss=0.3188, pruned_loss=0.105, over 4754.00 frames.], tot_loss[loss=0.2592, simple_loss=0.2979, pruned_loss=0.1102, over 973196.05 frames.], batch size: 16, lr: 2.11e-03 2022-05-03 13:40:48,805 INFO [train.py:715] (0/8) Epoch 0, batch 8850, loss[loss=0.2617, simple_loss=0.2979, pruned_loss=0.1128, over 4899.00 frames.], tot_loss[loss=0.2596, simple_loss=0.2983, pruned_loss=0.1105, over 973338.07 frames.], batch size: 19, lr: 2.10e-03 2022-05-03 13:41:29,535 INFO [train.py:715] (0/8) Epoch 0, batch 8900, loss[loss=0.2333, simple_loss=0.2851, pruned_loss=0.09079, over 4775.00 frames.], tot_loss[loss=0.2578, simple_loss=0.297, pruned_loss=0.1093, over 972968.31 frames.], batch size: 18, lr: 2.10e-03 2022-05-03 13:42:09,371 INFO [train.py:715] (0/8) Epoch 0, batch 8950, loss[loss=0.2315, simple_loss=0.2705, pruned_loss=0.09627, over 4919.00 frames.], tot_loss[loss=0.2603, simple_loss=0.2987, pruned_loss=0.1109, over 973481.14 frames.], batch size: 18, lr: 2.10e-03 2022-05-03 13:42:49,921 INFO [train.py:715] (0/8) Epoch 0, batch 9000, loss[loss=0.316, simple_loss=0.3478, pruned_loss=0.142, over 4828.00 frames.], tot_loss[loss=0.2607, simple_loss=0.2991, pruned_loss=0.1112, over 972632.03 frames.], batch size: 25, lr: 2.09e-03 2022-05-03 13:42:49,922 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 13:43:03,385 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1592, simple_loss=0.2426, pruned_loss=0.03794, over 914524.00 frames. 2022-05-03 13:43:44,297 INFO [train.py:715] (0/8) Epoch 0, batch 9050, loss[loss=0.2606, simple_loss=0.2948, pruned_loss=0.1132, over 4842.00 frames.], tot_loss[loss=0.2599, simple_loss=0.2988, pruned_loss=0.1105, over 972666.63 frames.], batch size: 30, lr: 2.09e-03 2022-05-03 13:44:24,656 INFO [train.py:715] (0/8) Epoch 0, batch 9100, loss[loss=0.2514, simple_loss=0.2969, pruned_loss=0.103, over 4812.00 frames.], tot_loss[loss=0.2597, simple_loss=0.2989, pruned_loss=0.1102, over 972893.97 frames.], batch size: 21, lr: 2.08e-03 2022-05-03 13:45:04,786 INFO [train.py:715] (0/8) Epoch 0, batch 9150, loss[loss=0.2273, simple_loss=0.2699, pruned_loss=0.09235, over 4922.00 frames.], tot_loss[loss=0.2588, simple_loss=0.2983, pruned_loss=0.1097, over 972481.53 frames.], batch size: 23, lr: 2.08e-03 2022-05-03 13:45:44,980 INFO [train.py:715] (0/8) Epoch 0, batch 9200, loss[loss=0.2504, simple_loss=0.2875, pruned_loss=0.1066, over 4843.00 frames.], tot_loss[loss=0.2577, simple_loss=0.2971, pruned_loss=0.1092, over 972424.91 frames.], batch size: 30, lr: 2.07e-03 2022-05-03 13:46:26,079 INFO [train.py:715] (0/8) Epoch 0, batch 9250, loss[loss=0.2418, simple_loss=0.2922, pruned_loss=0.09569, over 4768.00 frames.], tot_loss[loss=0.2587, simple_loss=0.2977, pruned_loss=0.1098, over 972973.70 frames.], batch size: 17, lr: 2.07e-03 2022-05-03 13:47:06,382 INFO [train.py:715] (0/8) Epoch 0, batch 9300, loss[loss=0.2408, simple_loss=0.2887, pruned_loss=0.09642, over 4854.00 frames.], tot_loss[loss=0.2598, simple_loss=0.299, pruned_loss=0.1103, over 972571.95 frames.], batch size: 13, lr: 2.06e-03 2022-05-03 13:47:45,671 INFO [train.py:715] (0/8) Epoch 0, batch 9350, loss[loss=0.2203, simple_loss=0.269, pruned_loss=0.08584, over 4947.00 frames.], tot_loss[loss=0.2591, simple_loss=0.2989, pruned_loss=0.1097, over 972934.38 frames.], batch size: 35, lr: 2.06e-03 2022-05-03 13:48:27,114 INFO [train.py:715] (0/8) Epoch 0, batch 9400, loss[loss=0.2097, simple_loss=0.261, pruned_loss=0.07916, over 4912.00 frames.], tot_loss[loss=0.258, simple_loss=0.2979, pruned_loss=0.1091, over 972715.04 frames.], batch size: 19, lr: 2.06e-03 2022-05-03 13:49:07,601 INFO [train.py:715] (0/8) Epoch 0, batch 9450, loss[loss=0.2651, simple_loss=0.2993, pruned_loss=0.1155, over 4849.00 frames.], tot_loss[loss=0.2572, simple_loss=0.2974, pruned_loss=0.1084, over 972841.63 frames.], batch size: 13, lr: 2.05e-03 2022-05-03 13:49:47,924 INFO [train.py:715] (0/8) Epoch 0, batch 9500, loss[loss=0.3349, simple_loss=0.3447, pruned_loss=0.1626, over 4936.00 frames.], tot_loss[loss=0.2564, simple_loss=0.2967, pruned_loss=0.1081, over 973308.13 frames.], batch size: 21, lr: 2.05e-03 2022-05-03 13:50:28,012 INFO [train.py:715] (0/8) Epoch 0, batch 9550, loss[loss=0.2624, simple_loss=0.298, pruned_loss=0.1133, over 4899.00 frames.], tot_loss[loss=0.2556, simple_loss=0.296, pruned_loss=0.1076, over 973089.10 frames.], batch size: 19, lr: 2.04e-03 2022-05-03 13:51:08,461 INFO [train.py:715] (0/8) Epoch 0, batch 9600, loss[loss=0.2256, simple_loss=0.2662, pruned_loss=0.09252, over 4772.00 frames.], tot_loss[loss=0.2547, simple_loss=0.2956, pruned_loss=0.1069, over 973221.75 frames.], batch size: 18, lr: 2.04e-03 2022-05-03 13:51:48,908 INFO [train.py:715] (0/8) Epoch 0, batch 9650, loss[loss=0.2422, simple_loss=0.2912, pruned_loss=0.09663, over 4792.00 frames.], tot_loss[loss=0.2548, simple_loss=0.296, pruned_loss=0.1069, over 973311.67 frames.], batch size: 17, lr: 2.03e-03 2022-05-03 13:52:27,673 INFO [train.py:715] (0/8) Epoch 0, batch 9700, loss[loss=0.2724, simple_loss=0.3038, pruned_loss=0.1205, over 4883.00 frames.], tot_loss[loss=0.2533, simple_loss=0.2951, pruned_loss=0.1057, over 973439.76 frames.], batch size: 22, lr: 2.03e-03 2022-05-03 13:53:08,241 INFO [train.py:715] (0/8) Epoch 0, batch 9750, loss[loss=0.2894, simple_loss=0.3246, pruned_loss=0.1271, over 4697.00 frames.], tot_loss[loss=0.2554, simple_loss=0.2962, pruned_loss=0.1073, over 973190.43 frames.], batch size: 15, lr: 2.03e-03 2022-05-03 13:53:47,976 INFO [train.py:715] (0/8) Epoch 0, batch 9800, loss[loss=0.3114, simple_loss=0.3368, pruned_loss=0.143, over 4973.00 frames.], tot_loss[loss=0.2565, simple_loss=0.2971, pruned_loss=0.108, over 973150.63 frames.], batch size: 15, lr: 2.02e-03 2022-05-03 13:54:27,876 INFO [train.py:715] (0/8) Epoch 0, batch 9850, loss[loss=0.2458, simple_loss=0.2883, pruned_loss=0.1016, over 4758.00 frames.], tot_loss[loss=0.256, simple_loss=0.2971, pruned_loss=0.1075, over 972657.80 frames.], batch size: 19, lr: 2.02e-03 2022-05-03 13:55:07,636 INFO [train.py:715] (0/8) Epoch 0, batch 9900, loss[loss=0.2382, simple_loss=0.2865, pruned_loss=0.095, over 4916.00 frames.], tot_loss[loss=0.2557, simple_loss=0.2972, pruned_loss=0.1071, over 972792.43 frames.], batch size: 18, lr: 2.01e-03 2022-05-03 13:55:47,712 INFO [train.py:715] (0/8) Epoch 0, batch 9950, loss[loss=0.2262, simple_loss=0.2815, pruned_loss=0.08541, over 4838.00 frames.], tot_loss[loss=0.2549, simple_loss=0.2965, pruned_loss=0.1066, over 972753.88 frames.], batch size: 27, lr: 2.01e-03 2022-05-03 13:56:27,933 INFO [train.py:715] (0/8) Epoch 0, batch 10000, loss[loss=0.2954, simple_loss=0.327, pruned_loss=0.1319, over 4780.00 frames.], tot_loss[loss=0.255, simple_loss=0.2964, pruned_loss=0.1068, over 973495.32 frames.], batch size: 14, lr: 2.01e-03 2022-05-03 13:57:07,309 INFO [train.py:715] (0/8) Epoch 0, batch 10050, loss[loss=0.219, simple_loss=0.261, pruned_loss=0.08845, over 4725.00 frames.], tot_loss[loss=0.2536, simple_loss=0.2955, pruned_loss=0.1058, over 973643.91 frames.], batch size: 16, lr: 2.00e-03 2022-05-03 13:57:47,855 INFO [train.py:715] (0/8) Epoch 0, batch 10100, loss[loss=0.2852, simple_loss=0.3094, pruned_loss=0.1305, over 4875.00 frames.], tot_loss[loss=0.2536, simple_loss=0.2955, pruned_loss=0.1058, over 974092.28 frames.], batch size: 16, lr: 2.00e-03 2022-05-03 13:58:27,704 INFO [train.py:715] (0/8) Epoch 0, batch 10150, loss[loss=0.2704, simple_loss=0.3239, pruned_loss=0.1084, over 4992.00 frames.], tot_loss[loss=0.2526, simple_loss=0.2949, pruned_loss=0.1051, over 974207.90 frames.], batch size: 20, lr: 1.99e-03 2022-05-03 13:59:07,285 INFO [train.py:715] (0/8) Epoch 0, batch 10200, loss[loss=0.2475, simple_loss=0.2895, pruned_loss=0.1028, over 4899.00 frames.], tot_loss[loss=0.2532, simple_loss=0.2953, pruned_loss=0.1055, over 973380.21 frames.], batch size: 17, lr: 1.99e-03 2022-05-03 13:59:47,206 INFO [train.py:715] (0/8) Epoch 0, batch 10250, loss[loss=0.2415, simple_loss=0.2868, pruned_loss=0.09811, over 4972.00 frames.], tot_loss[loss=0.2522, simple_loss=0.2944, pruned_loss=0.105, over 974219.85 frames.], batch size: 25, lr: 1.99e-03 2022-05-03 14:00:28,079 INFO [train.py:715] (0/8) Epoch 0, batch 10300, loss[loss=0.2624, simple_loss=0.3168, pruned_loss=0.104, over 4962.00 frames.], tot_loss[loss=0.2512, simple_loss=0.2936, pruned_loss=0.1044, over 973900.83 frames.], batch size: 29, lr: 1.98e-03 2022-05-03 14:01:08,334 INFO [train.py:715] (0/8) Epoch 0, batch 10350, loss[loss=0.3061, simple_loss=0.3429, pruned_loss=0.1346, over 4940.00 frames.], tot_loss[loss=0.2524, simple_loss=0.2945, pruned_loss=0.1051, over 973678.48 frames.], batch size: 24, lr: 1.98e-03 2022-05-03 14:01:47,793 INFO [train.py:715] (0/8) Epoch 0, batch 10400, loss[loss=0.2519, simple_loss=0.2893, pruned_loss=0.1073, over 4685.00 frames.], tot_loss[loss=0.2529, simple_loss=0.2947, pruned_loss=0.1055, over 972683.14 frames.], batch size: 15, lr: 1.97e-03 2022-05-03 14:02:28,423 INFO [train.py:715] (0/8) Epoch 0, batch 10450, loss[loss=0.2144, simple_loss=0.2564, pruned_loss=0.0862, over 4639.00 frames.], tot_loss[loss=0.254, simple_loss=0.2959, pruned_loss=0.106, over 972365.23 frames.], batch size: 13, lr: 1.97e-03 2022-05-03 14:03:09,166 INFO [train.py:715] (0/8) Epoch 0, batch 10500, loss[loss=0.253, simple_loss=0.2907, pruned_loss=0.1076, over 4961.00 frames.], tot_loss[loss=0.2543, simple_loss=0.296, pruned_loss=0.1063, over 971537.08 frames.], batch size: 21, lr: 1.97e-03 2022-05-03 14:03:48,866 INFO [train.py:715] (0/8) Epoch 0, batch 10550, loss[loss=0.2615, simple_loss=0.2889, pruned_loss=0.117, over 4823.00 frames.], tot_loss[loss=0.2542, simple_loss=0.2962, pruned_loss=0.1061, over 972116.18 frames.], batch size: 12, lr: 1.96e-03 2022-05-03 14:04:28,878 INFO [train.py:715] (0/8) Epoch 0, batch 10600, loss[loss=0.1999, simple_loss=0.2638, pruned_loss=0.06801, over 4927.00 frames.], tot_loss[loss=0.2522, simple_loss=0.295, pruned_loss=0.1047, over 972866.27 frames.], batch size: 23, lr: 1.96e-03 2022-05-03 14:05:09,750 INFO [train.py:715] (0/8) Epoch 0, batch 10650, loss[loss=0.2991, simple_loss=0.3294, pruned_loss=0.1344, over 4977.00 frames.], tot_loss[loss=0.2516, simple_loss=0.2946, pruned_loss=0.1043, over 973024.09 frames.], batch size: 35, lr: 1.96e-03 2022-05-03 14:05:49,659 INFO [train.py:715] (0/8) Epoch 0, batch 10700, loss[loss=0.2882, simple_loss=0.3223, pruned_loss=0.1271, over 4963.00 frames.], tot_loss[loss=0.2507, simple_loss=0.2946, pruned_loss=0.1034, over 972556.58 frames.], batch size: 35, lr: 1.95e-03 2022-05-03 14:06:29,548 INFO [train.py:715] (0/8) Epoch 0, batch 10750, loss[loss=0.2442, simple_loss=0.2969, pruned_loss=0.09572, over 4935.00 frames.], tot_loss[loss=0.2498, simple_loss=0.2936, pruned_loss=0.103, over 972958.86 frames.], batch size: 23, lr: 1.95e-03 2022-05-03 14:07:09,726 INFO [train.py:715] (0/8) Epoch 0, batch 10800, loss[loss=0.1968, simple_loss=0.2523, pruned_loss=0.07066, over 4911.00 frames.], tot_loss[loss=0.2484, simple_loss=0.2927, pruned_loss=0.1021, over 972165.38 frames.], batch size: 17, lr: 1.94e-03 2022-05-03 14:07:50,565 INFO [train.py:715] (0/8) Epoch 0, batch 10850, loss[loss=0.1786, simple_loss=0.2381, pruned_loss=0.05958, over 4834.00 frames.], tot_loss[loss=0.2492, simple_loss=0.293, pruned_loss=0.1027, over 972686.93 frames.], batch size: 13, lr: 1.94e-03 2022-05-03 14:08:30,103 INFO [train.py:715] (0/8) Epoch 0, batch 10900, loss[loss=0.239, simple_loss=0.2823, pruned_loss=0.09782, over 4875.00 frames.], tot_loss[loss=0.2478, simple_loss=0.2919, pruned_loss=0.1018, over 972768.60 frames.], batch size: 30, lr: 1.94e-03 2022-05-03 14:09:10,037 INFO [train.py:715] (0/8) Epoch 0, batch 10950, loss[loss=0.2576, simple_loss=0.2822, pruned_loss=0.1164, over 4842.00 frames.], tot_loss[loss=0.2483, simple_loss=0.292, pruned_loss=0.1023, over 971913.38 frames.], batch size: 13, lr: 1.93e-03 2022-05-03 14:09:50,814 INFO [train.py:715] (0/8) Epoch 0, batch 11000, loss[loss=0.2686, simple_loss=0.3162, pruned_loss=0.1105, over 4838.00 frames.], tot_loss[loss=0.245, simple_loss=0.2895, pruned_loss=0.1002, over 971791.75 frames.], batch size: 26, lr: 1.93e-03 2022-05-03 14:10:31,102 INFO [train.py:715] (0/8) Epoch 0, batch 11050, loss[loss=0.2591, simple_loss=0.2865, pruned_loss=0.1159, over 4892.00 frames.], tot_loss[loss=0.2457, simple_loss=0.2899, pruned_loss=0.1008, over 970963.90 frames.], batch size: 17, lr: 1.93e-03 2022-05-03 14:11:11,145 INFO [train.py:715] (0/8) Epoch 0, batch 11100, loss[loss=0.2912, simple_loss=0.3203, pruned_loss=0.1311, over 4844.00 frames.], tot_loss[loss=0.2446, simple_loss=0.2894, pruned_loss=0.0999, over 971237.61 frames.], batch size: 20, lr: 1.92e-03 2022-05-03 14:11:51,022 INFO [train.py:715] (0/8) Epoch 0, batch 11150, loss[loss=0.2925, simple_loss=0.3275, pruned_loss=0.1288, over 4835.00 frames.], tot_loss[loss=0.2443, simple_loss=0.289, pruned_loss=0.09978, over 971472.53 frames.], batch size: 30, lr: 1.92e-03 2022-05-03 14:12:31,464 INFO [train.py:715] (0/8) Epoch 0, batch 11200, loss[loss=0.2669, simple_loss=0.2969, pruned_loss=0.1184, over 4776.00 frames.], tot_loss[loss=0.2439, simple_loss=0.2888, pruned_loss=0.09945, over 971498.16 frames.], batch size: 14, lr: 1.92e-03 2022-05-03 14:13:10,945 INFO [train.py:715] (0/8) Epoch 0, batch 11250, loss[loss=0.2578, simple_loss=0.2871, pruned_loss=0.1142, over 4962.00 frames.], tot_loss[loss=0.2448, simple_loss=0.2893, pruned_loss=0.1002, over 972807.05 frames.], batch size: 15, lr: 1.91e-03 2022-05-03 14:13:51,038 INFO [train.py:715] (0/8) Epoch 0, batch 11300, loss[loss=0.2664, simple_loss=0.3131, pruned_loss=0.1098, over 4902.00 frames.], tot_loss[loss=0.2451, simple_loss=0.2896, pruned_loss=0.1003, over 973015.42 frames.], batch size: 17, lr: 1.91e-03 2022-05-03 14:14:31,686 INFO [train.py:715] (0/8) Epoch 0, batch 11350, loss[loss=0.2684, simple_loss=0.2908, pruned_loss=0.123, over 4984.00 frames.], tot_loss[loss=0.2475, simple_loss=0.2915, pruned_loss=0.1018, over 973363.82 frames.], batch size: 15, lr: 1.90e-03 2022-05-03 14:15:12,112 INFO [train.py:715] (0/8) Epoch 0, batch 11400, loss[loss=0.3157, simple_loss=0.3364, pruned_loss=0.1475, over 4810.00 frames.], tot_loss[loss=0.2475, simple_loss=0.2918, pruned_loss=0.1016, over 973862.52 frames.], batch size: 26, lr: 1.90e-03 2022-05-03 14:15:51,359 INFO [train.py:715] (0/8) Epoch 0, batch 11450, loss[loss=0.2652, simple_loss=0.3224, pruned_loss=0.104, over 4703.00 frames.], tot_loss[loss=0.2455, simple_loss=0.2904, pruned_loss=0.1003, over 973315.82 frames.], batch size: 15, lr: 1.90e-03 2022-05-03 14:16:32,015 INFO [train.py:715] (0/8) Epoch 0, batch 11500, loss[loss=0.2178, simple_loss=0.278, pruned_loss=0.07879, over 4949.00 frames.], tot_loss[loss=0.2449, simple_loss=0.2901, pruned_loss=0.09985, over 973204.00 frames.], batch size: 24, lr: 1.89e-03 2022-05-03 14:17:12,409 INFO [train.py:715] (0/8) Epoch 0, batch 11550, loss[loss=0.2085, simple_loss=0.2646, pruned_loss=0.07619, over 4897.00 frames.], tot_loss[loss=0.2434, simple_loss=0.2892, pruned_loss=0.09878, over 972654.71 frames.], batch size: 18, lr: 1.89e-03 2022-05-03 14:17:52,480 INFO [train.py:715] (0/8) Epoch 0, batch 11600, loss[loss=0.2504, simple_loss=0.2977, pruned_loss=0.1016, over 4799.00 frames.], tot_loss[loss=0.2439, simple_loss=0.2892, pruned_loss=0.0993, over 972519.19 frames.], batch size: 21, lr: 1.89e-03 2022-05-03 14:18:32,575 INFO [train.py:715] (0/8) Epoch 0, batch 11650, loss[loss=0.2064, simple_loss=0.2605, pruned_loss=0.0762, over 4840.00 frames.], tot_loss[loss=0.2437, simple_loss=0.2887, pruned_loss=0.09934, over 972410.47 frames.], batch size: 13, lr: 1.88e-03 2022-05-03 14:19:13,491 INFO [train.py:715] (0/8) Epoch 0, batch 11700, loss[loss=0.3444, simple_loss=0.3645, pruned_loss=0.1622, over 4974.00 frames.], tot_loss[loss=0.2442, simple_loss=0.2892, pruned_loss=0.0996, over 972609.64 frames.], batch size: 15, lr: 1.88e-03 2022-05-03 14:19:53,844 INFO [train.py:715] (0/8) Epoch 0, batch 11750, loss[loss=0.1809, simple_loss=0.2455, pruned_loss=0.05813, over 4842.00 frames.], tot_loss[loss=0.2445, simple_loss=0.2895, pruned_loss=0.09976, over 972239.45 frames.], batch size: 12, lr: 1.88e-03 2022-05-03 14:20:34,221 INFO [train.py:715] (0/8) Epoch 0, batch 11800, loss[loss=0.2738, simple_loss=0.3094, pruned_loss=0.1191, over 4698.00 frames.], tot_loss[loss=0.2448, simple_loss=0.2892, pruned_loss=0.1002, over 972746.02 frames.], batch size: 15, lr: 1.87e-03 2022-05-03 14:21:14,577 INFO [train.py:715] (0/8) Epoch 0, batch 11850, loss[loss=0.3057, simple_loss=0.3286, pruned_loss=0.1415, over 4828.00 frames.], tot_loss[loss=0.2444, simple_loss=0.2893, pruned_loss=0.09979, over 972684.35 frames.], batch size: 15, lr: 1.87e-03 2022-05-03 14:21:55,675 INFO [train.py:715] (0/8) Epoch 0, batch 11900, loss[loss=0.2086, simple_loss=0.2661, pruned_loss=0.0755, over 4932.00 frames.], tot_loss[loss=0.2421, simple_loss=0.2877, pruned_loss=0.09826, over 972122.04 frames.], batch size: 23, lr: 1.87e-03 2022-05-03 14:22:35,861 INFO [train.py:715] (0/8) Epoch 0, batch 11950, loss[loss=0.2093, simple_loss=0.2573, pruned_loss=0.08066, over 4894.00 frames.], tot_loss[loss=0.2418, simple_loss=0.2875, pruned_loss=0.09806, over 972055.97 frames.], batch size: 32, lr: 1.86e-03 2022-05-03 14:23:15,977 INFO [train.py:715] (0/8) Epoch 0, batch 12000, loss[loss=0.2538, simple_loss=0.302, pruned_loss=0.1028, over 4985.00 frames.], tot_loss[loss=0.2402, simple_loss=0.2867, pruned_loss=0.09691, over 970770.26 frames.], batch size: 25, lr: 1.86e-03 2022-05-03 14:23:15,978 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 14:23:31,274 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1516, simple_loss=0.2368, pruned_loss=0.03315, over 914524.00 frames. 2022-05-03 14:24:11,266 INFO [train.py:715] (0/8) Epoch 0, batch 12050, loss[loss=0.2847, simple_loss=0.3229, pruned_loss=0.1232, over 4863.00 frames.], tot_loss[loss=0.2403, simple_loss=0.2866, pruned_loss=0.09698, over 971247.66 frames.], batch size: 34, lr: 1.86e-03 2022-05-03 14:24:51,298 INFO [train.py:715] (0/8) Epoch 0, batch 12100, loss[loss=0.199, simple_loss=0.2446, pruned_loss=0.07665, over 4818.00 frames.], tot_loss[loss=0.2399, simple_loss=0.2865, pruned_loss=0.09662, over 971357.56 frames.], batch size: 26, lr: 1.85e-03 2022-05-03 14:25:31,592 INFO [train.py:715] (0/8) Epoch 0, batch 12150, loss[loss=0.2753, simple_loss=0.3111, pruned_loss=0.1198, over 4979.00 frames.], tot_loss[loss=0.2419, simple_loss=0.2877, pruned_loss=0.09811, over 972607.12 frames.], batch size: 25, lr: 1.85e-03 2022-05-03 14:26:11,161 INFO [train.py:715] (0/8) Epoch 0, batch 12200, loss[loss=0.1778, simple_loss=0.2271, pruned_loss=0.06419, over 4879.00 frames.], tot_loss[loss=0.2429, simple_loss=0.2882, pruned_loss=0.09883, over 972876.96 frames.], batch size: 16, lr: 1.85e-03 2022-05-03 14:26:51,065 INFO [train.py:715] (0/8) Epoch 0, batch 12250, loss[loss=0.2482, simple_loss=0.2941, pruned_loss=0.1012, over 4970.00 frames.], tot_loss[loss=0.2428, simple_loss=0.2886, pruned_loss=0.09848, over 972538.81 frames.], batch size: 15, lr: 1.84e-03 2022-05-03 14:27:31,551 INFO [train.py:715] (0/8) Epoch 0, batch 12300, loss[loss=0.2828, simple_loss=0.3213, pruned_loss=0.1221, over 4951.00 frames.], tot_loss[loss=0.2417, simple_loss=0.288, pruned_loss=0.09764, over 972691.92 frames.], batch size: 23, lr: 1.84e-03 2022-05-03 14:28:10,862 INFO [train.py:715] (0/8) Epoch 0, batch 12350, loss[loss=0.2276, simple_loss=0.2798, pruned_loss=0.08771, over 4938.00 frames.], tot_loss[loss=0.2416, simple_loss=0.288, pruned_loss=0.09764, over 971774.89 frames.], batch size: 18, lr: 1.84e-03 2022-05-03 14:28:50,833 INFO [train.py:715] (0/8) Epoch 0, batch 12400, loss[loss=0.2593, simple_loss=0.2917, pruned_loss=0.1135, over 4959.00 frames.], tot_loss[loss=0.2413, simple_loss=0.2878, pruned_loss=0.09738, over 972576.69 frames.], batch size: 24, lr: 1.83e-03 2022-05-03 14:29:31,161 INFO [train.py:715] (0/8) Epoch 0, batch 12450, loss[loss=0.1954, simple_loss=0.2734, pruned_loss=0.05872, over 4819.00 frames.], tot_loss[loss=0.2428, simple_loss=0.289, pruned_loss=0.09829, over 972661.35 frames.], batch size: 25, lr: 1.83e-03 2022-05-03 14:30:11,383 INFO [train.py:715] (0/8) Epoch 0, batch 12500, loss[loss=0.2581, simple_loss=0.2972, pruned_loss=0.1095, over 4961.00 frames.], tot_loss[loss=0.2433, simple_loss=0.2896, pruned_loss=0.09846, over 972111.43 frames.], batch size: 24, lr: 1.83e-03 2022-05-03 14:30:50,307 INFO [train.py:715] (0/8) Epoch 0, batch 12550, loss[loss=0.2854, simple_loss=0.3174, pruned_loss=0.1267, over 4981.00 frames.], tot_loss[loss=0.243, simple_loss=0.2894, pruned_loss=0.09826, over 971344.67 frames.], batch size: 15, lr: 1.83e-03 2022-05-03 14:31:30,334 INFO [train.py:715] (0/8) Epoch 0, batch 12600, loss[loss=0.1827, simple_loss=0.2356, pruned_loss=0.06494, over 4647.00 frames.], tot_loss[loss=0.2405, simple_loss=0.2877, pruned_loss=0.09666, over 971608.71 frames.], batch size: 13, lr: 1.82e-03 2022-05-03 14:32:11,366 INFO [train.py:715] (0/8) Epoch 0, batch 12650, loss[loss=0.2566, simple_loss=0.2984, pruned_loss=0.1074, over 4750.00 frames.], tot_loss[loss=0.2398, simple_loss=0.2872, pruned_loss=0.09619, over 971816.50 frames.], batch size: 19, lr: 1.82e-03 2022-05-03 14:32:51,087 INFO [train.py:715] (0/8) Epoch 0, batch 12700, loss[loss=0.2459, simple_loss=0.2841, pruned_loss=0.1039, over 4883.00 frames.], tot_loss[loss=0.2398, simple_loss=0.2873, pruned_loss=0.09612, over 971836.74 frames.], batch size: 22, lr: 1.82e-03 2022-05-03 14:33:30,733 INFO [train.py:715] (0/8) Epoch 0, batch 12750, loss[loss=0.1666, simple_loss=0.2216, pruned_loss=0.05578, over 4815.00 frames.], tot_loss[loss=0.2382, simple_loss=0.2862, pruned_loss=0.09509, over 971752.44 frames.], batch size: 12, lr: 1.81e-03 2022-05-03 14:34:11,175 INFO [train.py:715] (0/8) Epoch 0, batch 12800, loss[loss=0.1996, simple_loss=0.2408, pruned_loss=0.07924, over 4757.00 frames.], tot_loss[loss=0.2388, simple_loss=0.2867, pruned_loss=0.09545, over 971098.59 frames.], batch size: 19, lr: 1.81e-03 2022-05-03 14:34:51,658 INFO [train.py:715] (0/8) Epoch 0, batch 12850, loss[loss=0.2086, simple_loss=0.262, pruned_loss=0.07759, over 4953.00 frames.], tot_loss[loss=0.2388, simple_loss=0.2866, pruned_loss=0.09552, over 971231.48 frames.], batch size: 21, lr: 1.81e-03 2022-05-03 14:35:31,483 INFO [train.py:715] (0/8) Epoch 0, batch 12900, loss[loss=0.2782, simple_loss=0.3143, pruned_loss=0.1211, over 4954.00 frames.], tot_loss[loss=0.2392, simple_loss=0.2866, pruned_loss=0.09588, over 971514.39 frames.], batch size: 35, lr: 1.80e-03 2022-05-03 14:36:11,741 INFO [train.py:715] (0/8) Epoch 0, batch 12950, loss[loss=0.228, simple_loss=0.2733, pruned_loss=0.09135, over 4836.00 frames.], tot_loss[loss=0.2369, simple_loss=0.2845, pruned_loss=0.09459, over 970902.93 frames.], batch size: 30, lr: 1.80e-03 2022-05-03 14:36:52,268 INFO [train.py:715] (0/8) Epoch 0, batch 13000, loss[loss=0.2178, simple_loss=0.2631, pruned_loss=0.08626, over 4982.00 frames.], tot_loss[loss=0.2394, simple_loss=0.2862, pruned_loss=0.09626, over 972226.36 frames.], batch size: 25, lr: 1.80e-03 2022-05-03 14:37:32,730 INFO [train.py:715] (0/8) Epoch 0, batch 13050, loss[loss=0.2662, simple_loss=0.2915, pruned_loss=0.1205, over 4827.00 frames.], tot_loss[loss=0.2413, simple_loss=0.288, pruned_loss=0.09728, over 972859.06 frames.], batch size: 12, lr: 1.79e-03 2022-05-03 14:38:12,069 INFO [train.py:715] (0/8) Epoch 0, batch 13100, loss[loss=0.1911, simple_loss=0.2463, pruned_loss=0.06792, over 4838.00 frames.], tot_loss[loss=0.24, simple_loss=0.2871, pruned_loss=0.09651, over 973079.14 frames.], batch size: 15, lr: 1.79e-03 2022-05-03 14:38:52,501 INFO [train.py:715] (0/8) Epoch 0, batch 13150, loss[loss=0.2023, simple_loss=0.2507, pruned_loss=0.07691, over 4917.00 frames.], tot_loss[loss=0.2406, simple_loss=0.2873, pruned_loss=0.09693, over 973150.19 frames.], batch size: 18, lr: 1.79e-03 2022-05-03 14:39:32,990 INFO [train.py:715] (0/8) Epoch 0, batch 13200, loss[loss=0.2238, simple_loss=0.2595, pruned_loss=0.09407, over 4833.00 frames.], tot_loss[loss=0.241, simple_loss=0.2876, pruned_loss=0.09725, over 972191.39 frames.], batch size: 15, lr: 1.79e-03 2022-05-03 14:40:12,566 INFO [train.py:715] (0/8) Epoch 0, batch 13250, loss[loss=0.2241, simple_loss=0.2805, pruned_loss=0.08381, over 4914.00 frames.], tot_loss[loss=0.2383, simple_loss=0.2857, pruned_loss=0.09549, over 972748.29 frames.], batch size: 23, lr: 1.78e-03 2022-05-03 14:40:52,444 INFO [train.py:715] (0/8) Epoch 0, batch 13300, loss[loss=0.2152, simple_loss=0.2728, pruned_loss=0.07883, over 4899.00 frames.], tot_loss[loss=0.2372, simple_loss=0.2851, pruned_loss=0.09461, over 972389.33 frames.], batch size: 19, lr: 1.78e-03 2022-05-03 14:41:32,818 INFO [train.py:715] (0/8) Epoch 0, batch 13350, loss[loss=0.1969, simple_loss=0.2607, pruned_loss=0.06657, over 4875.00 frames.], tot_loss[loss=0.2376, simple_loss=0.2852, pruned_loss=0.09501, over 971946.15 frames.], batch size: 22, lr: 1.78e-03 2022-05-03 14:42:13,154 INFO [train.py:715] (0/8) Epoch 0, batch 13400, loss[loss=0.2282, simple_loss=0.2738, pruned_loss=0.0913, over 4775.00 frames.], tot_loss[loss=0.2368, simple_loss=0.2846, pruned_loss=0.09448, over 971988.45 frames.], batch size: 17, lr: 1.77e-03 2022-05-03 14:42:52,942 INFO [train.py:715] (0/8) Epoch 0, batch 13450, loss[loss=0.294, simple_loss=0.3251, pruned_loss=0.1315, over 4962.00 frames.], tot_loss[loss=0.2349, simple_loss=0.2831, pruned_loss=0.0933, over 972535.73 frames.], batch size: 15, lr: 1.77e-03 2022-05-03 14:43:33,177 INFO [train.py:715] (0/8) Epoch 0, batch 13500, loss[loss=0.2379, simple_loss=0.2895, pruned_loss=0.09313, over 4873.00 frames.], tot_loss[loss=0.2356, simple_loss=0.2833, pruned_loss=0.09393, over 972175.30 frames.], batch size: 16, lr: 1.77e-03 2022-05-03 14:44:13,341 INFO [train.py:715] (0/8) Epoch 0, batch 13550, loss[loss=0.2538, simple_loss=0.299, pruned_loss=0.1043, over 4881.00 frames.], tot_loss[loss=0.2362, simple_loss=0.2842, pruned_loss=0.0941, over 972222.61 frames.], batch size: 22, lr: 1.77e-03 2022-05-03 14:44:52,794 INFO [train.py:715] (0/8) Epoch 0, batch 13600, loss[loss=0.2074, simple_loss=0.2607, pruned_loss=0.07704, over 4864.00 frames.], tot_loss[loss=0.235, simple_loss=0.2833, pruned_loss=0.09338, over 972698.64 frames.], batch size: 30, lr: 1.76e-03 2022-05-03 14:45:32,771 INFO [train.py:715] (0/8) Epoch 0, batch 13650, loss[loss=0.2095, simple_loss=0.2643, pruned_loss=0.07737, over 4859.00 frames.], tot_loss[loss=0.2336, simple_loss=0.2822, pruned_loss=0.09254, over 972129.72 frames.], batch size: 20, lr: 1.76e-03 2022-05-03 14:46:12,699 INFO [train.py:715] (0/8) Epoch 0, batch 13700, loss[loss=0.269, simple_loss=0.3061, pruned_loss=0.116, over 4975.00 frames.], tot_loss[loss=0.2344, simple_loss=0.2822, pruned_loss=0.09336, over 972511.26 frames.], batch size: 24, lr: 1.76e-03 2022-05-03 14:46:52,713 INFO [train.py:715] (0/8) Epoch 0, batch 13750, loss[loss=0.2972, simple_loss=0.3355, pruned_loss=0.1295, over 4916.00 frames.], tot_loss[loss=0.2358, simple_loss=0.2834, pruned_loss=0.09406, over 972785.71 frames.], batch size: 17, lr: 1.75e-03 2022-05-03 14:47:32,538 INFO [train.py:715] (0/8) Epoch 0, batch 13800, loss[loss=0.2498, simple_loss=0.2892, pruned_loss=0.1052, over 4930.00 frames.], tot_loss[loss=0.236, simple_loss=0.284, pruned_loss=0.09398, over 972907.27 frames.], batch size: 29, lr: 1.75e-03 2022-05-03 14:48:12,868 INFO [train.py:715] (0/8) Epoch 0, batch 13850, loss[loss=0.1985, simple_loss=0.2436, pruned_loss=0.07663, over 4801.00 frames.], tot_loss[loss=0.2362, simple_loss=0.2845, pruned_loss=0.09396, over 971831.83 frames.], batch size: 12, lr: 1.75e-03 2022-05-03 14:48:53,734 INFO [train.py:715] (0/8) Epoch 0, batch 13900, loss[loss=0.2196, simple_loss=0.2622, pruned_loss=0.08853, over 4799.00 frames.], tot_loss[loss=0.233, simple_loss=0.2823, pruned_loss=0.09186, over 972028.78 frames.], batch size: 12, lr: 1.75e-03 2022-05-03 14:49:33,779 INFO [train.py:715] (0/8) Epoch 0, batch 13950, loss[loss=0.1943, simple_loss=0.2573, pruned_loss=0.06564, over 4974.00 frames.], tot_loss[loss=0.2335, simple_loss=0.2827, pruned_loss=0.09216, over 971802.69 frames.], batch size: 14, lr: 1.74e-03 2022-05-03 14:50:14,374 INFO [train.py:715] (0/8) Epoch 0, batch 14000, loss[loss=0.1816, simple_loss=0.2362, pruned_loss=0.06344, over 4983.00 frames.], tot_loss[loss=0.2335, simple_loss=0.2828, pruned_loss=0.09214, over 972382.24 frames.], batch size: 14, lr: 1.74e-03 2022-05-03 14:50:55,235 INFO [train.py:715] (0/8) Epoch 0, batch 14050, loss[loss=0.1905, simple_loss=0.256, pruned_loss=0.06248, over 4913.00 frames.], tot_loss[loss=0.2323, simple_loss=0.2823, pruned_loss=0.09115, over 972355.08 frames.], batch size: 17, lr: 1.74e-03 2022-05-03 14:51:35,697 INFO [train.py:715] (0/8) Epoch 0, batch 14100, loss[loss=0.3029, simple_loss=0.3333, pruned_loss=0.1362, over 4794.00 frames.], tot_loss[loss=0.2334, simple_loss=0.2831, pruned_loss=0.09185, over 972237.83 frames.], batch size: 14, lr: 1.73e-03 2022-05-03 14:52:16,204 INFO [train.py:715] (0/8) Epoch 0, batch 14150, loss[loss=0.1991, simple_loss=0.2515, pruned_loss=0.0734, over 4924.00 frames.], tot_loss[loss=0.2345, simple_loss=0.2838, pruned_loss=0.09258, over 972841.20 frames.], batch size: 18, lr: 1.73e-03 2022-05-03 14:52:56,857 INFO [train.py:715] (0/8) Epoch 0, batch 14200, loss[loss=0.2291, simple_loss=0.2818, pruned_loss=0.08819, over 4806.00 frames.], tot_loss[loss=0.2335, simple_loss=0.2826, pruned_loss=0.09217, over 973478.50 frames.], batch size: 26, lr: 1.73e-03 2022-05-03 14:53:37,714 INFO [train.py:715] (0/8) Epoch 0, batch 14250, loss[loss=0.2189, simple_loss=0.2809, pruned_loss=0.0784, over 4805.00 frames.], tot_loss[loss=0.235, simple_loss=0.2836, pruned_loss=0.09319, over 973361.67 frames.], batch size: 21, lr: 1.73e-03 2022-05-03 14:54:18,412 INFO [train.py:715] (0/8) Epoch 0, batch 14300, loss[loss=0.2951, simple_loss=0.3261, pruned_loss=0.132, over 4859.00 frames.], tot_loss[loss=0.2352, simple_loss=0.2838, pruned_loss=0.09328, over 973274.45 frames.], batch size: 16, lr: 1.72e-03 2022-05-03 14:54:59,474 INFO [train.py:715] (0/8) Epoch 0, batch 14350, loss[loss=0.249, simple_loss=0.2931, pruned_loss=0.1024, over 4761.00 frames.], tot_loss[loss=0.2349, simple_loss=0.2843, pruned_loss=0.09273, over 973146.21 frames.], batch size: 14, lr: 1.72e-03 2022-05-03 14:55:40,716 INFO [train.py:715] (0/8) Epoch 0, batch 14400, loss[loss=0.1677, simple_loss=0.2417, pruned_loss=0.04684, over 4771.00 frames.], tot_loss[loss=0.2339, simple_loss=0.2835, pruned_loss=0.09217, over 972635.31 frames.], batch size: 17, lr: 1.72e-03 2022-05-03 14:56:21,190 INFO [train.py:715] (0/8) Epoch 0, batch 14450, loss[loss=0.2206, simple_loss=0.27, pruned_loss=0.08561, over 4817.00 frames.], tot_loss[loss=0.2342, simple_loss=0.2834, pruned_loss=0.09247, over 973559.93 frames.], batch size: 26, lr: 1.72e-03 2022-05-03 14:57:01,537 INFO [train.py:715] (0/8) Epoch 0, batch 14500, loss[loss=0.2152, simple_loss=0.2695, pruned_loss=0.08042, over 4815.00 frames.], tot_loss[loss=0.2339, simple_loss=0.2829, pruned_loss=0.09244, over 972935.92 frames.], batch size: 25, lr: 1.71e-03 2022-05-03 14:57:42,207 INFO [train.py:715] (0/8) Epoch 0, batch 14550, loss[loss=0.198, simple_loss=0.2482, pruned_loss=0.07388, over 4703.00 frames.], tot_loss[loss=0.2334, simple_loss=0.2827, pruned_loss=0.09202, over 972250.22 frames.], batch size: 15, lr: 1.71e-03 2022-05-03 14:58:22,167 INFO [train.py:715] (0/8) Epoch 0, batch 14600, loss[loss=0.213, simple_loss=0.2626, pruned_loss=0.08164, over 4936.00 frames.], tot_loss[loss=0.2352, simple_loss=0.2837, pruned_loss=0.09333, over 973194.92 frames.], batch size: 29, lr: 1.71e-03 2022-05-03 14:59:01,456 INFO [train.py:715] (0/8) Epoch 0, batch 14650, loss[loss=0.2875, simple_loss=0.3368, pruned_loss=0.1191, over 4775.00 frames.], tot_loss[loss=0.2335, simple_loss=0.2822, pruned_loss=0.09237, over 972314.25 frames.], batch size: 18, lr: 1.70e-03 2022-05-03 14:59:41,810 INFO [train.py:715] (0/8) Epoch 0, batch 14700, loss[loss=0.2066, simple_loss=0.26, pruned_loss=0.07666, over 4770.00 frames.], tot_loss[loss=0.2329, simple_loss=0.2819, pruned_loss=0.09194, over 972372.74 frames.], batch size: 18, lr: 1.70e-03 2022-05-03 15:00:22,084 INFO [train.py:715] (0/8) Epoch 0, batch 14750, loss[loss=0.1963, simple_loss=0.2633, pruned_loss=0.06462, over 4814.00 frames.], tot_loss[loss=0.2343, simple_loss=0.2831, pruned_loss=0.09279, over 971713.87 frames.], batch size: 21, lr: 1.70e-03 2022-05-03 15:01:02,120 INFO [train.py:715] (0/8) Epoch 0, batch 14800, loss[loss=0.1971, simple_loss=0.2572, pruned_loss=0.06848, over 4802.00 frames.], tot_loss[loss=0.2376, simple_loss=0.2854, pruned_loss=0.09486, over 971909.03 frames.], batch size: 13, lr: 1.70e-03 2022-05-03 15:01:41,997 INFO [train.py:715] (0/8) Epoch 0, batch 14850, loss[loss=0.3, simple_loss=0.3201, pruned_loss=0.14, over 4781.00 frames.], tot_loss[loss=0.2361, simple_loss=0.2846, pruned_loss=0.0938, over 971420.22 frames.], batch size: 14, lr: 1.69e-03 2022-05-03 15:02:22,720 INFO [train.py:715] (0/8) Epoch 0, batch 14900, loss[loss=0.2531, simple_loss=0.307, pruned_loss=0.09964, over 4806.00 frames.], tot_loss[loss=0.237, simple_loss=0.2848, pruned_loss=0.09455, over 970590.06 frames.], batch size: 25, lr: 1.69e-03 2022-05-03 15:03:02,607 INFO [train.py:715] (0/8) Epoch 0, batch 14950, loss[loss=0.2376, simple_loss=0.2754, pruned_loss=0.09987, over 4791.00 frames.], tot_loss[loss=0.236, simple_loss=0.2842, pruned_loss=0.09387, over 970712.36 frames.], batch size: 18, lr: 1.69e-03 2022-05-03 15:03:42,038 INFO [train.py:715] (0/8) Epoch 0, batch 15000, loss[loss=0.1972, simple_loss=0.2658, pruned_loss=0.0643, over 4938.00 frames.], tot_loss[loss=0.2343, simple_loss=0.2834, pruned_loss=0.09258, over 970953.50 frames.], batch size: 29, lr: 1.69e-03 2022-05-03 15:03:42,039 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 15:03:53,634 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1454, simple_loss=0.2314, pruned_loss=0.02968, over 914524.00 frames. 2022-05-03 15:04:32,996 INFO [train.py:715] (0/8) Epoch 0, batch 15050, loss[loss=0.2035, simple_loss=0.2563, pruned_loss=0.07531, over 4775.00 frames.], tot_loss[loss=0.2321, simple_loss=0.2823, pruned_loss=0.09097, over 971972.28 frames.], batch size: 14, lr: 1.68e-03 2022-05-03 15:05:13,558 INFO [train.py:715] (0/8) Epoch 0, batch 15100, loss[loss=0.2071, simple_loss=0.257, pruned_loss=0.07858, over 4885.00 frames.], tot_loss[loss=0.2323, simple_loss=0.2822, pruned_loss=0.09118, over 971837.98 frames.], batch size: 32, lr: 1.68e-03 2022-05-03 15:05:53,899 INFO [train.py:715] (0/8) Epoch 0, batch 15150, loss[loss=0.245, simple_loss=0.2757, pruned_loss=0.1072, over 4777.00 frames.], tot_loss[loss=0.2322, simple_loss=0.2822, pruned_loss=0.09114, over 972068.57 frames.], batch size: 12, lr: 1.68e-03 2022-05-03 15:06:33,825 INFO [train.py:715] (0/8) Epoch 0, batch 15200, loss[loss=0.2174, simple_loss=0.2668, pruned_loss=0.08401, over 4805.00 frames.], tot_loss[loss=0.2313, simple_loss=0.2813, pruned_loss=0.09059, over 971814.30 frames.], batch size: 12, lr: 1.68e-03 2022-05-03 15:07:13,397 INFO [train.py:715] (0/8) Epoch 0, batch 15250, loss[loss=0.2387, simple_loss=0.2885, pruned_loss=0.09448, over 4837.00 frames.], tot_loss[loss=0.2307, simple_loss=0.2809, pruned_loss=0.09024, over 972263.21 frames.], batch size: 30, lr: 1.67e-03 2022-05-03 15:07:53,257 INFO [train.py:715] (0/8) Epoch 0, batch 15300, loss[loss=0.2218, simple_loss=0.2879, pruned_loss=0.0779, over 4972.00 frames.], tot_loss[loss=0.2297, simple_loss=0.28, pruned_loss=0.0897, over 972196.95 frames.], batch size: 24, lr: 1.67e-03 2022-05-03 15:08:33,608 INFO [train.py:715] (0/8) Epoch 0, batch 15350, loss[loss=0.2118, simple_loss=0.2706, pruned_loss=0.0765, over 4765.00 frames.], tot_loss[loss=0.2308, simple_loss=0.2811, pruned_loss=0.09022, over 971428.38 frames.], batch size: 14, lr: 1.67e-03 2022-05-03 15:09:13,461 INFO [train.py:715] (0/8) Epoch 0, batch 15400, loss[loss=0.2372, simple_loss=0.2872, pruned_loss=0.09366, over 4767.00 frames.], tot_loss[loss=0.2302, simple_loss=0.2812, pruned_loss=0.08959, over 971184.61 frames.], batch size: 12, lr: 1.67e-03 2022-05-03 15:09:53,906 INFO [train.py:715] (0/8) Epoch 0, batch 15450, loss[loss=0.199, simple_loss=0.2648, pruned_loss=0.06661, over 4820.00 frames.], tot_loss[loss=0.2301, simple_loss=0.2813, pruned_loss=0.08947, over 971441.78 frames.], batch size: 13, lr: 1.66e-03 2022-05-03 15:10:33,373 INFO [train.py:715] (0/8) Epoch 0, batch 15500, loss[loss=0.1982, simple_loss=0.2584, pruned_loss=0.06904, over 4752.00 frames.], tot_loss[loss=0.2297, simple_loss=0.2815, pruned_loss=0.08899, over 971748.57 frames.], batch size: 19, lr: 1.66e-03 2022-05-03 15:11:12,572 INFO [train.py:715] (0/8) Epoch 0, batch 15550, loss[loss=0.2083, simple_loss=0.269, pruned_loss=0.07376, over 4835.00 frames.], tot_loss[loss=0.2289, simple_loss=0.2802, pruned_loss=0.08879, over 971412.89 frames.], batch size: 15, lr: 1.66e-03 2022-05-03 15:11:52,066 INFO [train.py:715] (0/8) Epoch 0, batch 15600, loss[loss=0.2004, simple_loss=0.2603, pruned_loss=0.07029, over 4982.00 frames.], tot_loss[loss=0.2296, simple_loss=0.2805, pruned_loss=0.08933, over 972145.09 frames.], batch size: 27, lr: 1.66e-03 2022-05-03 15:12:31,511 INFO [train.py:715] (0/8) Epoch 0, batch 15650, loss[loss=0.1675, simple_loss=0.2238, pruned_loss=0.0556, over 4966.00 frames.], tot_loss[loss=0.2277, simple_loss=0.279, pruned_loss=0.08817, over 971985.10 frames.], batch size: 14, lr: 1.65e-03 2022-05-03 15:13:11,299 INFO [train.py:715] (0/8) Epoch 0, batch 15700, loss[loss=0.2537, simple_loss=0.3033, pruned_loss=0.1021, over 4864.00 frames.], tot_loss[loss=0.2278, simple_loss=0.2791, pruned_loss=0.08819, over 971199.56 frames.], batch size: 22, lr: 1.65e-03 2022-05-03 15:13:50,904 INFO [train.py:715] (0/8) Epoch 0, batch 15750, loss[loss=0.1983, simple_loss=0.2642, pruned_loss=0.06618, over 4810.00 frames.], tot_loss[loss=0.2288, simple_loss=0.2799, pruned_loss=0.08881, over 970832.49 frames.], batch size: 21, lr: 1.65e-03 2022-05-03 15:14:30,844 INFO [train.py:715] (0/8) Epoch 0, batch 15800, loss[loss=0.2403, simple_loss=0.298, pruned_loss=0.0913, over 4868.00 frames.], tot_loss[loss=0.2286, simple_loss=0.2801, pruned_loss=0.08859, over 971626.65 frames.], batch size: 22, lr: 1.65e-03 2022-05-03 15:15:10,666 INFO [train.py:715] (0/8) Epoch 0, batch 15850, loss[loss=0.1949, simple_loss=0.2474, pruned_loss=0.07125, over 4858.00 frames.], tot_loss[loss=0.2297, simple_loss=0.2807, pruned_loss=0.08934, over 971707.43 frames.], batch size: 20, lr: 1.65e-03 2022-05-03 15:15:50,243 INFO [train.py:715] (0/8) Epoch 0, batch 15900, loss[loss=0.2745, simple_loss=0.3101, pruned_loss=0.1195, over 4808.00 frames.], tot_loss[loss=0.2279, simple_loss=0.2795, pruned_loss=0.08819, over 971299.61 frames.], batch size: 25, lr: 1.64e-03 2022-05-03 15:16:30,471 INFO [train.py:715] (0/8) Epoch 0, batch 15950, loss[loss=0.242, simple_loss=0.2889, pruned_loss=0.09759, over 4956.00 frames.], tot_loss[loss=0.2304, simple_loss=0.2815, pruned_loss=0.08963, over 971702.86 frames.], batch size: 24, lr: 1.64e-03 2022-05-03 15:17:09,743 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-16000.pt 2022-05-03 15:17:12,827 INFO [train.py:715] (0/8) Epoch 0, batch 16000, loss[loss=0.216, simple_loss=0.2722, pruned_loss=0.07994, over 4936.00 frames.], tot_loss[loss=0.2312, simple_loss=0.2819, pruned_loss=0.09027, over 971145.68 frames.], batch size: 35, lr: 1.64e-03 2022-05-03 15:17:52,705 INFO [train.py:715] (0/8) Epoch 0, batch 16050, loss[loss=0.2586, simple_loss=0.3028, pruned_loss=0.1072, over 4939.00 frames.], tot_loss[loss=0.2303, simple_loss=0.2811, pruned_loss=0.08968, over 971120.69 frames.], batch size: 23, lr: 1.64e-03 2022-05-03 15:18:33,252 INFO [train.py:715] (0/8) Epoch 0, batch 16100, loss[loss=0.2144, simple_loss=0.263, pruned_loss=0.08296, over 4988.00 frames.], tot_loss[loss=0.2296, simple_loss=0.2805, pruned_loss=0.0893, over 971826.42 frames.], batch size: 25, lr: 1.63e-03 2022-05-03 15:19:13,430 INFO [train.py:715] (0/8) Epoch 0, batch 16150, loss[loss=0.2367, simple_loss=0.2738, pruned_loss=0.09983, over 4920.00 frames.], tot_loss[loss=0.2299, simple_loss=0.2808, pruned_loss=0.08946, over 971391.47 frames.], batch size: 39, lr: 1.63e-03 2022-05-03 15:19:52,898 INFO [train.py:715] (0/8) Epoch 0, batch 16200, loss[loss=0.2104, simple_loss=0.2653, pruned_loss=0.07778, over 4981.00 frames.], tot_loss[loss=0.2292, simple_loss=0.2808, pruned_loss=0.08883, over 971203.46 frames.], batch size: 14, lr: 1.63e-03 2022-05-03 15:20:32,322 INFO [train.py:715] (0/8) Epoch 0, batch 16250, loss[loss=0.1813, simple_loss=0.2438, pruned_loss=0.05936, over 4801.00 frames.], tot_loss[loss=0.2287, simple_loss=0.2803, pruned_loss=0.0885, over 971608.98 frames.], batch size: 24, lr: 1.63e-03 2022-05-03 15:21:12,243 INFO [train.py:715] (0/8) Epoch 0, batch 16300, loss[loss=0.2472, simple_loss=0.2868, pruned_loss=0.1038, over 4921.00 frames.], tot_loss[loss=0.2265, simple_loss=0.2784, pruned_loss=0.08725, over 972508.03 frames.], batch size: 18, lr: 1.62e-03 2022-05-03 15:21:51,670 INFO [train.py:715] (0/8) Epoch 0, batch 16350, loss[loss=0.2519, simple_loss=0.2982, pruned_loss=0.1028, over 4933.00 frames.], tot_loss[loss=0.2277, simple_loss=0.2791, pruned_loss=0.08811, over 971586.05 frames.], batch size: 29, lr: 1.62e-03 2022-05-03 15:22:31,099 INFO [train.py:715] (0/8) Epoch 0, batch 16400, loss[loss=0.2164, simple_loss=0.2673, pruned_loss=0.08272, over 4958.00 frames.], tot_loss[loss=0.228, simple_loss=0.2794, pruned_loss=0.08835, over 971309.96 frames.], batch size: 15, lr: 1.62e-03 2022-05-03 15:23:11,063 INFO [train.py:715] (0/8) Epoch 0, batch 16450, loss[loss=0.293, simple_loss=0.3229, pruned_loss=0.1316, over 4985.00 frames.], tot_loss[loss=0.2269, simple_loss=0.2786, pruned_loss=0.08757, over 971483.97 frames.], batch size: 28, lr: 1.62e-03 2022-05-03 15:23:51,596 INFO [train.py:715] (0/8) Epoch 0, batch 16500, loss[loss=0.2152, simple_loss=0.2641, pruned_loss=0.08316, over 4956.00 frames.], tot_loss[loss=0.2271, simple_loss=0.2793, pruned_loss=0.08745, over 971572.08 frames.], batch size: 15, lr: 1.62e-03 2022-05-03 15:24:31,541 INFO [train.py:715] (0/8) Epoch 0, batch 16550, loss[loss=0.2615, simple_loss=0.3138, pruned_loss=0.1046, over 4902.00 frames.], tot_loss[loss=0.2277, simple_loss=0.2793, pruned_loss=0.088, over 971983.23 frames.], batch size: 22, lr: 1.61e-03 2022-05-03 15:25:11,225 INFO [train.py:715] (0/8) Epoch 0, batch 16600, loss[loss=0.2312, simple_loss=0.2734, pruned_loss=0.09451, over 4841.00 frames.], tot_loss[loss=0.2276, simple_loss=0.279, pruned_loss=0.0881, over 973204.98 frames.], batch size: 30, lr: 1.61e-03 2022-05-03 15:25:50,680 INFO [train.py:715] (0/8) Epoch 0, batch 16650, loss[loss=0.3006, simple_loss=0.3434, pruned_loss=0.1289, over 4685.00 frames.], tot_loss[loss=0.2288, simple_loss=0.2801, pruned_loss=0.08872, over 972228.00 frames.], batch size: 15, lr: 1.61e-03 2022-05-03 15:26:30,541 INFO [train.py:715] (0/8) Epoch 0, batch 16700, loss[loss=0.261, simple_loss=0.294, pruned_loss=0.1141, over 4903.00 frames.], tot_loss[loss=0.2286, simple_loss=0.2798, pruned_loss=0.08875, over 972586.82 frames.], batch size: 17, lr: 1.61e-03 2022-05-03 15:27:09,634 INFO [train.py:715] (0/8) Epoch 0, batch 16750, loss[loss=0.223, simple_loss=0.2731, pruned_loss=0.08642, over 4838.00 frames.], tot_loss[loss=0.2287, simple_loss=0.2798, pruned_loss=0.08885, over 973080.66 frames.], batch size: 13, lr: 1.60e-03 2022-05-03 15:27:48,777 INFO [train.py:715] (0/8) Epoch 0, batch 16800, loss[loss=0.2832, simple_loss=0.3275, pruned_loss=0.1195, over 4692.00 frames.], tot_loss[loss=0.2278, simple_loss=0.2792, pruned_loss=0.08817, over 973010.64 frames.], batch size: 15, lr: 1.60e-03 2022-05-03 15:28:28,417 INFO [train.py:715] (0/8) Epoch 0, batch 16850, loss[loss=0.2475, simple_loss=0.2928, pruned_loss=0.1011, over 4935.00 frames.], tot_loss[loss=0.2275, simple_loss=0.2788, pruned_loss=0.08814, over 972802.99 frames.], batch size: 18, lr: 1.60e-03 2022-05-03 15:29:08,026 INFO [train.py:715] (0/8) Epoch 0, batch 16900, loss[loss=0.2292, simple_loss=0.2771, pruned_loss=0.09066, over 4987.00 frames.], tot_loss[loss=0.2282, simple_loss=0.2793, pruned_loss=0.08853, over 973549.48 frames.], batch size: 27, lr: 1.60e-03 2022-05-03 15:29:47,266 INFO [train.py:715] (0/8) Epoch 0, batch 16950, loss[loss=0.2221, simple_loss=0.2755, pruned_loss=0.0844, over 4860.00 frames.], tot_loss[loss=0.2286, simple_loss=0.2799, pruned_loss=0.08863, over 973901.97 frames.], batch size: 32, lr: 1.60e-03 2022-05-03 15:30:27,235 INFO [train.py:715] (0/8) Epoch 0, batch 17000, loss[loss=0.2112, simple_loss=0.2678, pruned_loss=0.07725, over 4806.00 frames.], tot_loss[loss=0.2283, simple_loss=0.2797, pruned_loss=0.08845, over 973296.32 frames.], batch size: 21, lr: 1.59e-03 2022-05-03 15:31:07,748 INFO [train.py:715] (0/8) Epoch 0, batch 17050, loss[loss=0.1998, simple_loss=0.2651, pruned_loss=0.06731, over 4850.00 frames.], tot_loss[loss=0.2279, simple_loss=0.2796, pruned_loss=0.08812, over 973811.43 frames.], batch size: 30, lr: 1.59e-03 2022-05-03 15:31:47,487 INFO [train.py:715] (0/8) Epoch 0, batch 17100, loss[loss=0.2317, simple_loss=0.2829, pruned_loss=0.09021, over 4932.00 frames.], tot_loss[loss=0.2289, simple_loss=0.2807, pruned_loss=0.08854, over 974054.03 frames.], batch size: 23, lr: 1.59e-03 2022-05-03 15:32:26,649 INFO [train.py:715] (0/8) Epoch 0, batch 17150, loss[loss=0.1712, simple_loss=0.2369, pruned_loss=0.05273, over 4928.00 frames.], tot_loss[loss=0.2307, simple_loss=0.2824, pruned_loss=0.08951, over 973419.17 frames.], batch size: 23, lr: 1.59e-03 2022-05-03 15:33:06,907 INFO [train.py:715] (0/8) Epoch 0, batch 17200, loss[loss=0.2464, simple_loss=0.297, pruned_loss=0.09792, over 4924.00 frames.], tot_loss[loss=0.2301, simple_loss=0.2816, pruned_loss=0.08929, over 973189.46 frames.], batch size: 23, lr: 1.58e-03 2022-05-03 15:33:46,680 INFO [train.py:715] (0/8) Epoch 0, batch 17250, loss[loss=0.226, simple_loss=0.2852, pruned_loss=0.08341, over 4907.00 frames.], tot_loss[loss=0.2291, simple_loss=0.2803, pruned_loss=0.08892, over 972833.89 frames.], batch size: 18, lr: 1.58e-03 2022-05-03 15:34:26,237 INFO [train.py:715] (0/8) Epoch 0, batch 17300, loss[loss=0.2014, simple_loss=0.2664, pruned_loss=0.06814, over 4988.00 frames.], tot_loss[loss=0.2287, simple_loss=0.28, pruned_loss=0.08867, over 972782.28 frames.], batch size: 28, lr: 1.58e-03 2022-05-03 15:35:06,293 INFO [train.py:715] (0/8) Epoch 0, batch 17350, loss[loss=0.2475, simple_loss=0.2935, pruned_loss=0.1007, over 4818.00 frames.], tot_loss[loss=0.2278, simple_loss=0.2793, pruned_loss=0.08813, over 972932.66 frames.], batch size: 25, lr: 1.58e-03 2022-05-03 15:35:46,527 INFO [train.py:715] (0/8) Epoch 0, batch 17400, loss[loss=0.252, simple_loss=0.3041, pruned_loss=0.09997, over 4797.00 frames.], tot_loss[loss=0.227, simple_loss=0.2784, pruned_loss=0.08777, over 972861.64 frames.], batch size: 25, lr: 1.58e-03 2022-05-03 15:36:26,423 INFO [train.py:715] (0/8) Epoch 0, batch 17450, loss[loss=0.2219, simple_loss=0.2729, pruned_loss=0.08551, over 4965.00 frames.], tot_loss[loss=0.2262, simple_loss=0.2779, pruned_loss=0.08724, over 972859.87 frames.], batch size: 35, lr: 1.57e-03 2022-05-03 15:37:07,037 INFO [train.py:715] (0/8) Epoch 0, batch 17500, loss[loss=0.2309, simple_loss=0.2773, pruned_loss=0.09223, over 4814.00 frames.], tot_loss[loss=0.2272, simple_loss=0.2786, pruned_loss=0.08792, over 972833.22 frames.], batch size: 25, lr: 1.57e-03 2022-05-03 15:37:47,460 INFO [train.py:715] (0/8) Epoch 0, batch 17550, loss[loss=0.2117, simple_loss=0.2649, pruned_loss=0.07927, over 4858.00 frames.], tot_loss[loss=0.227, simple_loss=0.2783, pruned_loss=0.08788, over 972893.16 frames.], batch size: 20, lr: 1.57e-03 2022-05-03 15:38:27,020 INFO [train.py:715] (0/8) Epoch 0, batch 17600, loss[loss=0.2015, simple_loss=0.2575, pruned_loss=0.07275, over 4795.00 frames.], tot_loss[loss=0.226, simple_loss=0.2776, pruned_loss=0.08724, over 973013.24 frames.], batch size: 18, lr: 1.57e-03 2022-05-03 15:39:06,939 INFO [train.py:715] (0/8) Epoch 0, batch 17650, loss[loss=0.22, simple_loss=0.2743, pruned_loss=0.08279, over 4931.00 frames.], tot_loss[loss=0.2256, simple_loss=0.2773, pruned_loss=0.08695, over 973120.31 frames.], batch size: 39, lr: 1.57e-03 2022-05-03 15:39:47,479 INFO [train.py:715] (0/8) Epoch 0, batch 17700, loss[loss=0.1985, simple_loss=0.2605, pruned_loss=0.06825, over 4869.00 frames.], tot_loss[loss=0.2272, simple_loss=0.2782, pruned_loss=0.08808, over 972989.25 frames.], batch size: 22, lr: 1.56e-03 2022-05-03 15:40:27,382 INFO [train.py:715] (0/8) Epoch 0, batch 17750, loss[loss=0.2074, simple_loss=0.2718, pruned_loss=0.07153, over 4837.00 frames.], tot_loss[loss=0.2254, simple_loss=0.2771, pruned_loss=0.08687, over 973005.30 frames.], batch size: 30, lr: 1.56e-03 2022-05-03 15:41:07,058 INFO [train.py:715] (0/8) Epoch 0, batch 17800, loss[loss=0.209, simple_loss=0.2624, pruned_loss=0.07781, over 4983.00 frames.], tot_loss[loss=0.2239, simple_loss=0.2764, pruned_loss=0.0857, over 972885.05 frames.], batch size: 28, lr: 1.56e-03 2022-05-03 15:41:47,861 INFO [train.py:715] (0/8) Epoch 0, batch 17850, loss[loss=0.2209, simple_loss=0.2706, pruned_loss=0.08559, over 4900.00 frames.], tot_loss[loss=0.2235, simple_loss=0.2763, pruned_loss=0.08535, over 973536.95 frames.], batch size: 22, lr: 1.56e-03 2022-05-03 15:42:28,485 INFO [train.py:715] (0/8) Epoch 0, batch 17900, loss[loss=0.194, simple_loss=0.2454, pruned_loss=0.07131, over 4980.00 frames.], tot_loss[loss=0.224, simple_loss=0.2768, pruned_loss=0.0856, over 973631.78 frames.], batch size: 24, lr: 1.56e-03 2022-05-03 15:43:07,988 INFO [train.py:715] (0/8) Epoch 0, batch 17950, loss[loss=0.2323, simple_loss=0.2947, pruned_loss=0.08495, over 4898.00 frames.], tot_loss[loss=0.224, simple_loss=0.2767, pruned_loss=0.08563, over 974273.39 frames.], batch size: 22, lr: 1.55e-03 2022-05-03 15:43:48,224 INFO [train.py:715] (0/8) Epoch 0, batch 18000, loss[loss=0.2127, simple_loss=0.2752, pruned_loss=0.07508, over 4960.00 frames.], tot_loss[loss=0.2247, simple_loss=0.2774, pruned_loss=0.08604, over 974023.82 frames.], batch size: 35, lr: 1.55e-03 2022-05-03 15:43:48,225 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 15:43:57,827 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.141, simple_loss=0.228, pruned_loss=0.02706, over 914524.00 frames. 2022-05-03 15:44:38,093 INFO [train.py:715] (0/8) Epoch 0, batch 18050, loss[loss=0.2616, simple_loss=0.3005, pruned_loss=0.1114, over 4832.00 frames.], tot_loss[loss=0.2253, simple_loss=0.2778, pruned_loss=0.08638, over 974183.82 frames.], batch size: 15, lr: 1.55e-03 2022-05-03 15:45:18,347 INFO [train.py:715] (0/8) Epoch 0, batch 18100, loss[loss=0.185, simple_loss=0.2286, pruned_loss=0.07071, over 4836.00 frames.], tot_loss[loss=0.2242, simple_loss=0.2771, pruned_loss=0.08562, over 973935.93 frames.], batch size: 13, lr: 1.55e-03 2022-05-03 15:45:58,158 INFO [train.py:715] (0/8) Epoch 0, batch 18150, loss[loss=0.2571, simple_loss=0.3043, pruned_loss=0.105, over 4944.00 frames.], tot_loss[loss=0.2232, simple_loss=0.2764, pruned_loss=0.08499, over 974239.53 frames.], batch size: 24, lr: 1.55e-03 2022-05-03 15:46:37,570 INFO [train.py:715] (0/8) Epoch 0, batch 18200, loss[loss=0.1906, simple_loss=0.2511, pruned_loss=0.0651, over 4908.00 frames.], tot_loss[loss=0.223, simple_loss=0.276, pruned_loss=0.08498, over 974032.49 frames.], batch size: 19, lr: 1.54e-03 2022-05-03 15:47:17,748 INFO [train.py:715] (0/8) Epoch 0, batch 18250, loss[loss=0.241, simple_loss=0.2914, pruned_loss=0.09529, over 4909.00 frames.], tot_loss[loss=0.223, simple_loss=0.2758, pruned_loss=0.08512, over 973240.30 frames.], batch size: 29, lr: 1.54e-03 2022-05-03 15:47:59,027 INFO [train.py:715] (0/8) Epoch 0, batch 18300, loss[loss=0.2253, simple_loss=0.2822, pruned_loss=0.08423, over 4816.00 frames.], tot_loss[loss=0.2226, simple_loss=0.2755, pruned_loss=0.08488, over 972721.91 frames.], batch size: 25, lr: 1.54e-03 2022-05-03 15:48:38,806 INFO [train.py:715] (0/8) Epoch 0, batch 18350, loss[loss=0.2379, simple_loss=0.2832, pruned_loss=0.09633, over 4842.00 frames.], tot_loss[loss=0.2229, simple_loss=0.2756, pruned_loss=0.0851, over 972967.01 frames.], batch size: 30, lr: 1.54e-03 2022-05-03 15:49:19,073 INFO [train.py:715] (0/8) Epoch 0, batch 18400, loss[loss=0.2063, simple_loss=0.2657, pruned_loss=0.07345, over 4808.00 frames.], tot_loss[loss=0.2233, simple_loss=0.2765, pruned_loss=0.08511, over 973305.94 frames.], batch size: 25, lr: 1.54e-03 2022-05-03 15:49:59,576 INFO [train.py:715] (0/8) Epoch 0, batch 18450, loss[loss=0.232, simple_loss=0.2964, pruned_loss=0.08375, over 4774.00 frames.], tot_loss[loss=0.2244, simple_loss=0.2774, pruned_loss=0.08566, over 973442.65 frames.], batch size: 17, lr: 1.53e-03 2022-05-03 15:50:39,249 INFO [train.py:715] (0/8) Epoch 0, batch 18500, loss[loss=0.2865, simple_loss=0.3251, pruned_loss=0.124, over 4884.00 frames.], tot_loss[loss=0.2241, simple_loss=0.2772, pruned_loss=0.08552, over 973259.75 frames.], batch size: 16, lr: 1.53e-03 2022-05-03 15:51:19,783 INFO [train.py:715] (0/8) Epoch 0, batch 18550, loss[loss=0.2053, simple_loss=0.2579, pruned_loss=0.07636, over 4841.00 frames.], tot_loss[loss=0.2243, simple_loss=0.2772, pruned_loss=0.08568, over 972965.22 frames.], batch size: 13, lr: 1.53e-03 2022-05-03 15:52:00,085 INFO [train.py:715] (0/8) Epoch 0, batch 18600, loss[loss=0.1896, simple_loss=0.2564, pruned_loss=0.06144, over 4923.00 frames.], tot_loss[loss=0.2244, simple_loss=0.2774, pruned_loss=0.08567, over 972953.35 frames.], batch size: 21, lr: 1.53e-03 2022-05-03 15:52:40,191 INFO [train.py:715] (0/8) Epoch 0, batch 18650, loss[loss=0.1853, simple_loss=0.2562, pruned_loss=0.05725, over 4960.00 frames.], tot_loss[loss=0.2226, simple_loss=0.2761, pruned_loss=0.08453, over 972779.33 frames.], batch size: 24, lr: 1.53e-03 2022-05-03 15:53:19,601 INFO [train.py:715] (0/8) Epoch 0, batch 18700, loss[loss=0.1805, simple_loss=0.2474, pruned_loss=0.05684, over 4860.00 frames.], tot_loss[loss=0.2236, simple_loss=0.2764, pruned_loss=0.08544, over 971863.22 frames.], batch size: 13, lr: 1.52e-03 2022-05-03 15:53:59,909 INFO [train.py:715] (0/8) Epoch 0, batch 18750, loss[loss=0.2978, simple_loss=0.3271, pruned_loss=0.1342, over 4991.00 frames.], tot_loss[loss=0.2224, simple_loss=0.2751, pruned_loss=0.0848, over 971291.33 frames.], batch size: 15, lr: 1.52e-03 2022-05-03 15:54:41,177 INFO [train.py:715] (0/8) Epoch 0, batch 18800, loss[loss=0.2133, simple_loss=0.2652, pruned_loss=0.08073, over 4700.00 frames.], tot_loss[loss=0.221, simple_loss=0.2738, pruned_loss=0.08413, over 971863.23 frames.], batch size: 15, lr: 1.52e-03 2022-05-03 15:55:20,404 INFO [train.py:715] (0/8) Epoch 0, batch 18850, loss[loss=0.3007, simple_loss=0.3262, pruned_loss=0.1376, over 4797.00 frames.], tot_loss[loss=0.2221, simple_loss=0.2748, pruned_loss=0.08476, over 971926.46 frames.], batch size: 14, lr: 1.52e-03 2022-05-03 15:56:01,307 INFO [train.py:715] (0/8) Epoch 0, batch 18900, loss[loss=0.2408, simple_loss=0.2929, pruned_loss=0.09437, over 4792.00 frames.], tot_loss[loss=0.2224, simple_loss=0.2749, pruned_loss=0.08494, over 972216.20 frames.], batch size: 17, lr: 1.52e-03 2022-05-03 15:56:41,743 INFO [train.py:715] (0/8) Epoch 0, batch 18950, loss[loss=0.2088, simple_loss=0.271, pruned_loss=0.07328, over 4920.00 frames.], tot_loss[loss=0.2231, simple_loss=0.2754, pruned_loss=0.08538, over 973409.93 frames.], batch size: 18, lr: 1.52e-03 2022-05-03 15:57:21,405 INFO [train.py:715] (0/8) Epoch 0, batch 19000, loss[loss=0.2893, simple_loss=0.318, pruned_loss=0.1303, over 4972.00 frames.], tot_loss[loss=0.2239, simple_loss=0.2763, pruned_loss=0.08574, over 973215.33 frames.], batch size: 35, lr: 1.51e-03 2022-05-03 15:58:01,851 INFO [train.py:715] (0/8) Epoch 0, batch 19050, loss[loss=0.1719, simple_loss=0.2383, pruned_loss=0.05278, over 4846.00 frames.], tot_loss[loss=0.223, simple_loss=0.2755, pruned_loss=0.08527, over 973144.32 frames.], batch size: 15, lr: 1.51e-03 2022-05-03 15:58:42,185 INFO [train.py:715] (0/8) Epoch 0, batch 19100, loss[loss=0.1718, simple_loss=0.2434, pruned_loss=0.05011, over 4920.00 frames.], tot_loss[loss=0.2224, simple_loss=0.2751, pruned_loss=0.0849, over 972841.91 frames.], batch size: 29, lr: 1.51e-03 2022-05-03 15:59:22,505 INFO [train.py:715] (0/8) Epoch 0, batch 19150, loss[loss=0.2262, simple_loss=0.2809, pruned_loss=0.08579, over 4679.00 frames.], tot_loss[loss=0.2224, simple_loss=0.2753, pruned_loss=0.08478, over 972931.54 frames.], batch size: 15, lr: 1.51e-03 2022-05-03 16:00:01,718 INFO [train.py:715] (0/8) Epoch 0, batch 19200, loss[loss=0.2171, simple_loss=0.2755, pruned_loss=0.07931, over 4824.00 frames.], tot_loss[loss=0.2195, simple_loss=0.2731, pruned_loss=0.08297, over 973188.04 frames.], batch size: 26, lr: 1.51e-03 2022-05-03 16:00:42,583 INFO [train.py:715] (0/8) Epoch 0, batch 19250, loss[loss=0.2295, simple_loss=0.286, pruned_loss=0.08647, over 4953.00 frames.], tot_loss[loss=0.2206, simple_loss=0.2743, pruned_loss=0.08346, over 972008.09 frames.], batch size: 24, lr: 1.50e-03 2022-05-03 16:01:23,353 INFO [train.py:715] (0/8) Epoch 0, batch 19300, loss[loss=0.2259, simple_loss=0.2858, pruned_loss=0.083, over 4860.00 frames.], tot_loss[loss=0.2215, simple_loss=0.2748, pruned_loss=0.0841, over 971548.26 frames.], batch size: 22, lr: 1.50e-03 2022-05-03 16:02:03,058 INFO [train.py:715] (0/8) Epoch 0, batch 19350, loss[loss=0.1991, simple_loss=0.261, pruned_loss=0.0686, over 4896.00 frames.], tot_loss[loss=0.2216, simple_loss=0.275, pruned_loss=0.08404, over 970957.55 frames.], batch size: 22, lr: 1.50e-03 2022-05-03 16:02:43,214 INFO [train.py:715] (0/8) Epoch 0, batch 19400, loss[loss=0.1881, simple_loss=0.2497, pruned_loss=0.06331, over 4876.00 frames.], tot_loss[loss=0.2198, simple_loss=0.2737, pruned_loss=0.0829, over 971182.62 frames.], batch size: 22, lr: 1.50e-03 2022-05-03 16:03:24,061 INFO [train.py:715] (0/8) Epoch 0, batch 19450, loss[loss=0.2461, simple_loss=0.2895, pruned_loss=0.1014, over 4792.00 frames.], tot_loss[loss=0.2194, simple_loss=0.2731, pruned_loss=0.08288, over 970676.69 frames.], batch size: 14, lr: 1.50e-03 2022-05-03 16:04:03,574 INFO [train.py:715] (0/8) Epoch 0, batch 19500, loss[loss=0.2183, simple_loss=0.2688, pruned_loss=0.08393, over 4844.00 frames.], tot_loss[loss=0.2182, simple_loss=0.272, pruned_loss=0.0822, over 970664.79 frames.], batch size: 30, lr: 1.50e-03 2022-05-03 16:04:42,929 INFO [train.py:715] (0/8) Epoch 0, batch 19550, loss[loss=0.1965, simple_loss=0.2444, pruned_loss=0.07435, over 4827.00 frames.], tot_loss[loss=0.2178, simple_loss=0.2718, pruned_loss=0.08192, over 970296.63 frames.], batch size: 25, lr: 1.49e-03 2022-05-03 16:05:23,278 INFO [train.py:715] (0/8) Epoch 0, batch 19600, loss[loss=0.2044, simple_loss=0.2666, pruned_loss=0.07115, over 4900.00 frames.], tot_loss[loss=0.217, simple_loss=0.2713, pruned_loss=0.0813, over 970944.57 frames.], batch size: 18, lr: 1.49e-03 2022-05-03 16:06:03,064 INFO [train.py:715] (0/8) Epoch 0, batch 19650, loss[loss=0.252, simple_loss=0.2991, pruned_loss=0.1024, over 4781.00 frames.], tot_loss[loss=0.2165, simple_loss=0.2711, pruned_loss=0.08091, over 970825.06 frames.], batch size: 17, lr: 1.49e-03 2022-05-03 16:06:42,549 INFO [train.py:715] (0/8) Epoch 0, batch 19700, loss[loss=0.2022, simple_loss=0.2487, pruned_loss=0.07789, over 4833.00 frames.], tot_loss[loss=0.2169, simple_loss=0.2711, pruned_loss=0.08131, over 970748.40 frames.], batch size: 15, lr: 1.49e-03 2022-05-03 16:07:22,620 INFO [train.py:715] (0/8) Epoch 0, batch 19750, loss[loss=0.2924, simple_loss=0.3255, pruned_loss=0.1297, over 4936.00 frames.], tot_loss[loss=0.2163, simple_loss=0.2707, pruned_loss=0.08094, over 970746.33 frames.], batch size: 23, lr: 1.49e-03 2022-05-03 16:08:02,296 INFO [train.py:715] (0/8) Epoch 0, batch 19800, loss[loss=0.2429, simple_loss=0.2914, pruned_loss=0.09723, over 4780.00 frames.], tot_loss[loss=0.2173, simple_loss=0.2712, pruned_loss=0.0817, over 971158.28 frames.], batch size: 18, lr: 1.48e-03 2022-05-03 16:08:42,111 INFO [train.py:715] (0/8) Epoch 0, batch 19850, loss[loss=0.2051, simple_loss=0.2627, pruned_loss=0.07376, over 4844.00 frames.], tot_loss[loss=0.2192, simple_loss=0.273, pruned_loss=0.08272, over 971133.10 frames.], batch size: 20, lr: 1.48e-03 2022-05-03 16:09:21,346 INFO [train.py:715] (0/8) Epoch 0, batch 19900, loss[loss=0.2741, simple_loss=0.31, pruned_loss=0.1191, over 4836.00 frames.], tot_loss[loss=0.2196, simple_loss=0.2736, pruned_loss=0.08281, over 971597.20 frames.], batch size: 32, lr: 1.48e-03 2022-05-03 16:10:02,122 INFO [train.py:715] (0/8) Epoch 0, batch 19950, loss[loss=0.2049, simple_loss=0.2621, pruned_loss=0.07386, over 4915.00 frames.], tot_loss[loss=0.2194, simple_loss=0.273, pruned_loss=0.08285, over 973040.19 frames.], batch size: 17, lr: 1.48e-03 2022-05-03 16:10:42,169 INFO [train.py:715] (0/8) Epoch 0, batch 20000, loss[loss=0.2058, simple_loss=0.2548, pruned_loss=0.07838, over 4884.00 frames.], tot_loss[loss=0.2187, simple_loss=0.2726, pruned_loss=0.08242, over 973299.86 frames.], batch size: 16, lr: 1.48e-03 2022-05-03 16:11:21,525 INFO [train.py:715] (0/8) Epoch 0, batch 20050, loss[loss=0.21, simple_loss=0.2612, pruned_loss=0.07935, over 4748.00 frames.], tot_loss[loss=0.2188, simple_loss=0.2727, pruned_loss=0.08242, over 973836.31 frames.], batch size: 19, lr: 1.48e-03 2022-05-03 16:12:01,704 INFO [train.py:715] (0/8) Epoch 0, batch 20100, loss[loss=0.204, simple_loss=0.2581, pruned_loss=0.07494, over 4926.00 frames.], tot_loss[loss=0.2177, simple_loss=0.2722, pruned_loss=0.08163, over 973866.58 frames.], batch size: 29, lr: 1.47e-03 2022-05-03 16:12:41,694 INFO [train.py:715] (0/8) Epoch 0, batch 20150, loss[loss=0.2011, simple_loss=0.2596, pruned_loss=0.07135, over 4932.00 frames.], tot_loss[loss=0.2188, simple_loss=0.2731, pruned_loss=0.08228, over 974012.76 frames.], batch size: 29, lr: 1.47e-03 2022-05-03 16:13:21,728 INFO [train.py:715] (0/8) Epoch 0, batch 20200, loss[loss=0.2095, simple_loss=0.2611, pruned_loss=0.07891, over 4862.00 frames.], tot_loss[loss=0.2178, simple_loss=0.2726, pruned_loss=0.08153, over 973611.42 frames.], batch size: 32, lr: 1.47e-03 2022-05-03 16:14:01,259 INFO [train.py:715] (0/8) Epoch 0, batch 20250, loss[loss=0.1778, simple_loss=0.2317, pruned_loss=0.06196, over 4734.00 frames.], tot_loss[loss=0.2183, simple_loss=0.2727, pruned_loss=0.08195, over 973324.89 frames.], batch size: 12, lr: 1.47e-03 2022-05-03 16:14:42,006 INFO [train.py:715] (0/8) Epoch 0, batch 20300, loss[loss=0.2166, simple_loss=0.2804, pruned_loss=0.07641, over 4961.00 frames.], tot_loss[loss=0.2185, simple_loss=0.2728, pruned_loss=0.0821, over 973124.71 frames.], batch size: 24, lr: 1.47e-03 2022-05-03 16:15:21,893 INFO [train.py:715] (0/8) Epoch 0, batch 20350, loss[loss=0.229, simple_loss=0.2912, pruned_loss=0.08339, over 4979.00 frames.], tot_loss[loss=0.2183, simple_loss=0.2726, pruned_loss=0.08202, over 972897.18 frames.], batch size: 39, lr: 1.47e-03 2022-05-03 16:16:00,953 INFO [train.py:715] (0/8) Epoch 0, batch 20400, loss[loss=0.2199, simple_loss=0.2625, pruned_loss=0.08865, over 4938.00 frames.], tot_loss[loss=0.22, simple_loss=0.2734, pruned_loss=0.08334, over 973125.69 frames.], batch size: 23, lr: 1.46e-03 2022-05-03 16:16:40,899 INFO [train.py:715] (0/8) Epoch 0, batch 20450, loss[loss=0.2032, simple_loss=0.2622, pruned_loss=0.07213, over 4848.00 frames.], tot_loss[loss=0.2201, simple_loss=0.2735, pruned_loss=0.08334, over 972207.46 frames.], batch size: 30, lr: 1.46e-03 2022-05-03 16:17:20,440 INFO [train.py:715] (0/8) Epoch 0, batch 20500, loss[loss=0.1983, simple_loss=0.2462, pruned_loss=0.07519, over 4800.00 frames.], tot_loss[loss=0.2191, simple_loss=0.2728, pruned_loss=0.08267, over 972474.71 frames.], batch size: 21, lr: 1.46e-03 2022-05-03 16:18:00,515 INFO [train.py:715] (0/8) Epoch 0, batch 20550, loss[loss=0.171, simple_loss=0.2259, pruned_loss=0.05811, over 4838.00 frames.], tot_loss[loss=0.2204, simple_loss=0.274, pruned_loss=0.08338, over 971632.38 frames.], batch size: 13, lr: 1.46e-03 2022-05-03 16:18:39,957 INFO [train.py:715] (0/8) Epoch 0, batch 20600, loss[loss=0.213, simple_loss=0.281, pruned_loss=0.07255, over 4848.00 frames.], tot_loss[loss=0.2198, simple_loss=0.2738, pruned_loss=0.08287, over 971369.69 frames.], batch size: 20, lr: 1.46e-03 2022-05-03 16:19:19,647 INFO [train.py:715] (0/8) Epoch 0, batch 20650, loss[loss=0.1499, simple_loss=0.2317, pruned_loss=0.0341, over 4833.00 frames.], tot_loss[loss=0.2184, simple_loss=0.2728, pruned_loss=0.08194, over 970882.78 frames.], batch size: 13, lr: 1.46e-03 2022-05-03 16:20:00,379 INFO [train.py:715] (0/8) Epoch 0, batch 20700, loss[loss=0.1863, simple_loss=0.2573, pruned_loss=0.05764, over 4985.00 frames.], tot_loss[loss=0.2181, simple_loss=0.2729, pruned_loss=0.08169, over 971312.69 frames.], batch size: 28, lr: 1.45e-03 2022-05-03 16:20:39,705 INFO [train.py:715] (0/8) Epoch 0, batch 20750, loss[loss=0.2153, simple_loss=0.2716, pruned_loss=0.07947, over 4703.00 frames.], tot_loss[loss=0.2191, simple_loss=0.2737, pruned_loss=0.08229, over 970728.15 frames.], batch size: 15, lr: 1.45e-03 2022-05-03 16:21:19,884 INFO [train.py:715] (0/8) Epoch 0, batch 20800, loss[loss=0.2147, simple_loss=0.2688, pruned_loss=0.08033, over 4808.00 frames.], tot_loss[loss=0.2175, simple_loss=0.2724, pruned_loss=0.0813, over 971251.71 frames.], batch size: 21, lr: 1.45e-03 2022-05-03 16:21:59,640 INFO [train.py:715] (0/8) Epoch 0, batch 20850, loss[loss=0.1666, simple_loss=0.2371, pruned_loss=0.04803, over 4985.00 frames.], tot_loss[loss=0.2173, simple_loss=0.272, pruned_loss=0.08134, over 972611.72 frames.], batch size: 25, lr: 1.45e-03 2022-05-03 16:22:39,128 INFO [train.py:715] (0/8) Epoch 0, batch 20900, loss[loss=0.2114, simple_loss=0.2764, pruned_loss=0.07326, over 4750.00 frames.], tot_loss[loss=0.2154, simple_loss=0.2706, pruned_loss=0.08015, over 972574.44 frames.], batch size: 19, lr: 1.45e-03 2022-05-03 16:23:19,645 INFO [train.py:715] (0/8) Epoch 0, batch 20950, loss[loss=0.1957, simple_loss=0.2685, pruned_loss=0.06147, over 4809.00 frames.], tot_loss[loss=0.216, simple_loss=0.2709, pruned_loss=0.0806, over 972401.15 frames.], batch size: 25, lr: 1.45e-03 2022-05-03 16:24:00,683 INFO [train.py:715] (0/8) Epoch 0, batch 21000, loss[loss=0.2115, simple_loss=0.2709, pruned_loss=0.07606, over 4880.00 frames.], tot_loss[loss=0.2167, simple_loss=0.2713, pruned_loss=0.08103, over 972941.62 frames.], batch size: 22, lr: 1.44e-03 2022-05-03 16:24:00,685 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 16:24:16,220 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1386, simple_loss=0.2255, pruned_loss=0.02581, over 914524.00 frames. 2022-05-03 16:24:57,015 INFO [train.py:715] (0/8) Epoch 0, batch 21050, loss[loss=0.2341, simple_loss=0.283, pruned_loss=0.09257, over 4990.00 frames.], tot_loss[loss=0.2167, simple_loss=0.2715, pruned_loss=0.08093, over 972652.49 frames.], batch size: 14, lr: 1.44e-03 2022-05-03 16:25:36,603 INFO [train.py:715] (0/8) Epoch 0, batch 21100, loss[loss=0.1988, simple_loss=0.2552, pruned_loss=0.0712, over 4975.00 frames.], tot_loss[loss=0.2157, simple_loss=0.2711, pruned_loss=0.08014, over 972264.58 frames.], batch size: 24, lr: 1.44e-03 2022-05-03 16:26:16,953 INFO [train.py:715] (0/8) Epoch 0, batch 21150, loss[loss=0.246, simple_loss=0.2938, pruned_loss=0.09912, over 4872.00 frames.], tot_loss[loss=0.2155, simple_loss=0.2707, pruned_loss=0.08012, over 972368.30 frames.], batch size: 16, lr: 1.44e-03 2022-05-03 16:26:56,818 INFO [train.py:715] (0/8) Epoch 0, batch 21200, loss[loss=0.204, simple_loss=0.2551, pruned_loss=0.07644, over 4773.00 frames.], tot_loss[loss=0.2154, simple_loss=0.2709, pruned_loss=0.07994, over 972744.93 frames.], batch size: 17, lr: 1.44e-03 2022-05-03 16:27:37,356 INFO [train.py:715] (0/8) Epoch 0, batch 21250, loss[loss=0.196, simple_loss=0.2564, pruned_loss=0.06782, over 4646.00 frames.], tot_loss[loss=0.2161, simple_loss=0.2711, pruned_loss=0.08057, over 971389.29 frames.], batch size: 13, lr: 1.44e-03 2022-05-03 16:28:17,124 INFO [train.py:715] (0/8) Epoch 0, batch 21300, loss[loss=0.2227, simple_loss=0.2643, pruned_loss=0.09049, over 4865.00 frames.], tot_loss[loss=0.2162, simple_loss=0.2709, pruned_loss=0.08073, over 971305.79 frames.], batch size: 32, lr: 1.43e-03 2022-05-03 16:28:57,546 INFO [train.py:715] (0/8) Epoch 0, batch 21350, loss[loss=0.1997, simple_loss=0.2584, pruned_loss=0.07047, over 4844.00 frames.], tot_loss[loss=0.2133, simple_loss=0.2688, pruned_loss=0.0789, over 971483.35 frames.], batch size: 32, lr: 1.43e-03 2022-05-03 16:29:38,280 INFO [train.py:715] (0/8) Epoch 0, batch 21400, loss[loss=0.1879, simple_loss=0.2448, pruned_loss=0.06552, over 4781.00 frames.], tot_loss[loss=0.2127, simple_loss=0.2684, pruned_loss=0.07849, over 971552.95 frames.], batch size: 17, lr: 1.43e-03 2022-05-03 16:30:17,952 INFO [train.py:715] (0/8) Epoch 0, batch 21450, loss[loss=0.2595, simple_loss=0.3096, pruned_loss=0.1048, over 4829.00 frames.], tot_loss[loss=0.2152, simple_loss=0.2706, pruned_loss=0.07993, over 971963.87 frames.], batch size: 13, lr: 1.43e-03 2022-05-03 16:30:57,797 INFO [train.py:715] (0/8) Epoch 0, batch 21500, loss[loss=0.2218, simple_loss=0.2987, pruned_loss=0.0724, over 4944.00 frames.], tot_loss[loss=0.2162, simple_loss=0.2715, pruned_loss=0.08052, over 971221.50 frames.], batch size: 21, lr: 1.43e-03 2022-05-03 16:31:38,010 INFO [train.py:715] (0/8) Epoch 0, batch 21550, loss[loss=0.2325, simple_loss=0.2853, pruned_loss=0.08988, over 4745.00 frames.], tot_loss[loss=0.2145, simple_loss=0.2701, pruned_loss=0.07947, over 971371.34 frames.], batch size: 19, lr: 1.43e-03 2022-05-03 16:32:18,469 INFO [train.py:715] (0/8) Epoch 0, batch 21600, loss[loss=0.2814, simple_loss=0.3214, pruned_loss=0.1207, over 4815.00 frames.], tot_loss[loss=0.2154, simple_loss=0.2707, pruned_loss=0.08006, over 971229.83 frames.], batch size: 15, lr: 1.42e-03 2022-05-03 16:32:58,238 INFO [train.py:715] (0/8) Epoch 0, batch 21650, loss[loss=0.2305, simple_loss=0.2791, pruned_loss=0.09094, over 4927.00 frames.], tot_loss[loss=0.216, simple_loss=0.2709, pruned_loss=0.08056, over 971360.07 frames.], batch size: 39, lr: 1.42e-03 2022-05-03 16:33:39,050 INFO [train.py:715] (0/8) Epoch 0, batch 21700, loss[loss=0.2585, simple_loss=0.3028, pruned_loss=0.1072, over 4897.00 frames.], tot_loss[loss=0.2173, simple_loss=0.2719, pruned_loss=0.08138, over 972214.79 frames.], batch size: 19, lr: 1.42e-03 2022-05-03 16:34:19,220 INFO [train.py:715] (0/8) Epoch 0, batch 21750, loss[loss=0.2054, simple_loss=0.2633, pruned_loss=0.07377, over 4901.00 frames.], tot_loss[loss=0.215, simple_loss=0.27, pruned_loss=0.07997, over 971524.02 frames.], batch size: 18, lr: 1.42e-03 2022-05-03 16:34:58,792 INFO [train.py:715] (0/8) Epoch 0, batch 21800, loss[loss=0.1758, simple_loss=0.2475, pruned_loss=0.05203, over 4801.00 frames.], tot_loss[loss=0.2144, simple_loss=0.2691, pruned_loss=0.07986, over 972193.23 frames.], batch size: 21, lr: 1.42e-03 2022-05-03 16:35:38,621 INFO [train.py:715] (0/8) Epoch 0, batch 21850, loss[loss=0.2462, simple_loss=0.2885, pruned_loss=0.102, over 4977.00 frames.], tot_loss[loss=0.2164, simple_loss=0.2709, pruned_loss=0.08097, over 972558.88 frames.], batch size: 40, lr: 1.42e-03 2022-05-03 16:36:19,093 INFO [train.py:715] (0/8) Epoch 0, batch 21900, loss[loss=0.1975, simple_loss=0.2457, pruned_loss=0.07461, over 4800.00 frames.], tot_loss[loss=0.217, simple_loss=0.2714, pruned_loss=0.0813, over 972969.84 frames.], batch size: 12, lr: 1.42e-03 2022-05-03 16:36:59,004 INFO [train.py:715] (0/8) Epoch 0, batch 21950, loss[loss=0.1868, simple_loss=0.2448, pruned_loss=0.06434, over 4851.00 frames.], tot_loss[loss=0.2151, simple_loss=0.2696, pruned_loss=0.08031, over 972621.39 frames.], batch size: 20, lr: 1.41e-03 2022-05-03 16:37:38,289 INFO [train.py:715] (0/8) Epoch 0, batch 22000, loss[loss=0.2154, simple_loss=0.2673, pruned_loss=0.08177, over 4895.00 frames.], tot_loss[loss=0.2149, simple_loss=0.2699, pruned_loss=0.08001, over 972370.73 frames.], batch size: 19, lr: 1.41e-03 2022-05-03 16:38:18,443 INFO [train.py:715] (0/8) Epoch 0, batch 22050, loss[loss=0.2006, simple_loss=0.2529, pruned_loss=0.07412, over 4658.00 frames.], tot_loss[loss=0.2142, simple_loss=0.2691, pruned_loss=0.07964, over 972517.04 frames.], batch size: 13, lr: 1.41e-03 2022-05-03 16:38:58,604 INFO [train.py:715] (0/8) Epoch 0, batch 22100, loss[loss=0.199, simple_loss=0.244, pruned_loss=0.07699, over 4777.00 frames.], tot_loss[loss=0.2143, simple_loss=0.2698, pruned_loss=0.07938, over 971650.04 frames.], batch size: 14, lr: 1.41e-03 2022-05-03 16:39:38,118 INFO [train.py:715] (0/8) Epoch 0, batch 22150, loss[loss=0.2173, simple_loss=0.2785, pruned_loss=0.07803, over 4898.00 frames.], tot_loss[loss=0.2142, simple_loss=0.2697, pruned_loss=0.07933, over 972151.52 frames.], batch size: 19, lr: 1.41e-03 2022-05-03 16:40:17,928 INFO [train.py:715] (0/8) Epoch 0, batch 22200, loss[loss=0.2113, simple_loss=0.2707, pruned_loss=0.07594, over 4989.00 frames.], tot_loss[loss=0.2156, simple_loss=0.2712, pruned_loss=0.08, over 972912.04 frames.], batch size: 28, lr: 1.41e-03 2022-05-03 16:40:58,312 INFO [train.py:715] (0/8) Epoch 0, batch 22250, loss[loss=0.2313, simple_loss=0.2901, pruned_loss=0.08626, over 4739.00 frames.], tot_loss[loss=0.2154, simple_loss=0.271, pruned_loss=0.07994, over 973373.86 frames.], batch size: 16, lr: 1.40e-03 2022-05-03 16:41:38,378 INFO [train.py:715] (0/8) Epoch 0, batch 22300, loss[loss=0.2426, simple_loss=0.2921, pruned_loss=0.09651, over 4913.00 frames.], tot_loss[loss=0.2166, simple_loss=0.2716, pruned_loss=0.08084, over 972888.54 frames.], batch size: 17, lr: 1.40e-03 2022-05-03 16:42:18,080 INFO [train.py:715] (0/8) Epoch 0, batch 22350, loss[loss=0.2193, simple_loss=0.2866, pruned_loss=0.07601, over 4861.00 frames.], tot_loss[loss=0.216, simple_loss=0.2706, pruned_loss=0.08075, over 971863.17 frames.], batch size: 20, lr: 1.40e-03 2022-05-03 16:42:58,252 INFO [train.py:715] (0/8) Epoch 0, batch 22400, loss[loss=0.2201, simple_loss=0.2836, pruned_loss=0.07831, over 4934.00 frames.], tot_loss[loss=0.2148, simple_loss=0.27, pruned_loss=0.07981, over 972757.06 frames.], batch size: 29, lr: 1.40e-03 2022-05-03 16:43:38,085 INFO [train.py:715] (0/8) Epoch 0, batch 22450, loss[loss=0.1699, simple_loss=0.2295, pruned_loss=0.05521, over 4766.00 frames.], tot_loss[loss=0.2153, simple_loss=0.2706, pruned_loss=0.07998, over 972912.60 frames.], batch size: 14, lr: 1.40e-03 2022-05-03 16:44:17,449 INFO [train.py:715] (0/8) Epoch 0, batch 22500, loss[loss=0.1906, simple_loss=0.2502, pruned_loss=0.06554, over 4934.00 frames.], tot_loss[loss=0.2154, simple_loss=0.2705, pruned_loss=0.08017, over 973030.08 frames.], batch size: 18, lr: 1.40e-03 2022-05-03 16:44:57,227 INFO [train.py:715] (0/8) Epoch 0, batch 22550, loss[loss=0.2323, simple_loss=0.2863, pruned_loss=0.08918, over 4770.00 frames.], tot_loss[loss=0.2144, simple_loss=0.2698, pruned_loss=0.07953, over 971825.32 frames.], batch size: 14, lr: 1.40e-03 2022-05-03 16:45:37,440 INFO [train.py:715] (0/8) Epoch 0, batch 22600, loss[loss=0.2621, simple_loss=0.3076, pruned_loss=0.1084, over 4918.00 frames.], tot_loss[loss=0.2148, simple_loss=0.27, pruned_loss=0.0798, over 972170.29 frames.], batch size: 19, lr: 1.39e-03 2022-05-03 16:46:18,083 INFO [train.py:715] (0/8) Epoch 0, batch 22650, loss[loss=0.2241, simple_loss=0.2823, pruned_loss=0.08298, over 4964.00 frames.], tot_loss[loss=0.2147, simple_loss=0.27, pruned_loss=0.07969, over 971713.25 frames.], batch size: 24, lr: 1.39e-03 2022-05-03 16:46:57,299 INFO [train.py:715] (0/8) Epoch 0, batch 22700, loss[loss=0.2775, simple_loss=0.3084, pruned_loss=0.1233, over 4729.00 frames.], tot_loss[loss=0.2154, simple_loss=0.2703, pruned_loss=0.08027, over 971760.29 frames.], batch size: 16, lr: 1.39e-03 2022-05-03 16:47:37,378 INFO [train.py:715] (0/8) Epoch 0, batch 22750, loss[loss=0.1861, simple_loss=0.2583, pruned_loss=0.05701, over 4953.00 frames.], tot_loss[loss=0.2166, simple_loss=0.2714, pruned_loss=0.08093, over 971018.56 frames.], batch size: 21, lr: 1.39e-03 2022-05-03 16:48:17,858 INFO [train.py:715] (0/8) Epoch 0, batch 22800, loss[loss=0.2104, simple_loss=0.2684, pruned_loss=0.07624, over 4924.00 frames.], tot_loss[loss=0.2161, simple_loss=0.2711, pruned_loss=0.08053, over 971451.10 frames.], batch size: 23, lr: 1.39e-03 2022-05-03 16:48:57,455 INFO [train.py:715] (0/8) Epoch 0, batch 22850, loss[loss=0.2119, simple_loss=0.2757, pruned_loss=0.07402, over 4791.00 frames.], tot_loss[loss=0.2151, simple_loss=0.2703, pruned_loss=0.07996, over 971585.12 frames.], batch size: 24, lr: 1.39e-03 2022-05-03 16:49:37,569 INFO [train.py:715] (0/8) Epoch 0, batch 22900, loss[loss=0.1853, simple_loss=0.2474, pruned_loss=0.06159, over 4935.00 frames.], tot_loss[loss=0.2136, simple_loss=0.2691, pruned_loss=0.079, over 972029.89 frames.], batch size: 21, lr: 1.39e-03 2022-05-03 16:50:17,834 INFO [train.py:715] (0/8) Epoch 0, batch 22950, loss[loss=0.2197, simple_loss=0.2696, pruned_loss=0.08495, over 4903.00 frames.], tot_loss[loss=0.2125, simple_loss=0.2687, pruned_loss=0.07818, over 972457.75 frames.], batch size: 17, lr: 1.38e-03 2022-05-03 16:50:58,467 INFO [train.py:715] (0/8) Epoch 0, batch 23000, loss[loss=0.2274, simple_loss=0.2843, pruned_loss=0.08531, over 4776.00 frames.], tot_loss[loss=0.2136, simple_loss=0.2694, pruned_loss=0.07891, over 971526.26 frames.], batch size: 18, lr: 1.38e-03 2022-05-03 16:51:37,479 INFO [train.py:715] (0/8) Epoch 0, batch 23050, loss[loss=0.2129, simple_loss=0.2688, pruned_loss=0.07854, over 4932.00 frames.], tot_loss[loss=0.2132, simple_loss=0.2691, pruned_loss=0.07864, over 971849.01 frames.], batch size: 29, lr: 1.38e-03 2022-05-03 16:52:18,419 INFO [train.py:715] (0/8) Epoch 0, batch 23100, loss[loss=0.191, simple_loss=0.2628, pruned_loss=0.05957, over 4793.00 frames.], tot_loss[loss=0.2144, simple_loss=0.2698, pruned_loss=0.0795, over 971734.94 frames.], batch size: 24, lr: 1.38e-03 2022-05-03 16:52:59,440 INFO [train.py:715] (0/8) Epoch 0, batch 23150, loss[loss=0.2403, simple_loss=0.296, pruned_loss=0.09225, over 4751.00 frames.], tot_loss[loss=0.2146, simple_loss=0.2701, pruned_loss=0.07954, over 972008.14 frames.], batch size: 19, lr: 1.38e-03 2022-05-03 16:53:39,188 INFO [train.py:715] (0/8) Epoch 0, batch 23200, loss[loss=0.1741, simple_loss=0.2394, pruned_loss=0.05442, over 4917.00 frames.], tot_loss[loss=0.2138, simple_loss=0.2694, pruned_loss=0.07907, over 971644.49 frames.], batch size: 17, lr: 1.38e-03 2022-05-03 16:54:19,754 INFO [train.py:715] (0/8) Epoch 0, batch 23250, loss[loss=0.2199, simple_loss=0.2764, pruned_loss=0.08169, over 4905.00 frames.], tot_loss[loss=0.2129, simple_loss=0.2687, pruned_loss=0.07855, over 973100.64 frames.], batch size: 17, lr: 1.38e-03 2022-05-03 16:55:00,175 INFO [train.py:715] (0/8) Epoch 0, batch 23300, loss[loss=0.2167, simple_loss=0.2742, pruned_loss=0.07966, over 4754.00 frames.], tot_loss[loss=0.2139, simple_loss=0.2693, pruned_loss=0.07926, over 971820.69 frames.], batch size: 16, lr: 1.37e-03 2022-05-03 16:55:40,660 INFO [train.py:715] (0/8) Epoch 0, batch 23350, loss[loss=0.1698, simple_loss=0.2393, pruned_loss=0.05019, over 4986.00 frames.], tot_loss[loss=0.213, simple_loss=0.2685, pruned_loss=0.07878, over 972678.92 frames.], batch size: 14, lr: 1.37e-03 2022-05-03 16:56:21,254 INFO [train.py:715] (0/8) Epoch 0, batch 23400, loss[loss=0.2215, simple_loss=0.286, pruned_loss=0.0785, over 4893.00 frames.], tot_loss[loss=0.2119, simple_loss=0.2676, pruned_loss=0.07815, over 972233.36 frames.], batch size: 19, lr: 1.37e-03 2022-05-03 16:57:02,267 INFO [train.py:715] (0/8) Epoch 0, batch 23450, loss[loss=0.1993, simple_loss=0.2533, pruned_loss=0.07261, over 4872.00 frames.], tot_loss[loss=0.2116, simple_loss=0.2673, pruned_loss=0.078, over 971669.00 frames.], batch size: 16, lr: 1.37e-03 2022-05-03 16:57:43,373 INFO [train.py:715] (0/8) Epoch 0, batch 23500, loss[loss=0.2298, simple_loss=0.2857, pruned_loss=0.08692, over 4970.00 frames.], tot_loss[loss=0.2123, simple_loss=0.2679, pruned_loss=0.07832, over 972130.00 frames.], batch size: 14, lr: 1.37e-03 2022-05-03 16:58:23,226 INFO [train.py:715] (0/8) Epoch 0, batch 23550, loss[loss=0.2125, simple_loss=0.2616, pruned_loss=0.08169, over 4964.00 frames.], tot_loss[loss=0.213, simple_loss=0.2686, pruned_loss=0.07872, over 972625.07 frames.], batch size: 35, lr: 1.37e-03 2022-05-03 16:59:04,085 INFO [train.py:715] (0/8) Epoch 0, batch 23600, loss[loss=0.256, simple_loss=0.3167, pruned_loss=0.0977, over 4988.00 frames.], tot_loss[loss=0.213, simple_loss=0.2684, pruned_loss=0.07876, over 971274.04 frames.], batch size: 33, lr: 1.37e-03 2022-05-03 16:59:44,350 INFO [train.py:715] (0/8) Epoch 0, batch 23650, loss[loss=0.241, simple_loss=0.28, pruned_loss=0.101, over 4856.00 frames.], tot_loss[loss=0.2127, simple_loss=0.268, pruned_loss=0.07872, over 970980.56 frames.], batch size: 38, lr: 1.36e-03 2022-05-03 17:00:24,467 INFO [train.py:715] (0/8) Epoch 0, batch 23700, loss[loss=0.2156, simple_loss=0.2577, pruned_loss=0.08674, over 4796.00 frames.], tot_loss[loss=0.2131, simple_loss=0.2685, pruned_loss=0.07882, over 971569.46 frames.], batch size: 12, lr: 1.36e-03 2022-05-03 17:01:03,662 INFO [train.py:715] (0/8) Epoch 0, batch 23750, loss[loss=0.156, simple_loss=0.2263, pruned_loss=0.04281, over 4964.00 frames.], tot_loss[loss=0.2122, simple_loss=0.268, pruned_loss=0.07823, over 972291.20 frames.], batch size: 15, lr: 1.36e-03 2022-05-03 17:01:43,663 INFO [train.py:715] (0/8) Epoch 0, batch 23800, loss[loss=0.2211, simple_loss=0.2888, pruned_loss=0.07675, over 4922.00 frames.], tot_loss[loss=0.2126, simple_loss=0.2688, pruned_loss=0.07816, over 972582.60 frames.], batch size: 29, lr: 1.36e-03 2022-05-03 17:02:24,142 INFO [train.py:715] (0/8) Epoch 0, batch 23850, loss[loss=0.2115, simple_loss=0.2629, pruned_loss=0.08009, over 4891.00 frames.], tot_loss[loss=0.2143, simple_loss=0.2705, pruned_loss=0.07902, over 972470.76 frames.], batch size: 17, lr: 1.36e-03 2022-05-03 17:03:03,308 INFO [train.py:715] (0/8) Epoch 0, batch 23900, loss[loss=0.1929, simple_loss=0.2637, pruned_loss=0.06103, over 4912.00 frames.], tot_loss[loss=0.2133, simple_loss=0.2693, pruned_loss=0.07868, over 972312.08 frames.], batch size: 18, lr: 1.36e-03 2022-05-03 17:03:43,453 INFO [train.py:715] (0/8) Epoch 0, batch 23950, loss[loss=0.2092, simple_loss=0.2633, pruned_loss=0.07756, over 4859.00 frames.], tot_loss[loss=0.2124, simple_loss=0.2686, pruned_loss=0.07811, over 972215.90 frames.], batch size: 20, lr: 1.36e-03 2022-05-03 17:04:23,140 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-24000.pt 2022-05-03 17:04:26,569 INFO [train.py:715] (0/8) Epoch 0, batch 24000, loss[loss=0.1832, simple_loss=0.252, pruned_loss=0.0572, over 4891.00 frames.], tot_loss[loss=0.2117, simple_loss=0.2682, pruned_loss=0.07757, over 972502.58 frames.], batch size: 18, lr: 1.35e-03 2022-05-03 17:04:26,570 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 17:04:40,851 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1357, simple_loss=0.2226, pruned_loss=0.02435, over 914524.00 frames. 2022-05-03 17:05:21,172 INFO [train.py:715] (0/8) Epoch 0, batch 24050, loss[loss=0.191, simple_loss=0.2562, pruned_loss=0.06294, over 4772.00 frames.], tot_loss[loss=0.211, simple_loss=0.2677, pruned_loss=0.07716, over 972127.50 frames.], batch size: 18, lr: 1.35e-03 2022-05-03 17:06:00,596 INFO [train.py:715] (0/8) Epoch 0, batch 24100, loss[loss=0.1856, simple_loss=0.2433, pruned_loss=0.06393, over 4766.00 frames.], tot_loss[loss=0.2103, simple_loss=0.267, pruned_loss=0.07677, over 972323.61 frames.], batch size: 19, lr: 1.35e-03 2022-05-03 17:06:40,580 INFO [train.py:715] (0/8) Epoch 0, batch 24150, loss[loss=0.1981, simple_loss=0.2561, pruned_loss=0.07004, over 4831.00 frames.], tot_loss[loss=0.2105, simple_loss=0.2671, pruned_loss=0.07693, over 972500.97 frames.], batch size: 15, lr: 1.35e-03 2022-05-03 17:07:20,604 INFO [train.py:715] (0/8) Epoch 0, batch 24200, loss[loss=0.23, simple_loss=0.2779, pruned_loss=0.09104, over 4823.00 frames.], tot_loss[loss=0.2095, simple_loss=0.2663, pruned_loss=0.07634, over 972624.29 frames.], batch size: 15, lr: 1.35e-03 2022-05-03 17:08:01,227 INFO [train.py:715] (0/8) Epoch 0, batch 24250, loss[loss=0.2698, simple_loss=0.312, pruned_loss=0.1138, over 4990.00 frames.], tot_loss[loss=0.2095, simple_loss=0.2662, pruned_loss=0.07642, over 972690.40 frames.], batch size: 14, lr: 1.35e-03 2022-05-03 17:08:40,832 INFO [train.py:715] (0/8) Epoch 0, batch 24300, loss[loss=0.248, simple_loss=0.2962, pruned_loss=0.09992, over 4987.00 frames.], tot_loss[loss=0.2115, simple_loss=0.2676, pruned_loss=0.07772, over 972753.90 frames.], batch size: 15, lr: 1.35e-03 2022-05-03 17:09:21,015 INFO [train.py:715] (0/8) Epoch 0, batch 24350, loss[loss=0.2005, simple_loss=0.2614, pruned_loss=0.06983, over 4649.00 frames.], tot_loss[loss=0.2119, simple_loss=0.2682, pruned_loss=0.07786, over 972102.97 frames.], batch size: 13, lr: 1.35e-03 2022-05-03 17:10:01,417 INFO [train.py:715] (0/8) Epoch 0, batch 24400, loss[loss=0.2221, simple_loss=0.2773, pruned_loss=0.08346, over 4902.00 frames.], tot_loss[loss=0.2108, simple_loss=0.267, pruned_loss=0.07735, over 972647.14 frames.], batch size: 17, lr: 1.34e-03 2022-05-03 17:10:40,941 INFO [train.py:715] (0/8) Epoch 0, batch 24450, loss[loss=0.1882, simple_loss=0.2586, pruned_loss=0.05894, over 4936.00 frames.], tot_loss[loss=0.2108, simple_loss=0.2672, pruned_loss=0.07722, over 972055.26 frames.], batch size: 21, lr: 1.34e-03 2022-05-03 17:11:21,048 INFO [train.py:715] (0/8) Epoch 0, batch 24500, loss[loss=0.2173, simple_loss=0.2789, pruned_loss=0.0779, over 4935.00 frames.], tot_loss[loss=0.21, simple_loss=0.2666, pruned_loss=0.07671, over 971557.78 frames.], batch size: 35, lr: 1.34e-03 2022-05-03 17:12:01,321 INFO [train.py:715] (0/8) Epoch 0, batch 24550, loss[loss=0.1803, simple_loss=0.2368, pruned_loss=0.06188, over 4769.00 frames.], tot_loss[loss=0.2109, simple_loss=0.2673, pruned_loss=0.07726, over 971636.57 frames.], batch size: 19, lr: 1.34e-03 2022-05-03 17:12:41,511 INFO [train.py:715] (0/8) Epoch 0, batch 24600, loss[loss=0.2209, simple_loss=0.2926, pruned_loss=0.07467, over 4975.00 frames.], tot_loss[loss=0.212, simple_loss=0.2684, pruned_loss=0.07782, over 971769.09 frames.], batch size: 28, lr: 1.34e-03 2022-05-03 17:13:20,993 INFO [train.py:715] (0/8) Epoch 0, batch 24650, loss[loss=0.196, simple_loss=0.2593, pruned_loss=0.06636, over 4774.00 frames.], tot_loss[loss=0.2103, simple_loss=0.2671, pruned_loss=0.07672, over 971972.39 frames.], batch size: 18, lr: 1.34e-03 2022-05-03 17:14:01,417 INFO [train.py:715] (0/8) Epoch 0, batch 24700, loss[loss=0.2527, simple_loss=0.2957, pruned_loss=0.1049, over 4982.00 frames.], tot_loss[loss=0.2115, simple_loss=0.2681, pruned_loss=0.07748, over 972271.23 frames.], batch size: 15, lr: 1.34e-03 2022-05-03 17:14:42,123 INFO [train.py:715] (0/8) Epoch 0, batch 24750, loss[loss=0.2263, simple_loss=0.2791, pruned_loss=0.08678, over 4826.00 frames.], tot_loss[loss=0.2136, simple_loss=0.2694, pruned_loss=0.07896, over 973112.55 frames.], batch size: 13, lr: 1.33e-03 2022-05-03 17:15:21,176 INFO [train.py:715] (0/8) Epoch 0, batch 24800, loss[loss=0.1355, simple_loss=0.1925, pruned_loss=0.0393, over 4776.00 frames.], tot_loss[loss=0.2133, simple_loss=0.2692, pruned_loss=0.0787, over 973286.98 frames.], batch size: 12, lr: 1.33e-03 2022-05-03 17:16:01,313 INFO [train.py:715] (0/8) Epoch 0, batch 24850, loss[loss=0.2323, simple_loss=0.2868, pruned_loss=0.08885, over 4861.00 frames.], tot_loss[loss=0.2118, simple_loss=0.268, pruned_loss=0.07773, over 973299.35 frames.], batch size: 38, lr: 1.33e-03 2022-05-03 17:16:41,587 INFO [train.py:715] (0/8) Epoch 0, batch 24900, loss[loss=0.2092, simple_loss=0.2733, pruned_loss=0.07255, over 4853.00 frames.], tot_loss[loss=0.2115, simple_loss=0.268, pruned_loss=0.07751, over 972976.13 frames.], batch size: 20, lr: 1.33e-03 2022-05-03 17:17:21,632 INFO [train.py:715] (0/8) Epoch 0, batch 24950, loss[loss=0.2363, simple_loss=0.2789, pruned_loss=0.09688, over 4985.00 frames.], tot_loss[loss=0.2112, simple_loss=0.2677, pruned_loss=0.07741, over 973281.78 frames.], batch size: 28, lr: 1.33e-03 2022-05-03 17:18:01,147 INFO [train.py:715] (0/8) Epoch 0, batch 25000, loss[loss=0.2119, simple_loss=0.259, pruned_loss=0.0824, over 4695.00 frames.], tot_loss[loss=0.2106, simple_loss=0.2675, pruned_loss=0.07692, over 973444.44 frames.], batch size: 15, lr: 1.33e-03 2022-05-03 17:18:41,404 INFO [train.py:715] (0/8) Epoch 0, batch 25050, loss[loss=0.1957, simple_loss=0.2598, pruned_loss=0.06583, over 4739.00 frames.], tot_loss[loss=0.2101, simple_loss=0.2671, pruned_loss=0.07653, over 973031.43 frames.], batch size: 19, lr: 1.33e-03 2022-05-03 17:19:21,104 INFO [train.py:715] (0/8) Epoch 0, batch 25100, loss[loss=0.1827, simple_loss=0.2426, pruned_loss=0.06143, over 4792.00 frames.], tot_loss[loss=0.2092, simple_loss=0.2661, pruned_loss=0.0762, over 973846.67 frames.], batch size: 14, lr: 1.33e-03 2022-05-03 17:20:00,603 INFO [train.py:715] (0/8) Epoch 0, batch 25150, loss[loss=0.2456, simple_loss=0.2824, pruned_loss=0.1044, over 4776.00 frames.], tot_loss[loss=0.2094, simple_loss=0.2661, pruned_loss=0.07637, over 974654.60 frames.], batch size: 17, lr: 1.32e-03 2022-05-03 17:20:41,149 INFO [train.py:715] (0/8) Epoch 0, batch 25200, loss[loss=0.2204, simple_loss=0.2798, pruned_loss=0.08045, over 4824.00 frames.], tot_loss[loss=0.2092, simple_loss=0.2662, pruned_loss=0.07613, over 974495.80 frames.], batch size: 15, lr: 1.32e-03 2022-05-03 17:21:21,718 INFO [train.py:715] (0/8) Epoch 0, batch 25250, loss[loss=0.2466, simple_loss=0.298, pruned_loss=0.09762, over 4878.00 frames.], tot_loss[loss=0.2091, simple_loss=0.2662, pruned_loss=0.07598, over 973817.97 frames.], batch size: 22, lr: 1.32e-03 2022-05-03 17:22:02,265 INFO [train.py:715] (0/8) Epoch 0, batch 25300, loss[loss=0.1722, simple_loss=0.2398, pruned_loss=0.05232, over 4804.00 frames.], tot_loss[loss=0.2089, simple_loss=0.266, pruned_loss=0.0759, over 973083.79 frames.], batch size: 24, lr: 1.32e-03 2022-05-03 17:22:42,097 INFO [train.py:715] (0/8) Epoch 0, batch 25350, loss[loss=0.1872, simple_loss=0.2411, pruned_loss=0.06664, over 4968.00 frames.], tot_loss[loss=0.2095, simple_loss=0.2666, pruned_loss=0.07619, over 972793.55 frames.], batch size: 24, lr: 1.32e-03 2022-05-03 17:23:22,549 INFO [train.py:715] (0/8) Epoch 0, batch 25400, loss[loss=0.2187, simple_loss=0.2749, pruned_loss=0.08126, over 4872.00 frames.], tot_loss[loss=0.2092, simple_loss=0.2662, pruned_loss=0.07606, over 973166.25 frames.], batch size: 20, lr: 1.32e-03 2022-05-03 17:24:02,724 INFO [train.py:715] (0/8) Epoch 0, batch 25450, loss[loss=0.2017, simple_loss=0.2548, pruned_loss=0.07431, over 4959.00 frames.], tot_loss[loss=0.2083, simple_loss=0.2652, pruned_loss=0.0757, over 972599.61 frames.], batch size: 35, lr: 1.32e-03 2022-05-03 17:24:41,711 INFO [train.py:715] (0/8) Epoch 0, batch 25500, loss[loss=0.1457, simple_loss=0.2141, pruned_loss=0.03863, over 4794.00 frames.], tot_loss[loss=0.2082, simple_loss=0.2651, pruned_loss=0.07567, over 972483.10 frames.], batch size: 12, lr: 1.32e-03 2022-05-03 17:25:22,417 INFO [train.py:715] (0/8) Epoch 0, batch 25550, loss[loss=0.2145, simple_loss=0.2692, pruned_loss=0.07987, over 4848.00 frames.], tot_loss[loss=0.2081, simple_loss=0.2651, pruned_loss=0.07558, over 971894.49 frames.], batch size: 34, lr: 1.31e-03 2022-05-03 17:26:02,030 INFO [train.py:715] (0/8) Epoch 0, batch 25600, loss[loss=0.1946, simple_loss=0.2547, pruned_loss=0.06726, over 4918.00 frames.], tot_loss[loss=0.209, simple_loss=0.2656, pruned_loss=0.07618, over 972296.41 frames.], batch size: 23, lr: 1.31e-03 2022-05-03 17:26:41,735 INFO [train.py:715] (0/8) Epoch 0, batch 25650, loss[loss=0.262, simple_loss=0.3063, pruned_loss=0.1089, over 4752.00 frames.], tot_loss[loss=0.2075, simple_loss=0.2648, pruned_loss=0.0751, over 972249.77 frames.], batch size: 18, lr: 1.31e-03 2022-05-03 17:27:21,453 INFO [train.py:715] (0/8) Epoch 0, batch 25700, loss[loss=0.2206, simple_loss=0.2695, pruned_loss=0.08589, over 4847.00 frames.], tot_loss[loss=0.2077, simple_loss=0.2646, pruned_loss=0.07536, over 971699.65 frames.], batch size: 13, lr: 1.31e-03 2022-05-03 17:28:01,729 INFO [train.py:715] (0/8) Epoch 0, batch 25750, loss[loss=0.1878, simple_loss=0.2542, pruned_loss=0.0607, over 4781.00 frames.], tot_loss[loss=0.2088, simple_loss=0.2654, pruned_loss=0.0761, over 972558.39 frames.], batch size: 17, lr: 1.31e-03 2022-05-03 17:28:41,513 INFO [train.py:715] (0/8) Epoch 0, batch 25800, loss[loss=0.1869, simple_loss=0.2534, pruned_loss=0.06024, over 4826.00 frames.], tot_loss[loss=0.208, simple_loss=0.2645, pruned_loss=0.07576, over 972401.13 frames.], batch size: 26, lr: 1.31e-03 2022-05-03 17:29:20,762 INFO [train.py:715] (0/8) Epoch 0, batch 25850, loss[loss=0.2599, simple_loss=0.3071, pruned_loss=0.1063, over 4905.00 frames.], tot_loss[loss=0.2084, simple_loss=0.2648, pruned_loss=0.07598, over 972425.57 frames.], batch size: 38, lr: 1.31e-03 2022-05-03 17:30:01,475 INFO [train.py:715] (0/8) Epoch 0, batch 25900, loss[loss=0.2086, simple_loss=0.2677, pruned_loss=0.07478, over 4861.00 frames.], tot_loss[loss=0.2063, simple_loss=0.2633, pruned_loss=0.07466, over 971934.92 frames.], batch size: 20, lr: 1.31e-03 2022-05-03 17:30:41,212 INFO [train.py:715] (0/8) Epoch 0, batch 25950, loss[loss=0.1975, simple_loss=0.2652, pruned_loss=0.06487, over 4738.00 frames.], tot_loss[loss=0.2081, simple_loss=0.265, pruned_loss=0.0756, over 972447.18 frames.], batch size: 16, lr: 1.30e-03 2022-05-03 17:31:21,228 INFO [train.py:715] (0/8) Epoch 0, batch 26000, loss[loss=0.2157, simple_loss=0.2795, pruned_loss=0.07601, over 4921.00 frames.], tot_loss[loss=0.2088, simple_loss=0.2654, pruned_loss=0.07607, over 972528.60 frames.], batch size: 29, lr: 1.30e-03 2022-05-03 17:32:01,174 INFO [train.py:715] (0/8) Epoch 0, batch 26050, loss[loss=0.1881, simple_loss=0.2561, pruned_loss=0.06008, over 4985.00 frames.], tot_loss[loss=0.2103, simple_loss=0.267, pruned_loss=0.07677, over 972840.27 frames.], batch size: 28, lr: 1.30e-03 2022-05-03 17:32:41,638 INFO [train.py:715] (0/8) Epoch 0, batch 26100, loss[loss=0.2012, simple_loss=0.2664, pruned_loss=0.068, over 4968.00 frames.], tot_loss[loss=0.2106, simple_loss=0.2671, pruned_loss=0.07704, over 972433.46 frames.], batch size: 15, lr: 1.30e-03 2022-05-03 17:33:21,953 INFO [train.py:715] (0/8) Epoch 0, batch 26150, loss[loss=0.1716, simple_loss=0.242, pruned_loss=0.05065, over 4947.00 frames.], tot_loss[loss=0.2092, simple_loss=0.2653, pruned_loss=0.07657, over 972033.92 frames.], batch size: 29, lr: 1.30e-03 2022-05-03 17:34:00,860 INFO [train.py:715] (0/8) Epoch 0, batch 26200, loss[loss=0.1912, simple_loss=0.2605, pruned_loss=0.06095, over 4821.00 frames.], tot_loss[loss=0.2093, simple_loss=0.2654, pruned_loss=0.07666, over 972878.38 frames.], batch size: 27, lr: 1.30e-03 2022-05-03 17:34:41,489 INFO [train.py:715] (0/8) Epoch 0, batch 26250, loss[loss=0.2003, simple_loss=0.2641, pruned_loss=0.06827, over 4979.00 frames.], tot_loss[loss=0.2093, simple_loss=0.2656, pruned_loss=0.07647, over 972769.82 frames.], batch size: 24, lr: 1.30e-03 2022-05-03 17:35:21,440 INFO [train.py:715] (0/8) Epoch 0, batch 26300, loss[loss=0.2277, simple_loss=0.2816, pruned_loss=0.08692, over 4937.00 frames.], tot_loss[loss=0.2098, simple_loss=0.2662, pruned_loss=0.07673, over 973778.13 frames.], batch size: 23, lr: 1.30e-03 2022-05-03 17:36:01,274 INFO [train.py:715] (0/8) Epoch 0, batch 26350, loss[loss=0.2023, simple_loss=0.2478, pruned_loss=0.07841, over 4848.00 frames.], tot_loss[loss=0.2088, simple_loss=0.2654, pruned_loss=0.07603, over 973027.72 frames.], batch size: 34, lr: 1.30e-03 2022-05-03 17:36:41,217 INFO [train.py:715] (0/8) Epoch 0, batch 26400, loss[loss=0.2051, simple_loss=0.2764, pruned_loss=0.06693, over 4943.00 frames.], tot_loss[loss=0.2079, simple_loss=0.2649, pruned_loss=0.07543, over 973384.87 frames.], batch size: 21, lr: 1.29e-03 2022-05-03 17:37:21,342 INFO [train.py:715] (0/8) Epoch 0, batch 26450, loss[loss=0.171, simple_loss=0.2265, pruned_loss=0.05771, over 4984.00 frames.], tot_loss[loss=0.2072, simple_loss=0.2643, pruned_loss=0.07507, over 972275.02 frames.], batch size: 33, lr: 1.29e-03 2022-05-03 17:38:02,053 INFO [train.py:715] (0/8) Epoch 0, batch 26500, loss[loss=0.2698, simple_loss=0.3009, pruned_loss=0.1194, over 4946.00 frames.], tot_loss[loss=0.208, simple_loss=0.2651, pruned_loss=0.07546, over 972256.69 frames.], batch size: 35, lr: 1.29e-03 2022-05-03 17:38:41,413 INFO [train.py:715] (0/8) Epoch 0, batch 26550, loss[loss=0.2179, simple_loss=0.2671, pruned_loss=0.08431, over 4866.00 frames.], tot_loss[loss=0.2077, simple_loss=0.2652, pruned_loss=0.07508, over 972065.97 frames.], batch size: 20, lr: 1.29e-03 2022-05-03 17:39:21,088 INFO [train.py:715] (0/8) Epoch 0, batch 26600, loss[loss=0.2836, simple_loss=0.3082, pruned_loss=0.1295, over 4931.00 frames.], tot_loss[loss=0.2073, simple_loss=0.2652, pruned_loss=0.0747, over 971962.74 frames.], batch size: 18, lr: 1.29e-03 2022-05-03 17:40:01,332 INFO [train.py:715] (0/8) Epoch 0, batch 26650, loss[loss=0.2062, simple_loss=0.2585, pruned_loss=0.07694, over 4835.00 frames.], tot_loss[loss=0.2069, simple_loss=0.2647, pruned_loss=0.07449, over 972211.73 frames.], batch size: 13, lr: 1.29e-03 2022-05-03 17:40:40,798 INFO [train.py:715] (0/8) Epoch 0, batch 26700, loss[loss=0.2764, simple_loss=0.3195, pruned_loss=0.1166, over 4775.00 frames.], tot_loss[loss=0.2078, simple_loss=0.2655, pruned_loss=0.07507, over 971178.94 frames.], batch size: 19, lr: 1.29e-03 2022-05-03 17:41:20,824 INFO [train.py:715] (0/8) Epoch 0, batch 26750, loss[loss=0.2174, simple_loss=0.2654, pruned_loss=0.08469, over 4971.00 frames.], tot_loss[loss=0.2092, simple_loss=0.2665, pruned_loss=0.0759, over 971933.27 frames.], batch size: 14, lr: 1.29e-03 2022-05-03 17:42:01,249 INFO [train.py:715] (0/8) Epoch 0, batch 26800, loss[loss=0.2316, simple_loss=0.2817, pruned_loss=0.0908, over 4938.00 frames.], tot_loss[loss=0.2098, simple_loss=0.2668, pruned_loss=0.07636, over 972169.74 frames.], batch size: 23, lr: 1.28e-03 2022-05-03 17:42:41,671 INFO [train.py:715] (0/8) Epoch 0, batch 26850, loss[loss=0.2123, simple_loss=0.2696, pruned_loss=0.07747, over 4928.00 frames.], tot_loss[loss=0.2094, simple_loss=0.2668, pruned_loss=0.07596, over 972746.88 frames.], batch size: 23, lr: 1.28e-03 2022-05-03 17:43:21,534 INFO [train.py:715] (0/8) Epoch 0, batch 26900, loss[loss=0.1727, simple_loss=0.2464, pruned_loss=0.0495, over 4886.00 frames.], tot_loss[loss=0.2086, simple_loss=0.2659, pruned_loss=0.07565, over 973349.80 frames.], batch size: 22, lr: 1.28e-03 2022-05-03 17:44:02,268 INFO [train.py:715] (0/8) Epoch 0, batch 26950, loss[loss=0.2483, simple_loss=0.3019, pruned_loss=0.09733, over 4956.00 frames.], tot_loss[loss=0.2077, simple_loss=0.2648, pruned_loss=0.07524, over 973256.06 frames.], batch size: 24, lr: 1.28e-03 2022-05-03 17:44:42,421 INFO [train.py:715] (0/8) Epoch 0, batch 27000, loss[loss=0.2121, simple_loss=0.2692, pruned_loss=0.07754, over 4986.00 frames.], tot_loss[loss=0.2081, simple_loss=0.2653, pruned_loss=0.07544, over 973088.33 frames.], batch size: 14, lr: 1.28e-03 2022-05-03 17:44:42,422 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 17:44:51,201 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1338, simple_loss=0.2208, pruned_loss=0.02337, over 914524.00 frames. 2022-05-03 17:45:31,274 INFO [train.py:715] (0/8) Epoch 0, batch 27050, loss[loss=0.1967, simple_loss=0.2631, pruned_loss=0.06516, over 4794.00 frames.], tot_loss[loss=0.208, simple_loss=0.2652, pruned_loss=0.07539, over 972291.22 frames.], batch size: 24, lr: 1.28e-03 2022-05-03 17:46:10,747 INFO [train.py:715] (0/8) Epoch 0, batch 27100, loss[loss=0.2191, simple_loss=0.2739, pruned_loss=0.08216, over 4841.00 frames.], tot_loss[loss=0.2081, simple_loss=0.2651, pruned_loss=0.07554, over 972844.22 frames.], batch size: 26, lr: 1.28e-03 2022-05-03 17:46:51,333 INFO [train.py:715] (0/8) Epoch 0, batch 27150, loss[loss=0.2108, simple_loss=0.2659, pruned_loss=0.07783, over 4886.00 frames.], tot_loss[loss=0.2069, simple_loss=0.2646, pruned_loss=0.07459, over 972666.22 frames.], batch size: 19, lr: 1.28e-03 2022-05-03 17:47:31,714 INFO [train.py:715] (0/8) Epoch 0, batch 27200, loss[loss=0.1762, simple_loss=0.2429, pruned_loss=0.05475, over 4959.00 frames.], tot_loss[loss=0.2058, simple_loss=0.2641, pruned_loss=0.07373, over 972775.63 frames.], batch size: 24, lr: 1.28e-03 2022-05-03 17:48:11,815 INFO [train.py:715] (0/8) Epoch 0, batch 27250, loss[loss=0.1614, simple_loss=0.2452, pruned_loss=0.0388, over 4786.00 frames.], tot_loss[loss=0.2055, simple_loss=0.2637, pruned_loss=0.07361, over 972325.02 frames.], batch size: 17, lr: 1.27e-03 2022-05-03 17:48:51,960 INFO [train.py:715] (0/8) Epoch 0, batch 27300, loss[loss=0.1788, simple_loss=0.2446, pruned_loss=0.05653, over 4963.00 frames.], tot_loss[loss=0.2054, simple_loss=0.2634, pruned_loss=0.07368, over 971932.31 frames.], batch size: 39, lr: 1.27e-03 2022-05-03 17:49:31,865 INFO [train.py:715] (0/8) Epoch 0, batch 27350, loss[loss=0.2165, simple_loss=0.2729, pruned_loss=0.08009, over 4771.00 frames.], tot_loss[loss=0.2043, simple_loss=0.2624, pruned_loss=0.07313, over 972418.99 frames.], batch size: 14, lr: 1.27e-03 2022-05-03 17:50:11,824 INFO [train.py:715] (0/8) Epoch 0, batch 27400, loss[loss=0.1914, simple_loss=0.2578, pruned_loss=0.06251, over 4924.00 frames.], tot_loss[loss=0.2043, simple_loss=0.262, pruned_loss=0.07327, over 972179.22 frames.], batch size: 23, lr: 1.27e-03 2022-05-03 17:50:51,093 INFO [train.py:715] (0/8) Epoch 0, batch 27450, loss[loss=0.1634, simple_loss=0.2175, pruned_loss=0.05465, over 4842.00 frames.], tot_loss[loss=0.2038, simple_loss=0.2616, pruned_loss=0.07295, over 973011.29 frames.], batch size: 32, lr: 1.27e-03 2022-05-03 17:51:31,243 INFO [train.py:715] (0/8) Epoch 0, batch 27500, loss[loss=0.2395, simple_loss=0.2776, pruned_loss=0.1007, over 4910.00 frames.], tot_loss[loss=0.2046, simple_loss=0.2621, pruned_loss=0.07355, over 972760.56 frames.], batch size: 17, lr: 1.27e-03 2022-05-03 17:52:11,052 INFO [train.py:715] (0/8) Epoch 0, batch 27550, loss[loss=0.1599, simple_loss=0.2252, pruned_loss=0.04728, over 4905.00 frames.], tot_loss[loss=0.2053, simple_loss=0.2627, pruned_loss=0.07393, over 973021.55 frames.], batch size: 22, lr: 1.27e-03 2022-05-03 17:52:50,539 INFO [train.py:715] (0/8) Epoch 0, batch 27600, loss[loss=0.214, simple_loss=0.2727, pruned_loss=0.07765, over 4817.00 frames.], tot_loss[loss=0.206, simple_loss=0.2632, pruned_loss=0.07437, over 972880.79 frames.], batch size: 25, lr: 1.27e-03 2022-05-03 17:53:29,971 INFO [train.py:715] (0/8) Epoch 0, batch 27650, loss[loss=0.2425, simple_loss=0.2921, pruned_loss=0.09648, over 4876.00 frames.], tot_loss[loss=0.2065, simple_loss=0.2641, pruned_loss=0.07442, over 972921.57 frames.], batch size: 16, lr: 1.27e-03 2022-05-03 17:54:09,972 INFO [train.py:715] (0/8) Epoch 0, batch 27700, loss[loss=0.2345, simple_loss=0.2825, pruned_loss=0.09326, over 4815.00 frames.], tot_loss[loss=0.2086, simple_loss=0.2658, pruned_loss=0.07573, over 972266.43 frames.], batch size: 14, lr: 1.26e-03 2022-05-03 17:54:50,345 INFO [train.py:715] (0/8) Epoch 0, batch 27750, loss[loss=0.2133, simple_loss=0.2727, pruned_loss=0.07698, over 4758.00 frames.], tot_loss[loss=0.208, simple_loss=0.2654, pruned_loss=0.07525, over 971421.23 frames.], batch size: 19, lr: 1.26e-03 2022-05-03 17:55:30,109 INFO [train.py:715] (0/8) Epoch 0, batch 27800, loss[loss=0.196, simple_loss=0.2575, pruned_loss=0.06726, over 4800.00 frames.], tot_loss[loss=0.2063, simple_loss=0.2638, pruned_loss=0.07439, over 972185.79 frames.], batch size: 21, lr: 1.26e-03 2022-05-03 17:56:10,361 INFO [train.py:715] (0/8) Epoch 0, batch 27850, loss[loss=0.1944, simple_loss=0.251, pruned_loss=0.06884, over 4896.00 frames.], tot_loss[loss=0.2073, simple_loss=0.2645, pruned_loss=0.07505, over 972736.57 frames.], batch size: 19, lr: 1.26e-03 2022-05-03 17:56:49,945 INFO [train.py:715] (0/8) Epoch 0, batch 27900, loss[loss=0.2383, simple_loss=0.292, pruned_loss=0.09226, over 4979.00 frames.], tot_loss[loss=0.2082, simple_loss=0.2652, pruned_loss=0.07559, over 972483.70 frames.], batch size: 28, lr: 1.26e-03 2022-05-03 17:57:29,409 INFO [train.py:715] (0/8) Epoch 0, batch 27950, loss[loss=0.2351, simple_loss=0.2802, pruned_loss=0.09499, over 4909.00 frames.], tot_loss[loss=0.2073, simple_loss=0.2646, pruned_loss=0.07495, over 972324.40 frames.], batch size: 17, lr: 1.26e-03 2022-05-03 17:58:09,433 INFO [train.py:715] (0/8) Epoch 0, batch 28000, loss[loss=0.2027, simple_loss=0.2687, pruned_loss=0.06837, over 4938.00 frames.], tot_loss[loss=0.2087, simple_loss=0.2656, pruned_loss=0.07591, over 971741.00 frames.], batch size: 29, lr: 1.26e-03 2022-05-03 17:58:49,663 INFO [train.py:715] (0/8) Epoch 0, batch 28050, loss[loss=0.2374, simple_loss=0.2855, pruned_loss=0.09466, over 4907.00 frames.], tot_loss[loss=0.2088, simple_loss=0.266, pruned_loss=0.07577, over 972064.86 frames.], batch size: 18, lr: 1.26e-03 2022-05-03 17:59:29,712 INFO [train.py:715] (0/8) Epoch 0, batch 28100, loss[loss=0.2106, simple_loss=0.2801, pruned_loss=0.07059, over 4869.00 frames.], tot_loss[loss=0.207, simple_loss=0.2646, pruned_loss=0.07475, over 972860.10 frames.], batch size: 20, lr: 1.26e-03 2022-05-03 18:00:08,966 INFO [train.py:715] (0/8) Epoch 0, batch 28150, loss[loss=0.1654, simple_loss=0.238, pruned_loss=0.04641, over 4818.00 frames.], tot_loss[loss=0.2065, simple_loss=0.2643, pruned_loss=0.07438, over 973304.91 frames.], batch size: 21, lr: 1.25e-03 2022-05-03 18:00:49,200 INFO [train.py:715] (0/8) Epoch 0, batch 28200, loss[loss=0.2071, simple_loss=0.2635, pruned_loss=0.07533, over 4743.00 frames.], tot_loss[loss=0.2059, simple_loss=0.2634, pruned_loss=0.07419, over 972534.56 frames.], batch size: 12, lr: 1.25e-03 2022-05-03 18:01:28,912 INFO [train.py:715] (0/8) Epoch 0, batch 28250, loss[loss=0.1832, simple_loss=0.2442, pruned_loss=0.0611, over 4937.00 frames.], tot_loss[loss=0.2062, simple_loss=0.264, pruned_loss=0.07425, over 971867.98 frames.], batch size: 23, lr: 1.25e-03 2022-05-03 18:02:07,675 INFO [train.py:715] (0/8) Epoch 0, batch 28300, loss[loss=0.1995, simple_loss=0.2639, pruned_loss=0.06758, over 4962.00 frames.], tot_loss[loss=0.2059, simple_loss=0.264, pruned_loss=0.0739, over 972387.13 frames.], batch size: 25, lr: 1.25e-03 2022-05-03 18:02:48,213 INFO [train.py:715] (0/8) Epoch 0, batch 28350, loss[loss=0.2378, simple_loss=0.295, pruned_loss=0.09026, over 4684.00 frames.], tot_loss[loss=0.2046, simple_loss=0.2626, pruned_loss=0.07328, over 971747.32 frames.], batch size: 15, lr: 1.25e-03 2022-05-03 18:03:27,713 INFO [train.py:715] (0/8) Epoch 0, batch 28400, loss[loss=0.2245, simple_loss=0.2809, pruned_loss=0.08407, over 4933.00 frames.], tot_loss[loss=0.2037, simple_loss=0.2617, pruned_loss=0.07282, over 970589.86 frames.], batch size: 21, lr: 1.25e-03 2022-05-03 18:04:07,961 INFO [train.py:715] (0/8) Epoch 0, batch 28450, loss[loss=0.2209, simple_loss=0.2792, pruned_loss=0.08128, over 4789.00 frames.], tot_loss[loss=0.2038, simple_loss=0.2621, pruned_loss=0.07278, over 970790.66 frames.], batch size: 18, lr: 1.25e-03 2022-05-03 18:04:47,635 INFO [train.py:715] (0/8) Epoch 0, batch 28500, loss[loss=0.1716, simple_loss=0.2333, pruned_loss=0.05497, over 4777.00 frames.], tot_loss[loss=0.2042, simple_loss=0.2623, pruned_loss=0.07307, over 969481.72 frames.], batch size: 12, lr: 1.25e-03 2022-05-03 18:05:28,102 INFO [train.py:715] (0/8) Epoch 0, batch 28550, loss[loss=0.2705, simple_loss=0.3082, pruned_loss=0.1164, over 4842.00 frames.], tot_loss[loss=0.2048, simple_loss=0.2628, pruned_loss=0.07338, over 969771.24 frames.], batch size: 15, lr: 1.25e-03 2022-05-03 18:06:07,736 INFO [train.py:715] (0/8) Epoch 0, batch 28600, loss[loss=0.2512, simple_loss=0.3101, pruned_loss=0.09616, over 4892.00 frames.], tot_loss[loss=0.2037, simple_loss=0.2618, pruned_loss=0.07279, over 971325.66 frames.], batch size: 16, lr: 1.24e-03 2022-05-03 18:06:46,954 INFO [train.py:715] (0/8) Epoch 0, batch 28650, loss[loss=0.1837, simple_loss=0.2528, pruned_loss=0.05726, over 4811.00 frames.], tot_loss[loss=0.2039, simple_loss=0.2622, pruned_loss=0.07284, over 970890.24 frames.], batch size: 25, lr: 1.24e-03 2022-05-03 18:07:26,844 INFO [train.py:715] (0/8) Epoch 0, batch 28700, loss[loss=0.1775, simple_loss=0.2325, pruned_loss=0.06125, over 4966.00 frames.], tot_loss[loss=0.2042, simple_loss=0.2628, pruned_loss=0.07274, over 972472.51 frames.], batch size: 24, lr: 1.24e-03 2022-05-03 18:08:06,488 INFO [train.py:715] (0/8) Epoch 0, batch 28750, loss[loss=0.1875, simple_loss=0.2456, pruned_loss=0.06475, over 4835.00 frames.], tot_loss[loss=0.2037, simple_loss=0.2624, pruned_loss=0.07251, over 972331.57 frames.], batch size: 27, lr: 1.24e-03 2022-05-03 18:08:46,802 INFO [train.py:715] (0/8) Epoch 0, batch 28800, loss[loss=0.1819, simple_loss=0.2472, pruned_loss=0.05831, over 4814.00 frames.], tot_loss[loss=0.2046, simple_loss=0.2629, pruned_loss=0.07317, over 972116.02 frames.], batch size: 25, lr: 1.24e-03 2022-05-03 18:09:25,927 INFO [train.py:715] (0/8) Epoch 0, batch 28850, loss[loss=0.2121, simple_loss=0.2727, pruned_loss=0.07572, over 4987.00 frames.], tot_loss[loss=0.2049, simple_loss=0.2631, pruned_loss=0.07334, over 971401.29 frames.], batch size: 31, lr: 1.24e-03 2022-05-03 18:10:05,956 INFO [train.py:715] (0/8) Epoch 0, batch 28900, loss[loss=0.174, simple_loss=0.2395, pruned_loss=0.05423, over 4784.00 frames.], tot_loss[loss=0.2052, simple_loss=0.2632, pruned_loss=0.07362, over 971067.78 frames.], batch size: 14, lr: 1.24e-03 2022-05-03 18:10:45,837 INFO [train.py:715] (0/8) Epoch 0, batch 28950, loss[loss=0.1926, simple_loss=0.2462, pruned_loss=0.06952, over 4859.00 frames.], tot_loss[loss=0.2047, simple_loss=0.2627, pruned_loss=0.0733, over 971605.50 frames.], batch size: 30, lr: 1.24e-03 2022-05-03 18:11:24,710 INFO [train.py:715] (0/8) Epoch 0, batch 29000, loss[loss=0.2167, simple_loss=0.2647, pruned_loss=0.08439, over 4647.00 frames.], tot_loss[loss=0.2045, simple_loss=0.2623, pruned_loss=0.07342, over 972145.66 frames.], batch size: 13, lr: 1.24e-03 2022-05-03 18:12:05,309 INFO [train.py:715] (0/8) Epoch 0, batch 29050, loss[loss=0.199, simple_loss=0.2622, pruned_loss=0.06788, over 4828.00 frames.], tot_loss[loss=0.2042, simple_loss=0.2621, pruned_loss=0.0732, over 972462.38 frames.], batch size: 26, lr: 1.24e-03 2022-05-03 18:12:45,445 INFO [train.py:715] (0/8) Epoch 0, batch 29100, loss[loss=0.188, simple_loss=0.2512, pruned_loss=0.06245, over 4841.00 frames.], tot_loss[loss=0.2036, simple_loss=0.2617, pruned_loss=0.0728, over 972753.11 frames.], batch size: 32, lr: 1.23e-03 2022-05-03 18:13:25,071 INFO [train.py:715] (0/8) Epoch 0, batch 29150, loss[loss=0.2167, simple_loss=0.2755, pruned_loss=0.07899, over 4817.00 frames.], tot_loss[loss=0.2037, simple_loss=0.2613, pruned_loss=0.07299, over 972475.39 frames.], batch size: 15, lr: 1.23e-03 2022-05-03 18:14:04,266 INFO [train.py:715] (0/8) Epoch 0, batch 29200, loss[loss=0.2417, simple_loss=0.2856, pruned_loss=0.09892, over 4795.00 frames.], tot_loss[loss=0.2027, simple_loss=0.2605, pruned_loss=0.07245, over 973347.12 frames.], batch size: 24, lr: 1.23e-03 2022-05-03 18:14:44,211 INFO [train.py:715] (0/8) Epoch 0, batch 29250, loss[loss=0.1873, simple_loss=0.2414, pruned_loss=0.06665, over 4639.00 frames.], tot_loss[loss=0.2013, simple_loss=0.2591, pruned_loss=0.07172, over 972623.39 frames.], batch size: 13, lr: 1.23e-03 2022-05-03 18:15:24,221 INFO [train.py:715] (0/8) Epoch 0, batch 29300, loss[loss=0.1983, simple_loss=0.2546, pruned_loss=0.07099, over 4790.00 frames.], tot_loss[loss=0.2031, simple_loss=0.2608, pruned_loss=0.07271, over 972100.79 frames.], batch size: 18, lr: 1.23e-03 2022-05-03 18:16:04,644 INFO [train.py:715] (0/8) Epoch 0, batch 29350, loss[loss=0.219, simple_loss=0.2711, pruned_loss=0.0834, over 4774.00 frames.], tot_loss[loss=0.2033, simple_loss=0.2612, pruned_loss=0.07266, over 972070.80 frames.], batch size: 18, lr: 1.23e-03 2022-05-03 18:16:44,085 INFO [train.py:715] (0/8) Epoch 0, batch 29400, loss[loss=0.1683, simple_loss=0.2313, pruned_loss=0.05266, over 4945.00 frames.], tot_loss[loss=0.203, simple_loss=0.2613, pruned_loss=0.07236, over 972838.80 frames.], batch size: 29, lr: 1.23e-03 2022-05-03 18:17:23,557 INFO [train.py:715] (0/8) Epoch 0, batch 29450, loss[loss=0.1659, simple_loss=0.2373, pruned_loss=0.04719, over 4967.00 frames.], tot_loss[loss=0.2021, simple_loss=0.2606, pruned_loss=0.07181, over 972871.70 frames.], batch size: 15, lr: 1.23e-03 2022-05-03 18:18:03,751 INFO [train.py:715] (0/8) Epoch 0, batch 29500, loss[loss=0.1493, simple_loss=0.2167, pruned_loss=0.04093, over 4822.00 frames.], tot_loss[loss=0.204, simple_loss=0.2619, pruned_loss=0.07303, over 971687.59 frames.], batch size: 26, lr: 1.23e-03 2022-05-03 18:18:42,862 INFO [train.py:715] (0/8) Epoch 0, batch 29550, loss[loss=0.2008, simple_loss=0.2653, pruned_loss=0.06813, over 4798.00 frames.], tot_loss[loss=0.2057, simple_loss=0.2633, pruned_loss=0.07403, over 970962.31 frames.], batch size: 21, lr: 1.23e-03 2022-05-03 18:19:23,023 INFO [train.py:715] (0/8) Epoch 0, batch 29600, loss[loss=0.2379, simple_loss=0.2924, pruned_loss=0.09174, over 4770.00 frames.], tot_loss[loss=0.2062, simple_loss=0.2638, pruned_loss=0.07427, over 970241.34 frames.], batch size: 18, lr: 1.22e-03 2022-05-03 18:20:02,964 INFO [train.py:715] (0/8) Epoch 0, batch 29650, loss[loss=0.1618, simple_loss=0.2166, pruned_loss=0.05355, over 4840.00 frames.], tot_loss[loss=0.2075, simple_loss=0.2648, pruned_loss=0.07513, over 971597.93 frames.], batch size: 12, lr: 1.22e-03 2022-05-03 18:20:42,829 INFO [train.py:715] (0/8) Epoch 0, batch 29700, loss[loss=0.2125, simple_loss=0.2652, pruned_loss=0.07992, over 4721.00 frames.], tot_loss[loss=0.2077, simple_loss=0.2648, pruned_loss=0.07533, over 971013.50 frames.], batch size: 12, lr: 1.22e-03 2022-05-03 18:21:23,327 INFO [train.py:715] (0/8) Epoch 0, batch 29750, loss[loss=0.1318, simple_loss=0.1962, pruned_loss=0.03371, over 4853.00 frames.], tot_loss[loss=0.2059, simple_loss=0.2635, pruned_loss=0.07422, over 970939.69 frames.], batch size: 12, lr: 1.22e-03 2022-05-03 18:22:03,154 INFO [train.py:715] (0/8) Epoch 0, batch 29800, loss[loss=0.195, simple_loss=0.2576, pruned_loss=0.06622, over 4915.00 frames.], tot_loss[loss=0.2054, simple_loss=0.2632, pruned_loss=0.07379, over 971359.71 frames.], batch size: 18, lr: 1.22e-03 2022-05-03 18:22:44,057 INFO [train.py:715] (0/8) Epoch 0, batch 29850, loss[loss=0.1992, simple_loss=0.2505, pruned_loss=0.07397, over 4972.00 frames.], tot_loss[loss=0.2057, simple_loss=0.2634, pruned_loss=0.07397, over 971673.43 frames.], batch size: 25, lr: 1.22e-03 2022-05-03 18:23:23,991 INFO [train.py:715] (0/8) Epoch 0, batch 29900, loss[loss=0.1821, simple_loss=0.2462, pruned_loss=0.05905, over 4829.00 frames.], tot_loss[loss=0.205, simple_loss=0.2629, pruned_loss=0.07354, over 971550.96 frames.], batch size: 26, lr: 1.22e-03 2022-05-03 18:24:03,886 INFO [train.py:715] (0/8) Epoch 0, batch 29950, loss[loss=0.1694, simple_loss=0.2375, pruned_loss=0.05063, over 4802.00 frames.], tot_loss[loss=0.2047, simple_loss=0.2629, pruned_loss=0.07325, over 971439.72 frames.], batch size: 21, lr: 1.22e-03 2022-05-03 18:24:43,770 INFO [train.py:715] (0/8) Epoch 0, batch 30000, loss[loss=0.1695, simple_loss=0.2361, pruned_loss=0.0514, over 4978.00 frames.], tot_loss[loss=0.2037, simple_loss=0.262, pruned_loss=0.07272, over 972028.44 frames.], batch size: 14, lr: 1.22e-03 2022-05-03 18:24:43,771 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 18:25:00,381 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1316, simple_loss=0.2189, pruned_loss=0.02213, over 914524.00 frames. 2022-05-03 18:25:40,686 INFO [train.py:715] (0/8) Epoch 0, batch 30050, loss[loss=0.1909, simple_loss=0.2492, pruned_loss=0.06633, over 4907.00 frames.], tot_loss[loss=0.2036, simple_loss=0.2618, pruned_loss=0.07272, over 972545.40 frames.], batch size: 17, lr: 1.22e-03 2022-05-03 18:26:21,237 INFO [train.py:715] (0/8) Epoch 0, batch 30100, loss[loss=0.1591, simple_loss=0.2317, pruned_loss=0.04327, over 4769.00 frames.], tot_loss[loss=0.2041, simple_loss=0.2621, pruned_loss=0.07299, over 972870.56 frames.], batch size: 18, lr: 1.21e-03 2022-05-03 18:27:01,918 INFO [train.py:715] (0/8) Epoch 0, batch 30150, loss[loss=0.216, simple_loss=0.2955, pruned_loss=0.0683, over 4983.00 frames.], tot_loss[loss=0.2043, simple_loss=0.2625, pruned_loss=0.07305, over 972840.59 frames.], batch size: 15, lr: 1.21e-03 2022-05-03 18:27:42,054 INFO [train.py:715] (0/8) Epoch 0, batch 30200, loss[loss=0.1942, simple_loss=0.2684, pruned_loss=0.06002, over 4958.00 frames.], tot_loss[loss=0.2035, simple_loss=0.262, pruned_loss=0.07245, over 973480.08 frames.], batch size: 15, lr: 1.21e-03 2022-05-03 18:28:22,540 INFO [train.py:715] (0/8) Epoch 0, batch 30250, loss[loss=0.2276, simple_loss=0.2851, pruned_loss=0.08501, over 4910.00 frames.], tot_loss[loss=0.2024, simple_loss=0.2616, pruned_loss=0.07163, over 973392.49 frames.], batch size: 17, lr: 1.21e-03 2022-05-03 18:29:02,644 INFO [train.py:715] (0/8) Epoch 0, batch 30300, loss[loss=0.2044, simple_loss=0.2672, pruned_loss=0.07073, over 4917.00 frames.], tot_loss[loss=0.202, simple_loss=0.2611, pruned_loss=0.07149, over 973469.91 frames.], batch size: 17, lr: 1.21e-03 2022-05-03 18:29:43,075 INFO [train.py:715] (0/8) Epoch 0, batch 30350, loss[loss=0.2369, simple_loss=0.2824, pruned_loss=0.09566, over 4940.00 frames.], tot_loss[loss=0.2019, simple_loss=0.261, pruned_loss=0.07139, over 973425.29 frames.], batch size: 29, lr: 1.21e-03 2022-05-03 18:30:23,202 INFO [train.py:715] (0/8) Epoch 0, batch 30400, loss[loss=0.1962, simple_loss=0.2624, pruned_loss=0.06499, over 4731.00 frames.], tot_loss[loss=0.2033, simple_loss=0.262, pruned_loss=0.07231, over 973109.57 frames.], batch size: 15, lr: 1.21e-03 2022-05-03 18:31:02,971 INFO [train.py:715] (0/8) Epoch 0, batch 30450, loss[loss=0.16, simple_loss=0.2224, pruned_loss=0.04881, over 4907.00 frames.], tot_loss[loss=0.2023, simple_loss=0.2613, pruned_loss=0.07161, over 973745.59 frames.], batch size: 18, lr: 1.21e-03 2022-05-03 18:31:42,723 INFO [train.py:715] (0/8) Epoch 0, batch 30500, loss[loss=0.175, simple_loss=0.244, pruned_loss=0.05298, over 4879.00 frames.], tot_loss[loss=0.2021, simple_loss=0.2612, pruned_loss=0.07156, over 974103.67 frames.], batch size: 13, lr: 1.21e-03 2022-05-03 18:32:22,645 INFO [train.py:715] (0/8) Epoch 0, batch 30550, loss[loss=0.2324, simple_loss=0.2878, pruned_loss=0.0885, over 4898.00 frames.], tot_loss[loss=0.2009, simple_loss=0.2602, pruned_loss=0.0708, over 973597.22 frames.], batch size: 17, lr: 1.21e-03 2022-05-03 18:33:01,763 INFO [train.py:715] (0/8) Epoch 0, batch 30600, loss[loss=0.1891, simple_loss=0.2555, pruned_loss=0.06137, over 4984.00 frames.], tot_loss[loss=0.2008, simple_loss=0.2602, pruned_loss=0.07074, over 973392.14 frames.], batch size: 14, lr: 1.20e-03 2022-05-03 18:33:41,707 INFO [train.py:715] (0/8) Epoch 0, batch 30650, loss[loss=0.2175, simple_loss=0.2776, pruned_loss=0.07869, over 4979.00 frames.], tot_loss[loss=0.2014, simple_loss=0.2611, pruned_loss=0.07086, over 972684.32 frames.], batch size: 39, lr: 1.20e-03 2022-05-03 18:34:21,519 INFO [train.py:715] (0/8) Epoch 0, batch 30700, loss[loss=0.2016, simple_loss=0.2709, pruned_loss=0.06613, over 4912.00 frames.], tot_loss[loss=0.2008, simple_loss=0.2605, pruned_loss=0.07054, over 972188.92 frames.], batch size: 39, lr: 1.20e-03 2022-05-03 18:35:01,624 INFO [train.py:715] (0/8) Epoch 0, batch 30750, loss[loss=0.2035, simple_loss=0.258, pruned_loss=0.07447, over 4906.00 frames.], tot_loss[loss=0.2013, simple_loss=0.2609, pruned_loss=0.07092, over 972699.49 frames.], batch size: 17, lr: 1.20e-03 2022-05-03 18:35:40,970 INFO [train.py:715] (0/8) Epoch 0, batch 30800, loss[loss=0.1718, simple_loss=0.2411, pruned_loss=0.05125, over 4970.00 frames.], tot_loss[loss=0.2016, simple_loss=0.2614, pruned_loss=0.07097, over 972051.97 frames.], batch size: 24, lr: 1.20e-03 2022-05-03 18:36:21,307 INFO [train.py:715] (0/8) Epoch 0, batch 30850, loss[loss=0.1811, simple_loss=0.2364, pruned_loss=0.06291, over 4971.00 frames.], tot_loss[loss=0.2015, simple_loss=0.2611, pruned_loss=0.07095, over 971928.56 frames.], batch size: 14, lr: 1.20e-03 2022-05-03 18:37:01,147 INFO [train.py:715] (0/8) Epoch 0, batch 30900, loss[loss=0.2155, simple_loss=0.2728, pruned_loss=0.07911, over 4807.00 frames.], tot_loss[loss=0.2035, simple_loss=0.2625, pruned_loss=0.07227, over 972305.49 frames.], batch size: 25, lr: 1.20e-03 2022-05-03 18:37:40,866 INFO [train.py:715] (0/8) Epoch 0, batch 30950, loss[loss=0.2545, simple_loss=0.3015, pruned_loss=0.1038, over 4941.00 frames.], tot_loss[loss=0.2027, simple_loss=0.2615, pruned_loss=0.07195, over 972496.32 frames.], batch size: 39, lr: 1.20e-03 2022-05-03 18:38:20,952 INFO [train.py:715] (0/8) Epoch 0, batch 31000, loss[loss=0.1559, simple_loss=0.2272, pruned_loss=0.04231, over 4917.00 frames.], tot_loss[loss=0.2009, simple_loss=0.2603, pruned_loss=0.07073, over 973225.39 frames.], batch size: 23, lr: 1.20e-03 2022-05-03 18:39:00,969 INFO [train.py:715] (0/8) Epoch 0, batch 31050, loss[loss=0.1979, simple_loss=0.259, pruned_loss=0.06847, over 4906.00 frames.], tot_loss[loss=0.2006, simple_loss=0.2601, pruned_loss=0.07049, over 973020.51 frames.], batch size: 17, lr: 1.20e-03 2022-05-03 18:39:40,379 INFO [train.py:715] (0/8) Epoch 0, batch 31100, loss[loss=0.1914, simple_loss=0.2451, pruned_loss=0.0689, over 4750.00 frames.], tot_loss[loss=0.1995, simple_loss=0.2591, pruned_loss=0.06995, over 972295.79 frames.], batch size: 19, lr: 1.20e-03 2022-05-03 18:40:19,537 INFO [train.py:715] (0/8) Epoch 0, batch 31150, loss[loss=0.2223, simple_loss=0.2763, pruned_loss=0.0842, over 4899.00 frames.], tot_loss[loss=0.2015, simple_loss=0.2611, pruned_loss=0.07099, over 972412.54 frames.], batch size: 17, lr: 1.19e-03 2022-05-03 18:40:59,616 INFO [train.py:715] (0/8) Epoch 0, batch 31200, loss[loss=0.1983, simple_loss=0.2534, pruned_loss=0.07162, over 4869.00 frames.], tot_loss[loss=0.2024, simple_loss=0.2611, pruned_loss=0.0718, over 971905.81 frames.], batch size: 12, lr: 1.19e-03 2022-05-03 18:41:39,408 INFO [train.py:715] (0/8) Epoch 0, batch 31250, loss[loss=0.2084, simple_loss=0.2762, pruned_loss=0.0703, over 4972.00 frames.], tot_loss[loss=0.2024, simple_loss=0.261, pruned_loss=0.07188, over 972322.62 frames.], batch size: 24, lr: 1.19e-03 2022-05-03 18:42:18,880 INFO [train.py:715] (0/8) Epoch 0, batch 31300, loss[loss=0.1994, simple_loss=0.2651, pruned_loss=0.06684, over 4839.00 frames.], tot_loss[loss=0.2021, simple_loss=0.2607, pruned_loss=0.07169, over 972698.96 frames.], batch size: 26, lr: 1.19e-03 2022-05-03 18:42:59,213 INFO [train.py:715] (0/8) Epoch 0, batch 31350, loss[loss=0.2183, simple_loss=0.283, pruned_loss=0.07678, over 4975.00 frames.], tot_loss[loss=0.2026, simple_loss=0.2616, pruned_loss=0.07185, over 972589.89 frames.], batch size: 15, lr: 1.19e-03 2022-05-03 18:43:38,899 INFO [train.py:715] (0/8) Epoch 0, batch 31400, loss[loss=0.2112, simple_loss=0.2672, pruned_loss=0.07757, over 4831.00 frames.], tot_loss[loss=0.2021, simple_loss=0.2612, pruned_loss=0.07155, over 972196.57 frames.], batch size: 13, lr: 1.19e-03 2022-05-03 18:44:18,176 INFO [train.py:715] (0/8) Epoch 0, batch 31450, loss[loss=0.2117, simple_loss=0.2679, pruned_loss=0.07779, over 4978.00 frames.], tot_loss[loss=0.2031, simple_loss=0.262, pruned_loss=0.07207, over 972857.01 frames.], batch size: 35, lr: 1.19e-03 2022-05-03 18:44:57,277 INFO [train.py:715] (0/8) Epoch 0, batch 31500, loss[loss=0.1841, simple_loss=0.243, pruned_loss=0.06258, over 4993.00 frames.], tot_loss[loss=0.2029, simple_loss=0.2619, pruned_loss=0.07194, over 973440.53 frames.], batch size: 14, lr: 1.19e-03 2022-05-03 18:45:37,325 INFO [train.py:715] (0/8) Epoch 0, batch 31550, loss[loss=0.1798, simple_loss=0.2461, pruned_loss=0.05674, over 4884.00 frames.], tot_loss[loss=0.2031, simple_loss=0.2618, pruned_loss=0.07214, over 973264.26 frames.], batch size: 19, lr: 1.19e-03 2022-05-03 18:46:17,103 INFO [train.py:715] (0/8) Epoch 0, batch 31600, loss[loss=0.1772, simple_loss=0.2449, pruned_loss=0.05471, over 4796.00 frames.], tot_loss[loss=0.2018, simple_loss=0.2605, pruned_loss=0.07149, over 973603.07 frames.], batch size: 17, lr: 1.19e-03 2022-05-03 18:46:56,339 INFO [train.py:715] (0/8) Epoch 0, batch 31650, loss[loss=0.1938, simple_loss=0.2498, pruned_loss=0.06885, over 4817.00 frames.], tot_loss[loss=0.2027, simple_loss=0.2613, pruned_loss=0.07205, over 973336.37 frames.], batch size: 14, lr: 1.19e-03 2022-05-03 18:47:36,250 INFO [train.py:715] (0/8) Epoch 0, batch 31700, loss[loss=0.2101, simple_loss=0.2742, pruned_loss=0.07297, over 4794.00 frames.], tot_loss[loss=0.203, simple_loss=0.2615, pruned_loss=0.07227, over 972613.08 frames.], batch size: 24, lr: 1.18e-03 2022-05-03 18:48:16,474 INFO [train.py:715] (0/8) Epoch 0, batch 31750, loss[loss=0.2185, simple_loss=0.2803, pruned_loss=0.07839, over 4961.00 frames.], tot_loss[loss=0.2047, simple_loss=0.2629, pruned_loss=0.07325, over 972553.11 frames.], batch size: 39, lr: 1.18e-03 2022-05-03 18:48:56,205 INFO [train.py:715] (0/8) Epoch 0, batch 31800, loss[loss=0.1842, simple_loss=0.247, pruned_loss=0.06066, over 4857.00 frames.], tot_loss[loss=0.2048, simple_loss=0.2629, pruned_loss=0.07331, over 973281.65 frames.], batch size: 15, lr: 1.18e-03 2022-05-03 18:49:35,467 INFO [train.py:715] (0/8) Epoch 0, batch 31850, loss[loss=0.1918, simple_loss=0.2642, pruned_loss=0.0597, over 4905.00 frames.], tot_loss[loss=0.2039, simple_loss=0.2623, pruned_loss=0.07276, over 973706.25 frames.], batch size: 19, lr: 1.18e-03 2022-05-03 18:50:15,969 INFO [train.py:715] (0/8) Epoch 0, batch 31900, loss[loss=0.1695, simple_loss=0.2397, pruned_loss=0.04966, over 4837.00 frames.], tot_loss[loss=0.2028, simple_loss=0.2615, pruned_loss=0.07205, over 973000.91 frames.], batch size: 15, lr: 1.18e-03 2022-05-03 18:50:55,676 INFO [train.py:715] (0/8) Epoch 0, batch 31950, loss[loss=0.206, simple_loss=0.2759, pruned_loss=0.06807, over 4805.00 frames.], tot_loss[loss=0.2029, simple_loss=0.2616, pruned_loss=0.07214, over 972778.29 frames.], batch size: 25, lr: 1.18e-03 2022-05-03 18:51:34,289 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-32000.pt 2022-05-03 18:51:37,234 INFO [train.py:715] (0/8) Epoch 0, batch 32000, loss[loss=0.2234, simple_loss=0.2886, pruned_loss=0.07912, over 4917.00 frames.], tot_loss[loss=0.2026, simple_loss=0.2616, pruned_loss=0.07182, over 971738.60 frames.], batch size: 29, lr: 1.18e-03 2022-05-03 18:52:17,389 INFO [train.py:715] (0/8) Epoch 0, batch 32050, loss[loss=0.2256, simple_loss=0.2789, pruned_loss=0.08619, over 4956.00 frames.], tot_loss[loss=0.2029, simple_loss=0.2618, pruned_loss=0.07201, over 972299.18 frames.], batch size: 21, lr: 1.18e-03 2022-05-03 18:52:57,281 INFO [train.py:715] (0/8) Epoch 0, batch 32100, loss[loss=0.1754, simple_loss=0.2464, pruned_loss=0.05223, over 4886.00 frames.], tot_loss[loss=0.2014, simple_loss=0.2607, pruned_loss=0.07108, over 972221.37 frames.], batch size: 19, lr: 1.18e-03 2022-05-03 18:53:36,631 INFO [train.py:715] (0/8) Epoch 0, batch 32150, loss[loss=0.1758, simple_loss=0.235, pruned_loss=0.05835, over 4823.00 frames.], tot_loss[loss=0.2008, simple_loss=0.26, pruned_loss=0.07078, over 972910.42 frames.], batch size: 15, lr: 1.18e-03 2022-05-03 18:54:15,811 INFO [train.py:715] (0/8) Epoch 0, batch 32200, loss[loss=0.168, simple_loss=0.2309, pruned_loss=0.05253, over 4752.00 frames.], tot_loss[loss=0.2001, simple_loss=0.2593, pruned_loss=0.07041, over 972101.30 frames.], batch size: 16, lr: 1.18e-03 2022-05-03 18:54:55,966 INFO [train.py:715] (0/8) Epoch 0, batch 32250, loss[loss=0.155, simple_loss=0.226, pruned_loss=0.04197, over 4974.00 frames.], tot_loss[loss=0.2007, simple_loss=0.2604, pruned_loss=0.07053, over 972633.87 frames.], batch size: 14, lr: 1.17e-03 2022-05-03 18:55:35,812 INFO [train.py:715] (0/8) Epoch 0, batch 32300, loss[loss=0.1384, simple_loss=0.197, pruned_loss=0.03997, over 4796.00 frames.], tot_loss[loss=0.201, simple_loss=0.2604, pruned_loss=0.07078, over 972420.17 frames.], batch size: 12, lr: 1.17e-03 2022-05-03 18:56:15,326 INFO [train.py:715] (0/8) Epoch 0, batch 32350, loss[loss=0.2308, simple_loss=0.2779, pruned_loss=0.09186, over 4807.00 frames.], tot_loss[loss=0.2011, simple_loss=0.2602, pruned_loss=0.07101, over 972743.94 frames.], batch size: 15, lr: 1.17e-03 2022-05-03 18:56:55,317 INFO [train.py:715] (0/8) Epoch 0, batch 32400, loss[loss=0.1652, simple_loss=0.23, pruned_loss=0.05025, over 4902.00 frames.], tot_loss[loss=0.1993, simple_loss=0.2585, pruned_loss=0.07008, over 973235.16 frames.], batch size: 17, lr: 1.17e-03 2022-05-03 18:57:35,392 INFO [train.py:715] (0/8) Epoch 0, batch 32450, loss[loss=0.2485, simple_loss=0.293, pruned_loss=0.102, over 4911.00 frames.], tot_loss[loss=0.1993, simple_loss=0.2583, pruned_loss=0.07016, over 973216.18 frames.], batch size: 18, lr: 1.17e-03 2022-05-03 18:58:15,204 INFO [train.py:715] (0/8) Epoch 0, batch 32500, loss[loss=0.1829, simple_loss=0.2526, pruned_loss=0.05656, over 4891.00 frames.], tot_loss[loss=0.198, simple_loss=0.2573, pruned_loss=0.06934, over 972778.66 frames.], batch size: 19, lr: 1.17e-03 2022-05-03 18:58:54,511 INFO [train.py:715] (0/8) Epoch 0, batch 32550, loss[loss=0.2449, simple_loss=0.2875, pruned_loss=0.1011, over 4903.00 frames.], tot_loss[loss=0.1982, simple_loss=0.2579, pruned_loss=0.06925, over 972121.75 frames.], batch size: 32, lr: 1.17e-03 2022-05-03 18:59:34,023 INFO [train.py:715] (0/8) Epoch 0, batch 32600, loss[loss=0.1758, simple_loss=0.2436, pruned_loss=0.05403, over 4970.00 frames.], tot_loss[loss=0.1972, simple_loss=0.2577, pruned_loss=0.06836, over 971941.77 frames.], batch size: 15, lr: 1.17e-03 2022-05-03 19:00:13,284 INFO [train.py:715] (0/8) Epoch 0, batch 32650, loss[loss=0.1759, simple_loss=0.2428, pruned_loss=0.05452, over 4838.00 frames.], tot_loss[loss=0.1972, simple_loss=0.2578, pruned_loss=0.06832, over 972170.42 frames.], batch size: 15, lr: 1.17e-03 2022-05-03 19:00:52,619 INFO [train.py:715] (0/8) Epoch 0, batch 32700, loss[loss=0.2104, simple_loss=0.2642, pruned_loss=0.07825, over 4649.00 frames.], tot_loss[loss=0.1975, simple_loss=0.258, pruned_loss=0.0685, over 971816.31 frames.], batch size: 13, lr: 1.17e-03 2022-05-03 19:01:32,101 INFO [train.py:715] (0/8) Epoch 0, batch 32750, loss[loss=0.1398, simple_loss=0.2068, pruned_loss=0.03643, over 4750.00 frames.], tot_loss[loss=0.1978, simple_loss=0.2584, pruned_loss=0.06866, over 972665.65 frames.], batch size: 12, lr: 1.17e-03 2022-05-03 19:02:12,130 INFO [train.py:715] (0/8) Epoch 0, batch 32800, loss[loss=0.2266, simple_loss=0.2989, pruned_loss=0.07709, over 4927.00 frames.], tot_loss[loss=0.1976, simple_loss=0.2579, pruned_loss=0.06861, over 972429.83 frames.], batch size: 29, lr: 1.16e-03 2022-05-03 19:02:51,640 INFO [train.py:715] (0/8) Epoch 0, batch 32850, loss[loss=0.2156, simple_loss=0.267, pruned_loss=0.0821, over 4829.00 frames.], tot_loss[loss=0.198, simple_loss=0.258, pruned_loss=0.06898, over 971954.33 frames.], batch size: 27, lr: 1.16e-03 2022-05-03 19:03:31,122 INFO [train.py:715] (0/8) Epoch 0, batch 32900, loss[loss=0.2104, simple_loss=0.2724, pruned_loss=0.07423, over 4847.00 frames.], tot_loss[loss=0.198, simple_loss=0.2583, pruned_loss=0.06881, over 971197.33 frames.], batch size: 30, lr: 1.16e-03 2022-05-03 19:04:11,182 INFO [train.py:715] (0/8) Epoch 0, batch 32950, loss[loss=0.1895, simple_loss=0.2537, pruned_loss=0.06259, over 4935.00 frames.], tot_loss[loss=0.1978, simple_loss=0.2582, pruned_loss=0.06871, over 971819.85 frames.], batch size: 18, lr: 1.16e-03 2022-05-03 19:04:50,685 INFO [train.py:715] (0/8) Epoch 0, batch 33000, loss[loss=0.1743, simple_loss=0.2334, pruned_loss=0.05759, over 4790.00 frames.], tot_loss[loss=0.1987, simple_loss=0.2588, pruned_loss=0.06935, over 971846.13 frames.], batch size: 18, lr: 1.16e-03 2022-05-03 19:04:50,686 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 19:05:00,797 INFO [train.py:742] (0/8) Epoch 0, validation: loss=0.1303, simple_loss=0.2174, pruned_loss=0.02158, over 914524.00 frames. 2022-05-03 19:05:40,742 INFO [train.py:715] (0/8) Epoch 0, batch 33050, loss[loss=0.1914, simple_loss=0.2654, pruned_loss=0.05868, over 4950.00 frames.], tot_loss[loss=0.1986, simple_loss=0.2586, pruned_loss=0.06936, over 971813.25 frames.], batch size: 35, lr: 1.16e-03 2022-05-03 19:06:20,350 INFO [train.py:715] (0/8) Epoch 0, batch 33100, loss[loss=0.1883, simple_loss=0.2532, pruned_loss=0.06177, over 4924.00 frames.], tot_loss[loss=0.1994, simple_loss=0.2593, pruned_loss=0.06973, over 972476.91 frames.], batch size: 23, lr: 1.16e-03 2022-05-03 19:07:01,017 INFO [train.py:715] (0/8) Epoch 0, batch 33150, loss[loss=0.1881, simple_loss=0.2479, pruned_loss=0.06416, over 4942.00 frames.], tot_loss[loss=0.2006, simple_loss=0.2604, pruned_loss=0.07043, over 972415.69 frames.], batch size: 23, lr: 1.16e-03 2022-05-03 19:07:41,361 INFO [train.py:715] (0/8) Epoch 0, batch 33200, loss[loss=0.19, simple_loss=0.2365, pruned_loss=0.07169, over 4848.00 frames.], tot_loss[loss=0.2019, simple_loss=0.2611, pruned_loss=0.07136, over 971972.49 frames.], batch size: 13, lr: 1.16e-03 2022-05-03 19:08:21,597 INFO [train.py:715] (0/8) Epoch 0, batch 33250, loss[loss=0.205, simple_loss=0.2646, pruned_loss=0.07271, over 4785.00 frames.], tot_loss[loss=0.2, simple_loss=0.2601, pruned_loss=0.06992, over 972743.04 frames.], batch size: 17, lr: 1.16e-03 2022-05-03 19:09:01,809 INFO [train.py:715] (0/8) Epoch 0, batch 33300, loss[loss=0.187, simple_loss=0.2427, pruned_loss=0.06566, over 4821.00 frames.], tot_loss[loss=0.1997, simple_loss=0.2601, pruned_loss=0.06962, over 973124.28 frames.], batch size: 27, lr: 1.16e-03 2022-05-03 19:09:42,530 INFO [train.py:715] (0/8) Epoch 0, batch 33350, loss[loss=0.2061, simple_loss=0.266, pruned_loss=0.07308, over 4787.00 frames.], tot_loss[loss=0.1997, simple_loss=0.2597, pruned_loss=0.06981, over 972996.13 frames.], batch size: 17, lr: 1.16e-03 2022-05-03 19:10:22,675 INFO [train.py:715] (0/8) Epoch 0, batch 33400, loss[loss=0.2248, simple_loss=0.2586, pruned_loss=0.09548, over 4941.00 frames.], tot_loss[loss=0.2002, simple_loss=0.2601, pruned_loss=0.07013, over 972054.27 frames.], batch size: 39, lr: 1.15e-03 2022-05-03 19:11:02,704 INFO [train.py:715] (0/8) Epoch 0, batch 33450, loss[loss=0.1489, simple_loss=0.2277, pruned_loss=0.03504, over 4755.00 frames.], tot_loss[loss=0.2004, simple_loss=0.2601, pruned_loss=0.0703, over 972766.22 frames.], batch size: 19, lr: 1.15e-03 2022-05-03 19:11:43,362 INFO [train.py:715] (0/8) Epoch 0, batch 33500, loss[loss=0.1696, simple_loss=0.2385, pruned_loss=0.05038, over 4932.00 frames.], tot_loss[loss=0.2017, simple_loss=0.2612, pruned_loss=0.0711, over 972570.65 frames.], batch size: 23, lr: 1.15e-03 2022-05-03 19:12:23,715 INFO [train.py:715] (0/8) Epoch 0, batch 33550, loss[loss=0.1971, simple_loss=0.2584, pruned_loss=0.0679, over 4865.00 frames.], tot_loss[loss=0.2019, simple_loss=0.2611, pruned_loss=0.07131, over 973130.36 frames.], batch size: 20, lr: 1.15e-03 2022-05-03 19:13:02,899 INFO [train.py:715] (0/8) Epoch 0, batch 33600, loss[loss=0.1757, simple_loss=0.2311, pruned_loss=0.06016, over 4877.00 frames.], tot_loss[loss=0.2005, simple_loss=0.2597, pruned_loss=0.07067, over 973871.70 frames.], batch size: 32, lr: 1.15e-03 2022-05-03 19:13:43,474 INFO [train.py:715] (0/8) Epoch 0, batch 33650, loss[loss=0.1885, simple_loss=0.2462, pruned_loss=0.06538, over 4822.00 frames.], tot_loss[loss=0.1994, simple_loss=0.2589, pruned_loss=0.06999, over 974466.66 frames.], batch size: 27, lr: 1.15e-03 2022-05-03 19:14:23,807 INFO [train.py:715] (0/8) Epoch 0, batch 33700, loss[loss=0.2168, simple_loss=0.2817, pruned_loss=0.07596, over 4746.00 frames.], tot_loss[loss=0.1985, simple_loss=0.2581, pruned_loss=0.06944, over 973429.52 frames.], batch size: 16, lr: 1.15e-03 2022-05-03 19:15:03,034 INFO [train.py:715] (0/8) Epoch 0, batch 33750, loss[loss=0.2038, simple_loss=0.2522, pruned_loss=0.07767, over 4985.00 frames.], tot_loss[loss=0.197, simple_loss=0.2568, pruned_loss=0.06864, over 972752.01 frames.], batch size: 35, lr: 1.15e-03 2022-05-03 19:15:42,517 INFO [train.py:715] (0/8) Epoch 0, batch 33800, loss[loss=0.2267, simple_loss=0.2842, pruned_loss=0.08458, over 4775.00 frames.], tot_loss[loss=0.1983, simple_loss=0.2577, pruned_loss=0.06939, over 973489.32 frames.], batch size: 18, lr: 1.15e-03 2022-05-03 19:16:22,772 INFO [train.py:715] (0/8) Epoch 0, batch 33850, loss[loss=0.1995, simple_loss=0.2579, pruned_loss=0.07053, over 4872.00 frames.], tot_loss[loss=0.1986, simple_loss=0.2577, pruned_loss=0.06971, over 972825.10 frames.], batch size: 32, lr: 1.15e-03 2022-05-03 19:17:02,062 INFO [train.py:715] (0/8) Epoch 0, batch 33900, loss[loss=0.1919, simple_loss=0.258, pruned_loss=0.0629, over 4643.00 frames.], tot_loss[loss=0.1981, simple_loss=0.2574, pruned_loss=0.06943, over 971350.18 frames.], batch size: 13, lr: 1.15e-03 2022-05-03 19:17:41,118 INFO [train.py:715] (0/8) Epoch 0, batch 33950, loss[loss=0.2104, simple_loss=0.254, pruned_loss=0.08338, over 4823.00 frames.], tot_loss[loss=0.1993, simple_loss=0.2577, pruned_loss=0.07048, over 971161.47 frames.], batch size: 12, lr: 1.15e-03 2022-05-03 19:18:21,090 INFO [train.py:715] (0/8) Epoch 0, batch 34000, loss[loss=0.1586, simple_loss=0.2227, pruned_loss=0.04727, over 4974.00 frames.], tot_loss[loss=0.1976, simple_loss=0.2565, pruned_loss=0.06933, over 971492.89 frames.], batch size: 25, lr: 1.14e-03 2022-05-03 19:19:00,968 INFO [train.py:715] (0/8) Epoch 0, batch 34050, loss[loss=0.1647, simple_loss=0.239, pruned_loss=0.04516, over 4904.00 frames.], tot_loss[loss=0.1974, simple_loss=0.2566, pruned_loss=0.06914, over 971692.23 frames.], batch size: 19, lr: 1.14e-03 2022-05-03 19:19:40,632 INFO [train.py:715] (0/8) Epoch 0, batch 34100, loss[loss=0.1559, simple_loss=0.2306, pruned_loss=0.04056, over 4805.00 frames.], tot_loss[loss=0.1998, simple_loss=0.2589, pruned_loss=0.07032, over 971743.06 frames.], batch size: 25, lr: 1.14e-03 2022-05-03 19:20:19,830 INFO [train.py:715] (0/8) Epoch 0, batch 34150, loss[loss=0.1982, simple_loss=0.2597, pruned_loss=0.06838, over 4938.00 frames.], tot_loss[loss=0.1991, simple_loss=0.2585, pruned_loss=0.06989, over 971673.12 frames.], batch size: 21, lr: 1.14e-03 2022-05-03 19:20:59,753 INFO [train.py:715] (0/8) Epoch 0, batch 34200, loss[loss=0.2056, simple_loss=0.2754, pruned_loss=0.06793, over 4697.00 frames.], tot_loss[loss=0.199, simple_loss=0.2589, pruned_loss=0.06955, over 971169.77 frames.], batch size: 15, lr: 1.14e-03 2022-05-03 19:21:39,301 INFO [train.py:715] (0/8) Epoch 0, batch 34250, loss[loss=0.1934, simple_loss=0.2516, pruned_loss=0.06754, over 4923.00 frames.], tot_loss[loss=0.1991, simple_loss=0.259, pruned_loss=0.06957, over 971626.07 frames.], batch size: 23, lr: 1.14e-03 2022-05-03 19:22:18,620 INFO [train.py:715] (0/8) Epoch 0, batch 34300, loss[loss=0.1925, simple_loss=0.2552, pruned_loss=0.06495, over 4861.00 frames.], tot_loss[loss=0.1992, simple_loss=0.2591, pruned_loss=0.06963, over 970716.34 frames.], batch size: 15, lr: 1.14e-03 2022-05-03 19:22:58,858 INFO [train.py:715] (0/8) Epoch 0, batch 34350, loss[loss=0.1824, simple_loss=0.2429, pruned_loss=0.06093, over 4818.00 frames.], tot_loss[loss=0.1985, simple_loss=0.2585, pruned_loss=0.06926, over 971173.83 frames.], batch size: 27, lr: 1.14e-03 2022-05-03 19:23:39,063 INFO [train.py:715] (0/8) Epoch 0, batch 34400, loss[loss=0.2498, simple_loss=0.3049, pruned_loss=0.09735, over 4987.00 frames.], tot_loss[loss=0.1986, simple_loss=0.2584, pruned_loss=0.06937, over 971653.58 frames.], batch size: 39, lr: 1.14e-03 2022-05-03 19:24:18,634 INFO [train.py:715] (0/8) Epoch 0, batch 34450, loss[loss=0.1961, simple_loss=0.2569, pruned_loss=0.06761, over 4967.00 frames.], tot_loss[loss=0.1997, simple_loss=0.2598, pruned_loss=0.06985, over 972255.24 frames.], batch size: 15, lr: 1.14e-03 2022-05-03 19:24:57,905 INFO [train.py:715] (0/8) Epoch 0, batch 34500, loss[loss=0.2028, simple_loss=0.2603, pruned_loss=0.07265, over 4946.00 frames.], tot_loss[loss=0.201, simple_loss=0.2607, pruned_loss=0.0706, over 972505.23 frames.], batch size: 21, lr: 1.14e-03 2022-05-03 19:25:38,250 INFO [train.py:715] (0/8) Epoch 0, batch 34550, loss[loss=0.1803, simple_loss=0.2388, pruned_loss=0.06088, over 4801.00 frames.], tot_loss[loss=0.2006, simple_loss=0.2605, pruned_loss=0.07034, over 972479.44 frames.], batch size: 24, lr: 1.14e-03 2022-05-03 19:26:17,985 INFO [train.py:715] (0/8) Epoch 0, batch 34600, loss[loss=0.1368, simple_loss=0.2194, pruned_loss=0.02712, over 4919.00 frames.], tot_loss[loss=0.1994, simple_loss=0.2597, pruned_loss=0.06955, over 971800.10 frames.], batch size: 18, lr: 1.13e-03 2022-05-03 19:26:57,216 INFO [train.py:715] (0/8) Epoch 0, batch 34650, loss[loss=0.1829, simple_loss=0.2464, pruned_loss=0.0597, over 4695.00 frames.], tot_loss[loss=0.2004, simple_loss=0.2604, pruned_loss=0.07019, over 973176.65 frames.], batch size: 15, lr: 1.13e-03 2022-05-03 19:27:37,743 INFO [train.py:715] (0/8) Epoch 0, batch 34700, loss[loss=0.2421, simple_loss=0.2977, pruned_loss=0.09329, over 4930.00 frames.], tot_loss[loss=0.1998, simple_loss=0.2598, pruned_loss=0.06989, over 973216.48 frames.], batch size: 18, lr: 1.13e-03 2022-05-03 19:28:15,921 INFO [train.py:715] (0/8) Epoch 0, batch 34750, loss[loss=0.1759, simple_loss=0.2503, pruned_loss=0.05075, over 4944.00 frames.], tot_loss[loss=0.1982, simple_loss=0.2586, pruned_loss=0.06886, over 973847.55 frames.], batch size: 21, lr: 1.13e-03 2022-05-03 19:28:53,216 INFO [train.py:715] (0/8) Epoch 0, batch 34800, loss[loss=0.1877, simple_loss=0.2517, pruned_loss=0.06182, over 4767.00 frames.], tot_loss[loss=0.1976, simple_loss=0.2578, pruned_loss=0.06865, over 973112.15 frames.], batch size: 12, lr: 1.13e-03 2022-05-03 19:29:02,557 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-0.pt 2022-05-03 19:29:42,572 INFO [train.py:715] (0/8) Epoch 1, batch 0, loss[loss=0.2302, simple_loss=0.2857, pruned_loss=0.08735, over 4926.00 frames.], tot_loss[loss=0.2302, simple_loss=0.2857, pruned_loss=0.08735, over 4926.00 frames.], batch size: 29, lr: 1.11e-03 2022-05-03 19:30:21,874 INFO [train.py:715] (0/8) Epoch 1, batch 50, loss[loss=0.1966, simple_loss=0.2579, pruned_loss=0.06772, over 4757.00 frames.], tot_loss[loss=0.1998, simple_loss=0.2594, pruned_loss=0.07015, over 218897.30 frames.], batch size: 19, lr: 1.11e-03 2022-05-03 19:31:01,846 INFO [train.py:715] (0/8) Epoch 1, batch 100, loss[loss=0.1916, simple_loss=0.2449, pruned_loss=0.06921, over 4875.00 frames.], tot_loss[loss=0.1986, simple_loss=0.2585, pruned_loss=0.06938, over 385932.50 frames.], batch size: 16, lr: 1.11e-03 2022-05-03 19:31:41,284 INFO [train.py:715] (0/8) Epoch 1, batch 150, loss[loss=0.1967, simple_loss=0.2593, pruned_loss=0.06707, over 4847.00 frames.], tot_loss[loss=0.1967, simple_loss=0.2567, pruned_loss=0.06839, over 515842.06 frames.], batch size: 30, lr: 1.11e-03 2022-05-03 19:32:20,521 INFO [train.py:715] (0/8) Epoch 1, batch 200, loss[loss=0.2065, simple_loss=0.2667, pruned_loss=0.07318, over 4897.00 frames.], tot_loss[loss=0.1967, simple_loss=0.2566, pruned_loss=0.06835, over 617315.20 frames.], batch size: 19, lr: 1.11e-03 2022-05-03 19:33:00,052 INFO [train.py:715] (0/8) Epoch 1, batch 250, loss[loss=0.1912, simple_loss=0.2524, pruned_loss=0.06502, over 4797.00 frames.], tot_loss[loss=0.1969, simple_loss=0.2568, pruned_loss=0.06848, over 695995.38 frames.], batch size: 18, lr: 1.11e-03 2022-05-03 19:33:40,739 INFO [train.py:715] (0/8) Epoch 1, batch 300, loss[loss=0.1504, simple_loss=0.2311, pruned_loss=0.03486, over 4976.00 frames.], tot_loss[loss=0.1951, simple_loss=0.256, pruned_loss=0.06713, over 756698.08 frames.], batch size: 25, lr: 1.11e-03 2022-05-03 19:34:21,114 INFO [train.py:715] (0/8) Epoch 1, batch 350, loss[loss=0.1934, simple_loss=0.2537, pruned_loss=0.06655, over 4831.00 frames.], tot_loss[loss=0.1977, simple_loss=0.2576, pruned_loss=0.06889, over 805287.06 frames.], batch size: 27, lr: 1.11e-03 2022-05-03 19:35:01,400 INFO [train.py:715] (0/8) Epoch 1, batch 400, loss[loss=0.2202, simple_loss=0.2765, pruned_loss=0.08192, over 4705.00 frames.], tot_loss[loss=0.1963, simple_loss=0.2565, pruned_loss=0.06811, over 842109.71 frames.], batch size: 15, lr: 1.11e-03 2022-05-03 19:35:42,063 INFO [train.py:715] (0/8) Epoch 1, batch 450, loss[loss=0.2143, simple_loss=0.2656, pruned_loss=0.08151, over 4852.00 frames.], tot_loss[loss=0.1965, simple_loss=0.2561, pruned_loss=0.06842, over 870835.59 frames.], batch size: 20, lr: 1.11e-03 2022-05-03 19:36:22,770 INFO [train.py:715] (0/8) Epoch 1, batch 500, loss[loss=0.1993, simple_loss=0.2517, pruned_loss=0.07349, over 4820.00 frames.], tot_loss[loss=0.1949, simple_loss=0.255, pruned_loss=0.06737, over 893252.80 frames.], batch size: 26, lr: 1.11e-03 2022-05-03 19:37:03,291 INFO [train.py:715] (0/8) Epoch 1, batch 550, loss[loss=0.2078, simple_loss=0.2637, pruned_loss=0.0759, over 4809.00 frames.], tot_loss[loss=0.1961, simple_loss=0.2563, pruned_loss=0.06801, over 910058.67 frames.], batch size: 21, lr: 1.11e-03 2022-05-03 19:37:43,270 INFO [train.py:715] (0/8) Epoch 1, batch 600, loss[loss=0.193, simple_loss=0.2585, pruned_loss=0.06372, over 4937.00 frames.], tot_loss[loss=0.196, simple_loss=0.2563, pruned_loss=0.06783, over 924125.81 frames.], batch size: 35, lr: 1.10e-03 2022-05-03 19:38:23,976 INFO [train.py:715] (0/8) Epoch 1, batch 650, loss[loss=0.2041, simple_loss=0.2706, pruned_loss=0.06884, over 4973.00 frames.], tot_loss[loss=0.1965, simple_loss=0.2566, pruned_loss=0.06816, over 933899.04 frames.], batch size: 28, lr: 1.10e-03 2022-05-03 19:39:04,146 INFO [train.py:715] (0/8) Epoch 1, batch 700, loss[loss=0.1637, simple_loss=0.235, pruned_loss=0.04613, over 4943.00 frames.], tot_loss[loss=0.1965, simple_loss=0.2566, pruned_loss=0.06818, over 942285.61 frames.], batch size: 21, lr: 1.10e-03 2022-05-03 19:39:44,138 INFO [train.py:715] (0/8) Epoch 1, batch 750, loss[loss=0.1718, simple_loss=0.22, pruned_loss=0.06177, over 4861.00 frames.], tot_loss[loss=0.1966, simple_loss=0.2569, pruned_loss=0.06819, over 949225.95 frames.], batch size: 20, lr: 1.10e-03 2022-05-03 19:40:24,221 INFO [train.py:715] (0/8) Epoch 1, batch 800, loss[loss=0.1497, simple_loss=0.2167, pruned_loss=0.04132, over 4876.00 frames.], tot_loss[loss=0.1975, simple_loss=0.2575, pruned_loss=0.0687, over 954075.20 frames.], batch size: 13, lr: 1.10e-03 2022-05-03 19:41:04,462 INFO [train.py:715] (0/8) Epoch 1, batch 850, loss[loss=0.1732, simple_loss=0.2419, pruned_loss=0.05226, over 4942.00 frames.], tot_loss[loss=0.1966, simple_loss=0.2569, pruned_loss=0.06815, over 958518.09 frames.], batch size: 29, lr: 1.10e-03 2022-05-03 19:41:43,687 INFO [train.py:715] (0/8) Epoch 1, batch 900, loss[loss=0.1953, simple_loss=0.2511, pruned_loss=0.06975, over 4811.00 frames.], tot_loss[loss=0.1962, simple_loss=0.2566, pruned_loss=0.06788, over 961733.87 frames.], batch size: 25, lr: 1.10e-03 2022-05-03 19:42:22,972 INFO [train.py:715] (0/8) Epoch 1, batch 950, loss[loss=0.1747, simple_loss=0.232, pruned_loss=0.0587, over 4807.00 frames.], tot_loss[loss=0.197, simple_loss=0.2574, pruned_loss=0.06833, over 963630.69 frames.], batch size: 25, lr: 1.10e-03 2022-05-03 19:43:02,563 INFO [train.py:715] (0/8) Epoch 1, batch 1000, loss[loss=0.1543, simple_loss=0.2182, pruned_loss=0.04524, over 4982.00 frames.], tot_loss[loss=0.1979, simple_loss=0.258, pruned_loss=0.06895, over 965710.76 frames.], batch size: 14, lr: 1.10e-03 2022-05-03 19:43:41,904 INFO [train.py:715] (0/8) Epoch 1, batch 1050, loss[loss=0.2173, simple_loss=0.2725, pruned_loss=0.08101, over 4952.00 frames.], tot_loss[loss=0.1982, simple_loss=0.2583, pruned_loss=0.06902, over 967307.78 frames.], batch size: 29, lr: 1.10e-03 2022-05-03 19:44:20,966 INFO [train.py:715] (0/8) Epoch 1, batch 1100, loss[loss=0.1718, simple_loss=0.2396, pruned_loss=0.05196, over 4874.00 frames.], tot_loss[loss=0.1973, simple_loss=0.2575, pruned_loss=0.06852, over 968871.38 frames.], batch size: 20, lr: 1.10e-03 2022-05-03 19:45:00,277 INFO [train.py:715] (0/8) Epoch 1, batch 1150, loss[loss=0.2013, simple_loss=0.2512, pruned_loss=0.07566, over 4873.00 frames.], tot_loss[loss=0.1973, simple_loss=0.2576, pruned_loss=0.06845, over 970435.54 frames.], batch size: 16, lr: 1.10e-03 2022-05-03 19:45:40,272 INFO [train.py:715] (0/8) Epoch 1, batch 1200, loss[loss=0.2135, simple_loss=0.2778, pruned_loss=0.07456, over 4883.00 frames.], tot_loss[loss=0.1979, simple_loss=0.2579, pruned_loss=0.06895, over 970482.59 frames.], batch size: 22, lr: 1.10e-03 2022-05-03 19:46:19,426 INFO [train.py:715] (0/8) Epoch 1, batch 1250, loss[loss=0.2248, simple_loss=0.286, pruned_loss=0.08184, over 4779.00 frames.], tot_loss[loss=0.1974, simple_loss=0.2573, pruned_loss=0.06879, over 971542.86 frames.], batch size: 14, lr: 1.10e-03 2022-05-03 19:46:58,959 INFO [train.py:715] (0/8) Epoch 1, batch 1300, loss[loss=0.1813, simple_loss=0.2398, pruned_loss=0.06136, over 4816.00 frames.], tot_loss[loss=0.1966, simple_loss=0.2562, pruned_loss=0.06853, over 970288.20 frames.], batch size: 15, lr: 1.09e-03 2022-05-03 19:47:39,272 INFO [train.py:715] (0/8) Epoch 1, batch 1350, loss[loss=0.1544, simple_loss=0.217, pruned_loss=0.04594, over 4848.00 frames.], tot_loss[loss=0.1952, simple_loss=0.2548, pruned_loss=0.06781, over 970741.39 frames.], batch size: 13, lr: 1.09e-03 2022-05-03 19:48:18,898 INFO [train.py:715] (0/8) Epoch 1, batch 1400, loss[loss=0.1732, simple_loss=0.2323, pruned_loss=0.05707, over 4698.00 frames.], tot_loss[loss=0.1958, simple_loss=0.2554, pruned_loss=0.06809, over 969845.75 frames.], batch size: 12, lr: 1.09e-03 2022-05-03 19:48:58,742 INFO [train.py:715] (0/8) Epoch 1, batch 1450, loss[loss=0.1753, simple_loss=0.2383, pruned_loss=0.05614, over 4888.00 frames.], tot_loss[loss=0.1955, simple_loss=0.2553, pruned_loss=0.06785, over 969991.28 frames.], batch size: 19, lr: 1.09e-03 2022-05-03 19:49:38,353 INFO [train.py:715] (0/8) Epoch 1, batch 1500, loss[loss=0.1775, simple_loss=0.2434, pruned_loss=0.05575, over 4893.00 frames.], tot_loss[loss=0.1977, simple_loss=0.2575, pruned_loss=0.06896, over 970853.42 frames.], batch size: 22, lr: 1.09e-03 2022-05-03 19:50:17,877 INFO [train.py:715] (0/8) Epoch 1, batch 1550, loss[loss=0.1914, simple_loss=0.2546, pruned_loss=0.06403, over 4987.00 frames.], tot_loss[loss=0.1978, simple_loss=0.2579, pruned_loss=0.06887, over 970679.32 frames.], batch size: 14, lr: 1.09e-03 2022-05-03 19:50:57,104 INFO [train.py:715] (0/8) Epoch 1, batch 1600, loss[loss=0.1531, simple_loss=0.2228, pruned_loss=0.04171, over 4785.00 frames.], tot_loss[loss=0.1962, simple_loss=0.2569, pruned_loss=0.06772, over 972047.06 frames.], batch size: 17, lr: 1.09e-03 2022-05-03 19:51:36,418 INFO [train.py:715] (0/8) Epoch 1, batch 1650, loss[loss=0.2006, simple_loss=0.2475, pruned_loss=0.07686, over 4835.00 frames.], tot_loss[loss=0.1971, simple_loss=0.2576, pruned_loss=0.06825, over 972845.16 frames.], batch size: 30, lr: 1.09e-03 2022-05-03 19:52:16,983 INFO [train.py:715] (0/8) Epoch 1, batch 1700, loss[loss=0.221, simple_loss=0.2738, pruned_loss=0.08405, over 4832.00 frames.], tot_loss[loss=0.1965, simple_loss=0.257, pruned_loss=0.06794, over 973044.26 frames.], batch size: 27, lr: 1.09e-03 2022-05-03 19:52:56,163 INFO [train.py:715] (0/8) Epoch 1, batch 1750, loss[loss=0.1772, simple_loss=0.2434, pruned_loss=0.05557, over 4812.00 frames.], tot_loss[loss=0.197, simple_loss=0.257, pruned_loss=0.0685, over 972578.41 frames.], batch size: 21, lr: 1.09e-03 2022-05-03 19:53:35,893 INFO [train.py:715] (0/8) Epoch 1, batch 1800, loss[loss=0.1782, simple_loss=0.2405, pruned_loss=0.0579, over 4883.00 frames.], tot_loss[loss=0.1965, simple_loss=0.2564, pruned_loss=0.06831, over 972677.11 frames.], batch size: 16, lr: 1.09e-03 2022-05-03 19:54:15,260 INFO [train.py:715] (0/8) Epoch 1, batch 1850, loss[loss=0.1996, simple_loss=0.2648, pruned_loss=0.06717, over 4901.00 frames.], tot_loss[loss=0.1955, simple_loss=0.2556, pruned_loss=0.06768, over 973053.23 frames.], batch size: 19, lr: 1.09e-03 2022-05-03 19:54:54,779 INFO [train.py:715] (0/8) Epoch 1, batch 1900, loss[loss=0.1395, simple_loss=0.1941, pruned_loss=0.04248, over 4806.00 frames.], tot_loss[loss=0.1952, simple_loss=0.2554, pruned_loss=0.06754, over 973654.35 frames.], batch size: 14, lr: 1.09e-03 2022-05-03 19:55:34,090 INFO [train.py:715] (0/8) Epoch 1, batch 1950, loss[loss=0.1958, simple_loss=0.2616, pruned_loss=0.06503, over 4824.00 frames.], tot_loss[loss=0.1964, simple_loss=0.2566, pruned_loss=0.06809, over 974059.72 frames.], batch size: 26, lr: 1.08e-03 2022-05-03 19:56:14,077 INFO [train.py:715] (0/8) Epoch 1, batch 2000, loss[loss=0.2334, simple_loss=0.2747, pruned_loss=0.09603, over 4783.00 frames.], tot_loss[loss=0.1971, simple_loss=0.257, pruned_loss=0.06854, over 974302.76 frames.], batch size: 17, lr: 1.08e-03 2022-05-03 19:56:53,569 INFO [train.py:715] (0/8) Epoch 1, batch 2050, loss[loss=0.2385, simple_loss=0.2899, pruned_loss=0.09352, over 4848.00 frames.], tot_loss[loss=0.1965, simple_loss=0.2571, pruned_loss=0.06799, over 974576.82 frames.], batch size: 32, lr: 1.08e-03 2022-05-03 19:57:33,042 INFO [train.py:715] (0/8) Epoch 1, batch 2100, loss[loss=0.2185, simple_loss=0.2727, pruned_loss=0.08213, over 4906.00 frames.], tot_loss[loss=0.196, simple_loss=0.2568, pruned_loss=0.06761, over 974796.27 frames.], batch size: 17, lr: 1.08e-03 2022-05-03 19:58:12,716 INFO [train.py:715] (0/8) Epoch 1, batch 2150, loss[loss=0.2222, simple_loss=0.268, pruned_loss=0.08823, over 4938.00 frames.], tot_loss[loss=0.1971, simple_loss=0.2574, pruned_loss=0.06841, over 974444.36 frames.], batch size: 21, lr: 1.08e-03 2022-05-03 19:58:52,406 INFO [train.py:715] (0/8) Epoch 1, batch 2200, loss[loss=0.2174, simple_loss=0.2634, pruned_loss=0.08575, over 4881.00 frames.], tot_loss[loss=0.1951, simple_loss=0.2559, pruned_loss=0.06715, over 974990.33 frames.], batch size: 32, lr: 1.08e-03 2022-05-03 19:59:32,133 INFO [train.py:715] (0/8) Epoch 1, batch 2250, loss[loss=0.1663, simple_loss=0.2381, pruned_loss=0.04728, over 4874.00 frames.], tot_loss[loss=0.1951, simple_loss=0.2557, pruned_loss=0.06724, over 973927.71 frames.], batch size: 22, lr: 1.08e-03 2022-05-03 20:00:11,176 INFO [train.py:715] (0/8) Epoch 1, batch 2300, loss[loss=0.1986, simple_loss=0.25, pruned_loss=0.07354, over 4903.00 frames.], tot_loss[loss=0.1958, simple_loss=0.2561, pruned_loss=0.06781, over 973747.85 frames.], batch size: 19, lr: 1.08e-03 2022-05-03 20:00:51,308 INFO [train.py:715] (0/8) Epoch 1, batch 2350, loss[loss=0.2183, simple_loss=0.2805, pruned_loss=0.07805, over 4939.00 frames.], tot_loss[loss=0.1952, simple_loss=0.2559, pruned_loss=0.06727, over 973192.68 frames.], batch size: 29, lr: 1.08e-03 2022-05-03 20:01:30,587 INFO [train.py:715] (0/8) Epoch 1, batch 2400, loss[loss=0.1474, simple_loss=0.2161, pruned_loss=0.03931, over 4947.00 frames.], tot_loss[loss=0.1956, simple_loss=0.2562, pruned_loss=0.06756, over 972889.58 frames.], batch size: 21, lr: 1.08e-03 2022-05-03 20:02:09,733 INFO [train.py:715] (0/8) Epoch 1, batch 2450, loss[loss=0.1581, simple_loss=0.2256, pruned_loss=0.04526, over 4770.00 frames.], tot_loss[loss=0.1968, simple_loss=0.2571, pruned_loss=0.06829, over 972632.32 frames.], batch size: 18, lr: 1.08e-03 2022-05-03 20:02:48,981 INFO [train.py:715] (0/8) Epoch 1, batch 2500, loss[loss=0.1771, simple_loss=0.2374, pruned_loss=0.05843, over 4990.00 frames.], tot_loss[loss=0.1943, simple_loss=0.2547, pruned_loss=0.06694, over 972069.50 frames.], batch size: 15, lr: 1.08e-03 2022-05-03 20:03:28,537 INFO [train.py:715] (0/8) Epoch 1, batch 2550, loss[loss=0.2205, simple_loss=0.2741, pruned_loss=0.08346, over 4980.00 frames.], tot_loss[loss=0.1954, simple_loss=0.2555, pruned_loss=0.06764, over 972002.89 frames.], batch size: 15, lr: 1.08e-03 2022-05-03 20:04:08,267 INFO [train.py:715] (0/8) Epoch 1, batch 2600, loss[loss=0.2236, simple_loss=0.2763, pruned_loss=0.08542, over 4901.00 frames.], tot_loss[loss=0.1953, simple_loss=0.2556, pruned_loss=0.06748, over 972087.89 frames.], batch size: 22, lr: 1.08e-03 2022-05-03 20:04:47,476 INFO [train.py:715] (0/8) Epoch 1, batch 2650, loss[loss=0.1958, simple_loss=0.2441, pruned_loss=0.0737, over 4903.00 frames.], tot_loss[loss=0.1957, simple_loss=0.2558, pruned_loss=0.06783, over 972914.04 frames.], batch size: 17, lr: 1.07e-03 2022-05-03 20:05:27,543 INFO [train.py:715] (0/8) Epoch 1, batch 2700, loss[loss=0.234, simple_loss=0.2869, pruned_loss=0.09056, over 4975.00 frames.], tot_loss[loss=0.1966, simple_loss=0.2569, pruned_loss=0.06809, over 973805.11 frames.], batch size: 39, lr: 1.07e-03 2022-05-03 20:06:06,955 INFO [train.py:715] (0/8) Epoch 1, batch 2750, loss[loss=0.1776, simple_loss=0.2409, pruned_loss=0.05722, over 4737.00 frames.], tot_loss[loss=0.1951, simple_loss=0.2561, pruned_loss=0.0671, over 973705.45 frames.], batch size: 16, lr: 1.07e-03 2022-05-03 20:06:45,692 INFO [train.py:715] (0/8) Epoch 1, batch 2800, loss[loss=0.1955, simple_loss=0.2669, pruned_loss=0.06205, over 4814.00 frames.], tot_loss[loss=0.1945, simple_loss=0.2558, pruned_loss=0.0666, over 973489.85 frames.], batch size: 21, lr: 1.07e-03 2022-05-03 20:07:25,353 INFO [train.py:715] (0/8) Epoch 1, batch 2850, loss[loss=0.1627, simple_loss=0.2279, pruned_loss=0.04871, over 4972.00 frames.], tot_loss[loss=0.1941, simple_loss=0.2551, pruned_loss=0.06658, over 973415.96 frames.], batch size: 15, lr: 1.07e-03 2022-05-03 20:08:05,011 INFO [train.py:715] (0/8) Epoch 1, batch 2900, loss[loss=0.1999, simple_loss=0.2523, pruned_loss=0.07376, over 4993.00 frames.], tot_loss[loss=0.1938, simple_loss=0.255, pruned_loss=0.06625, over 973257.58 frames.], batch size: 14, lr: 1.07e-03 2022-05-03 20:08:44,128 INFO [train.py:715] (0/8) Epoch 1, batch 2950, loss[loss=0.1633, simple_loss=0.2312, pruned_loss=0.0477, over 4972.00 frames.], tot_loss[loss=0.1941, simple_loss=0.2554, pruned_loss=0.06637, over 973286.30 frames.], batch size: 28, lr: 1.07e-03 2022-05-03 20:09:22,836 INFO [train.py:715] (0/8) Epoch 1, batch 3000, loss[loss=0.288, simple_loss=0.3204, pruned_loss=0.1278, over 4739.00 frames.], tot_loss[loss=0.1948, simple_loss=0.2558, pruned_loss=0.06689, over 972682.86 frames.], batch size: 16, lr: 1.07e-03 2022-05-03 20:09:22,837 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 20:09:34,565 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1276, simple_loss=0.2149, pruned_loss=0.0201, over 914524.00 frames. 2022-05-03 20:10:13,443 INFO [train.py:715] (0/8) Epoch 1, batch 3050, loss[loss=0.164, simple_loss=0.2292, pruned_loss=0.0494, over 4817.00 frames.], tot_loss[loss=0.1929, simple_loss=0.2545, pruned_loss=0.06568, over 971706.39 frames.], batch size: 21, lr: 1.07e-03 2022-05-03 20:10:53,453 INFO [train.py:715] (0/8) Epoch 1, batch 3100, loss[loss=0.2103, simple_loss=0.2622, pruned_loss=0.07925, over 4943.00 frames.], tot_loss[loss=0.1918, simple_loss=0.2536, pruned_loss=0.06496, over 971491.60 frames.], batch size: 14, lr: 1.07e-03 2022-05-03 20:11:32,601 INFO [train.py:715] (0/8) Epoch 1, batch 3150, loss[loss=0.2347, simple_loss=0.3002, pruned_loss=0.08458, over 4822.00 frames.], tot_loss[loss=0.1946, simple_loss=0.2555, pruned_loss=0.06683, over 971287.42 frames.], batch size: 26, lr: 1.07e-03 2022-05-03 20:12:11,820 INFO [train.py:715] (0/8) Epoch 1, batch 3200, loss[loss=0.2116, simple_loss=0.2691, pruned_loss=0.07708, over 4945.00 frames.], tot_loss[loss=0.1944, simple_loss=0.2553, pruned_loss=0.06676, over 970983.49 frames.], batch size: 29, lr: 1.07e-03 2022-05-03 20:12:51,458 INFO [train.py:715] (0/8) Epoch 1, batch 3250, loss[loss=0.2362, simple_loss=0.284, pruned_loss=0.09421, over 4790.00 frames.], tot_loss[loss=0.1941, simple_loss=0.2551, pruned_loss=0.0665, over 971603.31 frames.], batch size: 18, lr: 1.07e-03 2022-05-03 20:13:31,214 INFO [train.py:715] (0/8) Epoch 1, batch 3300, loss[loss=0.2075, simple_loss=0.2766, pruned_loss=0.06923, over 4933.00 frames.], tot_loss[loss=0.195, simple_loss=0.2554, pruned_loss=0.06726, over 971066.95 frames.], batch size: 21, lr: 1.07e-03 2022-05-03 20:14:10,767 INFO [train.py:715] (0/8) Epoch 1, batch 3350, loss[loss=0.176, simple_loss=0.2388, pruned_loss=0.05659, over 4803.00 frames.], tot_loss[loss=0.1947, simple_loss=0.2556, pruned_loss=0.06693, over 971124.23 frames.], batch size: 21, lr: 1.07e-03 2022-05-03 20:14:50,050 INFO [train.py:715] (0/8) Epoch 1, batch 3400, loss[loss=0.2182, simple_loss=0.2715, pruned_loss=0.08247, over 4976.00 frames.], tot_loss[loss=0.1943, simple_loss=0.255, pruned_loss=0.06683, over 970917.26 frames.], batch size: 25, lr: 1.06e-03 2022-05-03 20:15:30,666 INFO [train.py:715] (0/8) Epoch 1, batch 3450, loss[loss=0.2249, simple_loss=0.2721, pruned_loss=0.08878, over 4933.00 frames.], tot_loss[loss=0.1949, simple_loss=0.2555, pruned_loss=0.06712, over 971661.09 frames.], batch size: 35, lr: 1.06e-03 2022-05-03 20:16:09,590 INFO [train.py:715] (0/8) Epoch 1, batch 3500, loss[loss=0.2162, simple_loss=0.2734, pruned_loss=0.07953, over 4953.00 frames.], tot_loss[loss=0.1961, simple_loss=0.2562, pruned_loss=0.06798, over 971366.81 frames.], batch size: 39, lr: 1.06e-03 2022-05-03 20:16:48,614 INFO [train.py:715] (0/8) Epoch 1, batch 3550, loss[loss=0.1936, simple_loss=0.2533, pruned_loss=0.06693, over 4780.00 frames.], tot_loss[loss=0.1947, simple_loss=0.2553, pruned_loss=0.06707, over 972245.45 frames.], batch size: 18, lr: 1.06e-03 2022-05-03 20:17:28,377 INFO [train.py:715] (0/8) Epoch 1, batch 3600, loss[loss=0.1851, simple_loss=0.248, pruned_loss=0.06111, over 4981.00 frames.], tot_loss[loss=0.1938, simple_loss=0.2545, pruned_loss=0.06655, over 973196.25 frames.], batch size: 25, lr: 1.06e-03 2022-05-03 20:18:08,024 INFO [train.py:715] (0/8) Epoch 1, batch 3650, loss[loss=0.1956, simple_loss=0.255, pruned_loss=0.06811, over 4765.00 frames.], tot_loss[loss=0.1937, simple_loss=0.2543, pruned_loss=0.0665, over 972817.95 frames.], batch size: 14, lr: 1.06e-03 2022-05-03 20:18:46,982 INFO [train.py:715] (0/8) Epoch 1, batch 3700, loss[loss=0.1923, simple_loss=0.2553, pruned_loss=0.06469, over 4805.00 frames.], tot_loss[loss=0.1921, simple_loss=0.2531, pruned_loss=0.06551, over 973199.12 frames.], batch size: 21, lr: 1.06e-03 2022-05-03 20:19:25,663 INFO [train.py:715] (0/8) Epoch 1, batch 3750, loss[loss=0.2022, simple_loss=0.2705, pruned_loss=0.06695, over 4902.00 frames.], tot_loss[loss=0.1923, simple_loss=0.2535, pruned_loss=0.0656, over 972844.89 frames.], batch size: 39, lr: 1.06e-03 2022-05-03 20:20:05,935 INFO [train.py:715] (0/8) Epoch 1, batch 3800, loss[loss=0.1402, simple_loss=0.2005, pruned_loss=0.04002, over 4796.00 frames.], tot_loss[loss=0.1918, simple_loss=0.2531, pruned_loss=0.06527, over 972425.03 frames.], batch size: 12, lr: 1.06e-03 2022-05-03 20:20:44,907 INFO [train.py:715] (0/8) Epoch 1, batch 3850, loss[loss=0.2355, simple_loss=0.2696, pruned_loss=0.1007, over 4635.00 frames.], tot_loss[loss=0.1917, simple_loss=0.2531, pruned_loss=0.06517, over 972101.53 frames.], batch size: 13, lr: 1.06e-03 2022-05-03 20:21:23,759 INFO [train.py:715] (0/8) Epoch 1, batch 3900, loss[loss=0.1744, simple_loss=0.2411, pruned_loss=0.05382, over 4943.00 frames.], tot_loss[loss=0.1931, simple_loss=0.2542, pruned_loss=0.06603, over 972370.70 frames.], batch size: 29, lr: 1.06e-03 2022-05-03 20:22:03,282 INFO [train.py:715] (0/8) Epoch 1, batch 3950, loss[loss=0.161, simple_loss=0.2321, pruned_loss=0.04494, over 4942.00 frames.], tot_loss[loss=0.1943, simple_loss=0.2552, pruned_loss=0.06673, over 971735.99 frames.], batch size: 29, lr: 1.06e-03 2022-05-03 20:22:42,797 INFO [train.py:715] (0/8) Epoch 1, batch 4000, loss[loss=0.203, simple_loss=0.2615, pruned_loss=0.07226, over 4810.00 frames.], tot_loss[loss=0.1936, simple_loss=0.2546, pruned_loss=0.06635, over 972089.26 frames.], batch size: 25, lr: 1.06e-03 2022-05-03 20:23:21,457 INFO [train.py:715] (0/8) Epoch 1, batch 4050, loss[loss=0.2991, simple_loss=0.3356, pruned_loss=0.1313, over 4913.00 frames.], tot_loss[loss=0.1941, simple_loss=0.2547, pruned_loss=0.06677, over 971787.58 frames.], batch size: 18, lr: 1.06e-03 2022-05-03 20:24:00,887 INFO [train.py:715] (0/8) Epoch 1, batch 4100, loss[loss=0.2098, simple_loss=0.2605, pruned_loss=0.07952, over 4975.00 frames.], tot_loss[loss=0.1947, simple_loss=0.255, pruned_loss=0.06725, over 971833.20 frames.], batch size: 14, lr: 1.05e-03 2022-05-03 20:24:40,537 INFO [train.py:715] (0/8) Epoch 1, batch 4150, loss[loss=0.1969, simple_loss=0.2571, pruned_loss=0.06836, over 4849.00 frames.], tot_loss[loss=0.1942, simple_loss=0.2543, pruned_loss=0.06705, over 972489.78 frames.], batch size: 15, lr: 1.05e-03 2022-05-03 20:25:19,587 INFO [train.py:715] (0/8) Epoch 1, batch 4200, loss[loss=0.172, simple_loss=0.2311, pruned_loss=0.05647, over 4792.00 frames.], tot_loss[loss=0.1942, simple_loss=0.2544, pruned_loss=0.06694, over 972143.14 frames.], batch size: 14, lr: 1.05e-03 2022-05-03 20:25:58,628 INFO [train.py:715] (0/8) Epoch 1, batch 4250, loss[loss=0.1828, simple_loss=0.2475, pruned_loss=0.05902, over 4751.00 frames.], tot_loss[loss=0.1948, simple_loss=0.255, pruned_loss=0.06729, over 972239.22 frames.], batch size: 16, lr: 1.05e-03 2022-05-03 20:26:38,142 INFO [train.py:715] (0/8) Epoch 1, batch 4300, loss[loss=0.1817, simple_loss=0.2518, pruned_loss=0.05585, over 4984.00 frames.], tot_loss[loss=0.1945, simple_loss=0.255, pruned_loss=0.067, over 972565.10 frames.], batch size: 25, lr: 1.05e-03 2022-05-03 20:27:17,803 INFO [train.py:715] (0/8) Epoch 1, batch 4350, loss[loss=0.2039, simple_loss=0.2564, pruned_loss=0.07574, over 4842.00 frames.], tot_loss[loss=0.1948, simple_loss=0.2557, pruned_loss=0.0669, over 972685.81 frames.], batch size: 30, lr: 1.05e-03 2022-05-03 20:27:56,255 INFO [train.py:715] (0/8) Epoch 1, batch 4400, loss[loss=0.2686, simple_loss=0.3182, pruned_loss=0.1095, over 4981.00 frames.], tot_loss[loss=0.1955, simple_loss=0.2563, pruned_loss=0.06732, over 972243.12 frames.], batch size: 35, lr: 1.05e-03 2022-05-03 20:28:35,847 INFO [train.py:715] (0/8) Epoch 1, batch 4450, loss[loss=0.1795, simple_loss=0.2513, pruned_loss=0.05388, over 4878.00 frames.], tot_loss[loss=0.195, simple_loss=0.2552, pruned_loss=0.06746, over 972960.06 frames.], batch size: 16, lr: 1.05e-03 2022-05-03 20:29:15,599 INFO [train.py:715] (0/8) Epoch 1, batch 4500, loss[loss=0.1913, simple_loss=0.2568, pruned_loss=0.06289, over 4970.00 frames.], tot_loss[loss=0.1947, simple_loss=0.2554, pruned_loss=0.06701, over 973183.90 frames.], batch size: 35, lr: 1.05e-03 2022-05-03 20:29:54,820 INFO [train.py:715] (0/8) Epoch 1, batch 4550, loss[loss=0.2255, simple_loss=0.2848, pruned_loss=0.0831, over 4951.00 frames.], tot_loss[loss=0.1932, simple_loss=0.2546, pruned_loss=0.06588, over 973148.40 frames.], batch size: 35, lr: 1.05e-03 2022-05-03 20:30:33,523 INFO [train.py:715] (0/8) Epoch 1, batch 4600, loss[loss=0.1916, simple_loss=0.2514, pruned_loss=0.06592, over 4883.00 frames.], tot_loss[loss=0.1943, simple_loss=0.255, pruned_loss=0.06675, over 973373.56 frames.], batch size: 22, lr: 1.05e-03 2022-05-03 20:31:13,062 INFO [train.py:715] (0/8) Epoch 1, batch 4650, loss[loss=0.1744, simple_loss=0.2257, pruned_loss=0.06155, over 4840.00 frames.], tot_loss[loss=0.1948, simple_loss=0.2555, pruned_loss=0.06704, over 972091.65 frames.], batch size: 30, lr: 1.05e-03 2022-05-03 20:31:52,506 INFO [train.py:715] (0/8) Epoch 1, batch 4700, loss[loss=0.1934, simple_loss=0.2563, pruned_loss=0.06525, over 4744.00 frames.], tot_loss[loss=0.1943, simple_loss=0.2551, pruned_loss=0.06673, over 972967.03 frames.], batch size: 16, lr: 1.05e-03 2022-05-03 20:32:31,322 INFO [train.py:715] (0/8) Epoch 1, batch 4750, loss[loss=0.1662, simple_loss=0.2288, pruned_loss=0.0518, over 4931.00 frames.], tot_loss[loss=0.1946, simple_loss=0.2555, pruned_loss=0.06686, over 973463.80 frames.], batch size: 18, lr: 1.05e-03 2022-05-03 20:33:11,342 INFO [train.py:715] (0/8) Epoch 1, batch 4800, loss[loss=0.1852, simple_loss=0.25, pruned_loss=0.06018, over 4895.00 frames.], tot_loss[loss=0.1943, simple_loss=0.2552, pruned_loss=0.06673, over 973616.55 frames.], batch size: 17, lr: 1.05e-03 2022-05-03 20:33:51,188 INFO [train.py:715] (0/8) Epoch 1, batch 4850, loss[loss=0.2532, simple_loss=0.3054, pruned_loss=0.1005, over 4833.00 frames.], tot_loss[loss=0.1957, simple_loss=0.2563, pruned_loss=0.06755, over 973529.31 frames.], batch size: 13, lr: 1.05e-03 2022-05-03 20:34:30,469 INFO [train.py:715] (0/8) Epoch 1, batch 4900, loss[loss=0.2147, simple_loss=0.2741, pruned_loss=0.07763, over 4866.00 frames.], tot_loss[loss=0.195, simple_loss=0.2561, pruned_loss=0.06694, over 973207.21 frames.], batch size: 20, lr: 1.04e-03 2022-05-03 20:35:09,824 INFO [train.py:715] (0/8) Epoch 1, batch 4950, loss[loss=0.1947, simple_loss=0.2442, pruned_loss=0.0726, over 4977.00 frames.], tot_loss[loss=0.1956, simple_loss=0.2563, pruned_loss=0.06741, over 973091.41 frames.], batch size: 14, lr: 1.04e-03 2022-05-03 20:35:50,165 INFO [train.py:715] (0/8) Epoch 1, batch 5000, loss[loss=0.1885, simple_loss=0.2501, pruned_loss=0.0634, over 4745.00 frames.], tot_loss[loss=0.1949, simple_loss=0.2557, pruned_loss=0.06705, over 973965.27 frames.], batch size: 16, lr: 1.04e-03 2022-05-03 20:36:29,722 INFO [train.py:715] (0/8) Epoch 1, batch 5050, loss[loss=0.1694, simple_loss=0.2277, pruned_loss=0.0555, over 4815.00 frames.], tot_loss[loss=0.1943, simple_loss=0.2551, pruned_loss=0.06673, over 973766.39 frames.], batch size: 21, lr: 1.04e-03 2022-05-03 20:37:08,720 INFO [train.py:715] (0/8) Epoch 1, batch 5100, loss[loss=0.2344, simple_loss=0.2855, pruned_loss=0.09162, over 4854.00 frames.], tot_loss[loss=0.1941, simple_loss=0.2548, pruned_loss=0.06672, over 973337.56 frames.], batch size: 34, lr: 1.04e-03 2022-05-03 20:37:48,748 INFO [train.py:715] (0/8) Epoch 1, batch 5150, loss[loss=0.1786, simple_loss=0.2528, pruned_loss=0.05221, over 4990.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2538, pruned_loss=0.0658, over 973605.77 frames.], batch size: 28, lr: 1.04e-03 2022-05-03 20:38:19,038 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-40000.pt 2022-05-03 20:38:30,132 INFO [train.py:715] (0/8) Epoch 1, batch 5200, loss[loss=0.1832, simple_loss=0.2393, pruned_loss=0.06353, over 4816.00 frames.], tot_loss[loss=0.1931, simple_loss=0.254, pruned_loss=0.06606, over 973326.82 frames.], batch size: 26, lr: 1.04e-03 2022-05-03 20:39:09,108 INFO [train.py:715] (0/8) Epoch 1, batch 5250, loss[loss=0.2412, simple_loss=0.2809, pruned_loss=0.1007, over 4856.00 frames.], tot_loss[loss=0.1919, simple_loss=0.2529, pruned_loss=0.06548, over 972327.69 frames.], batch size: 30, lr: 1.04e-03 2022-05-03 20:39:48,467 INFO [train.py:715] (0/8) Epoch 1, batch 5300, loss[loss=0.2342, simple_loss=0.2845, pruned_loss=0.09199, over 4887.00 frames.], tot_loss[loss=0.192, simple_loss=0.2529, pruned_loss=0.06553, over 973246.93 frames.], batch size: 19, lr: 1.04e-03 2022-05-03 20:40:28,106 INFO [train.py:715] (0/8) Epoch 1, batch 5350, loss[loss=0.1885, simple_loss=0.2481, pruned_loss=0.0644, over 4907.00 frames.], tot_loss[loss=0.1923, simple_loss=0.2534, pruned_loss=0.06563, over 973438.01 frames.], batch size: 19, lr: 1.04e-03 2022-05-03 20:41:07,647 INFO [train.py:715] (0/8) Epoch 1, batch 5400, loss[loss=0.2875, simple_loss=0.3127, pruned_loss=0.1311, over 4728.00 frames.], tot_loss[loss=0.1919, simple_loss=0.2532, pruned_loss=0.06529, over 973410.51 frames.], batch size: 16, lr: 1.04e-03 2022-05-03 20:41:46,697 INFO [train.py:715] (0/8) Epoch 1, batch 5450, loss[loss=0.261, simple_loss=0.322, pruned_loss=0.1, over 4929.00 frames.], tot_loss[loss=0.1923, simple_loss=0.2536, pruned_loss=0.06552, over 972935.13 frames.], batch size: 39, lr: 1.04e-03 2022-05-03 20:42:26,579 INFO [train.py:715] (0/8) Epoch 1, batch 5500, loss[loss=0.1515, simple_loss=0.2197, pruned_loss=0.0416, over 4794.00 frames.], tot_loss[loss=0.1929, simple_loss=0.2542, pruned_loss=0.06583, over 973259.09 frames.], batch size: 21, lr: 1.04e-03 2022-05-03 20:43:06,478 INFO [train.py:715] (0/8) Epoch 1, batch 5550, loss[loss=0.1768, simple_loss=0.2414, pruned_loss=0.0561, over 4745.00 frames.], tot_loss[loss=0.1926, simple_loss=0.2536, pruned_loss=0.06577, over 971651.58 frames.], batch size: 19, lr: 1.04e-03 2022-05-03 20:43:45,495 INFO [train.py:715] (0/8) Epoch 1, batch 5600, loss[loss=0.2158, simple_loss=0.2585, pruned_loss=0.0866, over 4829.00 frames.], tot_loss[loss=0.1933, simple_loss=0.2542, pruned_loss=0.06621, over 972769.96 frames.], batch size: 13, lr: 1.04e-03 2022-05-03 20:44:24,788 INFO [train.py:715] (0/8) Epoch 1, batch 5650, loss[loss=0.1771, simple_loss=0.2449, pruned_loss=0.0547, over 4962.00 frames.], tot_loss[loss=0.193, simple_loss=0.254, pruned_loss=0.06599, over 973129.84 frames.], batch size: 15, lr: 1.03e-03 2022-05-03 20:45:04,553 INFO [train.py:715] (0/8) Epoch 1, batch 5700, loss[loss=0.1787, simple_loss=0.2288, pruned_loss=0.06427, over 4710.00 frames.], tot_loss[loss=0.1934, simple_loss=0.2541, pruned_loss=0.06628, over 973615.53 frames.], batch size: 15, lr: 1.03e-03 2022-05-03 20:45:44,082 INFO [train.py:715] (0/8) Epoch 1, batch 5750, loss[loss=0.1738, simple_loss=0.2355, pruned_loss=0.0561, over 4983.00 frames.], tot_loss[loss=0.192, simple_loss=0.2531, pruned_loss=0.0655, over 974348.68 frames.], batch size: 16, lr: 1.03e-03 2022-05-03 20:46:23,092 INFO [train.py:715] (0/8) Epoch 1, batch 5800, loss[loss=0.1679, simple_loss=0.232, pruned_loss=0.05194, over 4842.00 frames.], tot_loss[loss=0.1913, simple_loss=0.2525, pruned_loss=0.06506, over 974113.40 frames.], batch size: 13, lr: 1.03e-03 2022-05-03 20:47:03,042 INFO [train.py:715] (0/8) Epoch 1, batch 5850, loss[loss=0.1847, simple_loss=0.2548, pruned_loss=0.05733, over 4814.00 frames.], tot_loss[loss=0.191, simple_loss=0.2528, pruned_loss=0.06464, over 973393.74 frames.], batch size: 25, lr: 1.03e-03 2022-05-03 20:47:42,854 INFO [train.py:715] (0/8) Epoch 1, batch 5900, loss[loss=0.176, simple_loss=0.2417, pruned_loss=0.05514, over 4847.00 frames.], tot_loss[loss=0.1916, simple_loss=0.2533, pruned_loss=0.06497, over 972693.04 frames.], batch size: 20, lr: 1.03e-03 2022-05-03 20:48:21,958 INFO [train.py:715] (0/8) Epoch 1, batch 5950, loss[loss=0.1718, simple_loss=0.2369, pruned_loss=0.05338, over 4981.00 frames.], tot_loss[loss=0.1905, simple_loss=0.2521, pruned_loss=0.06448, over 973461.99 frames.], batch size: 25, lr: 1.03e-03 2022-05-03 20:49:01,785 INFO [train.py:715] (0/8) Epoch 1, batch 6000, loss[loss=0.1975, simple_loss=0.2504, pruned_loss=0.07225, over 4860.00 frames.], tot_loss[loss=0.1907, simple_loss=0.2521, pruned_loss=0.06465, over 971591.97 frames.], batch size: 20, lr: 1.03e-03 2022-05-03 20:49:01,786 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 20:49:14,259 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1267, simple_loss=0.2135, pruned_loss=0.01993, over 914524.00 frames. 2022-05-03 20:49:53,682 INFO [train.py:715] (0/8) Epoch 1, batch 6050, loss[loss=0.176, simple_loss=0.253, pruned_loss=0.04949, over 4878.00 frames.], tot_loss[loss=0.1899, simple_loss=0.2516, pruned_loss=0.06411, over 971805.83 frames.], batch size: 16, lr: 1.03e-03 2022-05-03 20:50:33,756 INFO [train.py:715] (0/8) Epoch 1, batch 6100, loss[loss=0.2047, simple_loss=0.2718, pruned_loss=0.06879, over 4946.00 frames.], tot_loss[loss=0.1899, simple_loss=0.2513, pruned_loss=0.06424, over 972083.16 frames.], batch size: 39, lr: 1.03e-03 2022-05-03 20:51:13,277 INFO [train.py:715] (0/8) Epoch 1, batch 6150, loss[loss=0.1784, simple_loss=0.2488, pruned_loss=0.05397, over 4967.00 frames.], tot_loss[loss=0.1894, simple_loss=0.251, pruned_loss=0.06392, over 972148.25 frames.], batch size: 15, lr: 1.03e-03 2022-05-03 20:51:51,973 INFO [train.py:715] (0/8) Epoch 1, batch 6200, loss[loss=0.1777, simple_loss=0.2468, pruned_loss=0.05432, over 4864.00 frames.], tot_loss[loss=0.191, simple_loss=0.2522, pruned_loss=0.06483, over 972407.55 frames.], batch size: 20, lr: 1.03e-03 2022-05-03 20:52:32,165 INFO [train.py:715] (0/8) Epoch 1, batch 6250, loss[loss=0.1962, simple_loss=0.257, pruned_loss=0.06774, over 4929.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2535, pruned_loss=0.066, over 972203.43 frames.], batch size: 39, lr: 1.03e-03 2022-05-03 20:53:11,876 INFO [train.py:715] (0/8) Epoch 1, batch 6300, loss[loss=0.1859, simple_loss=0.258, pruned_loss=0.05689, over 4973.00 frames.], tot_loss[loss=0.193, simple_loss=0.2536, pruned_loss=0.06619, over 973063.78 frames.], batch size: 15, lr: 1.03e-03 2022-05-03 20:53:51,078 INFO [train.py:715] (0/8) Epoch 1, batch 6350, loss[loss=0.207, simple_loss=0.2736, pruned_loss=0.07024, over 4823.00 frames.], tot_loss[loss=0.1936, simple_loss=0.2544, pruned_loss=0.06637, over 972065.95 frames.], batch size: 15, lr: 1.03e-03 2022-05-03 20:54:30,387 INFO [train.py:715] (0/8) Epoch 1, batch 6400, loss[loss=0.1648, simple_loss=0.2332, pruned_loss=0.04821, over 4910.00 frames.], tot_loss[loss=0.1933, simple_loss=0.2545, pruned_loss=0.06602, over 971696.02 frames.], batch size: 19, lr: 1.03e-03 2022-05-03 20:55:09,943 INFO [train.py:715] (0/8) Epoch 1, batch 6450, loss[loss=0.2, simple_loss=0.2435, pruned_loss=0.07825, over 4778.00 frames.], tot_loss[loss=0.194, simple_loss=0.2546, pruned_loss=0.06671, over 971420.12 frames.], batch size: 17, lr: 1.02e-03 2022-05-03 20:55:49,580 INFO [train.py:715] (0/8) Epoch 1, batch 6500, loss[loss=0.1163, simple_loss=0.1848, pruned_loss=0.02389, over 4868.00 frames.], tot_loss[loss=0.1939, simple_loss=0.2545, pruned_loss=0.06668, over 972648.29 frames.], batch size: 12, lr: 1.02e-03 2022-05-03 20:56:28,202 INFO [train.py:715] (0/8) Epoch 1, batch 6550, loss[loss=0.2429, simple_loss=0.2927, pruned_loss=0.09657, over 4923.00 frames.], tot_loss[loss=0.195, simple_loss=0.2555, pruned_loss=0.06723, over 972755.13 frames.], batch size: 39, lr: 1.02e-03 2022-05-03 20:57:08,079 INFO [train.py:715] (0/8) Epoch 1, batch 6600, loss[loss=0.2319, simple_loss=0.2885, pruned_loss=0.08762, over 4747.00 frames.], tot_loss[loss=0.1937, simple_loss=0.2546, pruned_loss=0.06638, over 972341.63 frames.], batch size: 16, lr: 1.02e-03 2022-05-03 20:57:48,552 INFO [train.py:715] (0/8) Epoch 1, batch 6650, loss[loss=0.1999, simple_loss=0.2496, pruned_loss=0.07505, over 4936.00 frames.], tot_loss[loss=0.1936, simple_loss=0.2539, pruned_loss=0.06664, over 972504.99 frames.], batch size: 18, lr: 1.02e-03 2022-05-03 20:58:28,003 INFO [train.py:715] (0/8) Epoch 1, batch 6700, loss[loss=0.1901, simple_loss=0.2588, pruned_loss=0.06069, over 4881.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2535, pruned_loss=0.06597, over 972665.52 frames.], batch size: 22, lr: 1.02e-03 2022-05-03 20:59:07,325 INFO [train.py:715] (0/8) Epoch 1, batch 6750, loss[loss=0.2232, simple_loss=0.2791, pruned_loss=0.08364, over 4901.00 frames.], tot_loss[loss=0.1935, simple_loss=0.2543, pruned_loss=0.06637, over 972617.09 frames.], batch size: 19, lr: 1.02e-03 2022-05-03 20:59:47,255 INFO [train.py:715] (0/8) Epoch 1, batch 6800, loss[loss=0.1818, simple_loss=0.2487, pruned_loss=0.05743, over 4893.00 frames.], tot_loss[loss=0.1932, simple_loss=0.2544, pruned_loss=0.06601, over 972671.99 frames.], batch size: 22, lr: 1.02e-03 2022-05-03 21:00:26,802 INFO [train.py:715] (0/8) Epoch 1, batch 6850, loss[loss=0.1489, simple_loss=0.2044, pruned_loss=0.04667, over 4855.00 frames.], tot_loss[loss=0.1926, simple_loss=0.2539, pruned_loss=0.06561, over 971951.18 frames.], batch size: 13, lr: 1.02e-03 2022-05-03 21:01:05,427 INFO [train.py:715] (0/8) Epoch 1, batch 6900, loss[loss=0.1771, simple_loss=0.2477, pruned_loss=0.0533, over 4648.00 frames.], tot_loss[loss=0.1924, simple_loss=0.254, pruned_loss=0.06536, over 971988.67 frames.], batch size: 13, lr: 1.02e-03 2022-05-03 21:01:44,718 INFO [train.py:715] (0/8) Epoch 1, batch 6950, loss[loss=0.1922, simple_loss=0.2471, pruned_loss=0.06864, over 4843.00 frames.], tot_loss[loss=0.1921, simple_loss=0.2538, pruned_loss=0.06522, over 972516.86 frames.], batch size: 30, lr: 1.02e-03 2022-05-03 21:02:24,797 INFO [train.py:715] (0/8) Epoch 1, batch 7000, loss[loss=0.1827, simple_loss=0.2536, pruned_loss=0.05585, over 4906.00 frames.], tot_loss[loss=0.1914, simple_loss=0.2531, pruned_loss=0.06488, over 972578.01 frames.], batch size: 22, lr: 1.02e-03 2022-05-03 21:03:03,643 INFO [train.py:715] (0/8) Epoch 1, batch 7050, loss[loss=0.1686, simple_loss=0.2329, pruned_loss=0.05214, over 4777.00 frames.], tot_loss[loss=0.1917, simple_loss=0.2532, pruned_loss=0.06509, over 972624.67 frames.], batch size: 17, lr: 1.02e-03 2022-05-03 21:03:42,610 INFO [train.py:715] (0/8) Epoch 1, batch 7100, loss[loss=0.2105, simple_loss=0.2608, pruned_loss=0.08015, over 4937.00 frames.], tot_loss[loss=0.1918, simple_loss=0.2532, pruned_loss=0.06516, over 972010.24 frames.], batch size: 23, lr: 1.02e-03 2022-05-03 21:04:22,597 INFO [train.py:715] (0/8) Epoch 1, batch 7150, loss[loss=0.2068, simple_loss=0.2737, pruned_loss=0.06998, over 4896.00 frames.], tot_loss[loss=0.1925, simple_loss=0.2539, pruned_loss=0.0656, over 972373.25 frames.], batch size: 19, lr: 1.02e-03 2022-05-03 21:05:02,516 INFO [train.py:715] (0/8) Epoch 1, batch 7200, loss[loss=0.1704, simple_loss=0.2262, pruned_loss=0.05732, over 4761.00 frames.], tot_loss[loss=0.1903, simple_loss=0.2524, pruned_loss=0.06407, over 972024.35 frames.], batch size: 12, lr: 1.02e-03 2022-05-03 21:05:41,154 INFO [train.py:715] (0/8) Epoch 1, batch 7250, loss[loss=0.1875, simple_loss=0.2439, pruned_loss=0.06555, over 4926.00 frames.], tot_loss[loss=0.1904, simple_loss=0.2522, pruned_loss=0.06429, over 971465.31 frames.], batch size: 17, lr: 1.02e-03 2022-05-03 21:06:21,088 INFO [train.py:715] (0/8) Epoch 1, batch 7300, loss[loss=0.2003, simple_loss=0.2654, pruned_loss=0.06757, over 4973.00 frames.], tot_loss[loss=0.1912, simple_loss=0.2526, pruned_loss=0.06493, over 972567.64 frames.], batch size: 14, lr: 1.01e-03 2022-05-03 21:07:00,831 INFO [train.py:715] (0/8) Epoch 1, batch 7350, loss[loss=0.1834, simple_loss=0.2398, pruned_loss=0.06353, over 4805.00 frames.], tot_loss[loss=0.1915, simple_loss=0.2527, pruned_loss=0.06512, over 972322.78 frames.], batch size: 26, lr: 1.01e-03 2022-05-03 21:07:39,618 INFO [train.py:715] (0/8) Epoch 1, batch 7400, loss[loss=0.1932, simple_loss=0.2615, pruned_loss=0.0625, over 4718.00 frames.], tot_loss[loss=0.1923, simple_loss=0.2533, pruned_loss=0.06565, over 972639.72 frames.], batch size: 15, lr: 1.01e-03 2022-05-03 21:08:18,528 INFO [train.py:715] (0/8) Epoch 1, batch 7450, loss[loss=0.2684, simple_loss=0.307, pruned_loss=0.1149, over 4890.00 frames.], tot_loss[loss=0.1934, simple_loss=0.2543, pruned_loss=0.06622, over 972259.38 frames.], batch size: 17, lr: 1.01e-03 2022-05-03 21:08:58,346 INFO [train.py:715] (0/8) Epoch 1, batch 7500, loss[loss=0.1863, simple_loss=0.2567, pruned_loss=0.05795, over 4924.00 frames.], tot_loss[loss=0.1924, simple_loss=0.2537, pruned_loss=0.06558, over 972711.64 frames.], batch size: 23, lr: 1.01e-03 2022-05-03 21:09:38,022 INFO [train.py:715] (0/8) Epoch 1, batch 7550, loss[loss=0.1517, simple_loss=0.2235, pruned_loss=0.03998, over 4700.00 frames.], tot_loss[loss=0.1898, simple_loss=0.2516, pruned_loss=0.06405, over 972865.52 frames.], batch size: 15, lr: 1.01e-03 2022-05-03 21:10:16,234 INFO [train.py:715] (0/8) Epoch 1, batch 7600, loss[loss=0.1646, simple_loss=0.2334, pruned_loss=0.04792, over 4858.00 frames.], tot_loss[loss=0.1897, simple_loss=0.2522, pruned_loss=0.06361, over 972686.80 frames.], batch size: 32, lr: 1.01e-03 2022-05-03 21:10:55,973 INFO [train.py:715] (0/8) Epoch 1, batch 7650, loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03536, over 4801.00 frames.], tot_loss[loss=0.1913, simple_loss=0.2532, pruned_loss=0.06469, over 973419.99 frames.], batch size: 21, lr: 1.01e-03 2022-05-03 21:11:35,787 INFO [train.py:715] (0/8) Epoch 1, batch 7700, loss[loss=0.1532, simple_loss=0.2265, pruned_loss=0.03998, over 4770.00 frames.], tot_loss[loss=0.1916, simple_loss=0.2534, pruned_loss=0.06489, over 972773.60 frames.], batch size: 18, lr: 1.01e-03 2022-05-03 21:12:14,134 INFO [train.py:715] (0/8) Epoch 1, batch 7750, loss[loss=0.211, simple_loss=0.2457, pruned_loss=0.08819, over 4927.00 frames.], tot_loss[loss=0.1904, simple_loss=0.2524, pruned_loss=0.06417, over 972392.62 frames.], batch size: 23, lr: 1.01e-03 2022-05-03 21:12:53,244 INFO [train.py:715] (0/8) Epoch 1, batch 7800, loss[loss=0.168, simple_loss=0.2346, pruned_loss=0.0507, over 4940.00 frames.], tot_loss[loss=0.1919, simple_loss=0.2537, pruned_loss=0.06504, over 972410.06 frames.], batch size: 29, lr: 1.01e-03 2022-05-03 21:13:33,314 INFO [train.py:715] (0/8) Epoch 1, batch 7850, loss[loss=0.2274, simple_loss=0.2877, pruned_loss=0.08349, over 4951.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2547, pruned_loss=0.06533, over 972342.88 frames.], batch size: 23, lr: 1.01e-03 2022-05-03 21:14:12,714 INFO [train.py:715] (0/8) Epoch 1, batch 7900, loss[loss=0.223, simple_loss=0.2833, pruned_loss=0.08136, over 4826.00 frames.], tot_loss[loss=0.1918, simple_loss=0.2539, pruned_loss=0.06483, over 970915.95 frames.], batch size: 26, lr: 1.01e-03 2022-05-03 21:14:51,157 INFO [train.py:715] (0/8) Epoch 1, batch 7950, loss[loss=0.1695, simple_loss=0.2442, pruned_loss=0.0474, over 4944.00 frames.], tot_loss[loss=0.1926, simple_loss=0.2544, pruned_loss=0.06543, over 970824.93 frames.], batch size: 23, lr: 1.01e-03 2022-05-03 21:15:31,258 INFO [train.py:715] (0/8) Epoch 1, batch 8000, loss[loss=0.2135, simple_loss=0.2618, pruned_loss=0.08257, over 4905.00 frames.], tot_loss[loss=0.1909, simple_loss=0.253, pruned_loss=0.06444, over 971626.66 frames.], batch size: 19, lr: 1.01e-03 2022-05-03 21:16:11,052 INFO [train.py:715] (0/8) Epoch 1, batch 8050, loss[loss=0.1674, simple_loss=0.2274, pruned_loss=0.05366, over 4782.00 frames.], tot_loss[loss=0.1905, simple_loss=0.2524, pruned_loss=0.0643, over 972297.47 frames.], batch size: 14, lr: 1.01e-03 2022-05-03 21:16:50,423 INFO [train.py:715] (0/8) Epoch 1, batch 8100, loss[loss=0.1888, simple_loss=0.2633, pruned_loss=0.05715, over 4927.00 frames.], tot_loss[loss=0.1895, simple_loss=0.2521, pruned_loss=0.06351, over 971804.78 frames.], batch size: 23, lr: 1.01e-03 2022-05-03 21:17:28,627 INFO [train.py:715] (0/8) Epoch 1, batch 8150, loss[loss=0.1853, simple_loss=0.2461, pruned_loss=0.06221, over 4772.00 frames.], tot_loss[loss=0.1898, simple_loss=0.2522, pruned_loss=0.06369, over 971429.86 frames.], batch size: 14, lr: 1.00e-03 2022-05-03 21:18:08,542 INFO [train.py:715] (0/8) Epoch 1, batch 8200, loss[loss=0.1915, simple_loss=0.2511, pruned_loss=0.06596, over 4971.00 frames.], tot_loss[loss=0.1911, simple_loss=0.2534, pruned_loss=0.0644, over 971416.73 frames.], batch size: 35, lr: 1.00e-03 2022-05-03 21:18:48,023 INFO [train.py:715] (0/8) Epoch 1, batch 8250, loss[loss=0.2151, simple_loss=0.2483, pruned_loss=0.09091, over 4846.00 frames.], tot_loss[loss=0.1931, simple_loss=0.2549, pruned_loss=0.06563, over 971131.73 frames.], batch size: 13, lr: 1.00e-03 2022-05-03 21:19:26,204 INFO [train.py:715] (0/8) Epoch 1, batch 8300, loss[loss=0.189, simple_loss=0.2532, pruned_loss=0.06243, over 4976.00 frames.], tot_loss[loss=0.1931, simple_loss=0.2548, pruned_loss=0.06568, over 971025.61 frames.], batch size: 31, lr: 1.00e-03 2022-05-03 21:20:06,148 INFO [train.py:715] (0/8) Epoch 1, batch 8350, loss[loss=0.2117, simple_loss=0.2671, pruned_loss=0.07816, over 4686.00 frames.], tot_loss[loss=0.1938, simple_loss=0.2552, pruned_loss=0.06622, over 970552.19 frames.], batch size: 15, lr: 1.00e-03 2022-05-03 21:20:45,731 INFO [train.py:715] (0/8) Epoch 1, batch 8400, loss[loss=0.2059, simple_loss=0.2694, pruned_loss=0.07117, over 4837.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2544, pruned_loss=0.0655, over 970043.89 frames.], batch size: 15, lr: 1.00e-03 2022-05-03 21:21:25,106 INFO [train.py:715] (0/8) Epoch 1, batch 8450, loss[loss=0.1537, simple_loss=0.2262, pruned_loss=0.0406, over 4947.00 frames.], tot_loss[loss=0.1928, simple_loss=0.2542, pruned_loss=0.06568, over 971155.04 frames.], batch size: 14, lr: 1.00e-03 2022-05-03 21:22:03,500 INFO [train.py:715] (0/8) Epoch 1, batch 8500, loss[loss=0.1973, simple_loss=0.2553, pruned_loss=0.06968, over 4812.00 frames.], tot_loss[loss=0.1933, simple_loss=0.2544, pruned_loss=0.06613, over 971598.49 frames.], batch size: 15, lr: 1.00e-03 2022-05-03 21:22:43,394 INFO [train.py:715] (0/8) Epoch 1, batch 8550, loss[loss=0.1772, simple_loss=0.2349, pruned_loss=0.05973, over 4841.00 frames.], tot_loss[loss=0.1939, simple_loss=0.2547, pruned_loss=0.06651, over 971686.19 frames.], batch size: 32, lr: 1.00e-03 2022-05-03 21:23:22,908 INFO [train.py:715] (0/8) Epoch 1, batch 8600, loss[loss=0.1965, simple_loss=0.2681, pruned_loss=0.06245, over 4973.00 frames.], tot_loss[loss=0.1925, simple_loss=0.2536, pruned_loss=0.06569, over 971924.73 frames.], batch size: 24, lr: 1.00e-03 2022-05-03 21:24:00,903 INFO [train.py:715] (0/8) Epoch 1, batch 8650, loss[loss=0.2016, simple_loss=0.2519, pruned_loss=0.07561, over 4822.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2537, pruned_loss=0.06586, over 971646.55 frames.], batch size: 15, lr: 9.99e-04 2022-05-03 21:24:41,126 INFO [train.py:715] (0/8) Epoch 1, batch 8700, loss[loss=0.1776, simple_loss=0.2425, pruned_loss=0.0563, over 4992.00 frames.], tot_loss[loss=0.1923, simple_loss=0.2535, pruned_loss=0.06555, over 971909.59 frames.], batch size: 16, lr: 9.98e-04 2022-05-03 21:25:21,115 INFO [train.py:715] (0/8) Epoch 1, batch 8750, loss[loss=0.1927, simple_loss=0.2547, pruned_loss=0.06539, over 4824.00 frames.], tot_loss[loss=0.1924, simple_loss=0.2537, pruned_loss=0.06558, over 972042.21 frames.], batch size: 30, lr: 9.98e-04 2022-05-03 21:26:00,208 INFO [train.py:715] (0/8) Epoch 1, batch 8800, loss[loss=0.1902, simple_loss=0.2528, pruned_loss=0.0638, over 4973.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2537, pruned_loss=0.06586, over 971301.45 frames.], batch size: 15, lr: 9.97e-04 2022-05-03 21:26:39,533 INFO [train.py:715] (0/8) Epoch 1, batch 8850, loss[loss=0.2426, simple_loss=0.2869, pruned_loss=0.09912, over 4981.00 frames.], tot_loss[loss=0.1933, simple_loss=0.2542, pruned_loss=0.06618, over 971299.66 frames.], batch size: 39, lr: 9.97e-04 2022-05-03 21:27:19,654 INFO [train.py:715] (0/8) Epoch 1, batch 8900, loss[loss=0.2024, simple_loss=0.2616, pruned_loss=0.07154, over 4751.00 frames.], tot_loss[loss=0.1937, simple_loss=0.2544, pruned_loss=0.06646, over 971872.89 frames.], batch size: 16, lr: 9.96e-04 2022-05-03 21:27:59,351 INFO [train.py:715] (0/8) Epoch 1, batch 8950, loss[loss=0.2009, simple_loss=0.2697, pruned_loss=0.06604, over 4968.00 frames.], tot_loss[loss=0.1944, simple_loss=0.255, pruned_loss=0.06687, over 971492.09 frames.], batch size: 39, lr: 9.96e-04 2022-05-03 21:28:37,785 INFO [train.py:715] (0/8) Epoch 1, batch 9000, loss[loss=0.1689, simple_loss=0.2305, pruned_loss=0.05361, over 4828.00 frames.], tot_loss[loss=0.1927, simple_loss=0.2541, pruned_loss=0.06563, over 972850.50 frames.], batch size: 15, lr: 9.95e-04 2022-05-03 21:28:37,786 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 21:28:47,502 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1253, simple_loss=0.2125, pruned_loss=0.01906, over 914524.00 frames. 2022-05-03 21:29:26,001 INFO [train.py:715] (0/8) Epoch 1, batch 9050, loss[loss=0.1594, simple_loss=0.2235, pruned_loss=0.04763, over 4735.00 frames.], tot_loss[loss=0.1906, simple_loss=0.2523, pruned_loss=0.06446, over 972647.92 frames.], batch size: 12, lr: 9.94e-04 2022-05-03 21:30:06,212 INFO [train.py:715] (0/8) Epoch 1, batch 9100, loss[loss=0.1777, simple_loss=0.2451, pruned_loss=0.05519, over 4963.00 frames.], tot_loss[loss=0.1907, simple_loss=0.2525, pruned_loss=0.06448, over 972591.77 frames.], batch size: 24, lr: 9.94e-04 2022-05-03 21:30:45,847 INFO [train.py:715] (0/8) Epoch 1, batch 9150, loss[loss=0.1964, simple_loss=0.2487, pruned_loss=0.07204, over 4850.00 frames.], tot_loss[loss=0.1902, simple_loss=0.252, pruned_loss=0.06425, over 972630.26 frames.], batch size: 32, lr: 9.93e-04 2022-05-03 21:31:24,125 INFO [train.py:715] (0/8) Epoch 1, batch 9200, loss[loss=0.1803, simple_loss=0.2506, pruned_loss=0.05499, over 4927.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2509, pruned_loss=0.06385, over 972354.74 frames.], batch size: 23, lr: 9.93e-04 2022-05-03 21:32:03,945 INFO [train.py:715] (0/8) Epoch 1, batch 9250, loss[loss=0.2488, simple_loss=0.3071, pruned_loss=0.09523, over 4990.00 frames.], tot_loss[loss=0.1899, simple_loss=0.2515, pruned_loss=0.06417, over 972755.21 frames.], batch size: 25, lr: 9.92e-04 2022-05-03 21:32:43,823 INFO [train.py:715] (0/8) Epoch 1, batch 9300, loss[loss=0.1474, simple_loss=0.2264, pruned_loss=0.03426, over 4915.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2512, pruned_loss=0.06368, over 973097.05 frames.], batch size: 18, lr: 9.92e-04 2022-05-03 21:33:22,877 INFO [train.py:715] (0/8) Epoch 1, batch 9350, loss[loss=0.192, simple_loss=0.2761, pruned_loss=0.05394, over 4804.00 frames.], tot_loss[loss=0.1901, simple_loss=0.2517, pruned_loss=0.06429, over 972233.63 frames.], batch size: 21, lr: 9.91e-04 2022-05-03 21:34:02,362 INFO [train.py:715] (0/8) Epoch 1, batch 9400, loss[loss=0.217, simple_loss=0.2679, pruned_loss=0.08307, over 4791.00 frames.], tot_loss[loss=0.1905, simple_loss=0.252, pruned_loss=0.06448, over 971985.08 frames.], batch size: 17, lr: 9.91e-04 2022-05-03 21:34:42,535 INFO [train.py:715] (0/8) Epoch 1, batch 9450, loss[loss=0.1927, simple_loss=0.2458, pruned_loss=0.06982, over 4806.00 frames.], tot_loss[loss=0.1914, simple_loss=0.2529, pruned_loss=0.06494, over 972264.05 frames.], batch size: 21, lr: 9.90e-04 2022-05-03 21:35:22,126 INFO [train.py:715] (0/8) Epoch 1, batch 9500, loss[loss=0.2194, simple_loss=0.2774, pruned_loss=0.08065, over 4893.00 frames.], tot_loss[loss=0.1909, simple_loss=0.2522, pruned_loss=0.06484, over 972633.49 frames.], batch size: 39, lr: 9.89e-04 2022-05-03 21:36:00,394 INFO [train.py:715] (0/8) Epoch 1, batch 9550, loss[loss=0.2054, simple_loss=0.2659, pruned_loss=0.07246, over 4894.00 frames.], tot_loss[loss=0.1903, simple_loss=0.252, pruned_loss=0.06434, over 972060.70 frames.], batch size: 22, lr: 9.89e-04 2022-05-03 21:36:40,621 INFO [train.py:715] (0/8) Epoch 1, batch 9600, loss[loss=0.2228, simple_loss=0.2762, pruned_loss=0.08472, over 4901.00 frames.], tot_loss[loss=0.1901, simple_loss=0.2518, pruned_loss=0.06417, over 971857.11 frames.], batch size: 19, lr: 9.88e-04 2022-05-03 21:37:20,358 INFO [train.py:715] (0/8) Epoch 1, batch 9650, loss[loss=0.2081, simple_loss=0.2549, pruned_loss=0.08063, over 4984.00 frames.], tot_loss[loss=0.1898, simple_loss=0.2514, pruned_loss=0.06416, over 972386.45 frames.], batch size: 24, lr: 9.88e-04 2022-05-03 21:37:58,748 INFO [train.py:715] (0/8) Epoch 1, batch 9700, loss[loss=0.2056, simple_loss=0.2712, pruned_loss=0.06997, over 4938.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2513, pruned_loss=0.06368, over 973519.00 frames.], batch size: 21, lr: 9.87e-04 2022-05-03 21:38:38,644 INFO [train.py:715] (0/8) Epoch 1, batch 9750, loss[loss=0.1671, simple_loss=0.2358, pruned_loss=0.04923, over 4919.00 frames.], tot_loss[loss=0.1903, simple_loss=0.2522, pruned_loss=0.06416, over 973635.79 frames.], batch size: 18, lr: 9.87e-04 2022-05-03 21:39:19,062 INFO [train.py:715] (0/8) Epoch 1, batch 9800, loss[loss=0.2212, simple_loss=0.2782, pruned_loss=0.08211, over 4958.00 frames.], tot_loss[loss=0.1899, simple_loss=0.252, pruned_loss=0.06389, over 972687.69 frames.], batch size: 21, lr: 9.86e-04 2022-05-03 21:39:58,299 INFO [train.py:715] (0/8) Epoch 1, batch 9850, loss[loss=0.1769, simple_loss=0.2466, pruned_loss=0.05359, over 4774.00 frames.], tot_loss[loss=0.1892, simple_loss=0.2514, pruned_loss=0.06355, over 973091.77 frames.], batch size: 19, lr: 9.86e-04 2022-05-03 21:40:37,084 INFO [train.py:715] (0/8) Epoch 1, batch 9900, loss[loss=0.1945, simple_loss=0.2517, pruned_loss=0.06868, over 4915.00 frames.], tot_loss[loss=0.1888, simple_loss=0.2515, pruned_loss=0.06308, over 973087.01 frames.], batch size: 23, lr: 9.85e-04 2022-05-03 21:41:17,364 INFO [train.py:715] (0/8) Epoch 1, batch 9950, loss[loss=0.1928, simple_loss=0.2448, pruned_loss=0.07038, over 4860.00 frames.], tot_loss[loss=0.1886, simple_loss=0.2509, pruned_loss=0.06314, over 972967.71 frames.], batch size: 32, lr: 9.85e-04 2022-05-03 21:41:57,266 INFO [train.py:715] (0/8) Epoch 1, batch 10000, loss[loss=0.1794, simple_loss=0.2441, pruned_loss=0.05733, over 4876.00 frames.], tot_loss[loss=0.1875, simple_loss=0.2499, pruned_loss=0.06256, over 972902.39 frames.], batch size: 32, lr: 9.84e-04 2022-05-03 21:42:36,327 INFO [train.py:715] (0/8) Epoch 1, batch 10050, loss[loss=0.1754, simple_loss=0.237, pruned_loss=0.05696, over 4807.00 frames.], tot_loss[loss=0.1879, simple_loss=0.2501, pruned_loss=0.0628, over 973512.91 frames.], batch size: 25, lr: 9.83e-04 2022-05-03 21:43:15,952 INFO [train.py:715] (0/8) Epoch 1, batch 10100, loss[loss=0.1631, simple_loss=0.2346, pruned_loss=0.04575, over 4777.00 frames.], tot_loss[loss=0.1883, simple_loss=0.2505, pruned_loss=0.06312, over 972062.98 frames.], batch size: 18, lr: 9.83e-04 2022-05-03 21:43:55,972 INFO [train.py:715] (0/8) Epoch 1, batch 10150, loss[loss=0.2019, simple_loss=0.2716, pruned_loss=0.06611, over 4822.00 frames.], tot_loss[loss=0.1887, simple_loss=0.2509, pruned_loss=0.06323, over 971207.49 frames.], batch size: 26, lr: 9.82e-04 2022-05-03 21:44:35,079 INFO [train.py:715] (0/8) Epoch 1, batch 10200, loss[loss=0.1649, simple_loss=0.2297, pruned_loss=0.05003, over 4788.00 frames.], tot_loss[loss=0.1883, simple_loss=0.2507, pruned_loss=0.06301, over 971615.45 frames.], batch size: 14, lr: 9.82e-04 2022-05-03 21:45:14,039 INFO [train.py:715] (0/8) Epoch 1, batch 10250, loss[loss=0.2254, simple_loss=0.2793, pruned_loss=0.08573, over 4879.00 frames.], tot_loss[loss=0.1898, simple_loss=0.2521, pruned_loss=0.06371, over 972071.30 frames.], batch size: 16, lr: 9.81e-04 2022-05-03 21:45:54,208 INFO [train.py:715] (0/8) Epoch 1, batch 10300, loss[loss=0.2147, simple_loss=0.2902, pruned_loss=0.06962, over 4880.00 frames.], tot_loss[loss=0.1896, simple_loss=0.2518, pruned_loss=0.06373, over 972272.76 frames.], batch size: 16, lr: 9.81e-04 2022-05-03 21:46:34,453 INFO [train.py:715] (0/8) Epoch 1, batch 10350, loss[loss=0.1789, simple_loss=0.251, pruned_loss=0.05338, over 4798.00 frames.], tot_loss[loss=0.1903, simple_loss=0.2528, pruned_loss=0.06396, over 972705.19 frames.], batch size: 25, lr: 9.80e-04 2022-05-03 21:47:13,909 INFO [train.py:715] (0/8) Epoch 1, batch 10400, loss[loss=0.1965, simple_loss=0.2679, pruned_loss=0.06254, over 4816.00 frames.], tot_loss[loss=0.1894, simple_loss=0.2521, pruned_loss=0.06337, over 972206.62 frames.], batch size: 25, lr: 9.80e-04 2022-05-03 21:47:53,945 INFO [train.py:715] (0/8) Epoch 1, batch 10450, loss[loss=0.1972, simple_loss=0.2616, pruned_loss=0.06636, over 4749.00 frames.], tot_loss[loss=0.1889, simple_loss=0.2517, pruned_loss=0.06302, over 971609.88 frames.], batch size: 19, lr: 9.79e-04 2022-05-03 21:48:34,481 INFO [train.py:715] (0/8) Epoch 1, batch 10500, loss[loss=0.1695, simple_loss=0.2413, pruned_loss=0.04881, over 4826.00 frames.], tot_loss[loss=0.1886, simple_loss=0.2515, pruned_loss=0.06281, over 971456.16 frames.], batch size: 26, lr: 9.79e-04 2022-05-03 21:49:13,765 INFO [train.py:715] (0/8) Epoch 1, batch 10550, loss[loss=0.1965, simple_loss=0.2596, pruned_loss=0.06665, over 4919.00 frames.], tot_loss[loss=0.1885, simple_loss=0.2514, pruned_loss=0.06281, over 972143.66 frames.], batch size: 39, lr: 9.78e-04 2022-05-03 21:49:52,641 INFO [train.py:715] (0/8) Epoch 1, batch 10600, loss[loss=0.1724, simple_loss=0.2351, pruned_loss=0.05482, over 4707.00 frames.], tot_loss[loss=0.1899, simple_loss=0.2524, pruned_loss=0.06369, over 971849.14 frames.], batch size: 15, lr: 9.78e-04 2022-05-03 21:50:33,180 INFO [train.py:715] (0/8) Epoch 1, batch 10650, loss[loss=0.169, simple_loss=0.2385, pruned_loss=0.04975, over 4776.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2518, pruned_loss=0.06346, over 972152.07 frames.], batch size: 19, lr: 9.77e-04 2022-05-03 21:51:13,730 INFO [train.py:715] (0/8) Epoch 1, batch 10700, loss[loss=0.1999, simple_loss=0.2734, pruned_loss=0.0632, over 4784.00 frames.], tot_loss[loss=0.1897, simple_loss=0.2523, pruned_loss=0.06351, over 973002.59 frames.], batch size: 18, lr: 9.76e-04 2022-05-03 21:51:52,994 INFO [train.py:715] (0/8) Epoch 1, batch 10750, loss[loss=0.1738, simple_loss=0.2505, pruned_loss=0.04858, over 4791.00 frames.], tot_loss[loss=0.1901, simple_loss=0.2522, pruned_loss=0.06403, over 972407.34 frames.], batch size: 24, lr: 9.76e-04 2022-05-03 21:52:32,277 INFO [train.py:715] (0/8) Epoch 1, batch 10800, loss[loss=0.2411, simple_loss=0.2805, pruned_loss=0.1008, over 4781.00 frames.], tot_loss[loss=0.189, simple_loss=0.2509, pruned_loss=0.06355, over 971662.90 frames.], batch size: 14, lr: 9.75e-04 2022-05-03 21:53:12,732 INFO [train.py:715] (0/8) Epoch 1, batch 10850, loss[loss=0.1837, simple_loss=0.2477, pruned_loss=0.05988, over 4890.00 frames.], tot_loss[loss=0.1878, simple_loss=0.2499, pruned_loss=0.06286, over 971735.55 frames.], batch size: 22, lr: 9.75e-04 2022-05-03 21:53:52,222 INFO [train.py:715] (0/8) Epoch 1, batch 10900, loss[loss=0.1887, simple_loss=0.2519, pruned_loss=0.06273, over 4800.00 frames.], tot_loss[loss=0.1872, simple_loss=0.2498, pruned_loss=0.06228, over 972776.24 frames.], batch size: 21, lr: 9.74e-04 2022-05-03 21:54:30,711 INFO [train.py:715] (0/8) Epoch 1, batch 10950, loss[loss=0.1676, simple_loss=0.2323, pruned_loss=0.05144, over 4988.00 frames.], tot_loss[loss=0.1863, simple_loss=0.2492, pruned_loss=0.06173, over 972260.56 frames.], batch size: 15, lr: 9.74e-04 2022-05-03 21:55:10,759 INFO [train.py:715] (0/8) Epoch 1, batch 11000, loss[loss=0.2022, simple_loss=0.2662, pruned_loss=0.06911, over 4854.00 frames.], tot_loss[loss=0.1871, simple_loss=0.2499, pruned_loss=0.06213, over 972106.97 frames.], batch size: 20, lr: 9.73e-04 2022-05-03 21:55:50,516 INFO [train.py:715] (0/8) Epoch 1, batch 11050, loss[loss=0.1699, simple_loss=0.2443, pruned_loss=0.04778, over 4799.00 frames.], tot_loss[loss=0.1869, simple_loss=0.2499, pruned_loss=0.06194, over 972524.21 frames.], batch size: 21, lr: 9.73e-04 2022-05-03 21:56:29,271 INFO [train.py:715] (0/8) Epoch 1, batch 11100, loss[loss=0.1882, simple_loss=0.2489, pruned_loss=0.06375, over 4985.00 frames.], tot_loss[loss=0.1859, simple_loss=0.2488, pruned_loss=0.06145, over 972284.71 frames.], batch size: 33, lr: 9.72e-04 2022-05-03 21:57:08,676 INFO [train.py:715] (0/8) Epoch 1, batch 11150, loss[loss=0.1525, simple_loss=0.2256, pruned_loss=0.03973, over 4916.00 frames.], tot_loss[loss=0.1871, simple_loss=0.2501, pruned_loss=0.06203, over 972199.24 frames.], batch size: 18, lr: 9.72e-04 2022-05-03 21:57:48,804 INFO [train.py:715] (0/8) Epoch 1, batch 11200, loss[loss=0.183, simple_loss=0.257, pruned_loss=0.05449, over 4874.00 frames.], tot_loss[loss=0.1877, simple_loss=0.2505, pruned_loss=0.06248, over 972249.82 frames.], batch size: 16, lr: 9.71e-04 2022-05-03 21:58:28,396 INFO [train.py:715] (0/8) Epoch 1, batch 11250, loss[loss=0.156, simple_loss=0.231, pruned_loss=0.04044, over 4789.00 frames.], tot_loss[loss=0.1867, simple_loss=0.2497, pruned_loss=0.06184, over 972059.88 frames.], batch size: 17, lr: 9.71e-04 2022-05-03 21:59:06,586 INFO [train.py:715] (0/8) Epoch 1, batch 11300, loss[loss=0.1709, simple_loss=0.2263, pruned_loss=0.05773, over 4936.00 frames.], tot_loss[loss=0.1856, simple_loss=0.2485, pruned_loss=0.06139, over 971521.95 frames.], batch size: 18, lr: 9.70e-04 2022-05-03 21:59:46,984 INFO [train.py:715] (0/8) Epoch 1, batch 11350, loss[loss=0.2589, simple_loss=0.3086, pruned_loss=0.1046, over 4972.00 frames.], tot_loss[loss=0.1876, simple_loss=0.25, pruned_loss=0.06255, over 971816.63 frames.], batch size: 35, lr: 9.70e-04 2022-05-03 22:00:26,698 INFO [train.py:715] (0/8) Epoch 1, batch 11400, loss[loss=0.1627, simple_loss=0.2242, pruned_loss=0.05063, over 4943.00 frames.], tot_loss[loss=0.186, simple_loss=0.249, pruned_loss=0.06151, over 971622.88 frames.], batch size: 23, lr: 9.69e-04 2022-05-03 22:01:04,858 INFO [train.py:715] (0/8) Epoch 1, batch 11450, loss[loss=0.1748, simple_loss=0.2421, pruned_loss=0.05378, over 4942.00 frames.], tot_loss[loss=0.1873, simple_loss=0.2499, pruned_loss=0.06234, over 971740.60 frames.], batch size: 21, lr: 9.69e-04 2022-05-03 22:01:44,072 INFO [train.py:715] (0/8) Epoch 1, batch 11500, loss[loss=0.2654, simple_loss=0.3199, pruned_loss=0.1054, over 4776.00 frames.], tot_loss[loss=0.1881, simple_loss=0.2506, pruned_loss=0.06282, over 971957.96 frames.], batch size: 17, lr: 9.68e-04 2022-05-03 22:02:23,958 INFO [train.py:715] (0/8) Epoch 1, batch 11550, loss[loss=0.1778, simple_loss=0.2285, pruned_loss=0.06354, over 4782.00 frames.], tot_loss[loss=0.1874, simple_loss=0.2501, pruned_loss=0.06233, over 971653.29 frames.], batch size: 17, lr: 9.68e-04 2022-05-03 22:03:03,168 INFO [train.py:715] (0/8) Epoch 1, batch 11600, loss[loss=0.1687, simple_loss=0.2487, pruned_loss=0.04438, over 4867.00 frames.], tot_loss[loss=0.1889, simple_loss=0.2514, pruned_loss=0.06324, over 971506.63 frames.], batch size: 16, lr: 9.67e-04 2022-05-03 22:03:41,498 INFO [train.py:715] (0/8) Epoch 1, batch 11650, loss[loss=0.1558, simple_loss=0.2224, pruned_loss=0.04455, over 4802.00 frames.], tot_loss[loss=0.1895, simple_loss=0.2518, pruned_loss=0.06357, over 972127.71 frames.], batch size: 24, lr: 9.67e-04 2022-05-03 22:04:21,436 INFO [train.py:715] (0/8) Epoch 1, batch 11700, loss[loss=0.1892, simple_loss=0.2642, pruned_loss=0.05706, over 4795.00 frames.], tot_loss[loss=0.1908, simple_loss=0.2527, pruned_loss=0.06446, over 971628.46 frames.], batch size: 18, lr: 9.66e-04 2022-05-03 22:05:01,253 INFO [train.py:715] (0/8) Epoch 1, batch 11750, loss[loss=0.1909, simple_loss=0.2516, pruned_loss=0.06515, over 4796.00 frames.], tot_loss[loss=0.1896, simple_loss=0.2512, pruned_loss=0.06398, over 971715.96 frames.], batch size: 17, lr: 9.66e-04 2022-05-03 22:05:40,551 INFO [train.py:715] (0/8) Epoch 1, batch 11800, loss[loss=0.155, simple_loss=0.2268, pruned_loss=0.04161, over 4939.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2511, pruned_loss=0.06373, over 971823.56 frames.], batch size: 23, lr: 9.65e-04 2022-05-03 22:06:19,258 INFO [train.py:715] (0/8) Epoch 1, batch 11850, loss[loss=0.2169, simple_loss=0.2793, pruned_loss=0.07722, over 4877.00 frames.], tot_loss[loss=0.188, simple_loss=0.2502, pruned_loss=0.06293, over 972628.63 frames.], batch size: 22, lr: 9.65e-04 2022-05-03 22:06:59,291 INFO [train.py:715] (0/8) Epoch 1, batch 11900, loss[loss=0.1893, simple_loss=0.2545, pruned_loss=0.062, over 4865.00 frames.], tot_loss[loss=0.1891, simple_loss=0.2509, pruned_loss=0.06364, over 971487.98 frames.], batch size: 38, lr: 9.64e-04 2022-05-03 22:07:38,637 INFO [train.py:715] (0/8) Epoch 1, batch 11950, loss[loss=0.1616, simple_loss=0.2421, pruned_loss=0.04053, over 4794.00 frames.], tot_loss[loss=0.1884, simple_loss=0.2505, pruned_loss=0.06315, over 970356.20 frames.], batch size: 14, lr: 9.63e-04 2022-05-03 22:08:17,123 INFO [train.py:715] (0/8) Epoch 1, batch 12000, loss[loss=0.1919, simple_loss=0.2525, pruned_loss=0.06567, over 4821.00 frames.], tot_loss[loss=0.1891, simple_loss=0.2511, pruned_loss=0.0635, over 970602.64 frames.], batch size: 26, lr: 9.63e-04 2022-05-03 22:08:17,124 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 22:08:27,632 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1244, simple_loss=0.2116, pruned_loss=0.01858, over 914524.00 frames. 2022-05-03 22:09:06,368 INFO [train.py:715] (0/8) Epoch 1, batch 12050, loss[loss=0.1832, simple_loss=0.2675, pruned_loss=0.04946, over 4888.00 frames.], tot_loss[loss=0.1897, simple_loss=0.2512, pruned_loss=0.06405, over 970983.56 frames.], batch size: 22, lr: 9.62e-04 2022-05-03 22:09:46,985 INFO [train.py:715] (0/8) Epoch 1, batch 12100, loss[loss=0.2008, simple_loss=0.2553, pruned_loss=0.07322, over 4926.00 frames.], tot_loss[loss=0.1882, simple_loss=0.25, pruned_loss=0.06325, over 971097.62 frames.], batch size: 23, lr: 9.62e-04 2022-05-03 22:10:27,672 INFO [train.py:715] (0/8) Epoch 1, batch 12150, loss[loss=0.1669, simple_loss=0.2362, pruned_loss=0.04874, over 4782.00 frames.], tot_loss[loss=0.1883, simple_loss=0.2504, pruned_loss=0.06312, over 970602.13 frames.], batch size: 17, lr: 9.61e-04 2022-05-03 22:11:06,643 INFO [train.py:715] (0/8) Epoch 1, batch 12200, loss[loss=0.2694, simple_loss=0.3053, pruned_loss=0.1168, over 4771.00 frames.], tot_loss[loss=0.1896, simple_loss=0.2519, pruned_loss=0.06371, over 970365.35 frames.], batch size: 17, lr: 9.61e-04 2022-05-03 22:11:46,547 INFO [train.py:715] (0/8) Epoch 1, batch 12250, loss[loss=0.19, simple_loss=0.2559, pruned_loss=0.06206, over 4690.00 frames.], tot_loss[loss=0.1907, simple_loss=0.2527, pruned_loss=0.06432, over 969726.40 frames.], batch size: 15, lr: 9.60e-04 2022-05-03 22:12:27,156 INFO [train.py:715] (0/8) Epoch 1, batch 12300, loss[loss=0.2137, simple_loss=0.2831, pruned_loss=0.07214, over 4840.00 frames.], tot_loss[loss=0.1898, simple_loss=0.252, pruned_loss=0.06378, over 969452.11 frames.], batch size: 15, lr: 9.60e-04 2022-05-03 22:13:06,777 INFO [train.py:715] (0/8) Epoch 1, batch 12350, loss[loss=0.2178, simple_loss=0.2898, pruned_loss=0.0729, over 4925.00 frames.], tot_loss[loss=0.1892, simple_loss=0.2518, pruned_loss=0.0633, over 969695.38 frames.], batch size: 23, lr: 9.59e-04 2022-05-03 22:13:45,543 INFO [train.py:715] (0/8) Epoch 1, batch 12400, loss[loss=0.2076, simple_loss=0.2797, pruned_loss=0.06778, over 4910.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2522, pruned_loss=0.06322, over 969859.03 frames.], batch size: 23, lr: 9.59e-04 2022-05-03 22:14:25,689 INFO [train.py:715] (0/8) Epoch 1, batch 12450, loss[loss=0.2145, simple_loss=0.2741, pruned_loss=0.07742, over 4869.00 frames.], tot_loss[loss=0.1885, simple_loss=0.2514, pruned_loss=0.06278, over 970132.81 frames.], batch size: 34, lr: 9.58e-04 2022-05-03 22:15:05,671 INFO [train.py:715] (0/8) Epoch 1, batch 12500, loss[loss=0.1966, simple_loss=0.2421, pruned_loss=0.07548, over 4989.00 frames.], tot_loss[loss=0.1881, simple_loss=0.2505, pruned_loss=0.06289, over 970852.98 frames.], batch size: 15, lr: 9.58e-04 2022-05-03 22:15:44,876 INFO [train.py:715] (0/8) Epoch 1, batch 12550, loss[loss=0.2007, simple_loss=0.2574, pruned_loss=0.07197, over 4944.00 frames.], tot_loss[loss=0.1895, simple_loss=0.2516, pruned_loss=0.06369, over 971404.67 frames.], batch size: 29, lr: 9.57e-04 2022-05-03 22:16:24,276 INFO [train.py:715] (0/8) Epoch 1, batch 12600, loss[loss=0.1739, simple_loss=0.225, pruned_loss=0.06144, over 4734.00 frames.], tot_loss[loss=0.1892, simple_loss=0.2517, pruned_loss=0.06334, over 970059.39 frames.], batch size: 12, lr: 9.57e-04 2022-05-03 22:17:04,548 INFO [train.py:715] (0/8) Epoch 1, batch 12650, loss[loss=0.1975, simple_loss=0.2536, pruned_loss=0.07068, over 4770.00 frames.], tot_loss[loss=0.1893, simple_loss=0.2516, pruned_loss=0.06347, over 970302.45 frames.], batch size: 17, lr: 9.56e-04 2022-05-03 22:17:43,557 INFO [train.py:715] (0/8) Epoch 1, batch 12700, loss[loss=0.1894, simple_loss=0.2577, pruned_loss=0.06057, over 4896.00 frames.], tot_loss[loss=0.1894, simple_loss=0.2518, pruned_loss=0.06354, over 971935.74 frames.], batch size: 39, lr: 9.56e-04 2022-05-03 22:18:22,949 INFO [train.py:715] (0/8) Epoch 1, batch 12750, loss[loss=0.2007, simple_loss=0.2574, pruned_loss=0.07197, over 4799.00 frames.], tot_loss[loss=0.1882, simple_loss=0.2508, pruned_loss=0.06283, over 972357.76 frames.], batch size: 24, lr: 9.55e-04 2022-05-03 22:19:03,056 INFO [train.py:715] (0/8) Epoch 1, batch 12800, loss[loss=0.1509, simple_loss=0.2126, pruned_loss=0.0446, over 4877.00 frames.], tot_loss[loss=0.1889, simple_loss=0.251, pruned_loss=0.06336, over 972296.48 frames.], batch size: 12, lr: 9.55e-04 2022-05-03 22:19:42,871 INFO [train.py:715] (0/8) Epoch 1, batch 12850, loss[loss=0.2056, simple_loss=0.2635, pruned_loss=0.07383, over 4818.00 frames.], tot_loss[loss=0.188, simple_loss=0.2502, pruned_loss=0.06291, over 972311.66 frames.], batch size: 25, lr: 9.54e-04 2022-05-03 22:20:21,824 INFO [train.py:715] (0/8) Epoch 1, batch 12900, loss[loss=0.1742, simple_loss=0.2357, pruned_loss=0.05637, over 4637.00 frames.], tot_loss[loss=0.1891, simple_loss=0.2512, pruned_loss=0.06351, over 972188.31 frames.], batch size: 13, lr: 9.54e-04 2022-05-03 22:21:01,116 INFO [train.py:715] (0/8) Epoch 1, batch 12950, loss[loss=0.1595, simple_loss=0.2161, pruned_loss=0.05147, over 4776.00 frames.], tot_loss[loss=0.1895, simple_loss=0.2512, pruned_loss=0.0639, over 973845.43 frames.], batch size: 12, lr: 9.53e-04 2022-05-03 22:21:41,530 INFO [train.py:715] (0/8) Epoch 1, batch 13000, loss[loss=0.1972, simple_loss=0.2605, pruned_loss=0.06692, over 4949.00 frames.], tot_loss[loss=0.1896, simple_loss=0.2517, pruned_loss=0.06376, over 973999.95 frames.], batch size: 21, lr: 9.53e-04 2022-05-03 22:22:21,099 INFO [train.py:715] (0/8) Epoch 1, batch 13050, loss[loss=0.1507, simple_loss=0.2115, pruned_loss=0.04491, over 4847.00 frames.], tot_loss[loss=0.189, simple_loss=0.2513, pruned_loss=0.06335, over 974445.73 frames.], batch size: 30, lr: 9.52e-04 2022-05-03 22:23:01,181 INFO [train.py:715] (0/8) Epoch 1, batch 13100, loss[loss=0.1825, simple_loss=0.2444, pruned_loss=0.06034, over 4978.00 frames.], tot_loss[loss=0.1901, simple_loss=0.2523, pruned_loss=0.06393, over 973522.02 frames.], batch size: 24, lr: 9.52e-04 2022-05-03 22:23:41,366 INFO [train.py:715] (0/8) Epoch 1, batch 13150, loss[loss=0.2043, simple_loss=0.2627, pruned_loss=0.07299, over 4912.00 frames.], tot_loss[loss=0.1898, simple_loss=0.252, pruned_loss=0.06375, over 973108.46 frames.], batch size: 17, lr: 9.51e-04 2022-05-03 22:24:12,349 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-48000.pt 2022-05-03 22:24:23,882 INFO [train.py:715] (0/8) Epoch 1, batch 13200, loss[loss=0.1481, simple_loss=0.2247, pruned_loss=0.03577, over 4890.00 frames.], tot_loss[loss=0.1887, simple_loss=0.2513, pruned_loss=0.063, over 972382.43 frames.], batch size: 16, lr: 9.51e-04 2022-05-03 22:25:03,006 INFO [train.py:715] (0/8) Epoch 1, batch 13250, loss[loss=0.1627, simple_loss=0.2285, pruned_loss=0.04845, over 4931.00 frames.], tot_loss[loss=0.1889, simple_loss=0.2511, pruned_loss=0.06332, over 972860.07 frames.], batch size: 23, lr: 9.51e-04 2022-05-03 22:25:41,756 INFO [train.py:715] (0/8) Epoch 1, batch 13300, loss[loss=0.2311, simple_loss=0.2801, pruned_loss=0.0911, over 4771.00 frames.], tot_loss[loss=0.1879, simple_loss=0.2505, pruned_loss=0.06265, over 973234.48 frames.], batch size: 18, lr: 9.50e-04 2022-05-03 22:26:21,985 INFO [train.py:715] (0/8) Epoch 1, batch 13350, loss[loss=0.1662, simple_loss=0.2328, pruned_loss=0.04983, over 4900.00 frames.], tot_loss[loss=0.1882, simple_loss=0.2507, pruned_loss=0.06286, over 972666.48 frames.], batch size: 19, lr: 9.50e-04 2022-05-03 22:27:01,386 INFO [train.py:715] (0/8) Epoch 1, batch 13400, loss[loss=0.1518, simple_loss=0.225, pruned_loss=0.03937, over 4886.00 frames.], tot_loss[loss=0.1889, simple_loss=0.2509, pruned_loss=0.0635, over 973207.96 frames.], batch size: 19, lr: 9.49e-04 2022-05-03 22:27:41,360 INFO [train.py:715] (0/8) Epoch 1, batch 13450, loss[loss=0.199, simple_loss=0.2696, pruned_loss=0.06416, over 4769.00 frames.], tot_loss[loss=0.1895, simple_loss=0.2516, pruned_loss=0.06369, over 974542.17 frames.], batch size: 16, lr: 9.49e-04 2022-05-03 22:28:21,073 INFO [train.py:715] (0/8) Epoch 1, batch 13500, loss[loss=0.1749, simple_loss=0.2371, pruned_loss=0.0563, over 4893.00 frames.], tot_loss[loss=0.1888, simple_loss=0.2506, pruned_loss=0.06349, over 973962.07 frames.], batch size: 19, lr: 9.48e-04 2022-05-03 22:29:01,041 INFO [train.py:715] (0/8) Epoch 1, batch 13550, loss[loss=0.2051, simple_loss=0.2586, pruned_loss=0.07577, over 4828.00 frames.], tot_loss[loss=0.19, simple_loss=0.2515, pruned_loss=0.06427, over 973870.65 frames.], batch size: 30, lr: 9.48e-04 2022-05-03 22:29:39,303 INFO [train.py:715] (0/8) Epoch 1, batch 13600, loss[loss=0.1766, simple_loss=0.2422, pruned_loss=0.05554, over 4973.00 frames.], tot_loss[loss=0.1893, simple_loss=0.251, pruned_loss=0.06381, over 973475.10 frames.], batch size: 24, lr: 9.47e-04 2022-05-03 22:30:18,510 INFO [train.py:715] (0/8) Epoch 1, batch 13650, loss[loss=0.1686, simple_loss=0.2308, pruned_loss=0.05318, over 4944.00 frames.], tot_loss[loss=0.1882, simple_loss=0.2502, pruned_loss=0.06313, over 973824.62 frames.], batch size: 21, lr: 9.47e-04 2022-05-03 22:30:58,747 INFO [train.py:715] (0/8) Epoch 1, batch 13700, loss[loss=0.2318, simple_loss=0.2968, pruned_loss=0.08338, over 4693.00 frames.], tot_loss[loss=0.1885, simple_loss=0.2503, pruned_loss=0.06334, over 973799.18 frames.], batch size: 15, lr: 9.46e-04 2022-05-03 22:31:38,135 INFO [train.py:715] (0/8) Epoch 1, batch 13750, loss[loss=0.1553, simple_loss=0.2254, pruned_loss=0.04258, over 4815.00 frames.], tot_loss[loss=0.1866, simple_loss=0.2488, pruned_loss=0.06222, over 973717.55 frames.], batch size: 25, lr: 9.46e-04 2022-05-03 22:32:17,285 INFO [train.py:715] (0/8) Epoch 1, batch 13800, loss[loss=0.1782, simple_loss=0.2475, pruned_loss=0.0545, over 4941.00 frames.], tot_loss[loss=0.1854, simple_loss=0.248, pruned_loss=0.06143, over 972994.71 frames.], batch size: 21, lr: 9.45e-04 2022-05-03 22:32:56,970 INFO [train.py:715] (0/8) Epoch 1, batch 13850, loss[loss=0.2017, simple_loss=0.2564, pruned_loss=0.07346, over 4826.00 frames.], tot_loss[loss=0.1868, simple_loss=0.2494, pruned_loss=0.06216, over 972846.50 frames.], batch size: 26, lr: 9.45e-04 2022-05-03 22:33:36,814 INFO [train.py:715] (0/8) Epoch 1, batch 13900, loss[loss=0.1743, simple_loss=0.2355, pruned_loss=0.05656, over 4791.00 frames.], tot_loss[loss=0.1857, simple_loss=0.2482, pruned_loss=0.06153, over 972820.46 frames.], batch size: 24, lr: 9.44e-04 2022-05-03 22:34:15,310 INFO [train.py:715] (0/8) Epoch 1, batch 13950, loss[loss=0.1503, simple_loss=0.2175, pruned_loss=0.04151, over 4880.00 frames.], tot_loss[loss=0.1853, simple_loss=0.2482, pruned_loss=0.06123, over 972090.12 frames.], batch size: 22, lr: 9.44e-04 2022-05-03 22:34:54,571 INFO [train.py:715] (0/8) Epoch 1, batch 14000, loss[loss=0.194, simple_loss=0.2559, pruned_loss=0.06608, over 4693.00 frames.], tot_loss[loss=0.1859, simple_loss=0.249, pruned_loss=0.06139, over 971674.26 frames.], batch size: 15, lr: 9.43e-04 2022-05-03 22:35:34,717 INFO [train.py:715] (0/8) Epoch 1, batch 14050, loss[loss=0.1748, simple_loss=0.2364, pruned_loss=0.05655, over 4787.00 frames.], tot_loss[loss=0.1869, simple_loss=0.25, pruned_loss=0.0619, over 971242.47 frames.], batch size: 14, lr: 9.43e-04 2022-05-03 22:36:13,518 INFO [train.py:715] (0/8) Epoch 1, batch 14100, loss[loss=0.161, simple_loss=0.2275, pruned_loss=0.04727, over 4887.00 frames.], tot_loss[loss=0.1888, simple_loss=0.2512, pruned_loss=0.06318, over 971420.75 frames.], batch size: 22, lr: 9.42e-04 2022-05-03 22:36:52,749 INFO [train.py:715] (0/8) Epoch 1, batch 14150, loss[loss=0.1941, simple_loss=0.2606, pruned_loss=0.06378, over 4870.00 frames.], tot_loss[loss=0.1889, simple_loss=0.2511, pruned_loss=0.06331, over 971255.74 frames.], batch size: 38, lr: 9.42e-04 2022-05-03 22:37:31,984 INFO [train.py:715] (0/8) Epoch 1, batch 14200, loss[loss=0.1928, simple_loss=0.2595, pruned_loss=0.06299, over 4926.00 frames.], tot_loss[loss=0.1887, simple_loss=0.2512, pruned_loss=0.06308, over 971491.87 frames.], batch size: 23, lr: 9.41e-04 2022-05-03 22:38:12,099 INFO [train.py:715] (0/8) Epoch 1, batch 14250, loss[loss=0.2235, simple_loss=0.2711, pruned_loss=0.08794, over 4810.00 frames.], tot_loss[loss=0.1882, simple_loss=0.2505, pruned_loss=0.06288, over 972190.84 frames.], batch size: 25, lr: 9.41e-04 2022-05-03 22:38:50,574 INFO [train.py:715] (0/8) Epoch 1, batch 14300, loss[loss=0.1853, simple_loss=0.2455, pruned_loss=0.06254, over 4887.00 frames.], tot_loss[loss=0.1882, simple_loss=0.2508, pruned_loss=0.06282, over 972197.74 frames.], batch size: 17, lr: 9.40e-04 2022-05-03 22:39:29,562 INFO [train.py:715] (0/8) Epoch 1, batch 14350, loss[loss=0.195, simple_loss=0.2563, pruned_loss=0.06682, over 4872.00 frames.], tot_loss[loss=0.1879, simple_loss=0.2503, pruned_loss=0.06281, over 972446.74 frames.], batch size: 16, lr: 9.40e-04 2022-05-03 22:40:09,908 INFO [train.py:715] (0/8) Epoch 1, batch 14400, loss[loss=0.218, simple_loss=0.2715, pruned_loss=0.08229, over 4955.00 frames.], tot_loss[loss=0.1874, simple_loss=0.2501, pruned_loss=0.06228, over 973302.26 frames.], batch size: 24, lr: 9.39e-04 2022-05-03 22:40:48,731 INFO [train.py:715] (0/8) Epoch 1, batch 14450, loss[loss=0.1719, simple_loss=0.235, pruned_loss=0.05441, over 4850.00 frames.], tot_loss[loss=0.1872, simple_loss=0.2502, pruned_loss=0.06207, over 972863.63 frames.], batch size: 30, lr: 9.39e-04 2022-05-03 22:41:28,255 INFO [train.py:715] (0/8) Epoch 1, batch 14500, loss[loss=0.2324, simple_loss=0.2909, pruned_loss=0.087, over 4947.00 frames.], tot_loss[loss=0.1851, simple_loss=0.2487, pruned_loss=0.06071, over 973376.55 frames.], batch size: 39, lr: 9.39e-04 2022-05-03 22:42:08,358 INFO [train.py:715] (0/8) Epoch 1, batch 14550, loss[loss=0.1983, simple_loss=0.2621, pruned_loss=0.06728, over 4957.00 frames.], tot_loss[loss=0.1849, simple_loss=0.2482, pruned_loss=0.06083, over 972343.42 frames.], batch size: 35, lr: 9.38e-04 2022-05-03 22:42:47,871 INFO [train.py:715] (0/8) Epoch 1, batch 14600, loss[loss=0.1735, simple_loss=0.237, pruned_loss=0.05504, over 4757.00 frames.], tot_loss[loss=0.1847, simple_loss=0.2479, pruned_loss=0.06074, over 971271.65 frames.], batch size: 19, lr: 9.38e-04 2022-05-03 22:43:26,830 INFO [train.py:715] (0/8) Epoch 1, batch 14650, loss[loss=0.1703, simple_loss=0.2412, pruned_loss=0.04965, over 4823.00 frames.], tot_loss[loss=0.1848, simple_loss=0.2483, pruned_loss=0.06069, over 970670.84 frames.], batch size: 26, lr: 9.37e-04 2022-05-03 22:44:05,668 INFO [train.py:715] (0/8) Epoch 1, batch 14700, loss[loss=0.2023, simple_loss=0.2698, pruned_loss=0.06746, over 4964.00 frames.], tot_loss[loss=0.1845, simple_loss=0.2479, pruned_loss=0.06052, over 971049.33 frames.], batch size: 15, lr: 9.37e-04 2022-05-03 22:44:45,797 INFO [train.py:715] (0/8) Epoch 1, batch 14750, loss[loss=0.181, simple_loss=0.2568, pruned_loss=0.05262, over 4940.00 frames.], tot_loss[loss=0.1847, simple_loss=0.2478, pruned_loss=0.06079, over 971987.33 frames.], batch size: 39, lr: 9.36e-04 2022-05-03 22:45:24,940 INFO [train.py:715] (0/8) Epoch 1, batch 14800, loss[loss=0.1636, simple_loss=0.2232, pruned_loss=0.05197, over 4980.00 frames.], tot_loss[loss=0.1869, simple_loss=0.2497, pruned_loss=0.06206, over 972607.71 frames.], batch size: 14, lr: 9.36e-04 2022-05-03 22:46:04,493 INFO [train.py:715] (0/8) Epoch 1, batch 14850, loss[loss=0.1871, simple_loss=0.2615, pruned_loss=0.05636, over 4900.00 frames.], tot_loss[loss=0.1865, simple_loss=0.2491, pruned_loss=0.06194, over 972691.47 frames.], batch size: 17, lr: 9.35e-04 2022-05-03 22:46:43,816 INFO [train.py:715] (0/8) Epoch 1, batch 14900, loss[loss=0.2398, simple_loss=0.2841, pruned_loss=0.09775, over 4828.00 frames.], tot_loss[loss=0.1868, simple_loss=0.2495, pruned_loss=0.06204, over 972324.38 frames.], batch size: 15, lr: 9.35e-04 2022-05-03 22:47:22,422 INFO [train.py:715] (0/8) Epoch 1, batch 14950, loss[loss=0.212, simple_loss=0.2761, pruned_loss=0.07397, over 4855.00 frames.], tot_loss[loss=0.1884, simple_loss=0.2503, pruned_loss=0.06324, over 972797.56 frames.], batch size: 20, lr: 9.34e-04 2022-05-03 22:48:02,038 INFO [train.py:715] (0/8) Epoch 1, batch 15000, loss[loss=0.1727, simple_loss=0.2422, pruned_loss=0.05158, over 4813.00 frames.], tot_loss[loss=0.1881, simple_loss=0.2502, pruned_loss=0.06301, over 972650.68 frames.], batch size: 27, lr: 9.34e-04 2022-05-03 22:48:02,039 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 22:48:17,510 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1242, simple_loss=0.2115, pruned_loss=0.01842, over 914524.00 frames. 2022-05-03 22:48:57,651 INFO [train.py:715] (0/8) Epoch 1, batch 15050, loss[loss=0.1695, simple_loss=0.2285, pruned_loss=0.05529, over 4828.00 frames.], tot_loss[loss=0.1873, simple_loss=0.2495, pruned_loss=0.06254, over 972323.30 frames.], batch size: 26, lr: 9.33e-04 2022-05-03 22:49:37,561 INFO [train.py:715] (0/8) Epoch 1, batch 15100, loss[loss=0.1982, simple_loss=0.2545, pruned_loss=0.07089, over 4800.00 frames.], tot_loss[loss=0.1884, simple_loss=0.2504, pruned_loss=0.06318, over 971957.78 frames.], batch size: 21, lr: 9.33e-04 2022-05-03 22:50:18,100 INFO [train.py:715] (0/8) Epoch 1, batch 15150, loss[loss=0.2156, simple_loss=0.2697, pruned_loss=0.08077, over 4925.00 frames.], tot_loss[loss=0.1874, simple_loss=0.2496, pruned_loss=0.06262, over 971834.55 frames.], batch size: 35, lr: 9.32e-04 2022-05-03 22:50:57,478 INFO [train.py:715] (0/8) Epoch 1, batch 15200, loss[loss=0.1791, simple_loss=0.2404, pruned_loss=0.05892, over 4914.00 frames.], tot_loss[loss=0.1878, simple_loss=0.2499, pruned_loss=0.06282, over 971465.33 frames.], batch size: 17, lr: 9.32e-04 2022-05-03 22:51:37,958 INFO [train.py:715] (0/8) Epoch 1, batch 15250, loss[loss=0.1733, simple_loss=0.2347, pruned_loss=0.056, over 4889.00 frames.], tot_loss[loss=0.1882, simple_loss=0.2503, pruned_loss=0.06303, over 972796.39 frames.], batch size: 19, lr: 9.32e-04 2022-05-03 22:52:17,874 INFO [train.py:715] (0/8) Epoch 1, batch 15300, loss[loss=0.2099, simple_loss=0.2567, pruned_loss=0.08153, over 4840.00 frames.], tot_loss[loss=0.1879, simple_loss=0.2499, pruned_loss=0.06298, over 972757.67 frames.], batch size: 13, lr: 9.31e-04 2022-05-03 22:52:57,765 INFO [train.py:715] (0/8) Epoch 1, batch 15350, loss[loss=0.2096, simple_loss=0.2794, pruned_loss=0.06992, over 4854.00 frames.], tot_loss[loss=0.1889, simple_loss=0.2513, pruned_loss=0.06319, over 973047.23 frames.], batch size: 32, lr: 9.31e-04 2022-05-03 22:53:37,903 INFO [train.py:715] (0/8) Epoch 1, batch 15400, loss[loss=0.1948, simple_loss=0.2567, pruned_loss=0.06646, over 4895.00 frames.], tot_loss[loss=0.1883, simple_loss=0.251, pruned_loss=0.06283, over 973457.49 frames.], batch size: 19, lr: 9.30e-04 2022-05-03 22:54:18,172 INFO [train.py:715] (0/8) Epoch 1, batch 15450, loss[loss=0.2045, simple_loss=0.2743, pruned_loss=0.06736, over 4699.00 frames.], tot_loss[loss=0.1875, simple_loss=0.25, pruned_loss=0.06244, over 972758.56 frames.], batch size: 15, lr: 9.30e-04 2022-05-03 22:54:58,644 INFO [train.py:715] (0/8) Epoch 1, batch 15500, loss[loss=0.1542, simple_loss=0.2212, pruned_loss=0.04362, over 4802.00 frames.], tot_loss[loss=0.1861, simple_loss=0.249, pruned_loss=0.06162, over 972339.70 frames.], batch size: 24, lr: 9.29e-04 2022-05-03 22:55:37,740 INFO [train.py:715] (0/8) Epoch 1, batch 15550, loss[loss=0.1582, simple_loss=0.2243, pruned_loss=0.04608, over 4754.00 frames.], tot_loss[loss=0.1862, simple_loss=0.249, pruned_loss=0.06172, over 971131.90 frames.], batch size: 16, lr: 9.29e-04 2022-05-03 22:56:18,062 INFO [train.py:715] (0/8) Epoch 1, batch 15600, loss[loss=0.142, simple_loss=0.2182, pruned_loss=0.03295, over 4928.00 frames.], tot_loss[loss=0.1853, simple_loss=0.2481, pruned_loss=0.06125, over 970706.27 frames.], batch size: 23, lr: 9.28e-04 2022-05-03 22:56:58,357 INFO [train.py:715] (0/8) Epoch 1, batch 15650, loss[loss=0.2014, simple_loss=0.2654, pruned_loss=0.06871, over 4909.00 frames.], tot_loss[loss=0.1853, simple_loss=0.2481, pruned_loss=0.06127, over 971474.55 frames.], batch size: 29, lr: 9.28e-04 2022-05-03 22:57:38,273 INFO [train.py:715] (0/8) Epoch 1, batch 15700, loss[loss=0.1488, simple_loss=0.2226, pruned_loss=0.0375, over 4930.00 frames.], tot_loss[loss=0.1847, simple_loss=0.2481, pruned_loss=0.0607, over 971049.07 frames.], batch size: 18, lr: 9.27e-04 2022-05-03 22:58:17,915 INFO [train.py:715] (0/8) Epoch 1, batch 15750, loss[loss=0.2032, simple_loss=0.2633, pruned_loss=0.07157, over 4822.00 frames.], tot_loss[loss=0.1865, simple_loss=0.2494, pruned_loss=0.06178, over 971031.82 frames.], batch size: 25, lr: 9.27e-04 2022-05-03 22:58:58,194 INFO [train.py:715] (0/8) Epoch 1, batch 15800, loss[loss=0.1731, simple_loss=0.2351, pruned_loss=0.05557, over 4971.00 frames.], tot_loss[loss=0.1858, simple_loss=0.2492, pruned_loss=0.06122, over 971446.29 frames.], batch size: 24, lr: 9.27e-04 2022-05-03 22:59:38,881 INFO [train.py:715] (0/8) Epoch 1, batch 15850, loss[loss=0.2208, simple_loss=0.2707, pruned_loss=0.08546, over 4808.00 frames.], tot_loss[loss=0.1876, simple_loss=0.2506, pruned_loss=0.0623, over 970896.32 frames.], batch size: 17, lr: 9.26e-04 2022-05-03 23:00:18,437 INFO [train.py:715] (0/8) Epoch 1, batch 15900, loss[loss=0.1775, simple_loss=0.23, pruned_loss=0.0625, over 4822.00 frames.], tot_loss[loss=0.1885, simple_loss=0.2515, pruned_loss=0.0628, over 971771.59 frames.], batch size: 26, lr: 9.26e-04 2022-05-03 23:00:58,073 INFO [train.py:715] (0/8) Epoch 1, batch 15950, loss[loss=0.1774, simple_loss=0.245, pruned_loss=0.05491, over 4950.00 frames.], tot_loss[loss=0.1872, simple_loss=0.2503, pruned_loss=0.06202, over 972369.11 frames.], batch size: 29, lr: 9.25e-04 2022-05-03 23:01:37,505 INFO [train.py:715] (0/8) Epoch 1, batch 16000, loss[loss=0.17, simple_loss=0.2374, pruned_loss=0.05136, over 4932.00 frames.], tot_loss[loss=0.1875, simple_loss=0.2507, pruned_loss=0.06211, over 972841.53 frames.], batch size: 23, lr: 9.25e-04 2022-05-03 23:02:16,263 INFO [train.py:715] (0/8) Epoch 1, batch 16050, loss[loss=0.1948, simple_loss=0.2563, pruned_loss=0.06665, over 4854.00 frames.], tot_loss[loss=0.1872, simple_loss=0.2505, pruned_loss=0.06195, over 972306.09 frames.], batch size: 20, lr: 9.24e-04 2022-05-03 23:02:55,585 INFO [train.py:715] (0/8) Epoch 1, batch 16100, loss[loss=0.2003, simple_loss=0.2549, pruned_loss=0.07285, over 4836.00 frames.], tot_loss[loss=0.1869, simple_loss=0.2505, pruned_loss=0.06172, over 971947.93 frames.], batch size: 26, lr: 9.24e-04 2022-05-03 23:03:35,234 INFO [train.py:715] (0/8) Epoch 1, batch 16150, loss[loss=0.2031, simple_loss=0.2539, pruned_loss=0.07615, over 4870.00 frames.], tot_loss[loss=0.1871, simple_loss=0.2502, pruned_loss=0.06195, over 971606.66 frames.], batch size: 16, lr: 9.23e-04 2022-05-03 23:04:15,419 INFO [train.py:715] (0/8) Epoch 1, batch 16200, loss[loss=0.1683, simple_loss=0.2247, pruned_loss=0.05594, over 4842.00 frames.], tot_loss[loss=0.1865, simple_loss=0.2497, pruned_loss=0.06169, over 971550.29 frames.], batch size: 30, lr: 9.23e-04 2022-05-03 23:04:53,730 INFO [train.py:715] (0/8) Epoch 1, batch 16250, loss[loss=0.2126, simple_loss=0.2676, pruned_loss=0.07879, over 4861.00 frames.], tot_loss[loss=0.1873, simple_loss=0.2501, pruned_loss=0.06224, over 971413.05 frames.], batch size: 32, lr: 9.22e-04 2022-05-03 23:05:33,196 INFO [train.py:715] (0/8) Epoch 1, batch 16300, loss[loss=0.189, simple_loss=0.245, pruned_loss=0.06643, over 4819.00 frames.], tot_loss[loss=0.1858, simple_loss=0.2492, pruned_loss=0.06118, over 971289.53 frames.], batch size: 13, lr: 9.22e-04 2022-05-03 23:06:12,741 INFO [train.py:715] (0/8) Epoch 1, batch 16350, loss[loss=0.1962, simple_loss=0.2568, pruned_loss=0.06776, over 4779.00 frames.], tot_loss[loss=0.1855, simple_loss=0.249, pruned_loss=0.061, over 970536.88 frames.], batch size: 14, lr: 9.22e-04 2022-05-03 23:06:51,403 INFO [train.py:715] (0/8) Epoch 1, batch 16400, loss[loss=0.1796, simple_loss=0.2538, pruned_loss=0.05265, over 4828.00 frames.], tot_loss[loss=0.1852, simple_loss=0.2488, pruned_loss=0.06078, over 970500.72 frames.], batch size: 15, lr: 9.21e-04 2022-05-03 23:07:30,896 INFO [train.py:715] (0/8) Epoch 1, batch 16450, loss[loss=0.1322, simple_loss=0.1966, pruned_loss=0.03393, over 4829.00 frames.], tot_loss[loss=0.1855, simple_loss=0.2489, pruned_loss=0.06101, over 971218.88 frames.], batch size: 12, lr: 9.21e-04 2022-05-03 23:08:10,544 INFO [train.py:715] (0/8) Epoch 1, batch 16500, loss[loss=0.1376, simple_loss=0.2044, pruned_loss=0.03541, over 4937.00 frames.], tot_loss[loss=0.1839, simple_loss=0.2474, pruned_loss=0.06022, over 971310.96 frames.], batch size: 29, lr: 9.20e-04 2022-05-03 23:08:50,453 INFO [train.py:715] (0/8) Epoch 1, batch 16550, loss[loss=0.1574, simple_loss=0.2325, pruned_loss=0.04114, over 4920.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2479, pruned_loss=0.06017, over 971158.08 frames.], batch size: 23, lr: 9.20e-04 2022-05-03 23:09:28,849 INFO [train.py:715] (0/8) Epoch 1, batch 16600, loss[loss=0.1934, simple_loss=0.2603, pruned_loss=0.06325, over 4908.00 frames.], tot_loss[loss=0.1843, simple_loss=0.2479, pruned_loss=0.06036, over 971603.39 frames.], batch size: 19, lr: 9.19e-04 2022-05-03 23:10:09,007 INFO [train.py:715] (0/8) Epoch 1, batch 16650, loss[loss=0.1812, simple_loss=0.2514, pruned_loss=0.05548, over 4961.00 frames.], tot_loss[loss=0.1849, simple_loss=0.2485, pruned_loss=0.0606, over 970685.36 frames.], batch size: 24, lr: 9.19e-04 2022-05-03 23:10:48,686 INFO [train.py:715] (0/8) Epoch 1, batch 16700, loss[loss=0.2521, simple_loss=0.2885, pruned_loss=0.1078, over 4974.00 frames.], tot_loss[loss=0.1849, simple_loss=0.2482, pruned_loss=0.06082, over 971689.60 frames.], batch size: 15, lr: 9.18e-04 2022-05-03 23:11:28,445 INFO [train.py:715] (0/8) Epoch 1, batch 16750, loss[loss=0.1571, simple_loss=0.2186, pruned_loss=0.04778, over 4933.00 frames.], tot_loss[loss=0.1847, simple_loss=0.248, pruned_loss=0.06073, over 972430.96 frames.], batch size: 23, lr: 9.18e-04 2022-05-03 23:12:08,272 INFO [train.py:715] (0/8) Epoch 1, batch 16800, loss[loss=0.1948, simple_loss=0.2575, pruned_loss=0.06606, over 4933.00 frames.], tot_loss[loss=0.1844, simple_loss=0.2478, pruned_loss=0.0605, over 972538.42 frames.], batch size: 29, lr: 9.18e-04 2022-05-03 23:12:47,929 INFO [train.py:715] (0/8) Epoch 1, batch 16850, loss[loss=0.1817, simple_loss=0.2684, pruned_loss=0.0475, over 4913.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2476, pruned_loss=0.06034, over 972042.06 frames.], batch size: 18, lr: 9.17e-04 2022-05-03 23:13:27,911 INFO [train.py:715] (0/8) Epoch 1, batch 16900, loss[loss=0.1765, simple_loss=0.2427, pruned_loss=0.05516, over 4894.00 frames.], tot_loss[loss=0.1853, simple_loss=0.2485, pruned_loss=0.06101, over 972358.30 frames.], batch size: 32, lr: 9.17e-04 2022-05-03 23:14:06,933 INFO [train.py:715] (0/8) Epoch 1, batch 16950, loss[loss=0.1787, simple_loss=0.2441, pruned_loss=0.05664, over 4829.00 frames.], tot_loss[loss=0.1853, simple_loss=0.2487, pruned_loss=0.06089, over 971926.77 frames.], batch size: 30, lr: 9.16e-04 2022-05-03 23:14:46,352 INFO [train.py:715] (0/8) Epoch 1, batch 17000, loss[loss=0.1562, simple_loss=0.2298, pruned_loss=0.0413, over 4882.00 frames.], tot_loss[loss=0.1853, simple_loss=0.2486, pruned_loss=0.06102, over 971875.29 frames.], batch size: 22, lr: 9.16e-04 2022-05-03 23:15:26,356 INFO [train.py:715] (0/8) Epoch 1, batch 17050, loss[loss=0.191, simple_loss=0.2634, pruned_loss=0.05928, over 4840.00 frames.], tot_loss[loss=0.1856, simple_loss=0.249, pruned_loss=0.0611, over 971848.65 frames.], batch size: 32, lr: 9.15e-04 2022-05-03 23:16:05,142 INFO [train.py:715] (0/8) Epoch 1, batch 17100, loss[loss=0.1733, simple_loss=0.2321, pruned_loss=0.05725, over 4710.00 frames.], tot_loss[loss=0.1848, simple_loss=0.2483, pruned_loss=0.0607, over 971660.71 frames.], batch size: 15, lr: 9.15e-04 2022-05-03 23:16:44,867 INFO [train.py:715] (0/8) Epoch 1, batch 17150, loss[loss=0.1424, simple_loss=0.2041, pruned_loss=0.04039, over 4737.00 frames.], tot_loss[loss=0.1854, simple_loss=0.2487, pruned_loss=0.06099, over 971382.63 frames.], batch size: 12, lr: 9.15e-04 2022-05-03 23:17:25,497 INFO [train.py:715] (0/8) Epoch 1, batch 17200, loss[loss=0.2012, simple_loss=0.268, pruned_loss=0.06721, over 4943.00 frames.], tot_loss[loss=0.185, simple_loss=0.2488, pruned_loss=0.06064, over 972026.36 frames.], batch size: 21, lr: 9.14e-04 2022-05-03 23:18:05,280 INFO [train.py:715] (0/8) Epoch 1, batch 17250, loss[loss=0.1891, simple_loss=0.2485, pruned_loss=0.06484, over 4863.00 frames.], tot_loss[loss=0.1852, simple_loss=0.2486, pruned_loss=0.06095, over 972435.87 frames.], batch size: 32, lr: 9.14e-04 2022-05-03 23:18:43,792 INFO [train.py:715] (0/8) Epoch 1, batch 17300, loss[loss=0.1893, simple_loss=0.2536, pruned_loss=0.06245, over 4935.00 frames.], tot_loss[loss=0.1839, simple_loss=0.2473, pruned_loss=0.06029, over 972877.47 frames.], batch size: 23, lr: 9.13e-04 2022-05-03 23:19:23,811 INFO [train.py:715] (0/8) Epoch 1, batch 17350, loss[loss=0.2035, simple_loss=0.2582, pruned_loss=0.07442, over 4703.00 frames.], tot_loss[loss=0.1836, simple_loss=0.2472, pruned_loss=0.05997, over 972456.89 frames.], batch size: 15, lr: 9.13e-04 2022-05-03 23:20:03,647 INFO [train.py:715] (0/8) Epoch 1, batch 17400, loss[loss=0.2079, simple_loss=0.2635, pruned_loss=0.07611, over 4831.00 frames.], tot_loss[loss=0.1829, simple_loss=0.2466, pruned_loss=0.05961, over 972961.74 frames.], batch size: 13, lr: 9.12e-04 2022-05-03 23:20:42,897 INFO [train.py:715] (0/8) Epoch 1, batch 17450, loss[loss=0.1624, simple_loss=0.2277, pruned_loss=0.04858, over 4804.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2472, pruned_loss=0.05975, over 973000.95 frames.], batch size: 12, lr: 9.12e-04 2022-05-03 23:21:23,300 INFO [train.py:715] (0/8) Epoch 1, batch 17500, loss[loss=0.2037, simple_loss=0.2649, pruned_loss=0.07122, over 4788.00 frames.], tot_loss[loss=0.1849, simple_loss=0.248, pruned_loss=0.06089, over 972303.20 frames.], batch size: 12, lr: 9.11e-04 2022-05-03 23:22:03,728 INFO [train.py:715] (0/8) Epoch 1, batch 17550, loss[loss=0.1403, simple_loss=0.2176, pruned_loss=0.03151, over 4811.00 frames.], tot_loss[loss=0.1859, simple_loss=0.2487, pruned_loss=0.06152, over 972031.12 frames.], batch size: 14, lr: 9.11e-04 2022-05-03 23:22:44,351 INFO [train.py:715] (0/8) Epoch 1, batch 17600, loss[loss=0.1703, simple_loss=0.2359, pruned_loss=0.05232, over 4880.00 frames.], tot_loss[loss=0.186, simple_loss=0.2491, pruned_loss=0.06146, over 972202.66 frames.], batch size: 22, lr: 9.11e-04 2022-05-03 23:23:24,045 INFO [train.py:715] (0/8) Epoch 1, batch 17650, loss[loss=0.1708, simple_loss=0.2398, pruned_loss=0.05096, over 4943.00 frames.], tot_loss[loss=0.186, simple_loss=0.2492, pruned_loss=0.0614, over 972403.48 frames.], batch size: 23, lr: 9.10e-04 2022-05-03 23:24:04,741 INFO [train.py:715] (0/8) Epoch 1, batch 17700, loss[loss=0.1597, simple_loss=0.2275, pruned_loss=0.04593, over 4945.00 frames.], tot_loss[loss=0.1846, simple_loss=0.2476, pruned_loss=0.06081, over 972711.50 frames.], batch size: 23, lr: 9.10e-04 2022-05-03 23:24:44,988 INFO [train.py:715] (0/8) Epoch 1, batch 17750, loss[loss=0.2347, simple_loss=0.273, pruned_loss=0.09823, over 4864.00 frames.], tot_loss[loss=0.1862, simple_loss=0.2486, pruned_loss=0.06187, over 973368.97 frames.], batch size: 38, lr: 9.09e-04 2022-05-03 23:25:24,524 INFO [train.py:715] (0/8) Epoch 1, batch 17800, loss[loss=0.1851, simple_loss=0.2507, pruned_loss=0.05974, over 4873.00 frames.], tot_loss[loss=0.1849, simple_loss=0.248, pruned_loss=0.06089, over 972742.01 frames.], batch size: 16, lr: 9.09e-04 2022-05-03 23:26:04,932 INFO [train.py:715] (0/8) Epoch 1, batch 17850, loss[loss=0.2195, simple_loss=0.2885, pruned_loss=0.0752, over 4873.00 frames.], tot_loss[loss=0.1848, simple_loss=0.2481, pruned_loss=0.06078, over 971607.51 frames.], batch size: 20, lr: 9.08e-04 2022-05-03 23:26:44,327 INFO [train.py:715] (0/8) Epoch 1, batch 17900, loss[loss=0.1535, simple_loss=0.2115, pruned_loss=0.04779, over 4913.00 frames.], tot_loss[loss=0.1848, simple_loss=0.2478, pruned_loss=0.06086, over 972428.63 frames.], batch size: 17, lr: 9.08e-04 2022-05-03 23:27:23,560 INFO [train.py:715] (0/8) Epoch 1, batch 17950, loss[loss=0.1776, simple_loss=0.2376, pruned_loss=0.05881, over 4914.00 frames.], tot_loss[loss=0.1855, simple_loss=0.2485, pruned_loss=0.0613, over 972623.86 frames.], batch size: 17, lr: 9.08e-04 2022-05-03 23:28:02,862 INFO [train.py:715] (0/8) Epoch 1, batch 18000, loss[loss=0.1895, simple_loss=0.2561, pruned_loss=0.06139, over 4983.00 frames.], tot_loss[loss=0.1851, simple_loss=0.2482, pruned_loss=0.06099, over 973193.45 frames.], batch size: 14, lr: 9.07e-04 2022-05-03 23:28:02,863 INFO [train.py:733] (0/8) Computing validation loss 2022-05-03 23:28:17,471 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.123, simple_loss=0.21, pruned_loss=0.01804, over 914524.00 frames. 2022-05-03 23:28:56,685 INFO [train.py:715] (0/8) Epoch 1, batch 18050, loss[loss=0.157, simple_loss=0.2275, pruned_loss=0.04321, over 4901.00 frames.], tot_loss[loss=0.1851, simple_loss=0.2481, pruned_loss=0.06103, over 972497.42 frames.], batch size: 19, lr: 9.07e-04 2022-05-03 23:29:37,121 INFO [train.py:715] (0/8) Epoch 1, batch 18100, loss[loss=0.1789, simple_loss=0.247, pruned_loss=0.05539, over 4794.00 frames.], tot_loss[loss=0.1842, simple_loss=0.2474, pruned_loss=0.06052, over 972368.60 frames.], batch size: 17, lr: 9.06e-04 2022-05-03 23:30:16,934 INFO [train.py:715] (0/8) Epoch 1, batch 18150, loss[loss=0.1579, simple_loss=0.2282, pruned_loss=0.04386, over 4854.00 frames.], tot_loss[loss=0.1852, simple_loss=0.2484, pruned_loss=0.06103, over 973529.79 frames.], batch size: 20, lr: 9.06e-04 2022-05-03 23:30:55,304 INFO [train.py:715] (0/8) Epoch 1, batch 18200, loss[loss=0.1713, simple_loss=0.2382, pruned_loss=0.05225, over 4685.00 frames.], tot_loss[loss=0.1838, simple_loss=0.2472, pruned_loss=0.06024, over 972618.91 frames.], batch size: 15, lr: 9.05e-04 2022-05-03 23:31:34,990 INFO [train.py:715] (0/8) Epoch 1, batch 18250, loss[loss=0.1501, simple_loss=0.22, pruned_loss=0.04015, over 4963.00 frames.], tot_loss[loss=0.1836, simple_loss=0.2471, pruned_loss=0.06008, over 972838.37 frames.], batch size: 24, lr: 9.05e-04 2022-05-03 23:32:14,617 INFO [train.py:715] (0/8) Epoch 1, batch 18300, loss[loss=0.1839, simple_loss=0.2566, pruned_loss=0.05564, over 4899.00 frames.], tot_loss[loss=0.1839, simple_loss=0.247, pruned_loss=0.06036, over 972379.35 frames.], batch size: 29, lr: 9.05e-04 2022-05-03 23:32:53,398 INFO [train.py:715] (0/8) Epoch 1, batch 18350, loss[loss=0.1916, simple_loss=0.261, pruned_loss=0.06109, over 4844.00 frames.], tot_loss[loss=0.185, simple_loss=0.2483, pruned_loss=0.06085, over 973028.70 frames.], batch size: 32, lr: 9.04e-04 2022-05-03 23:33:33,136 INFO [train.py:715] (0/8) Epoch 1, batch 18400, loss[loss=0.151, simple_loss=0.2223, pruned_loss=0.03987, over 4773.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2475, pruned_loss=0.0604, over 972476.19 frames.], batch size: 17, lr: 9.04e-04 2022-05-03 23:34:13,413 INFO [train.py:715] (0/8) Epoch 1, batch 18450, loss[loss=0.218, simple_loss=0.2789, pruned_loss=0.07857, over 4904.00 frames.], tot_loss[loss=0.1851, simple_loss=0.2485, pruned_loss=0.06081, over 972906.05 frames.], batch size: 17, lr: 9.03e-04 2022-05-03 23:34:52,243 INFO [train.py:715] (0/8) Epoch 1, batch 18500, loss[loss=0.1768, simple_loss=0.2392, pruned_loss=0.05716, over 4960.00 frames.], tot_loss[loss=0.1844, simple_loss=0.2477, pruned_loss=0.0605, over 972711.22 frames.], batch size: 21, lr: 9.03e-04 2022-05-03 23:35:31,275 INFO [train.py:715] (0/8) Epoch 1, batch 18550, loss[loss=0.1858, simple_loss=0.2429, pruned_loss=0.06434, over 4707.00 frames.], tot_loss[loss=0.185, simple_loss=0.2484, pruned_loss=0.06075, over 972178.89 frames.], batch size: 15, lr: 9.03e-04 2022-05-03 23:36:11,454 INFO [train.py:715] (0/8) Epoch 1, batch 18600, loss[loss=0.1517, simple_loss=0.231, pruned_loss=0.03616, over 4987.00 frames.], tot_loss[loss=0.1843, simple_loss=0.2478, pruned_loss=0.06038, over 972159.46 frames.], batch size: 20, lr: 9.02e-04 2022-05-03 23:36:50,772 INFO [train.py:715] (0/8) Epoch 1, batch 18650, loss[loss=0.1891, simple_loss=0.2571, pruned_loss=0.06055, over 4974.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2476, pruned_loss=0.06027, over 972116.91 frames.], batch size: 24, lr: 9.02e-04 2022-05-03 23:37:29,515 INFO [train.py:715] (0/8) Epoch 1, batch 18700, loss[loss=0.176, simple_loss=0.2391, pruned_loss=0.05642, over 4976.00 frames.], tot_loss[loss=0.1838, simple_loss=0.2474, pruned_loss=0.06014, over 973226.74 frames.], batch size: 14, lr: 9.01e-04 2022-05-03 23:38:08,765 INFO [train.py:715] (0/8) Epoch 1, batch 18750, loss[loss=0.1654, simple_loss=0.2344, pruned_loss=0.04815, over 4786.00 frames.], tot_loss[loss=0.1842, simple_loss=0.2477, pruned_loss=0.06041, over 973093.73 frames.], batch size: 18, lr: 9.01e-04 2022-05-03 23:38:48,691 INFO [train.py:715] (0/8) Epoch 1, batch 18800, loss[loss=0.2023, simple_loss=0.2545, pruned_loss=0.0751, over 4820.00 frames.], tot_loss[loss=0.1837, simple_loss=0.2472, pruned_loss=0.06006, over 973098.06 frames.], batch size: 13, lr: 9.00e-04 2022-05-03 23:39:27,392 INFO [train.py:715] (0/8) Epoch 1, batch 18850, loss[loss=0.1883, simple_loss=0.2515, pruned_loss=0.06251, over 4933.00 frames.], tot_loss[loss=0.1832, simple_loss=0.247, pruned_loss=0.05966, over 973831.98 frames.], batch size: 21, lr: 9.00e-04 2022-05-03 23:40:06,877 INFO [train.py:715] (0/8) Epoch 1, batch 18900, loss[loss=0.1599, simple_loss=0.2195, pruned_loss=0.05009, over 4773.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2469, pruned_loss=0.05994, over 973568.72 frames.], batch size: 12, lr: 9.00e-04 2022-05-03 23:40:46,610 INFO [train.py:715] (0/8) Epoch 1, batch 18950, loss[loss=0.211, simple_loss=0.2604, pruned_loss=0.08076, over 4913.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2472, pruned_loss=0.05979, over 973675.48 frames.], batch size: 18, lr: 8.99e-04 2022-05-03 23:41:25,999 INFO [train.py:715] (0/8) Epoch 1, batch 19000, loss[loss=0.1936, simple_loss=0.2598, pruned_loss=0.06372, over 4907.00 frames.], tot_loss[loss=0.1832, simple_loss=0.2474, pruned_loss=0.05955, over 973456.98 frames.], batch size: 18, lr: 8.99e-04 2022-05-03 23:42:05,679 INFO [train.py:715] (0/8) Epoch 1, batch 19050, loss[loss=0.1759, simple_loss=0.2385, pruned_loss=0.05668, over 4820.00 frames.], tot_loss[loss=0.1831, simple_loss=0.2473, pruned_loss=0.05943, over 973545.95 frames.], batch size: 25, lr: 8.98e-04 2022-05-03 23:42:44,851 INFO [train.py:715] (0/8) Epoch 1, batch 19100, loss[loss=0.1789, simple_loss=0.2521, pruned_loss=0.05291, over 4816.00 frames.], tot_loss[loss=0.1829, simple_loss=0.247, pruned_loss=0.05937, over 973361.53 frames.], batch size: 25, lr: 8.98e-04 2022-05-03 23:43:24,776 INFO [train.py:715] (0/8) Epoch 1, batch 19150, loss[loss=0.1751, simple_loss=0.2413, pruned_loss=0.05446, over 4896.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2472, pruned_loss=0.0598, over 973065.73 frames.], batch size: 19, lr: 8.98e-04 2022-05-03 23:44:03,412 INFO [train.py:715] (0/8) Epoch 1, batch 19200, loss[loss=0.1795, simple_loss=0.2536, pruned_loss=0.05273, over 4679.00 frames.], tot_loss[loss=0.1819, simple_loss=0.2462, pruned_loss=0.05879, over 972561.78 frames.], batch size: 15, lr: 8.97e-04 2022-05-03 23:44:42,700 INFO [train.py:715] (0/8) Epoch 1, batch 19250, loss[loss=0.2114, simple_loss=0.2769, pruned_loss=0.07295, over 4822.00 frames.], tot_loss[loss=0.1815, simple_loss=0.2457, pruned_loss=0.05867, over 972748.08 frames.], batch size: 25, lr: 8.97e-04 2022-05-03 23:45:23,330 INFO [train.py:715] (0/8) Epoch 1, batch 19300, loss[loss=0.1917, simple_loss=0.2551, pruned_loss=0.06415, over 4886.00 frames.], tot_loss[loss=0.1822, simple_loss=0.2458, pruned_loss=0.05933, over 972400.17 frames.], batch size: 16, lr: 8.96e-04 2022-05-03 23:46:02,789 INFO [train.py:715] (0/8) Epoch 1, batch 19350, loss[loss=0.1703, simple_loss=0.2346, pruned_loss=0.05298, over 4816.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2446, pruned_loss=0.05885, over 973183.00 frames.], batch size: 27, lr: 8.96e-04 2022-05-03 23:46:41,173 INFO [train.py:715] (0/8) Epoch 1, batch 19400, loss[loss=0.1594, simple_loss=0.2265, pruned_loss=0.04612, over 4942.00 frames.], tot_loss[loss=0.1817, simple_loss=0.2456, pruned_loss=0.0589, over 972893.48 frames.], batch size: 21, lr: 8.95e-04 2022-05-03 23:47:20,597 INFO [train.py:715] (0/8) Epoch 1, batch 19450, loss[loss=0.1878, simple_loss=0.2483, pruned_loss=0.0637, over 4866.00 frames.], tot_loss[loss=0.1817, simple_loss=0.2457, pruned_loss=0.05882, over 972960.60 frames.], batch size: 20, lr: 8.95e-04 2022-05-03 23:48:00,488 INFO [train.py:715] (0/8) Epoch 1, batch 19500, loss[loss=0.1913, simple_loss=0.2521, pruned_loss=0.06521, over 4750.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2464, pruned_loss=0.05962, over 973124.73 frames.], batch size: 16, lr: 8.95e-04 2022-05-03 23:48:39,204 INFO [train.py:715] (0/8) Epoch 1, batch 19550, loss[loss=0.1791, simple_loss=0.241, pruned_loss=0.05857, over 4980.00 frames.], tot_loss[loss=0.1834, simple_loss=0.247, pruned_loss=0.05997, over 973639.74 frames.], batch size: 31, lr: 8.94e-04 2022-05-03 23:49:18,329 INFO [train.py:715] (0/8) Epoch 1, batch 19600, loss[loss=0.1792, simple_loss=0.2423, pruned_loss=0.05801, over 4910.00 frames.], tot_loss[loss=0.1845, simple_loss=0.2478, pruned_loss=0.06062, over 973207.67 frames.], batch size: 19, lr: 8.94e-04 2022-05-03 23:49:58,544 INFO [train.py:715] (0/8) Epoch 1, batch 19650, loss[loss=0.188, simple_loss=0.2471, pruned_loss=0.06446, over 4750.00 frames.], tot_loss[loss=0.1842, simple_loss=0.2474, pruned_loss=0.06043, over 972705.53 frames.], batch size: 19, lr: 8.93e-04 2022-05-03 23:50:37,449 INFO [train.py:715] (0/8) Epoch 1, batch 19700, loss[loss=0.157, simple_loss=0.2283, pruned_loss=0.04282, over 4891.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2462, pruned_loss=0.05968, over 971737.34 frames.], batch size: 22, lr: 8.93e-04 2022-05-03 23:51:16,600 INFO [train.py:715] (0/8) Epoch 1, batch 19750, loss[loss=0.199, simple_loss=0.2542, pruned_loss=0.07183, over 4876.00 frames.], tot_loss[loss=0.1835, simple_loss=0.2467, pruned_loss=0.06018, over 971583.52 frames.], batch size: 38, lr: 8.93e-04 2022-05-03 23:51:56,243 INFO [train.py:715] (0/8) Epoch 1, batch 19800, loss[loss=0.2028, simple_loss=0.266, pruned_loss=0.06983, over 4971.00 frames.], tot_loss[loss=0.1854, simple_loss=0.2483, pruned_loss=0.06127, over 972511.94 frames.], batch size: 24, lr: 8.92e-04 2022-05-03 23:52:36,510 INFO [train.py:715] (0/8) Epoch 1, batch 19850, loss[loss=0.1966, simple_loss=0.2643, pruned_loss=0.06446, over 4974.00 frames.], tot_loss[loss=0.1863, simple_loss=0.2493, pruned_loss=0.06165, over 972373.24 frames.], batch size: 24, lr: 8.92e-04 2022-05-03 23:53:15,894 INFO [train.py:715] (0/8) Epoch 1, batch 19900, loss[loss=0.1387, simple_loss=0.2141, pruned_loss=0.03165, over 4932.00 frames.], tot_loss[loss=0.1848, simple_loss=0.2483, pruned_loss=0.06067, over 973096.93 frames.], batch size: 18, lr: 8.91e-04 2022-05-03 23:53:54,991 INFO [train.py:715] (0/8) Epoch 1, batch 19950, loss[loss=0.1825, simple_loss=0.2344, pruned_loss=0.06534, over 4932.00 frames.], tot_loss[loss=0.1848, simple_loss=0.2482, pruned_loss=0.06069, over 973327.54 frames.], batch size: 23, lr: 8.91e-04 2022-05-03 23:54:35,254 INFO [train.py:715] (0/8) Epoch 1, batch 20000, loss[loss=0.1844, simple_loss=0.2494, pruned_loss=0.05968, over 4853.00 frames.], tot_loss[loss=0.1849, simple_loss=0.2489, pruned_loss=0.06041, over 973448.40 frames.], batch size: 13, lr: 8.91e-04 2022-05-03 23:55:14,864 INFO [train.py:715] (0/8) Epoch 1, batch 20050, loss[loss=0.1822, simple_loss=0.2398, pruned_loss=0.06236, over 4879.00 frames.], tot_loss[loss=0.1843, simple_loss=0.2482, pruned_loss=0.06023, over 973027.33 frames.], batch size: 20, lr: 8.90e-04 2022-05-03 23:55:54,267 INFO [train.py:715] (0/8) Epoch 1, batch 20100, loss[loss=0.1563, simple_loss=0.2275, pruned_loss=0.04259, over 4942.00 frames.], tot_loss[loss=0.1839, simple_loss=0.2475, pruned_loss=0.06016, over 972718.48 frames.], batch size: 21, lr: 8.90e-04 2022-05-03 23:56:34,288 INFO [train.py:715] (0/8) Epoch 1, batch 20150, loss[loss=0.1823, simple_loss=0.2465, pruned_loss=0.05904, over 4954.00 frames.], tot_loss[loss=0.1837, simple_loss=0.2473, pruned_loss=0.06004, over 972922.37 frames.], batch size: 21, lr: 8.89e-04 2022-05-03 23:57:15,159 INFO [train.py:715] (0/8) Epoch 1, batch 20200, loss[loss=0.2151, simple_loss=0.2709, pruned_loss=0.07968, over 4978.00 frames.], tot_loss[loss=0.1834, simple_loss=0.247, pruned_loss=0.05988, over 973585.93 frames.], batch size: 15, lr: 8.89e-04 2022-05-03 23:57:53,974 INFO [train.py:715] (0/8) Epoch 1, batch 20250, loss[loss=0.1674, simple_loss=0.2379, pruned_loss=0.04845, over 4869.00 frames.], tot_loss[loss=0.1832, simple_loss=0.2476, pruned_loss=0.05939, over 973245.91 frames.], batch size: 16, lr: 8.89e-04 2022-05-03 23:58:33,274 INFO [train.py:715] (0/8) Epoch 1, batch 20300, loss[loss=0.1444, simple_loss=0.2176, pruned_loss=0.03556, over 4827.00 frames.], tot_loss[loss=0.1838, simple_loss=0.248, pruned_loss=0.05983, over 973138.49 frames.], batch size: 27, lr: 8.88e-04 2022-05-03 23:59:13,203 INFO [train.py:715] (0/8) Epoch 1, batch 20350, loss[loss=0.156, simple_loss=0.2224, pruned_loss=0.0448, over 4988.00 frames.], tot_loss[loss=0.1852, simple_loss=0.249, pruned_loss=0.06069, over 972815.16 frames.], batch size: 25, lr: 8.88e-04 2022-05-03 23:59:51,741 INFO [train.py:715] (0/8) Epoch 1, batch 20400, loss[loss=0.2061, simple_loss=0.2746, pruned_loss=0.06878, over 4887.00 frames.], tot_loss[loss=0.1855, simple_loss=0.249, pruned_loss=0.06095, over 972994.14 frames.], batch size: 22, lr: 8.87e-04 2022-05-04 00:00:31,300 INFO [train.py:715] (0/8) Epoch 1, batch 20450, loss[loss=0.185, simple_loss=0.2425, pruned_loss=0.06377, over 4962.00 frames.], tot_loss[loss=0.1851, simple_loss=0.249, pruned_loss=0.06061, over 973549.69 frames.], batch size: 24, lr: 8.87e-04 2022-05-04 00:01:10,343 INFO [train.py:715] (0/8) Epoch 1, batch 20500, loss[loss=0.1937, simple_loss=0.2598, pruned_loss=0.06379, over 4756.00 frames.], tot_loss[loss=0.1845, simple_loss=0.2484, pruned_loss=0.06029, over 973464.79 frames.], batch size: 12, lr: 8.87e-04 2022-05-04 00:01:50,044 INFO [train.py:715] (0/8) Epoch 1, batch 20550, loss[loss=0.1719, simple_loss=0.2436, pruned_loss=0.0501, over 4844.00 frames.], tot_loss[loss=0.184, simple_loss=0.2477, pruned_loss=0.0601, over 973347.30 frames.], batch size: 15, lr: 8.86e-04 2022-05-04 00:02:28,910 INFO [train.py:715] (0/8) Epoch 1, batch 20600, loss[loss=0.1708, simple_loss=0.2345, pruned_loss=0.05357, over 4843.00 frames.], tot_loss[loss=0.1836, simple_loss=0.2473, pruned_loss=0.06001, over 973370.32 frames.], batch size: 13, lr: 8.86e-04 2022-05-04 00:03:08,454 INFO [train.py:715] (0/8) Epoch 1, batch 20650, loss[loss=0.1815, simple_loss=0.2511, pruned_loss=0.05591, over 4939.00 frames.], tot_loss[loss=0.1844, simple_loss=0.248, pruned_loss=0.06039, over 971914.58 frames.], batch size: 35, lr: 8.85e-04 2022-05-04 00:03:48,939 INFO [train.py:715] (0/8) Epoch 1, batch 20700, loss[loss=0.183, simple_loss=0.2599, pruned_loss=0.05303, over 4941.00 frames.], tot_loss[loss=0.185, simple_loss=0.2483, pruned_loss=0.06085, over 972659.71 frames.], batch size: 23, lr: 8.85e-04 2022-05-04 00:04:28,581 INFO [train.py:715] (0/8) Epoch 1, batch 20750, loss[loss=0.1594, simple_loss=0.221, pruned_loss=0.04893, over 4911.00 frames.], tot_loss[loss=0.1838, simple_loss=0.2477, pruned_loss=0.05993, over 972769.48 frames.], batch size: 18, lr: 8.85e-04 2022-05-04 00:05:07,882 INFO [train.py:715] (0/8) Epoch 1, batch 20800, loss[loss=0.2222, simple_loss=0.2731, pruned_loss=0.08562, over 4934.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2467, pruned_loss=0.05944, over 972997.32 frames.], batch size: 39, lr: 8.84e-04 2022-05-04 00:05:47,750 INFO [train.py:715] (0/8) Epoch 1, batch 20850, loss[loss=0.1528, simple_loss=0.2156, pruned_loss=0.045, over 4727.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2466, pruned_loss=0.05945, over 972243.72 frames.], batch size: 12, lr: 8.84e-04 2022-05-04 00:06:27,487 INFO [train.py:715] (0/8) Epoch 1, batch 20900, loss[loss=0.1869, simple_loss=0.2364, pruned_loss=0.06872, over 4863.00 frames.], tot_loss[loss=0.182, simple_loss=0.2459, pruned_loss=0.05912, over 971986.25 frames.], batch size: 16, lr: 8.83e-04 2022-05-04 00:07:06,279 INFO [train.py:715] (0/8) Epoch 1, batch 20950, loss[loss=0.2267, simple_loss=0.2858, pruned_loss=0.0838, over 4781.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2452, pruned_loss=0.05867, over 971678.85 frames.], batch size: 17, lr: 8.83e-04 2022-05-04 00:07:45,660 INFO [train.py:715] (0/8) Epoch 1, batch 21000, loss[loss=0.1835, simple_loss=0.2547, pruned_loss=0.05613, over 4748.00 frames.], tot_loss[loss=0.1827, simple_loss=0.2466, pruned_loss=0.05942, over 972879.33 frames.], batch size: 16, lr: 8.83e-04 2022-05-04 00:07:45,661 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 00:08:00,762 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1226, simple_loss=0.2094, pruned_loss=0.01784, over 914524.00 frames. 2022-05-04 00:08:40,112 INFO [train.py:715] (0/8) Epoch 1, batch 21050, loss[loss=0.1853, simple_loss=0.2634, pruned_loss=0.05357, over 4984.00 frames.], tot_loss[loss=0.1835, simple_loss=0.2472, pruned_loss=0.05984, over 973048.06 frames.], batch size: 28, lr: 8.82e-04 2022-05-04 00:09:19,952 INFO [train.py:715] (0/8) Epoch 1, batch 21100, loss[loss=0.179, simple_loss=0.2276, pruned_loss=0.0652, over 4877.00 frames.], tot_loss[loss=0.183, simple_loss=0.2466, pruned_loss=0.05967, over 973382.78 frames.], batch size: 32, lr: 8.82e-04 2022-05-04 00:09:58,325 INFO [train.py:715] (0/8) Epoch 1, batch 21150, loss[loss=0.1815, simple_loss=0.2463, pruned_loss=0.05836, over 4698.00 frames.], tot_loss[loss=0.1831, simple_loss=0.2468, pruned_loss=0.05973, over 973581.22 frames.], batch size: 15, lr: 8.81e-04 2022-05-04 00:10:29,612 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-56000.pt 2022-05-04 00:10:40,731 INFO [train.py:715] (0/8) Epoch 1, batch 21200, loss[loss=0.2167, simple_loss=0.2677, pruned_loss=0.08283, over 4834.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2466, pruned_loss=0.05946, over 973893.70 frames.], batch size: 30, lr: 8.81e-04 2022-05-04 00:11:20,093 INFO [train.py:715] (0/8) Epoch 1, batch 21250, loss[loss=0.1878, simple_loss=0.2548, pruned_loss=0.06038, over 4865.00 frames.], tot_loss[loss=0.1827, simple_loss=0.2462, pruned_loss=0.05961, over 973957.59 frames.], batch size: 20, lr: 8.81e-04 2022-05-04 00:11:59,280 INFO [train.py:715] (0/8) Epoch 1, batch 21300, loss[loss=0.178, simple_loss=0.2272, pruned_loss=0.06435, over 4800.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2461, pruned_loss=0.05977, over 973332.04 frames.], batch size: 12, lr: 8.80e-04 2022-05-04 00:12:38,151 INFO [train.py:715] (0/8) Epoch 1, batch 21350, loss[loss=0.1764, simple_loss=0.2446, pruned_loss=0.05408, over 4829.00 frames.], tot_loss[loss=0.1822, simple_loss=0.2453, pruned_loss=0.05954, over 973028.13 frames.], batch size: 15, lr: 8.80e-04 2022-05-04 00:13:17,806 INFO [train.py:715] (0/8) Epoch 1, batch 21400, loss[loss=0.1839, simple_loss=0.2465, pruned_loss=0.06066, over 4928.00 frames.], tot_loss[loss=0.1823, simple_loss=0.2459, pruned_loss=0.05938, over 972731.56 frames.], batch size: 29, lr: 8.80e-04 2022-05-04 00:13:57,971 INFO [train.py:715] (0/8) Epoch 1, batch 21450, loss[loss=0.1875, simple_loss=0.2575, pruned_loss=0.05877, over 4778.00 frames.], tot_loss[loss=0.1831, simple_loss=0.2464, pruned_loss=0.05988, over 973110.02 frames.], batch size: 18, lr: 8.79e-04 2022-05-04 00:14:36,218 INFO [train.py:715] (0/8) Epoch 1, batch 21500, loss[loss=0.192, simple_loss=0.2612, pruned_loss=0.06138, over 4898.00 frames.], tot_loss[loss=0.1827, simple_loss=0.2464, pruned_loss=0.05949, over 973441.48 frames.], batch size: 17, lr: 8.79e-04 2022-05-04 00:15:15,314 INFO [train.py:715] (0/8) Epoch 1, batch 21550, loss[loss=0.1547, simple_loss=0.2195, pruned_loss=0.0449, over 4781.00 frames.], tot_loss[loss=0.1826, simple_loss=0.2464, pruned_loss=0.05942, over 972690.08 frames.], batch size: 18, lr: 8.78e-04 2022-05-04 00:15:54,611 INFO [train.py:715] (0/8) Epoch 1, batch 21600, loss[loss=0.1741, simple_loss=0.2259, pruned_loss=0.06119, over 4841.00 frames.], tot_loss[loss=0.1823, simple_loss=0.2459, pruned_loss=0.05931, over 971853.25 frames.], batch size: 15, lr: 8.78e-04 2022-05-04 00:16:33,922 INFO [train.py:715] (0/8) Epoch 1, batch 21650, loss[loss=0.1734, simple_loss=0.2384, pruned_loss=0.05425, over 4960.00 frames.], tot_loss[loss=0.1827, simple_loss=0.2466, pruned_loss=0.0594, over 972420.02 frames.], batch size: 15, lr: 8.78e-04 2022-05-04 00:17:12,483 INFO [train.py:715] (0/8) Epoch 1, batch 21700, loss[loss=0.1954, simple_loss=0.2519, pruned_loss=0.0694, over 4987.00 frames.], tot_loss[loss=0.183, simple_loss=0.2466, pruned_loss=0.05976, over 972057.35 frames.], batch size: 25, lr: 8.77e-04 2022-05-04 00:17:52,134 INFO [train.py:715] (0/8) Epoch 1, batch 21750, loss[loss=0.1917, simple_loss=0.2391, pruned_loss=0.07218, over 4986.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2465, pruned_loss=0.06014, over 971453.64 frames.], batch size: 31, lr: 8.77e-04 2022-05-04 00:18:31,692 INFO [train.py:715] (0/8) Epoch 1, batch 21800, loss[loss=0.2163, simple_loss=0.2749, pruned_loss=0.07887, over 4878.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2466, pruned_loss=0.06078, over 972021.03 frames.], batch size: 16, lr: 8.76e-04 2022-05-04 00:19:10,442 INFO [train.py:715] (0/8) Epoch 1, batch 21850, loss[loss=0.1586, simple_loss=0.2264, pruned_loss=0.04542, over 4965.00 frames.], tot_loss[loss=0.1838, simple_loss=0.2467, pruned_loss=0.06043, over 972170.87 frames.], batch size: 21, lr: 8.76e-04 2022-05-04 00:19:50,601 INFO [train.py:715] (0/8) Epoch 1, batch 21900, loss[loss=0.1629, simple_loss=0.2238, pruned_loss=0.051, over 4850.00 frames.], tot_loss[loss=0.1825, simple_loss=0.2457, pruned_loss=0.05967, over 972011.83 frames.], batch size: 13, lr: 8.76e-04 2022-05-04 00:20:30,155 INFO [train.py:715] (0/8) Epoch 1, batch 21950, loss[loss=0.1573, simple_loss=0.2192, pruned_loss=0.04767, over 4829.00 frames.], tot_loss[loss=0.1819, simple_loss=0.245, pruned_loss=0.05938, over 971623.73 frames.], batch size: 13, lr: 8.75e-04 2022-05-04 00:21:09,941 INFO [train.py:715] (0/8) Epoch 1, batch 22000, loss[loss=0.1679, simple_loss=0.2355, pruned_loss=0.0501, over 4802.00 frames.], tot_loss[loss=0.1814, simple_loss=0.2449, pruned_loss=0.05893, over 971898.19 frames.], batch size: 21, lr: 8.75e-04 2022-05-04 00:21:48,907 INFO [train.py:715] (0/8) Epoch 1, batch 22050, loss[loss=0.1823, simple_loss=0.2285, pruned_loss=0.06799, over 4800.00 frames.], tot_loss[loss=0.1817, simple_loss=0.2453, pruned_loss=0.05904, over 972248.45 frames.], batch size: 14, lr: 8.75e-04 2022-05-04 00:22:28,896 INFO [train.py:715] (0/8) Epoch 1, batch 22100, loss[loss=0.2024, simple_loss=0.2638, pruned_loss=0.07051, over 4936.00 frames.], tot_loss[loss=0.1826, simple_loss=0.2461, pruned_loss=0.05958, over 972197.52 frames.], batch size: 23, lr: 8.74e-04 2022-05-04 00:23:08,227 INFO [train.py:715] (0/8) Epoch 1, batch 22150, loss[loss=0.1853, simple_loss=0.2493, pruned_loss=0.06068, over 4942.00 frames.], tot_loss[loss=0.1836, simple_loss=0.2473, pruned_loss=0.05997, over 972531.65 frames.], batch size: 21, lr: 8.74e-04 2022-05-04 00:23:46,652 INFO [train.py:715] (0/8) Epoch 1, batch 22200, loss[loss=0.1911, simple_loss=0.2621, pruned_loss=0.06005, over 4684.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2478, pruned_loss=0.06022, over 971136.79 frames.], batch size: 15, lr: 8.73e-04 2022-05-04 00:24:25,884 INFO [train.py:715] (0/8) Epoch 1, batch 22250, loss[loss=0.1721, simple_loss=0.2396, pruned_loss=0.05225, over 4970.00 frames.], tot_loss[loss=0.183, simple_loss=0.2468, pruned_loss=0.05962, over 972204.22 frames.], batch size: 24, lr: 8.73e-04 2022-05-04 00:25:05,565 INFO [train.py:715] (0/8) Epoch 1, batch 22300, loss[loss=0.2059, simple_loss=0.2757, pruned_loss=0.06808, over 4793.00 frames.], tot_loss[loss=0.1833, simple_loss=0.2471, pruned_loss=0.05978, over 971646.10 frames.], batch size: 24, lr: 8.73e-04 2022-05-04 00:25:45,330 INFO [train.py:715] (0/8) Epoch 1, batch 22350, loss[loss=0.1988, simple_loss=0.2673, pruned_loss=0.06512, over 4920.00 frames.], tot_loss[loss=0.184, simple_loss=0.2478, pruned_loss=0.06006, over 970461.48 frames.], batch size: 18, lr: 8.72e-04 2022-05-04 00:26:24,291 INFO [train.py:715] (0/8) Epoch 1, batch 22400, loss[loss=0.1528, simple_loss=0.2284, pruned_loss=0.03857, over 4824.00 frames.], tot_loss[loss=0.184, simple_loss=0.248, pruned_loss=0.06003, over 970088.23 frames.], batch size: 26, lr: 8.72e-04 2022-05-04 00:27:04,016 INFO [train.py:715] (0/8) Epoch 1, batch 22450, loss[loss=0.1829, simple_loss=0.2522, pruned_loss=0.05684, over 4898.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2476, pruned_loss=0.05962, over 970245.08 frames.], batch size: 19, lr: 8.72e-04 2022-05-04 00:27:43,653 INFO [train.py:715] (0/8) Epoch 1, batch 22500, loss[loss=0.164, simple_loss=0.238, pruned_loss=0.04496, over 4939.00 frames.], tot_loss[loss=0.1837, simple_loss=0.248, pruned_loss=0.05976, over 970668.99 frames.], batch size: 29, lr: 8.71e-04 2022-05-04 00:28:22,150 INFO [train.py:715] (0/8) Epoch 1, batch 22550, loss[loss=0.1702, simple_loss=0.2325, pruned_loss=0.05397, over 4933.00 frames.], tot_loss[loss=0.1842, simple_loss=0.2483, pruned_loss=0.0601, over 972401.47 frames.], batch size: 18, lr: 8.71e-04 2022-05-04 00:29:02,216 INFO [train.py:715] (0/8) Epoch 1, batch 22600, loss[loss=0.1695, simple_loss=0.2363, pruned_loss=0.05138, over 4981.00 frames.], tot_loss[loss=0.1844, simple_loss=0.2485, pruned_loss=0.06013, over 972389.35 frames.], batch size: 28, lr: 8.70e-04 2022-05-04 00:29:42,693 INFO [train.py:715] (0/8) Epoch 1, batch 22650, loss[loss=0.1968, simple_loss=0.2587, pruned_loss=0.06745, over 4931.00 frames.], tot_loss[loss=0.1846, simple_loss=0.2484, pruned_loss=0.06043, over 972052.13 frames.], batch size: 23, lr: 8.70e-04 2022-05-04 00:30:22,589 INFO [train.py:715] (0/8) Epoch 1, batch 22700, loss[loss=0.1937, simple_loss=0.2543, pruned_loss=0.06653, over 4952.00 frames.], tot_loss[loss=0.1841, simple_loss=0.2479, pruned_loss=0.06016, over 972228.57 frames.], batch size: 35, lr: 8.70e-04 2022-05-04 00:31:00,982 INFO [train.py:715] (0/8) Epoch 1, batch 22750, loss[loss=0.2076, simple_loss=0.2671, pruned_loss=0.07409, over 4801.00 frames.], tot_loss[loss=0.1829, simple_loss=0.2469, pruned_loss=0.0595, over 972687.13 frames.], batch size: 14, lr: 8.69e-04 2022-05-04 00:31:41,165 INFO [train.py:715] (0/8) Epoch 1, batch 22800, loss[loss=0.1446, simple_loss=0.2131, pruned_loss=0.03806, over 4845.00 frames.], tot_loss[loss=0.1831, simple_loss=0.2469, pruned_loss=0.05965, over 972572.10 frames.], batch size: 13, lr: 8.69e-04 2022-05-04 00:32:20,891 INFO [train.py:715] (0/8) Epoch 1, batch 22850, loss[loss=0.1924, simple_loss=0.2553, pruned_loss=0.06477, over 4930.00 frames.], tot_loss[loss=0.183, simple_loss=0.2471, pruned_loss=0.0595, over 972314.73 frames.], batch size: 29, lr: 8.68e-04 2022-05-04 00:32:59,723 INFO [train.py:715] (0/8) Epoch 1, batch 22900, loss[loss=0.1747, simple_loss=0.2403, pruned_loss=0.05456, over 4984.00 frames.], tot_loss[loss=0.1846, simple_loss=0.2481, pruned_loss=0.06056, over 972515.20 frames.], batch size: 28, lr: 8.68e-04 2022-05-04 00:33:39,277 INFO [train.py:715] (0/8) Epoch 1, batch 22950, loss[loss=0.1997, simple_loss=0.2509, pruned_loss=0.07429, over 4876.00 frames.], tot_loss[loss=0.1837, simple_loss=0.2472, pruned_loss=0.06007, over 972343.42 frames.], batch size: 16, lr: 8.68e-04 2022-05-04 00:34:19,076 INFO [train.py:715] (0/8) Epoch 1, batch 23000, loss[loss=0.1803, simple_loss=0.2397, pruned_loss=0.06047, over 4809.00 frames.], tot_loss[loss=0.1837, simple_loss=0.2469, pruned_loss=0.06018, over 971200.10 frames.], batch size: 24, lr: 8.67e-04 2022-05-04 00:34:58,002 INFO [train.py:715] (0/8) Epoch 1, batch 23050, loss[loss=0.2032, simple_loss=0.2707, pruned_loss=0.06788, over 4910.00 frames.], tot_loss[loss=0.1831, simple_loss=0.2467, pruned_loss=0.05976, over 971726.24 frames.], batch size: 18, lr: 8.67e-04 2022-05-04 00:35:37,122 INFO [train.py:715] (0/8) Epoch 1, batch 23100, loss[loss=0.1921, simple_loss=0.2464, pruned_loss=0.06889, over 4983.00 frames.], tot_loss[loss=0.184, simple_loss=0.2474, pruned_loss=0.0603, over 972091.77 frames.], batch size: 31, lr: 8.67e-04 2022-05-04 00:36:16,859 INFO [train.py:715] (0/8) Epoch 1, batch 23150, loss[loss=0.1308, simple_loss=0.1996, pruned_loss=0.03099, over 4874.00 frames.], tot_loss[loss=0.1838, simple_loss=0.2474, pruned_loss=0.06008, over 972543.76 frames.], batch size: 16, lr: 8.66e-04 2022-05-04 00:36:56,385 INFO [train.py:715] (0/8) Epoch 1, batch 23200, loss[loss=0.1545, simple_loss=0.2179, pruned_loss=0.04556, over 4855.00 frames.], tot_loss[loss=0.1829, simple_loss=0.2466, pruned_loss=0.05958, over 972770.08 frames.], batch size: 30, lr: 8.66e-04 2022-05-04 00:37:34,616 INFO [train.py:715] (0/8) Epoch 1, batch 23250, loss[loss=0.1671, simple_loss=0.2355, pruned_loss=0.04938, over 4780.00 frames.], tot_loss[loss=0.1833, simple_loss=0.2473, pruned_loss=0.05966, over 973939.39 frames.], batch size: 17, lr: 8.66e-04 2022-05-04 00:38:14,200 INFO [train.py:715] (0/8) Epoch 1, batch 23300, loss[loss=0.1504, simple_loss=0.22, pruned_loss=0.04041, over 4808.00 frames.], tot_loss[loss=0.1834, simple_loss=0.2471, pruned_loss=0.05986, over 973628.60 frames.], batch size: 25, lr: 8.65e-04 2022-05-04 00:38:53,773 INFO [train.py:715] (0/8) Epoch 1, batch 23350, loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02936, over 4794.00 frames.], tot_loss[loss=0.1825, simple_loss=0.2465, pruned_loss=0.05926, over 973209.40 frames.], batch size: 24, lr: 8.65e-04 2022-05-04 00:39:32,075 INFO [train.py:715] (0/8) Epoch 1, batch 23400, loss[loss=0.1742, simple_loss=0.2383, pruned_loss=0.05508, over 4986.00 frames.], tot_loss[loss=0.1823, simple_loss=0.2462, pruned_loss=0.05923, over 973522.74 frames.], batch size: 25, lr: 8.64e-04 2022-05-04 00:40:11,309 INFO [train.py:715] (0/8) Epoch 1, batch 23450, loss[loss=0.186, simple_loss=0.2358, pruned_loss=0.06814, over 4785.00 frames.], tot_loss[loss=0.1827, simple_loss=0.2466, pruned_loss=0.0594, over 972224.03 frames.], batch size: 12, lr: 8.64e-04 2022-05-04 00:40:50,696 INFO [train.py:715] (0/8) Epoch 1, batch 23500, loss[loss=0.1685, simple_loss=0.232, pruned_loss=0.05244, over 4848.00 frames.], tot_loss[loss=0.1832, simple_loss=0.247, pruned_loss=0.05968, over 972015.35 frames.], batch size: 30, lr: 8.64e-04 2022-05-04 00:41:29,529 INFO [train.py:715] (0/8) Epoch 1, batch 23550, loss[loss=0.1622, simple_loss=0.2255, pruned_loss=0.04943, over 4761.00 frames.], tot_loss[loss=0.1829, simple_loss=0.2465, pruned_loss=0.05966, over 970992.56 frames.], batch size: 12, lr: 8.63e-04 2022-05-04 00:42:07,731 INFO [train.py:715] (0/8) Epoch 1, batch 23600, loss[loss=0.2027, simple_loss=0.2677, pruned_loss=0.06881, over 4929.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2452, pruned_loss=0.05872, over 970927.21 frames.], batch size: 18, lr: 8.63e-04 2022-05-04 00:42:47,239 INFO [train.py:715] (0/8) Epoch 1, batch 23650, loss[loss=0.1719, simple_loss=0.2438, pruned_loss=0.05004, over 4804.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2449, pruned_loss=0.05879, over 971217.01 frames.], batch size: 21, lr: 8.63e-04 2022-05-04 00:43:26,754 INFO [train.py:715] (0/8) Epoch 1, batch 23700, loss[loss=0.2295, simple_loss=0.28, pruned_loss=0.08953, over 4973.00 frames.], tot_loss[loss=0.1814, simple_loss=0.2451, pruned_loss=0.0589, over 971849.41 frames.], batch size: 35, lr: 8.62e-04 2022-05-04 00:44:05,095 INFO [train.py:715] (0/8) Epoch 1, batch 23750, loss[loss=0.1754, simple_loss=0.2378, pruned_loss=0.05647, over 4788.00 frames.], tot_loss[loss=0.183, simple_loss=0.2466, pruned_loss=0.05966, over 972081.86 frames.], batch size: 14, lr: 8.62e-04 2022-05-04 00:44:44,149 INFO [train.py:715] (0/8) Epoch 1, batch 23800, loss[loss=0.2063, simple_loss=0.2645, pruned_loss=0.07409, over 4777.00 frames.], tot_loss[loss=0.1833, simple_loss=0.2468, pruned_loss=0.05992, over 972098.91 frames.], batch size: 17, lr: 8.61e-04 2022-05-04 00:45:24,229 INFO [train.py:715] (0/8) Epoch 1, batch 23850, loss[loss=0.1445, simple_loss=0.2136, pruned_loss=0.03773, over 4921.00 frames.], tot_loss[loss=0.1826, simple_loss=0.2463, pruned_loss=0.05944, over 973207.04 frames.], batch size: 18, lr: 8.61e-04 2022-05-04 00:46:03,794 INFO [train.py:715] (0/8) Epoch 1, batch 23900, loss[loss=0.2018, simple_loss=0.2614, pruned_loss=0.07112, over 4787.00 frames.], tot_loss[loss=0.1833, simple_loss=0.247, pruned_loss=0.05977, over 973320.31 frames.], batch size: 17, lr: 8.61e-04 2022-05-04 00:46:42,593 INFO [train.py:715] (0/8) Epoch 1, batch 23950, loss[loss=0.1581, simple_loss=0.2242, pruned_loss=0.04601, over 4882.00 frames.], tot_loss[loss=0.1829, simple_loss=0.2468, pruned_loss=0.05949, over 972520.22 frames.], batch size: 22, lr: 8.60e-04 2022-05-04 00:47:22,334 INFO [train.py:715] (0/8) Epoch 1, batch 24000, loss[loss=0.1944, simple_loss=0.2647, pruned_loss=0.06208, over 4899.00 frames.], tot_loss[loss=0.1825, simple_loss=0.2465, pruned_loss=0.05925, over 972290.92 frames.], batch size: 17, lr: 8.60e-04 2022-05-04 00:47:22,335 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 00:47:34,530 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1217, simple_loss=0.2087, pruned_loss=0.01736, over 914524.00 frames. 2022-05-04 00:48:14,359 INFO [train.py:715] (0/8) Epoch 1, batch 24050, loss[loss=0.1445, simple_loss=0.2135, pruned_loss=0.03781, over 4783.00 frames.], tot_loss[loss=0.183, simple_loss=0.2472, pruned_loss=0.05945, over 972745.42 frames.], batch size: 18, lr: 8.60e-04 2022-05-04 00:48:53,686 INFO [train.py:715] (0/8) Epoch 1, batch 24100, loss[loss=0.1722, simple_loss=0.2365, pruned_loss=0.05392, over 4884.00 frames.], tot_loss[loss=0.1821, simple_loss=0.246, pruned_loss=0.05904, over 973067.58 frames.], batch size: 16, lr: 8.59e-04 2022-05-04 00:49:32,281 INFO [train.py:715] (0/8) Epoch 1, batch 24150, loss[loss=0.1841, simple_loss=0.2498, pruned_loss=0.0592, over 4878.00 frames.], tot_loss[loss=0.1819, simple_loss=0.2456, pruned_loss=0.05913, over 973161.36 frames.], batch size: 22, lr: 8.59e-04 2022-05-04 00:50:11,577 INFO [train.py:715] (0/8) Epoch 1, batch 24200, loss[loss=0.1775, simple_loss=0.2376, pruned_loss=0.0587, over 4969.00 frames.], tot_loss[loss=0.1816, simple_loss=0.2456, pruned_loss=0.05883, over 973154.25 frames.], batch size: 31, lr: 8.59e-04 2022-05-04 00:50:52,251 INFO [train.py:715] (0/8) Epoch 1, batch 24250, loss[loss=0.198, simple_loss=0.245, pruned_loss=0.07547, over 4869.00 frames.], tot_loss[loss=0.1817, simple_loss=0.2455, pruned_loss=0.05894, over 972556.35 frames.], batch size: 16, lr: 8.58e-04 2022-05-04 00:51:31,683 INFO [train.py:715] (0/8) Epoch 1, batch 24300, loss[loss=0.1796, simple_loss=0.2509, pruned_loss=0.05416, over 4796.00 frames.], tot_loss[loss=0.1809, simple_loss=0.245, pruned_loss=0.05836, over 972066.75 frames.], batch size: 21, lr: 8.58e-04 2022-05-04 00:52:11,125 INFO [train.py:715] (0/8) Epoch 1, batch 24350, loss[loss=0.1644, simple_loss=0.2337, pruned_loss=0.0475, over 4766.00 frames.], tot_loss[loss=0.1805, simple_loss=0.2445, pruned_loss=0.05819, over 972276.69 frames.], batch size: 17, lr: 8.57e-04 2022-05-04 00:52:51,502 INFO [train.py:715] (0/8) Epoch 1, batch 24400, loss[loss=0.1501, simple_loss=0.2217, pruned_loss=0.03926, over 4954.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2451, pruned_loss=0.05821, over 972526.27 frames.], batch size: 21, lr: 8.57e-04 2022-05-04 00:53:30,582 INFO [train.py:715] (0/8) Epoch 1, batch 24450, loss[loss=0.1955, simple_loss=0.2518, pruned_loss=0.06962, over 4771.00 frames.], tot_loss[loss=0.1811, simple_loss=0.245, pruned_loss=0.05862, over 971631.17 frames.], batch size: 19, lr: 8.57e-04 2022-05-04 00:54:09,304 INFO [train.py:715] (0/8) Epoch 1, batch 24500, loss[loss=0.1617, simple_loss=0.2271, pruned_loss=0.0481, over 4643.00 frames.], tot_loss[loss=0.1809, simple_loss=0.2448, pruned_loss=0.0585, over 970816.77 frames.], batch size: 13, lr: 8.56e-04 2022-05-04 00:54:48,967 INFO [train.py:715] (0/8) Epoch 1, batch 24550, loss[loss=0.1824, simple_loss=0.2398, pruned_loss=0.06247, over 4954.00 frames.], tot_loss[loss=0.183, simple_loss=0.2462, pruned_loss=0.05991, over 971642.77 frames.], batch size: 24, lr: 8.56e-04 2022-05-04 00:55:29,264 INFO [train.py:715] (0/8) Epoch 1, batch 24600, loss[loss=0.196, simple_loss=0.2478, pruned_loss=0.07211, over 4841.00 frames.], tot_loss[loss=0.1838, simple_loss=0.2468, pruned_loss=0.06045, over 971643.05 frames.], batch size: 32, lr: 8.56e-04 2022-05-04 00:56:08,138 INFO [train.py:715] (0/8) Epoch 1, batch 24650, loss[loss=0.2071, simple_loss=0.2518, pruned_loss=0.08121, over 4849.00 frames.], tot_loss[loss=0.1842, simple_loss=0.2473, pruned_loss=0.06059, over 971392.23 frames.], batch size: 32, lr: 8.55e-04 2022-05-04 00:56:47,166 INFO [train.py:715] (0/8) Epoch 1, batch 24700, loss[loss=0.1684, simple_loss=0.2304, pruned_loss=0.05324, over 4931.00 frames.], tot_loss[loss=0.1835, simple_loss=0.2467, pruned_loss=0.06016, over 972298.08 frames.], batch size: 39, lr: 8.55e-04 2022-05-04 00:57:27,344 INFO [train.py:715] (0/8) Epoch 1, batch 24750, loss[loss=0.1478, simple_loss=0.2183, pruned_loss=0.03862, over 4880.00 frames.], tot_loss[loss=0.1829, simple_loss=0.2461, pruned_loss=0.05988, over 972221.03 frames.], batch size: 16, lr: 8.55e-04 2022-05-04 00:58:06,483 INFO [train.py:715] (0/8) Epoch 1, batch 24800, loss[loss=0.2286, simple_loss=0.3003, pruned_loss=0.07846, over 4752.00 frames.], tot_loss[loss=0.1824, simple_loss=0.2458, pruned_loss=0.05952, over 971621.57 frames.], batch size: 16, lr: 8.54e-04 2022-05-04 00:58:45,108 INFO [train.py:715] (0/8) Epoch 1, batch 24850, loss[loss=0.1515, simple_loss=0.2279, pruned_loss=0.03759, over 4937.00 frames.], tot_loss[loss=0.1822, simple_loss=0.2455, pruned_loss=0.05942, over 972941.71 frames.], batch size: 21, lr: 8.54e-04 2022-05-04 00:59:25,593 INFO [train.py:715] (0/8) Epoch 1, batch 24900, loss[loss=0.1813, simple_loss=0.2521, pruned_loss=0.05523, over 4972.00 frames.], tot_loss[loss=0.1815, simple_loss=0.2457, pruned_loss=0.05867, over 973912.98 frames.], batch size: 24, lr: 8.54e-04 2022-05-04 01:00:05,523 INFO [train.py:715] (0/8) Epoch 1, batch 24950, loss[loss=0.1913, simple_loss=0.263, pruned_loss=0.05982, over 4911.00 frames.], tot_loss[loss=0.1826, simple_loss=0.2465, pruned_loss=0.0594, over 974798.15 frames.], batch size: 18, lr: 8.53e-04 2022-05-04 01:00:44,298 INFO [train.py:715] (0/8) Epoch 1, batch 25000, loss[loss=0.2208, simple_loss=0.2684, pruned_loss=0.08662, over 4908.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2467, pruned_loss=0.05946, over 973859.14 frames.], batch size: 17, lr: 8.53e-04 2022-05-04 01:01:22,938 INFO [train.py:715] (0/8) Epoch 1, batch 25050, loss[loss=0.1955, simple_loss=0.2662, pruned_loss=0.0624, over 4826.00 frames.], tot_loss[loss=0.1821, simple_loss=0.2462, pruned_loss=0.05899, over 973306.11 frames.], batch size: 25, lr: 8.53e-04 2022-05-04 01:02:02,856 INFO [train.py:715] (0/8) Epoch 1, batch 25100, loss[loss=0.1505, simple_loss=0.2175, pruned_loss=0.04175, over 4801.00 frames.], tot_loss[loss=0.1833, simple_loss=0.2473, pruned_loss=0.05964, over 972932.20 frames.], batch size: 14, lr: 8.52e-04 2022-05-04 01:02:42,036 INFO [train.py:715] (0/8) Epoch 1, batch 25150, loss[loss=0.2106, simple_loss=0.259, pruned_loss=0.08115, over 4778.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2467, pruned_loss=0.05946, over 973551.24 frames.], batch size: 17, lr: 8.52e-04 2022-05-04 01:03:20,877 INFO [train.py:715] (0/8) Epoch 1, batch 25200, loss[loss=0.1545, simple_loss=0.2179, pruned_loss=0.04559, over 4871.00 frames.], tot_loss[loss=0.1833, simple_loss=0.2472, pruned_loss=0.05966, over 973772.51 frames.], batch size: 22, lr: 8.51e-04 2022-05-04 01:04:00,101 INFO [train.py:715] (0/8) Epoch 1, batch 25250, loss[loss=0.1964, simple_loss=0.2587, pruned_loss=0.06706, over 4772.00 frames.], tot_loss[loss=0.1826, simple_loss=0.2469, pruned_loss=0.05918, over 972694.13 frames.], batch size: 14, lr: 8.51e-04 2022-05-04 01:04:40,223 INFO [train.py:715] (0/8) Epoch 1, batch 25300, loss[loss=0.1671, simple_loss=0.2299, pruned_loss=0.05211, over 4906.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2457, pruned_loss=0.05844, over 971971.99 frames.], batch size: 18, lr: 8.51e-04 2022-05-04 01:05:18,878 INFO [train.py:715] (0/8) Epoch 1, batch 25350, loss[loss=0.171, simple_loss=0.2314, pruned_loss=0.05523, over 4865.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2456, pruned_loss=0.05841, over 971107.64 frames.], batch size: 32, lr: 8.50e-04 2022-05-04 01:05:58,220 INFO [train.py:715] (0/8) Epoch 1, batch 25400, loss[loss=0.1441, simple_loss=0.2128, pruned_loss=0.03774, over 4723.00 frames.], tot_loss[loss=0.18, simple_loss=0.2443, pruned_loss=0.05791, over 970953.03 frames.], batch size: 12, lr: 8.50e-04 2022-05-04 01:06:38,483 INFO [train.py:715] (0/8) Epoch 1, batch 25450, loss[loss=0.2064, simple_loss=0.2625, pruned_loss=0.07518, over 4842.00 frames.], tot_loss[loss=0.1798, simple_loss=0.2441, pruned_loss=0.05777, over 971828.38 frames.], batch size: 15, lr: 8.50e-04 2022-05-04 01:07:18,425 INFO [train.py:715] (0/8) Epoch 1, batch 25500, loss[loss=0.1944, simple_loss=0.2596, pruned_loss=0.0646, over 4832.00 frames.], tot_loss[loss=0.1815, simple_loss=0.2452, pruned_loss=0.05887, over 971854.77 frames.], batch size: 26, lr: 8.49e-04 2022-05-04 01:07:56,849 INFO [train.py:715] (0/8) Epoch 1, batch 25550, loss[loss=0.172, simple_loss=0.232, pruned_loss=0.05596, over 4830.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2443, pruned_loss=0.05823, over 971796.50 frames.], batch size: 12, lr: 8.49e-04 2022-05-04 01:08:36,999 INFO [train.py:715] (0/8) Epoch 1, batch 25600, loss[loss=0.1678, simple_loss=0.2276, pruned_loss=0.05406, over 4770.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2446, pruned_loss=0.0584, over 972534.15 frames.], batch size: 12, lr: 8.49e-04 2022-05-04 01:09:17,518 INFO [train.py:715] (0/8) Epoch 1, batch 25650, loss[loss=0.1522, simple_loss=0.2287, pruned_loss=0.03791, over 4869.00 frames.], tot_loss[loss=0.1817, simple_loss=0.2456, pruned_loss=0.05891, over 972837.20 frames.], batch size: 22, lr: 8.48e-04 2022-05-04 01:09:57,008 INFO [train.py:715] (0/8) Epoch 1, batch 25700, loss[loss=0.1775, simple_loss=0.2428, pruned_loss=0.05609, over 4926.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2453, pruned_loss=0.05867, over 972186.92 frames.], batch size: 18, lr: 8.48e-04 2022-05-04 01:10:36,905 INFO [train.py:715] (0/8) Epoch 1, batch 25750, loss[loss=0.1679, simple_loss=0.2346, pruned_loss=0.05063, over 4783.00 frames.], tot_loss[loss=0.181, simple_loss=0.2448, pruned_loss=0.05858, over 972347.69 frames.], batch size: 17, lr: 8.48e-04 2022-05-04 01:11:17,397 INFO [train.py:715] (0/8) Epoch 1, batch 25800, loss[loss=0.1484, simple_loss=0.2117, pruned_loss=0.04251, over 4778.00 frames.], tot_loss[loss=0.1822, simple_loss=0.2458, pruned_loss=0.05928, over 972459.22 frames.], batch size: 18, lr: 8.47e-04 2022-05-04 01:11:56,817 INFO [train.py:715] (0/8) Epoch 1, batch 25850, loss[loss=0.184, simple_loss=0.2475, pruned_loss=0.06028, over 4782.00 frames.], tot_loss[loss=0.1821, simple_loss=0.2458, pruned_loss=0.05917, over 972188.49 frames.], batch size: 17, lr: 8.47e-04 2022-05-04 01:12:35,653 INFO [train.py:715] (0/8) Epoch 1, batch 25900, loss[loss=0.1835, simple_loss=0.2363, pruned_loss=0.06542, over 4768.00 frames.], tot_loss[loss=0.1828, simple_loss=0.2469, pruned_loss=0.05931, over 971985.21 frames.], batch size: 12, lr: 8.47e-04 2022-05-04 01:13:15,323 INFO [train.py:715] (0/8) Epoch 1, batch 25950, loss[loss=0.1651, simple_loss=0.226, pruned_loss=0.05209, over 4691.00 frames.], tot_loss[loss=0.1817, simple_loss=0.2457, pruned_loss=0.05886, over 971880.06 frames.], batch size: 15, lr: 8.46e-04 2022-05-04 01:13:55,208 INFO [train.py:715] (0/8) Epoch 1, batch 26000, loss[loss=0.2292, simple_loss=0.275, pruned_loss=0.09167, over 4842.00 frames.], tot_loss[loss=0.1818, simple_loss=0.2461, pruned_loss=0.05873, over 972447.76 frames.], batch size: 30, lr: 8.46e-04 2022-05-04 01:14:34,094 INFO [train.py:715] (0/8) Epoch 1, batch 26050, loss[loss=0.1543, simple_loss=0.2322, pruned_loss=0.03822, over 4814.00 frames.], tot_loss[loss=0.1821, simple_loss=0.2461, pruned_loss=0.05901, over 972341.85 frames.], batch size: 15, lr: 8.46e-04 2022-05-04 01:15:13,493 INFO [train.py:715] (0/8) Epoch 1, batch 26100, loss[loss=0.2048, simple_loss=0.2583, pruned_loss=0.07561, over 4751.00 frames.], tot_loss[loss=0.1819, simple_loss=0.246, pruned_loss=0.05892, over 972028.84 frames.], batch size: 19, lr: 8.45e-04 2022-05-04 01:15:53,625 INFO [train.py:715] (0/8) Epoch 1, batch 26150, loss[loss=0.1707, simple_loss=0.2329, pruned_loss=0.05423, over 4926.00 frames.], tot_loss[loss=0.1805, simple_loss=0.2446, pruned_loss=0.05814, over 971785.50 frames.], batch size: 29, lr: 8.45e-04 2022-05-04 01:16:32,576 INFO [train.py:715] (0/8) Epoch 1, batch 26200, loss[loss=0.1871, simple_loss=0.2534, pruned_loss=0.06043, over 4881.00 frames.], tot_loss[loss=0.1803, simple_loss=0.2446, pruned_loss=0.05797, over 972227.93 frames.], batch size: 22, lr: 8.44e-04 2022-05-04 01:17:11,439 INFO [train.py:715] (0/8) Epoch 1, batch 26250, loss[loss=0.153, simple_loss=0.2179, pruned_loss=0.04399, over 4698.00 frames.], tot_loss[loss=0.1791, simple_loss=0.2437, pruned_loss=0.05721, over 971775.68 frames.], batch size: 15, lr: 8.44e-04 2022-05-04 01:17:51,343 INFO [train.py:715] (0/8) Epoch 1, batch 26300, loss[loss=0.1882, simple_loss=0.2485, pruned_loss=0.06392, over 4874.00 frames.], tot_loss[loss=0.18, simple_loss=0.2445, pruned_loss=0.05781, over 970931.31 frames.], batch size: 30, lr: 8.44e-04 2022-05-04 01:18:31,204 INFO [train.py:715] (0/8) Epoch 1, batch 26350, loss[loss=0.1825, simple_loss=0.2381, pruned_loss=0.06348, over 4750.00 frames.], tot_loss[loss=0.1797, simple_loss=0.2444, pruned_loss=0.05756, over 971225.79 frames.], batch size: 12, lr: 8.43e-04 2022-05-04 01:19:09,971 INFO [train.py:715] (0/8) Epoch 1, batch 26400, loss[loss=0.173, simple_loss=0.2286, pruned_loss=0.05868, over 4792.00 frames.], tot_loss[loss=0.1801, simple_loss=0.2444, pruned_loss=0.05792, over 971523.73 frames.], batch size: 14, lr: 8.43e-04 2022-05-04 01:19:49,172 INFO [train.py:715] (0/8) Epoch 1, batch 26450, loss[loss=0.1924, simple_loss=0.2587, pruned_loss=0.06302, over 4854.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2448, pruned_loss=0.05885, over 970671.03 frames.], batch size: 20, lr: 8.43e-04 2022-05-04 01:20:28,910 INFO [train.py:715] (0/8) Epoch 1, batch 26500, loss[loss=0.1923, simple_loss=0.2508, pruned_loss=0.06694, over 4879.00 frames.], tot_loss[loss=0.181, simple_loss=0.2446, pruned_loss=0.05871, over 970234.88 frames.], batch size: 32, lr: 8.42e-04 2022-05-04 01:21:08,261 INFO [train.py:715] (0/8) Epoch 1, batch 26550, loss[loss=0.168, simple_loss=0.2329, pruned_loss=0.05153, over 4884.00 frames.], tot_loss[loss=0.1801, simple_loss=0.2441, pruned_loss=0.05804, over 970631.23 frames.], batch size: 19, lr: 8.42e-04 2022-05-04 01:21:47,621 INFO [train.py:715] (0/8) Epoch 1, batch 26600, loss[loss=0.186, simple_loss=0.2472, pruned_loss=0.0624, over 4699.00 frames.], tot_loss[loss=0.1805, simple_loss=0.2448, pruned_loss=0.05805, over 970950.41 frames.], batch size: 15, lr: 8.42e-04 2022-05-04 01:22:27,660 INFO [train.py:715] (0/8) Epoch 1, batch 26650, loss[loss=0.1888, simple_loss=0.2546, pruned_loss=0.06151, over 4803.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2447, pruned_loss=0.05827, over 971110.67 frames.], batch size: 14, lr: 8.41e-04 2022-05-04 01:23:07,618 INFO [train.py:715] (0/8) Epoch 1, batch 26700, loss[loss=0.231, simple_loss=0.2785, pruned_loss=0.09173, over 4897.00 frames.], tot_loss[loss=0.1808, simple_loss=0.2448, pruned_loss=0.05838, over 971290.92 frames.], batch size: 16, lr: 8.41e-04 2022-05-04 01:23:46,586 INFO [train.py:715] (0/8) Epoch 1, batch 26750, loss[loss=0.1666, simple_loss=0.2241, pruned_loss=0.0546, over 4841.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2437, pruned_loss=0.0573, over 970924.76 frames.], batch size: 13, lr: 8.41e-04 2022-05-04 01:24:26,595 INFO [train.py:715] (0/8) Epoch 1, batch 26800, loss[loss=0.1664, simple_loss=0.2366, pruned_loss=0.04816, over 4871.00 frames.], tot_loss[loss=0.1791, simple_loss=0.2433, pruned_loss=0.05749, over 971413.65 frames.], batch size: 20, lr: 8.40e-04 2022-05-04 01:25:06,141 INFO [train.py:715] (0/8) Epoch 1, batch 26850, loss[loss=0.1833, simple_loss=0.2493, pruned_loss=0.0587, over 4873.00 frames.], tot_loss[loss=0.1802, simple_loss=0.2443, pruned_loss=0.058, over 970877.26 frames.], batch size: 22, lr: 8.40e-04 2022-05-04 01:25:45,421 INFO [train.py:715] (0/8) Epoch 1, batch 26900, loss[loss=0.1721, simple_loss=0.2301, pruned_loss=0.05704, over 4983.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2452, pruned_loss=0.0581, over 971760.52 frames.], batch size: 15, lr: 8.40e-04 2022-05-04 01:26:24,114 INFO [train.py:715] (0/8) Epoch 1, batch 26950, loss[loss=0.1741, simple_loss=0.2491, pruned_loss=0.04952, over 4831.00 frames.], tot_loss[loss=0.18, simple_loss=0.2446, pruned_loss=0.05771, over 970797.05 frames.], batch size: 25, lr: 8.39e-04 2022-05-04 01:27:04,124 INFO [train.py:715] (0/8) Epoch 1, batch 27000, loss[loss=0.1965, simple_loss=0.2497, pruned_loss=0.07159, over 4973.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2457, pruned_loss=0.0583, over 972272.60 frames.], batch size: 35, lr: 8.39e-04 2022-05-04 01:27:04,125 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 01:27:12,719 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1212, simple_loss=0.2081, pruned_loss=0.01718, over 914524.00 frames. 2022-05-04 01:27:53,060 INFO [train.py:715] (0/8) Epoch 1, batch 27050, loss[loss=0.1826, simple_loss=0.2394, pruned_loss=0.06291, over 4807.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2451, pruned_loss=0.05804, over 970885.97 frames.], batch size: 13, lr: 8.39e-04 2022-05-04 01:28:33,376 INFO [train.py:715] (0/8) Epoch 1, batch 27100, loss[loss=0.1808, simple_loss=0.2432, pruned_loss=0.05917, over 4913.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2453, pruned_loss=0.05806, over 971742.74 frames.], batch size: 29, lr: 8.38e-04 2022-05-04 01:29:11,781 INFO [train.py:715] (0/8) Epoch 1, batch 27150, loss[loss=0.1987, simple_loss=0.2594, pruned_loss=0.06898, over 4832.00 frames.], tot_loss[loss=0.1814, simple_loss=0.246, pruned_loss=0.05841, over 971601.43 frames.], batch size: 15, lr: 8.38e-04 2022-05-04 01:29:51,725 INFO [train.py:715] (0/8) Epoch 1, batch 27200, loss[loss=0.1459, simple_loss=0.2088, pruned_loss=0.04147, over 4904.00 frames.], tot_loss[loss=0.1812, simple_loss=0.246, pruned_loss=0.05821, over 972143.43 frames.], batch size: 17, lr: 8.38e-04 2022-05-04 01:30:32,016 INFO [train.py:715] (0/8) Epoch 1, batch 27250, loss[loss=0.1832, simple_loss=0.2446, pruned_loss=0.06086, over 4987.00 frames.], tot_loss[loss=0.1809, simple_loss=0.2455, pruned_loss=0.05813, over 972678.60 frames.], batch size: 31, lr: 8.37e-04 2022-05-04 01:31:11,133 INFO [train.py:715] (0/8) Epoch 1, batch 27300, loss[loss=0.216, simple_loss=0.2624, pruned_loss=0.08485, over 4969.00 frames.], tot_loss[loss=0.181, simple_loss=0.2454, pruned_loss=0.0583, over 972925.43 frames.], batch size: 35, lr: 8.37e-04 2022-05-04 01:31:49,676 INFO [train.py:715] (0/8) Epoch 1, batch 27350, loss[loss=0.1364, simple_loss=0.2029, pruned_loss=0.03493, over 4802.00 frames.], tot_loss[loss=0.181, simple_loss=0.2454, pruned_loss=0.05828, over 972196.12 frames.], batch size: 14, lr: 8.37e-04 2022-05-04 01:32:29,602 INFO [train.py:715] (0/8) Epoch 1, batch 27400, loss[loss=0.1805, simple_loss=0.2466, pruned_loss=0.0572, over 4916.00 frames.], tot_loss[loss=0.1808, simple_loss=0.2453, pruned_loss=0.05817, over 973005.87 frames.], batch size: 17, lr: 8.36e-04 2022-05-04 01:33:09,601 INFO [train.py:715] (0/8) Epoch 1, batch 27450, loss[loss=0.1664, simple_loss=0.2411, pruned_loss=0.04589, over 4805.00 frames.], tot_loss[loss=0.1805, simple_loss=0.2453, pruned_loss=0.05788, over 972614.04 frames.], batch size: 21, lr: 8.36e-04 2022-05-04 01:33:48,109 INFO [train.py:715] (0/8) Epoch 1, batch 27500, loss[loss=0.2188, simple_loss=0.2798, pruned_loss=0.07887, over 4897.00 frames.], tot_loss[loss=0.18, simple_loss=0.2446, pruned_loss=0.05766, over 972928.10 frames.], batch size: 17, lr: 8.36e-04 2022-05-04 01:34:27,765 INFO [train.py:715] (0/8) Epoch 1, batch 27550, loss[loss=0.184, simple_loss=0.2462, pruned_loss=0.06087, over 4766.00 frames.], tot_loss[loss=0.1808, simple_loss=0.2453, pruned_loss=0.05817, over 972392.03 frames.], batch size: 17, lr: 8.35e-04 2022-05-04 01:35:07,993 INFO [train.py:715] (0/8) Epoch 1, batch 27600, loss[loss=0.2006, simple_loss=0.2698, pruned_loss=0.06567, over 4828.00 frames.], tot_loss[loss=0.181, simple_loss=0.2455, pruned_loss=0.05824, over 972089.88 frames.], batch size: 26, lr: 8.35e-04 2022-05-04 01:35:47,296 INFO [train.py:715] (0/8) Epoch 1, batch 27650, loss[loss=0.217, simple_loss=0.2626, pruned_loss=0.0857, over 4884.00 frames.], tot_loss[loss=0.1819, simple_loss=0.246, pruned_loss=0.05893, over 971871.37 frames.], batch size: 32, lr: 8.35e-04 2022-05-04 01:36:26,739 INFO [train.py:715] (0/8) Epoch 1, batch 27700, loss[loss=0.229, simple_loss=0.2886, pruned_loss=0.08469, over 4931.00 frames.], tot_loss[loss=0.1807, simple_loss=0.245, pruned_loss=0.0582, over 972006.69 frames.], batch size: 18, lr: 8.34e-04 2022-05-04 01:37:07,283 INFO [train.py:715] (0/8) Epoch 1, batch 27750, loss[loss=0.1703, simple_loss=0.2355, pruned_loss=0.05255, over 4802.00 frames.], tot_loss[loss=0.1796, simple_loss=0.2439, pruned_loss=0.05762, over 972812.12 frames.], batch size: 21, lr: 8.34e-04 2022-05-04 01:37:47,074 INFO [train.py:715] (0/8) Epoch 1, batch 27800, loss[loss=0.1421, simple_loss=0.2203, pruned_loss=0.03196, over 4945.00 frames.], tot_loss[loss=0.1797, simple_loss=0.2443, pruned_loss=0.0576, over 973398.79 frames.], batch size: 29, lr: 8.34e-04 2022-05-04 01:38:26,361 INFO [train.py:715] (0/8) Epoch 1, batch 27850, loss[loss=0.2058, simple_loss=0.2708, pruned_loss=0.07036, over 4981.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2449, pruned_loss=0.05817, over 973484.20 frames.], batch size: 39, lr: 8.33e-04 2022-05-04 01:39:06,467 INFO [train.py:715] (0/8) Epoch 1, batch 27900, loss[loss=0.1789, simple_loss=0.2474, pruned_loss=0.05517, over 4924.00 frames.], tot_loss[loss=0.1805, simple_loss=0.245, pruned_loss=0.05799, over 972798.84 frames.], batch size: 29, lr: 8.33e-04 2022-05-04 01:39:45,948 INFO [train.py:715] (0/8) Epoch 1, batch 27950, loss[loss=0.1777, simple_loss=0.2393, pruned_loss=0.05803, over 4937.00 frames.], tot_loss[loss=0.1796, simple_loss=0.2447, pruned_loss=0.05727, over 973220.49 frames.], batch size: 23, lr: 8.33e-04 2022-05-04 01:40:25,329 INFO [train.py:715] (0/8) Epoch 1, batch 28000, loss[loss=0.166, simple_loss=0.2333, pruned_loss=0.04939, over 4925.00 frames.], tot_loss[loss=0.1797, simple_loss=0.245, pruned_loss=0.05717, over 973261.56 frames.], batch size: 23, lr: 8.32e-04 2022-05-04 01:41:04,108 INFO [train.py:715] (0/8) Epoch 1, batch 28050, loss[loss=0.1875, simple_loss=0.2483, pruned_loss=0.0633, over 4815.00 frames.], tot_loss[loss=0.1808, simple_loss=0.2461, pruned_loss=0.05772, over 972465.88 frames.], batch size: 27, lr: 8.32e-04 2022-05-04 01:41:44,527 INFO [train.py:715] (0/8) Epoch 1, batch 28100, loss[loss=0.1847, simple_loss=0.2528, pruned_loss=0.05827, over 4746.00 frames.], tot_loss[loss=0.1809, simple_loss=0.2464, pruned_loss=0.05768, over 972679.40 frames.], batch size: 16, lr: 8.32e-04 2022-05-04 01:42:23,903 INFO [train.py:715] (0/8) Epoch 1, batch 28150, loss[loss=0.1653, simple_loss=0.2284, pruned_loss=0.05109, over 4881.00 frames.], tot_loss[loss=0.1815, simple_loss=0.2468, pruned_loss=0.0581, over 972610.37 frames.], batch size: 22, lr: 8.31e-04 2022-05-04 01:43:03,291 INFO [train.py:715] (0/8) Epoch 1, batch 28200, loss[loss=0.1951, simple_loss=0.2566, pruned_loss=0.06679, over 4930.00 frames.], tot_loss[loss=0.1808, simple_loss=0.2462, pruned_loss=0.05765, over 972704.38 frames.], batch size: 29, lr: 8.31e-04 2022-05-04 01:43:43,972 INFO [train.py:715] (0/8) Epoch 1, batch 28250, loss[loss=0.1792, simple_loss=0.2329, pruned_loss=0.06276, over 4940.00 frames.], tot_loss[loss=0.1814, simple_loss=0.2463, pruned_loss=0.0582, over 972525.08 frames.], batch size: 21, lr: 8.31e-04 2022-05-04 01:44:24,422 INFO [train.py:715] (0/8) Epoch 1, batch 28300, loss[loss=0.1872, simple_loss=0.2574, pruned_loss=0.05847, over 4977.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2454, pruned_loss=0.05771, over 972936.37 frames.], batch size: 15, lr: 8.30e-04 2022-05-04 01:45:03,754 INFO [train.py:715] (0/8) Epoch 1, batch 28350, loss[loss=0.1661, simple_loss=0.2284, pruned_loss=0.0519, over 4803.00 frames.], tot_loss[loss=0.1802, simple_loss=0.245, pruned_loss=0.05765, over 973502.47 frames.], batch size: 12, lr: 8.30e-04 2022-05-04 01:45:42,705 INFO [train.py:715] (0/8) Epoch 1, batch 28400, loss[loss=0.1882, simple_loss=0.2535, pruned_loss=0.06146, over 4971.00 frames.], tot_loss[loss=0.1802, simple_loss=0.2451, pruned_loss=0.05766, over 971998.79 frames.], batch size: 15, lr: 8.30e-04 2022-05-04 01:46:23,133 INFO [train.py:715] (0/8) Epoch 1, batch 28450, loss[loss=0.1718, simple_loss=0.2389, pruned_loss=0.0524, over 4985.00 frames.], tot_loss[loss=0.1795, simple_loss=0.2444, pruned_loss=0.0573, over 971850.45 frames.], batch size: 28, lr: 8.29e-04 2022-05-04 01:47:02,718 INFO [train.py:715] (0/8) Epoch 1, batch 28500, loss[loss=0.2177, simple_loss=0.2817, pruned_loss=0.07686, over 4841.00 frames.], tot_loss[loss=0.1801, simple_loss=0.2449, pruned_loss=0.05761, over 972161.03 frames.], batch size: 15, lr: 8.29e-04 2022-05-04 01:47:41,720 INFO [train.py:715] (0/8) Epoch 1, batch 28550, loss[loss=0.186, simple_loss=0.2417, pruned_loss=0.06517, over 4943.00 frames.], tot_loss[loss=0.1788, simple_loss=0.2439, pruned_loss=0.05688, over 972116.41 frames.], batch size: 39, lr: 8.29e-04 2022-05-04 01:48:22,004 INFO [train.py:715] (0/8) Epoch 1, batch 28600, loss[loss=0.2116, simple_loss=0.2712, pruned_loss=0.07604, over 4991.00 frames.], tot_loss[loss=0.1794, simple_loss=0.2446, pruned_loss=0.05716, over 971506.20 frames.], batch size: 16, lr: 8.28e-04 2022-05-04 01:49:01,954 INFO [train.py:715] (0/8) Epoch 1, batch 28650, loss[loss=0.1991, simple_loss=0.2552, pruned_loss=0.07151, over 4887.00 frames.], tot_loss[loss=0.1801, simple_loss=0.245, pruned_loss=0.05762, over 972280.14 frames.], batch size: 16, lr: 8.28e-04 2022-05-04 01:49:41,106 INFO [train.py:715] (0/8) Epoch 1, batch 28700, loss[loss=0.1854, simple_loss=0.2463, pruned_loss=0.06226, over 4850.00 frames.], tot_loss[loss=0.1799, simple_loss=0.2447, pruned_loss=0.05758, over 972408.77 frames.], batch size: 30, lr: 8.28e-04 2022-05-04 01:50:20,246 INFO [train.py:715] (0/8) Epoch 1, batch 28750, loss[loss=0.1974, simple_loss=0.2648, pruned_loss=0.065, over 4864.00 frames.], tot_loss[loss=0.179, simple_loss=0.2441, pruned_loss=0.05698, over 972696.26 frames.], batch size: 20, lr: 8.27e-04 2022-05-04 01:51:00,840 INFO [train.py:715] (0/8) Epoch 1, batch 28800, loss[loss=0.2137, simple_loss=0.2676, pruned_loss=0.07986, over 4759.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2451, pruned_loss=0.05791, over 972366.87 frames.], batch size: 16, lr: 8.27e-04 2022-05-04 01:51:40,149 INFO [train.py:715] (0/8) Epoch 1, batch 28850, loss[loss=0.1901, simple_loss=0.2446, pruned_loss=0.0678, over 4854.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2453, pruned_loss=0.05852, over 972629.45 frames.], batch size: 30, lr: 8.27e-04 2022-05-04 01:52:19,928 INFO [train.py:715] (0/8) Epoch 1, batch 28900, loss[loss=0.1903, simple_loss=0.249, pruned_loss=0.06578, over 4701.00 frames.], tot_loss[loss=0.1822, simple_loss=0.246, pruned_loss=0.05923, over 972823.53 frames.], batch size: 15, lr: 8.27e-04 2022-05-04 01:53:00,612 INFO [train.py:715] (0/8) Epoch 1, batch 28950, loss[loss=0.1664, simple_loss=0.2316, pruned_loss=0.05056, over 4956.00 frames.], tot_loss[loss=0.1813, simple_loss=0.2451, pruned_loss=0.0587, over 972993.52 frames.], batch size: 21, lr: 8.26e-04 2022-05-04 01:53:40,740 INFO [train.py:715] (0/8) Epoch 1, batch 29000, loss[loss=0.196, simple_loss=0.2648, pruned_loss=0.06355, over 4945.00 frames.], tot_loss[loss=0.1815, simple_loss=0.2455, pruned_loss=0.05872, over 973019.21 frames.], batch size: 29, lr: 8.26e-04 2022-05-04 01:54:19,719 INFO [train.py:715] (0/8) Epoch 1, batch 29050, loss[loss=0.1569, simple_loss=0.2156, pruned_loss=0.0491, over 4800.00 frames.], tot_loss[loss=0.1803, simple_loss=0.2449, pruned_loss=0.05784, over 973275.82 frames.], batch size: 24, lr: 8.26e-04 2022-05-04 01:54:59,589 INFO [train.py:715] (0/8) Epoch 1, batch 29100, loss[loss=0.1767, simple_loss=0.245, pruned_loss=0.05414, over 4965.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2453, pruned_loss=0.05795, over 973007.47 frames.], batch size: 14, lr: 8.25e-04 2022-05-04 01:55:40,265 INFO [train.py:715] (0/8) Epoch 1, batch 29150, loss[loss=0.2008, simple_loss=0.2543, pruned_loss=0.07368, over 4883.00 frames.], tot_loss[loss=0.1815, simple_loss=0.2457, pruned_loss=0.05859, over 972222.92 frames.], batch size: 16, lr: 8.25e-04 2022-05-04 01:56:10,490 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-64000.pt 2022-05-04 01:56:22,373 INFO [train.py:715] (0/8) Epoch 1, batch 29200, loss[loss=0.1936, simple_loss=0.2576, pruned_loss=0.06478, over 4718.00 frames.], tot_loss[loss=0.1818, simple_loss=0.2458, pruned_loss=0.05889, over 972490.16 frames.], batch size: 15, lr: 8.25e-04 2022-05-04 01:57:01,395 INFO [train.py:715] (0/8) Epoch 1, batch 29250, loss[loss=0.1918, simple_loss=0.2497, pruned_loss=0.067, over 4911.00 frames.], tot_loss[loss=0.1821, simple_loss=0.2463, pruned_loss=0.05893, over 972477.01 frames.], batch size: 19, lr: 8.24e-04 2022-05-04 01:57:41,947 INFO [train.py:715] (0/8) Epoch 1, batch 29300, loss[loss=0.1674, simple_loss=0.2377, pruned_loss=0.04851, over 4921.00 frames.], tot_loss[loss=0.1818, simple_loss=0.2459, pruned_loss=0.05885, over 972585.12 frames.], batch size: 18, lr: 8.24e-04 2022-05-04 01:58:22,153 INFO [train.py:715] (0/8) Epoch 1, batch 29350, loss[loss=0.1736, simple_loss=0.2364, pruned_loss=0.05537, over 4821.00 frames.], tot_loss[loss=0.1805, simple_loss=0.2446, pruned_loss=0.05819, over 972436.15 frames.], batch size: 26, lr: 8.24e-04 2022-05-04 01:59:00,692 INFO [train.py:715] (0/8) Epoch 1, batch 29400, loss[loss=0.1553, simple_loss=0.2256, pruned_loss=0.04249, over 4903.00 frames.], tot_loss[loss=0.1787, simple_loss=0.2432, pruned_loss=0.05716, over 972900.55 frames.], batch size: 17, lr: 8.23e-04 2022-05-04 01:59:40,309 INFO [train.py:715] (0/8) Epoch 1, batch 29450, loss[loss=0.1435, simple_loss=0.2213, pruned_loss=0.0329, over 4782.00 frames.], tot_loss[loss=0.179, simple_loss=0.2432, pruned_loss=0.05737, over 972866.79 frames.], batch size: 21, lr: 8.23e-04 2022-05-04 02:00:20,010 INFO [train.py:715] (0/8) Epoch 1, batch 29500, loss[loss=0.1465, simple_loss=0.215, pruned_loss=0.039, over 4810.00 frames.], tot_loss[loss=0.1781, simple_loss=0.2428, pruned_loss=0.05676, over 972708.49 frames.], batch size: 12, lr: 8.23e-04 2022-05-04 02:00:59,412 INFO [train.py:715] (0/8) Epoch 1, batch 29550, loss[loss=0.2253, simple_loss=0.2863, pruned_loss=0.08219, over 4973.00 frames.], tot_loss[loss=0.1783, simple_loss=0.2427, pruned_loss=0.05694, over 972922.10 frames.], batch size: 24, lr: 8.22e-04 2022-05-04 02:01:37,997 INFO [train.py:715] (0/8) Epoch 1, batch 29600, loss[loss=0.1648, simple_loss=0.2358, pruned_loss=0.04695, over 4925.00 frames.], tot_loss[loss=0.1785, simple_loss=0.2429, pruned_loss=0.05704, over 972859.75 frames.], batch size: 18, lr: 8.22e-04 2022-05-04 02:02:18,243 INFO [train.py:715] (0/8) Epoch 1, batch 29650, loss[loss=0.2099, simple_loss=0.2688, pruned_loss=0.07553, over 4691.00 frames.], tot_loss[loss=0.1793, simple_loss=0.2437, pruned_loss=0.05748, over 972721.69 frames.], batch size: 15, lr: 8.22e-04 2022-05-04 02:02:58,335 INFO [train.py:715] (0/8) Epoch 1, batch 29700, loss[loss=0.2283, simple_loss=0.2816, pruned_loss=0.08754, over 4910.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2437, pruned_loss=0.05734, over 973157.19 frames.], batch size: 39, lr: 8.21e-04 2022-05-04 02:03:36,329 INFO [train.py:715] (0/8) Epoch 1, batch 29750, loss[loss=0.1888, simple_loss=0.2471, pruned_loss=0.0653, over 4822.00 frames.], tot_loss[loss=0.1799, simple_loss=0.2442, pruned_loss=0.05777, over 973108.94 frames.], batch size: 26, lr: 8.21e-04 2022-05-04 02:04:15,641 INFO [train.py:715] (0/8) Epoch 1, batch 29800, loss[loss=0.1935, simple_loss=0.2581, pruned_loss=0.06449, over 4821.00 frames.], tot_loss[loss=0.1794, simple_loss=0.2442, pruned_loss=0.05735, over 972949.97 frames.], batch size: 26, lr: 8.21e-04 2022-05-04 02:04:55,054 INFO [train.py:715] (0/8) Epoch 1, batch 29850, loss[loss=0.2103, simple_loss=0.2765, pruned_loss=0.07207, over 4982.00 frames.], tot_loss[loss=0.1794, simple_loss=0.2442, pruned_loss=0.05728, over 972042.70 frames.], batch size: 25, lr: 8.20e-04 2022-05-04 02:05:34,432 INFO [train.py:715] (0/8) Epoch 1, batch 29900, loss[loss=0.1668, simple_loss=0.2269, pruned_loss=0.05333, over 4828.00 frames.], tot_loss[loss=0.1803, simple_loss=0.245, pruned_loss=0.05785, over 971877.00 frames.], batch size: 13, lr: 8.20e-04 2022-05-04 02:06:12,931 INFO [train.py:715] (0/8) Epoch 1, batch 29950, loss[loss=0.1251, simple_loss=0.1868, pruned_loss=0.03174, over 4819.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2438, pruned_loss=0.05729, over 971596.42 frames.], batch size: 26, lr: 8.20e-04 2022-05-04 02:06:52,740 INFO [train.py:715] (0/8) Epoch 1, batch 30000, loss[loss=0.2374, simple_loss=0.2792, pruned_loss=0.09783, over 4972.00 frames.], tot_loss[loss=0.1793, simple_loss=0.2435, pruned_loss=0.05755, over 972260.89 frames.], batch size: 14, lr: 8.20e-04 2022-05-04 02:06:52,741 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 02:07:09,693 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1207, simple_loss=0.2076, pruned_loss=0.01687, over 914524.00 frames. 2022-05-04 02:07:50,183 INFO [train.py:715] (0/8) Epoch 1, batch 30050, loss[loss=0.1554, simple_loss=0.2176, pruned_loss=0.04659, over 4871.00 frames.], tot_loss[loss=0.1793, simple_loss=0.2434, pruned_loss=0.05764, over 972144.38 frames.], batch size: 16, lr: 8.19e-04 2022-05-04 02:08:29,669 INFO [train.py:715] (0/8) Epoch 1, batch 30100, loss[loss=0.1697, simple_loss=0.2423, pruned_loss=0.04856, over 4972.00 frames.], tot_loss[loss=0.1805, simple_loss=0.2445, pruned_loss=0.05822, over 972152.30 frames.], batch size: 24, lr: 8.19e-04 2022-05-04 02:09:09,063 INFO [train.py:715] (0/8) Epoch 1, batch 30150, loss[loss=0.1629, simple_loss=0.217, pruned_loss=0.05442, over 4738.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2442, pruned_loss=0.05849, over 971223.11 frames.], batch size: 16, lr: 8.19e-04 2022-05-04 02:09:48,373 INFO [train.py:715] (0/8) Epoch 1, batch 30200, loss[loss=0.2654, simple_loss=0.3326, pruned_loss=0.09914, over 4910.00 frames.], tot_loss[loss=0.1813, simple_loss=0.245, pruned_loss=0.05875, over 972089.74 frames.], batch size: 17, lr: 8.18e-04 2022-05-04 02:10:28,823 INFO [train.py:715] (0/8) Epoch 1, batch 30250, loss[loss=0.1839, simple_loss=0.2529, pruned_loss=0.0574, over 4917.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2444, pruned_loss=0.05847, over 971954.01 frames.], batch size: 18, lr: 8.18e-04 2022-05-04 02:11:08,801 INFO [train.py:715] (0/8) Epoch 1, batch 30300, loss[loss=0.1724, simple_loss=0.2346, pruned_loss=0.05513, over 4986.00 frames.], tot_loss[loss=0.1801, simple_loss=0.2438, pruned_loss=0.05822, over 972676.89 frames.], batch size: 25, lr: 8.18e-04 2022-05-04 02:11:47,709 INFO [train.py:715] (0/8) Epoch 1, batch 30350, loss[loss=0.1859, simple_loss=0.2468, pruned_loss=0.06252, over 4986.00 frames.], tot_loss[loss=0.1813, simple_loss=0.245, pruned_loss=0.05884, over 971944.89 frames.], batch size: 33, lr: 8.17e-04 2022-05-04 02:12:27,777 INFO [train.py:715] (0/8) Epoch 1, batch 30400, loss[loss=0.1953, simple_loss=0.2318, pruned_loss=0.07947, over 4829.00 frames.], tot_loss[loss=0.1808, simple_loss=0.2449, pruned_loss=0.05834, over 972132.36 frames.], batch size: 30, lr: 8.17e-04 2022-05-04 02:13:07,264 INFO [train.py:715] (0/8) Epoch 1, batch 30450, loss[loss=0.1932, simple_loss=0.2575, pruned_loss=0.06441, over 4928.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2447, pruned_loss=0.05799, over 972982.13 frames.], batch size: 23, lr: 8.17e-04 2022-05-04 02:13:46,445 INFO [train.py:715] (0/8) Epoch 1, batch 30500, loss[loss=0.178, simple_loss=0.24, pruned_loss=0.05799, over 4943.00 frames.], tot_loss[loss=0.1811, simple_loss=0.2455, pruned_loss=0.05836, over 972663.21 frames.], batch size: 35, lr: 8.16e-04 2022-05-04 02:14:25,544 INFO [train.py:715] (0/8) Epoch 1, batch 30550, loss[loss=0.1706, simple_loss=0.2366, pruned_loss=0.0523, over 4922.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2455, pruned_loss=0.05847, over 972759.51 frames.], batch size: 23, lr: 8.16e-04 2022-05-04 02:15:05,345 INFO [train.py:715] (0/8) Epoch 1, batch 30600, loss[loss=0.1621, simple_loss=0.2278, pruned_loss=0.04822, over 4922.00 frames.], tot_loss[loss=0.1803, simple_loss=0.2449, pruned_loss=0.05789, over 972572.34 frames.], batch size: 29, lr: 8.16e-04 2022-05-04 02:15:44,808 INFO [train.py:715] (0/8) Epoch 1, batch 30650, loss[loss=0.1902, simple_loss=0.2594, pruned_loss=0.06053, over 4937.00 frames.], tot_loss[loss=0.18, simple_loss=0.2447, pruned_loss=0.05766, over 971659.20 frames.], batch size: 23, lr: 8.15e-04 2022-05-04 02:16:23,390 INFO [train.py:715] (0/8) Epoch 1, batch 30700, loss[loss=0.1988, simple_loss=0.2576, pruned_loss=0.07003, over 4965.00 frames.], tot_loss[loss=0.1795, simple_loss=0.2444, pruned_loss=0.05732, over 972264.69 frames.], batch size: 24, lr: 8.15e-04 2022-05-04 02:17:03,639 INFO [train.py:715] (0/8) Epoch 1, batch 30750, loss[loss=0.1823, simple_loss=0.2483, pruned_loss=0.05818, over 4940.00 frames.], tot_loss[loss=0.179, simple_loss=0.2441, pruned_loss=0.05697, over 972633.92 frames.], batch size: 39, lr: 8.15e-04 2022-05-04 02:17:43,211 INFO [train.py:715] (0/8) Epoch 1, batch 30800, loss[loss=0.1583, simple_loss=0.2304, pruned_loss=0.04311, over 4884.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2438, pruned_loss=0.05732, over 972435.93 frames.], batch size: 22, lr: 8.15e-04 2022-05-04 02:18:22,134 INFO [train.py:715] (0/8) Epoch 1, batch 30850, loss[loss=0.161, simple_loss=0.2298, pruned_loss=0.04606, over 4766.00 frames.], tot_loss[loss=0.1791, simple_loss=0.2435, pruned_loss=0.05733, over 972125.71 frames.], batch size: 14, lr: 8.14e-04 2022-05-04 02:19:01,718 INFO [train.py:715] (0/8) Epoch 1, batch 30900, loss[loss=0.2031, simple_loss=0.258, pruned_loss=0.07412, over 4862.00 frames.], tot_loss[loss=0.1784, simple_loss=0.2429, pruned_loss=0.05699, over 972021.39 frames.], batch size: 16, lr: 8.14e-04 2022-05-04 02:19:41,347 INFO [train.py:715] (0/8) Epoch 1, batch 30950, loss[loss=0.2113, simple_loss=0.2608, pruned_loss=0.08084, over 4790.00 frames.], tot_loss[loss=0.1797, simple_loss=0.2438, pruned_loss=0.05775, over 971874.17 frames.], batch size: 14, lr: 8.14e-04 2022-05-04 02:20:20,859 INFO [train.py:715] (0/8) Epoch 1, batch 31000, loss[loss=0.1715, simple_loss=0.2306, pruned_loss=0.0562, over 4984.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2449, pruned_loss=0.05811, over 972023.03 frames.], batch size: 15, lr: 8.13e-04 2022-05-04 02:21:00,356 INFO [train.py:715] (0/8) Epoch 1, batch 31050, loss[loss=0.1715, simple_loss=0.2283, pruned_loss=0.05737, over 4859.00 frames.], tot_loss[loss=0.1806, simple_loss=0.245, pruned_loss=0.0581, over 971902.25 frames.], batch size: 32, lr: 8.13e-04 2022-05-04 02:21:40,840 INFO [train.py:715] (0/8) Epoch 1, batch 31100, loss[loss=0.1677, simple_loss=0.2445, pruned_loss=0.04548, over 4957.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2447, pruned_loss=0.05803, over 972464.81 frames.], batch size: 24, lr: 8.13e-04 2022-05-04 02:22:20,583 INFO [train.py:715] (0/8) Epoch 1, batch 31150, loss[loss=0.2181, simple_loss=0.2579, pruned_loss=0.0891, over 4813.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2452, pruned_loss=0.05806, over 971704.63 frames.], batch size: 13, lr: 8.12e-04 2022-05-04 02:22:59,630 INFO [train.py:715] (0/8) Epoch 1, batch 31200, loss[loss=0.1564, simple_loss=0.2295, pruned_loss=0.04162, over 4787.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2451, pruned_loss=0.05785, over 971322.54 frames.], batch size: 17, lr: 8.12e-04 2022-05-04 02:23:39,858 INFO [train.py:715] (0/8) Epoch 1, batch 31250, loss[loss=0.2357, simple_loss=0.2887, pruned_loss=0.09139, over 4890.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2459, pruned_loss=0.05825, over 972522.94 frames.], batch size: 16, lr: 8.12e-04 2022-05-04 02:24:19,625 INFO [train.py:715] (0/8) Epoch 1, batch 31300, loss[loss=0.1857, simple_loss=0.2434, pruned_loss=0.06402, over 4826.00 frames.], tot_loss[loss=0.1814, simple_loss=0.2462, pruned_loss=0.05833, over 971145.64 frames.], batch size: 26, lr: 8.11e-04 2022-05-04 02:24:59,065 INFO [train.py:715] (0/8) Epoch 1, batch 31350, loss[loss=0.1818, simple_loss=0.2394, pruned_loss=0.06212, over 4958.00 frames.], tot_loss[loss=0.1813, simple_loss=0.246, pruned_loss=0.05831, over 971849.32 frames.], batch size: 35, lr: 8.11e-04 2022-05-04 02:25:38,863 INFO [train.py:715] (0/8) Epoch 1, batch 31400, loss[loss=0.1831, simple_loss=0.2562, pruned_loss=0.055, over 4770.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2459, pruned_loss=0.05829, over 972143.01 frames.], batch size: 17, lr: 8.11e-04 2022-05-04 02:26:18,868 INFO [train.py:715] (0/8) Epoch 1, batch 31450, loss[loss=0.1924, simple_loss=0.2519, pruned_loss=0.06645, over 4774.00 frames.], tot_loss[loss=0.1795, simple_loss=0.2445, pruned_loss=0.05722, over 973103.17 frames.], batch size: 18, lr: 8.11e-04 2022-05-04 02:26:58,729 INFO [train.py:715] (0/8) Epoch 1, batch 31500, loss[loss=0.1771, simple_loss=0.2492, pruned_loss=0.05255, over 4898.00 frames.], tot_loss[loss=0.1804, simple_loss=0.2456, pruned_loss=0.05765, over 973660.45 frames.], batch size: 19, lr: 8.10e-04 2022-05-04 02:27:37,233 INFO [train.py:715] (0/8) Epoch 1, batch 31550, loss[loss=0.1776, simple_loss=0.2567, pruned_loss=0.04926, over 4827.00 frames.], tot_loss[loss=0.1811, simple_loss=0.2461, pruned_loss=0.05803, over 973414.32 frames.], batch size: 15, lr: 8.10e-04 2022-05-04 02:28:17,418 INFO [train.py:715] (0/8) Epoch 1, batch 31600, loss[loss=0.1559, simple_loss=0.2182, pruned_loss=0.04678, over 4854.00 frames.], tot_loss[loss=0.1809, simple_loss=0.2456, pruned_loss=0.05811, over 973329.84 frames.], batch size: 13, lr: 8.10e-04 2022-05-04 02:28:57,089 INFO [train.py:715] (0/8) Epoch 1, batch 31650, loss[loss=0.1747, simple_loss=0.2366, pruned_loss=0.05639, over 4849.00 frames.], tot_loss[loss=0.1801, simple_loss=0.2449, pruned_loss=0.05763, over 973491.95 frames.], batch size: 15, lr: 8.09e-04 2022-05-04 02:29:37,006 INFO [train.py:715] (0/8) Epoch 1, batch 31700, loss[loss=0.2743, simple_loss=0.319, pruned_loss=0.1148, over 4754.00 frames.], tot_loss[loss=0.1811, simple_loss=0.2456, pruned_loss=0.05828, over 972399.87 frames.], batch size: 19, lr: 8.09e-04 2022-05-04 02:30:16,364 INFO [train.py:715] (0/8) Epoch 1, batch 31750, loss[loss=0.1753, simple_loss=0.2473, pruned_loss=0.05161, over 4952.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2451, pruned_loss=0.05813, over 971793.50 frames.], batch size: 24, lr: 8.09e-04 2022-05-04 02:30:56,488 INFO [train.py:715] (0/8) Epoch 1, batch 31800, loss[loss=0.1628, simple_loss=0.2343, pruned_loss=0.04565, over 4906.00 frames.], tot_loss[loss=0.1803, simple_loss=0.2448, pruned_loss=0.0579, over 971483.74 frames.], batch size: 19, lr: 8.08e-04 2022-05-04 02:31:36,271 INFO [train.py:715] (0/8) Epoch 1, batch 31850, loss[loss=0.1657, simple_loss=0.2374, pruned_loss=0.04706, over 4820.00 frames.], tot_loss[loss=0.1807, simple_loss=0.2455, pruned_loss=0.05791, over 970926.25 frames.], batch size: 15, lr: 8.08e-04 2022-05-04 02:32:15,746 INFO [train.py:715] (0/8) Epoch 1, batch 31900, loss[loss=0.1614, simple_loss=0.2265, pruned_loss=0.0481, over 4779.00 frames.], tot_loss[loss=0.1799, simple_loss=0.2445, pruned_loss=0.05765, over 971017.76 frames.], batch size: 12, lr: 8.08e-04 2022-05-04 02:32:55,110 INFO [train.py:715] (0/8) Epoch 1, batch 31950, loss[loss=0.1636, simple_loss=0.2306, pruned_loss=0.04832, over 4847.00 frames.], tot_loss[loss=0.1793, simple_loss=0.244, pruned_loss=0.05726, over 970690.44 frames.], batch size: 30, lr: 8.08e-04 2022-05-04 02:33:34,641 INFO [train.py:715] (0/8) Epoch 1, batch 32000, loss[loss=0.1829, simple_loss=0.242, pruned_loss=0.0619, over 4979.00 frames.], tot_loss[loss=0.1793, simple_loss=0.2442, pruned_loss=0.05724, over 970621.92 frames.], batch size: 28, lr: 8.07e-04 2022-05-04 02:34:14,071 INFO [train.py:715] (0/8) Epoch 1, batch 32050, loss[loss=0.1476, simple_loss=0.2257, pruned_loss=0.03481, over 4880.00 frames.], tot_loss[loss=0.1789, simple_loss=0.2435, pruned_loss=0.05708, over 970528.56 frames.], batch size: 16, lr: 8.07e-04 2022-05-04 02:34:53,319 INFO [train.py:715] (0/8) Epoch 1, batch 32100, loss[loss=0.1761, simple_loss=0.2511, pruned_loss=0.05058, over 4885.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2436, pruned_loss=0.05739, over 970811.89 frames.], batch size: 16, lr: 8.07e-04 2022-05-04 02:35:32,942 INFO [train.py:715] (0/8) Epoch 1, batch 32150, loss[loss=0.1775, simple_loss=0.2414, pruned_loss=0.05677, over 4762.00 frames.], tot_loss[loss=0.179, simple_loss=0.2435, pruned_loss=0.0572, over 971267.04 frames.], batch size: 19, lr: 8.06e-04 2022-05-04 02:36:12,941 INFO [train.py:715] (0/8) Epoch 1, batch 32200, loss[loss=0.1675, simple_loss=0.2347, pruned_loss=0.05016, over 4810.00 frames.], tot_loss[loss=0.1785, simple_loss=0.243, pruned_loss=0.05701, over 972124.22 frames.], batch size: 21, lr: 8.06e-04 2022-05-04 02:36:51,843 INFO [train.py:715] (0/8) Epoch 1, batch 32250, loss[loss=0.1785, simple_loss=0.2418, pruned_loss=0.05763, over 4777.00 frames.], tot_loss[loss=0.1788, simple_loss=0.2435, pruned_loss=0.05704, over 972223.56 frames.], batch size: 14, lr: 8.06e-04 2022-05-04 02:37:31,251 INFO [train.py:715] (0/8) Epoch 1, batch 32300, loss[loss=0.1363, simple_loss=0.1974, pruned_loss=0.03758, over 4959.00 frames.], tot_loss[loss=0.1796, simple_loss=0.2441, pruned_loss=0.05752, over 972123.77 frames.], batch size: 15, lr: 8.05e-04 2022-05-04 02:38:10,691 INFO [train.py:715] (0/8) Epoch 1, batch 32350, loss[loss=0.1582, simple_loss=0.2153, pruned_loss=0.05053, over 4770.00 frames.], tot_loss[loss=0.1786, simple_loss=0.2435, pruned_loss=0.05683, over 972203.03 frames.], batch size: 19, lr: 8.05e-04 2022-05-04 02:38:50,286 INFO [train.py:715] (0/8) Epoch 1, batch 32400, loss[loss=0.1798, simple_loss=0.2375, pruned_loss=0.06106, over 4752.00 frames.], tot_loss[loss=0.1786, simple_loss=0.2433, pruned_loss=0.05691, over 971890.57 frames.], batch size: 19, lr: 8.05e-04 2022-05-04 02:39:29,218 INFO [train.py:715] (0/8) Epoch 1, batch 32450, loss[loss=0.1754, simple_loss=0.2341, pruned_loss=0.05838, over 4909.00 frames.], tot_loss[loss=0.1794, simple_loss=0.2439, pruned_loss=0.05744, over 971583.47 frames.], batch size: 18, lr: 8.05e-04 2022-05-04 02:40:08,862 INFO [train.py:715] (0/8) Epoch 1, batch 32500, loss[loss=0.1732, simple_loss=0.2422, pruned_loss=0.05204, over 4899.00 frames.], tot_loss[loss=0.1798, simple_loss=0.2444, pruned_loss=0.05755, over 971573.01 frames.], batch size: 19, lr: 8.04e-04 2022-05-04 02:40:48,382 INFO [train.py:715] (0/8) Epoch 1, batch 32550, loss[loss=0.161, simple_loss=0.2298, pruned_loss=0.04613, over 4782.00 frames.], tot_loss[loss=0.1799, simple_loss=0.2441, pruned_loss=0.05785, over 971101.98 frames.], batch size: 18, lr: 8.04e-04 2022-05-04 02:41:27,300 INFO [train.py:715] (0/8) Epoch 1, batch 32600, loss[loss=0.162, simple_loss=0.2237, pruned_loss=0.05012, over 4823.00 frames.], tot_loss[loss=0.1787, simple_loss=0.243, pruned_loss=0.05724, over 969887.47 frames.], batch size: 26, lr: 8.04e-04 2022-05-04 02:42:06,688 INFO [train.py:715] (0/8) Epoch 1, batch 32650, loss[loss=0.1816, simple_loss=0.2445, pruned_loss=0.05935, over 4944.00 frames.], tot_loss[loss=0.1785, simple_loss=0.2431, pruned_loss=0.057, over 971607.97 frames.], batch size: 21, lr: 8.03e-04 2022-05-04 02:42:46,238 INFO [train.py:715] (0/8) Epoch 1, batch 32700, loss[loss=0.1831, simple_loss=0.2521, pruned_loss=0.05709, over 4809.00 frames.], tot_loss[loss=0.1781, simple_loss=0.2428, pruned_loss=0.05669, over 972285.82 frames.], batch size: 25, lr: 8.03e-04 2022-05-04 02:43:25,967 INFO [train.py:715] (0/8) Epoch 1, batch 32750, loss[loss=0.1641, simple_loss=0.2269, pruned_loss=0.05064, over 4991.00 frames.], tot_loss[loss=0.1795, simple_loss=0.2438, pruned_loss=0.05758, over 971223.24 frames.], batch size: 16, lr: 8.03e-04 2022-05-04 02:44:05,926 INFO [train.py:715] (0/8) Epoch 1, batch 32800, loss[loss=0.1491, simple_loss=0.2141, pruned_loss=0.04204, over 4849.00 frames.], tot_loss[loss=0.1786, simple_loss=0.243, pruned_loss=0.05712, over 971261.23 frames.], batch size: 20, lr: 8.02e-04 2022-05-04 02:44:45,561 INFO [train.py:715] (0/8) Epoch 1, batch 32850, loss[loss=0.2191, simple_loss=0.2846, pruned_loss=0.07682, over 4911.00 frames.], tot_loss[loss=0.1789, simple_loss=0.2432, pruned_loss=0.05732, over 971313.20 frames.], batch size: 17, lr: 8.02e-04 2022-05-04 02:45:24,934 INFO [train.py:715] (0/8) Epoch 1, batch 32900, loss[loss=0.1678, simple_loss=0.2335, pruned_loss=0.05106, over 4762.00 frames.], tot_loss[loss=0.1791, simple_loss=0.2435, pruned_loss=0.05738, over 971356.10 frames.], batch size: 17, lr: 8.02e-04 2022-05-04 02:46:04,183 INFO [train.py:715] (0/8) Epoch 1, batch 32950, loss[loss=0.1599, simple_loss=0.2317, pruned_loss=0.044, over 4936.00 frames.], tot_loss[loss=0.1786, simple_loss=0.2433, pruned_loss=0.057, over 972399.28 frames.], batch size: 29, lr: 8.02e-04 2022-05-04 02:46:43,642 INFO [train.py:715] (0/8) Epoch 1, batch 33000, loss[loss=0.1451, simple_loss=0.2092, pruned_loss=0.04054, over 4826.00 frames.], tot_loss[loss=0.1796, simple_loss=0.2438, pruned_loss=0.05774, over 972586.14 frames.], batch size: 13, lr: 8.01e-04 2022-05-04 02:46:43,643 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 02:46:52,425 INFO [train.py:742] (0/8) Epoch 1, validation: loss=0.1208, simple_loss=0.2074, pruned_loss=0.01714, over 914524.00 frames. 2022-05-04 02:47:32,109 INFO [train.py:715] (0/8) Epoch 1, batch 33050, loss[loss=0.1337, simple_loss=0.2062, pruned_loss=0.03062, over 4781.00 frames.], tot_loss[loss=0.1794, simple_loss=0.2436, pruned_loss=0.05766, over 972432.32 frames.], batch size: 18, lr: 8.01e-04 2022-05-04 02:48:12,131 INFO [train.py:715] (0/8) Epoch 1, batch 33100, loss[loss=0.1608, simple_loss=0.2188, pruned_loss=0.05134, over 4802.00 frames.], tot_loss[loss=0.1796, simple_loss=0.2441, pruned_loss=0.05757, over 972299.13 frames.], batch size: 12, lr: 8.01e-04 2022-05-04 02:48:52,003 INFO [train.py:715] (0/8) Epoch 1, batch 33150, loss[loss=0.1611, simple_loss=0.2262, pruned_loss=0.04793, over 4771.00 frames.], tot_loss[loss=0.181, simple_loss=0.2451, pruned_loss=0.05848, over 972317.03 frames.], batch size: 17, lr: 8.00e-04 2022-05-04 02:49:31,141 INFO [train.py:715] (0/8) Epoch 1, batch 33200, loss[loss=0.1768, simple_loss=0.2391, pruned_loss=0.05725, over 4839.00 frames.], tot_loss[loss=0.1812, simple_loss=0.2454, pruned_loss=0.05855, over 972028.59 frames.], batch size: 26, lr: 8.00e-04 2022-05-04 02:50:11,561 INFO [train.py:715] (0/8) Epoch 1, batch 33250, loss[loss=0.1695, simple_loss=0.2318, pruned_loss=0.05357, over 4954.00 frames.], tot_loss[loss=0.1809, simple_loss=0.2449, pruned_loss=0.05844, over 971960.00 frames.], batch size: 15, lr: 8.00e-04 2022-05-04 02:50:51,595 INFO [train.py:715] (0/8) Epoch 1, batch 33300, loss[loss=0.1967, simple_loss=0.2484, pruned_loss=0.07245, over 4750.00 frames.], tot_loss[loss=0.1806, simple_loss=0.2449, pruned_loss=0.05817, over 971571.38 frames.], batch size: 19, lr: 8.00e-04 2022-05-04 02:51:31,065 INFO [train.py:715] (0/8) Epoch 1, batch 33350, loss[loss=0.1804, simple_loss=0.2443, pruned_loss=0.05825, over 4977.00 frames.], tot_loss[loss=0.181, simple_loss=0.2452, pruned_loss=0.05836, over 972092.65 frames.], batch size: 15, lr: 7.99e-04 2022-05-04 02:52:11,450 INFO [train.py:715] (0/8) Epoch 1, batch 33400, loss[loss=0.1592, simple_loss=0.2332, pruned_loss=0.04262, over 4822.00 frames.], tot_loss[loss=0.1798, simple_loss=0.2441, pruned_loss=0.05777, over 972865.79 frames.], batch size: 26, lr: 7.99e-04 2022-05-04 02:52:51,305 INFO [train.py:715] (0/8) Epoch 1, batch 33450, loss[loss=0.1451, simple_loss=0.2187, pruned_loss=0.03574, over 4806.00 frames.], tot_loss[loss=0.1789, simple_loss=0.2433, pruned_loss=0.0572, over 972494.46 frames.], batch size: 25, lr: 7.99e-04 2022-05-04 02:53:30,409 INFO [train.py:715] (0/8) Epoch 1, batch 33500, loss[loss=0.1834, simple_loss=0.2536, pruned_loss=0.05658, over 4855.00 frames.], tot_loss[loss=0.1782, simple_loss=0.243, pruned_loss=0.05664, over 971433.14 frames.], batch size: 22, lr: 7.98e-04 2022-05-04 02:54:10,342 INFO [train.py:715] (0/8) Epoch 1, batch 33550, loss[loss=0.1964, simple_loss=0.2681, pruned_loss=0.06239, over 4920.00 frames.], tot_loss[loss=0.1785, simple_loss=0.2434, pruned_loss=0.05679, over 971597.02 frames.], batch size: 18, lr: 7.98e-04 2022-05-04 02:54:50,186 INFO [train.py:715] (0/8) Epoch 1, batch 33600, loss[loss=0.1784, simple_loss=0.2512, pruned_loss=0.05275, over 4763.00 frames.], tot_loss[loss=0.1776, simple_loss=0.2425, pruned_loss=0.05632, over 971737.14 frames.], batch size: 19, lr: 7.98e-04 2022-05-04 02:55:29,610 INFO [train.py:715] (0/8) Epoch 1, batch 33650, loss[loss=0.1759, simple_loss=0.2381, pruned_loss=0.05691, over 4752.00 frames.], tot_loss[loss=0.1769, simple_loss=0.2422, pruned_loss=0.05584, over 971418.65 frames.], batch size: 19, lr: 7.97e-04 2022-05-04 02:56:08,654 INFO [train.py:715] (0/8) Epoch 1, batch 33700, loss[loss=0.2104, simple_loss=0.2658, pruned_loss=0.07746, over 4688.00 frames.], tot_loss[loss=0.177, simple_loss=0.2423, pruned_loss=0.05588, over 971740.24 frames.], batch size: 15, lr: 7.97e-04 2022-05-04 02:56:47,810 INFO [train.py:715] (0/8) Epoch 1, batch 33750, loss[loss=0.1768, simple_loss=0.2463, pruned_loss=0.05362, over 4905.00 frames.], tot_loss[loss=0.1759, simple_loss=0.2416, pruned_loss=0.05513, over 971105.74 frames.], batch size: 19, lr: 7.97e-04 2022-05-04 02:57:27,455 INFO [train.py:715] (0/8) Epoch 1, batch 33800, loss[loss=0.1735, simple_loss=0.2443, pruned_loss=0.05136, over 4791.00 frames.], tot_loss[loss=0.1767, simple_loss=0.242, pruned_loss=0.05573, over 970699.07 frames.], batch size: 17, lr: 7.97e-04 2022-05-04 02:58:06,286 INFO [train.py:715] (0/8) Epoch 1, batch 33850, loss[loss=0.1734, simple_loss=0.246, pruned_loss=0.05037, over 4914.00 frames.], tot_loss[loss=0.1767, simple_loss=0.2422, pruned_loss=0.05562, over 971420.36 frames.], batch size: 18, lr: 7.96e-04 2022-05-04 02:58:45,808 INFO [train.py:715] (0/8) Epoch 1, batch 33900, loss[loss=0.1798, simple_loss=0.232, pruned_loss=0.06382, over 4978.00 frames.], tot_loss[loss=0.1777, simple_loss=0.2427, pruned_loss=0.05634, over 971373.25 frames.], batch size: 33, lr: 7.96e-04 2022-05-04 02:59:25,365 INFO [train.py:715] (0/8) Epoch 1, batch 33950, loss[loss=0.1734, simple_loss=0.2454, pruned_loss=0.05075, over 4959.00 frames.], tot_loss[loss=0.1783, simple_loss=0.243, pruned_loss=0.05674, over 971665.10 frames.], batch size: 15, lr: 7.96e-04 2022-05-04 03:00:05,096 INFO [train.py:715] (0/8) Epoch 1, batch 34000, loss[loss=0.1466, simple_loss=0.2158, pruned_loss=0.03868, over 4888.00 frames.], tot_loss[loss=0.1769, simple_loss=0.2425, pruned_loss=0.0557, over 972579.18 frames.], batch size: 19, lr: 7.95e-04 2022-05-04 03:00:44,412 INFO [train.py:715] (0/8) Epoch 1, batch 34050, loss[loss=0.2029, simple_loss=0.2637, pruned_loss=0.07103, over 4896.00 frames.], tot_loss[loss=0.1789, simple_loss=0.244, pruned_loss=0.05688, over 973127.46 frames.], batch size: 38, lr: 7.95e-04 2022-05-04 03:01:23,803 INFO [train.py:715] (0/8) Epoch 1, batch 34100, loss[loss=0.1805, simple_loss=0.2452, pruned_loss=0.05786, over 4876.00 frames.], tot_loss[loss=0.1795, simple_loss=0.2446, pruned_loss=0.05724, over 973148.55 frames.], batch size: 32, lr: 7.95e-04 2022-05-04 03:02:03,181 INFO [train.py:715] (0/8) Epoch 1, batch 34150, loss[loss=0.1777, simple_loss=0.2387, pruned_loss=0.05829, over 4942.00 frames.], tot_loss[loss=0.18, simple_loss=0.245, pruned_loss=0.05746, over 972944.99 frames.], batch size: 35, lr: 7.95e-04 2022-05-04 03:02:42,214 INFO [train.py:715] (0/8) Epoch 1, batch 34200, loss[loss=0.2074, simple_loss=0.259, pruned_loss=0.07792, over 4866.00 frames.], tot_loss[loss=0.1794, simple_loss=0.2445, pruned_loss=0.0572, over 972309.16 frames.], batch size: 34, lr: 7.94e-04 2022-05-04 03:03:21,756 INFO [train.py:715] (0/8) Epoch 1, batch 34250, loss[loss=0.2414, simple_loss=0.2943, pruned_loss=0.09423, over 4936.00 frames.], tot_loss[loss=0.1789, simple_loss=0.2441, pruned_loss=0.05688, over 972846.50 frames.], batch size: 39, lr: 7.94e-04 2022-05-04 03:04:01,442 INFO [train.py:715] (0/8) Epoch 1, batch 34300, loss[loss=0.1982, simple_loss=0.2586, pruned_loss=0.0689, over 4925.00 frames.], tot_loss[loss=0.1782, simple_loss=0.2431, pruned_loss=0.05667, over 971985.77 frames.], batch size: 29, lr: 7.94e-04 2022-05-04 03:04:40,851 INFO [train.py:715] (0/8) Epoch 1, batch 34350, loss[loss=0.1397, simple_loss=0.2213, pruned_loss=0.02903, over 4758.00 frames.], tot_loss[loss=0.1776, simple_loss=0.2428, pruned_loss=0.05619, over 972149.03 frames.], batch size: 19, lr: 7.93e-04 2022-05-04 03:05:19,755 INFO [train.py:715] (0/8) Epoch 1, batch 34400, loss[loss=0.1861, simple_loss=0.2506, pruned_loss=0.06076, over 4827.00 frames.], tot_loss[loss=0.1774, simple_loss=0.2427, pruned_loss=0.05608, over 972476.15 frames.], batch size: 30, lr: 7.93e-04 2022-05-04 03:05:59,262 INFO [train.py:715] (0/8) Epoch 1, batch 34450, loss[loss=0.2054, simple_loss=0.2693, pruned_loss=0.07074, over 4772.00 frames.], tot_loss[loss=0.1776, simple_loss=0.2427, pruned_loss=0.05623, over 972353.16 frames.], batch size: 18, lr: 7.93e-04 2022-05-04 03:06:38,483 INFO [train.py:715] (0/8) Epoch 1, batch 34500, loss[loss=0.1667, simple_loss=0.228, pruned_loss=0.0527, over 4769.00 frames.], tot_loss[loss=0.1777, simple_loss=0.243, pruned_loss=0.05621, over 971717.77 frames.], batch size: 14, lr: 7.93e-04 2022-05-04 03:07:17,767 INFO [train.py:715] (0/8) Epoch 1, batch 34550, loss[loss=0.1931, simple_loss=0.2572, pruned_loss=0.0645, over 4741.00 frames.], tot_loss[loss=0.1785, simple_loss=0.2438, pruned_loss=0.05663, over 971544.27 frames.], batch size: 16, lr: 7.92e-04 2022-05-04 03:07:57,340 INFO [train.py:715] (0/8) Epoch 1, batch 34600, loss[loss=0.2406, simple_loss=0.2893, pruned_loss=0.09596, over 4978.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2442, pruned_loss=0.05708, over 971417.42 frames.], batch size: 35, lr: 7.92e-04 2022-05-04 03:08:37,231 INFO [train.py:715] (0/8) Epoch 1, batch 34650, loss[loss=0.1602, simple_loss=0.2256, pruned_loss=0.04743, over 4979.00 frames.], tot_loss[loss=0.1789, simple_loss=0.2438, pruned_loss=0.05702, over 971873.72 frames.], batch size: 28, lr: 7.92e-04 2022-05-04 03:09:17,429 INFO [train.py:715] (0/8) Epoch 1, batch 34700, loss[loss=0.1865, simple_loss=0.2643, pruned_loss=0.05433, over 4685.00 frames.], tot_loss[loss=0.178, simple_loss=0.2429, pruned_loss=0.05656, over 972383.52 frames.], batch size: 15, lr: 7.91e-04 2022-05-04 03:09:55,742 INFO [train.py:715] (0/8) Epoch 1, batch 34750, loss[loss=0.1631, simple_loss=0.2347, pruned_loss=0.04574, over 4932.00 frames.], tot_loss[loss=0.1792, simple_loss=0.2437, pruned_loss=0.05732, over 972950.78 frames.], batch size: 18, lr: 7.91e-04 2022-05-04 03:10:32,247 INFO [train.py:715] (0/8) Epoch 1, batch 34800, loss[loss=0.1968, simple_loss=0.2619, pruned_loss=0.06584, over 4955.00 frames.], tot_loss[loss=0.1786, simple_loss=0.2433, pruned_loss=0.05698, over 973505.85 frames.], batch size: 21, lr: 7.91e-04 2022-05-04 03:10:40,802 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-1.pt 2022-05-04 03:11:25,704 INFO [train.py:715] (0/8) Epoch 2, batch 0, loss[loss=0.1955, simple_loss=0.258, pruned_loss=0.06654, over 4984.00 frames.], tot_loss[loss=0.1955, simple_loss=0.258, pruned_loss=0.06654, over 4984.00 frames.], batch size: 14, lr: 7.59e-04 2022-05-04 03:12:05,765 INFO [train.py:715] (0/8) Epoch 2, batch 50, loss[loss=0.2023, simple_loss=0.2653, pruned_loss=0.06965, over 4823.00 frames.], tot_loss[loss=0.1795, simple_loss=0.2445, pruned_loss=0.05721, over 219220.08 frames.], batch size: 26, lr: 7.59e-04 2022-05-04 03:12:46,580 INFO [train.py:715] (0/8) Epoch 2, batch 100, loss[loss=0.1902, simple_loss=0.2417, pruned_loss=0.06938, over 4829.00 frames.], tot_loss[loss=0.1777, simple_loss=0.2433, pruned_loss=0.05607, over 386146.00 frames.], batch size: 25, lr: 7.59e-04 2022-05-04 03:13:27,194 INFO [train.py:715] (0/8) Epoch 2, batch 150, loss[loss=0.2304, simple_loss=0.2893, pruned_loss=0.08581, over 4924.00 frames.], tot_loss[loss=0.1769, simple_loss=0.2422, pruned_loss=0.0558, over 516277.80 frames.], batch size: 18, lr: 7.59e-04 2022-05-04 03:14:07,244 INFO [train.py:715] (0/8) Epoch 2, batch 200, loss[loss=0.1765, simple_loss=0.2372, pruned_loss=0.0579, over 4776.00 frames.], tot_loss[loss=0.1765, simple_loss=0.2423, pruned_loss=0.05535, over 616018.39 frames.], batch size: 17, lr: 7.58e-04 2022-05-04 03:14:48,000 INFO [train.py:715] (0/8) Epoch 2, batch 250, loss[loss=0.2065, simple_loss=0.2718, pruned_loss=0.0706, over 4919.00 frames.], tot_loss[loss=0.1754, simple_loss=0.2418, pruned_loss=0.0545, over 695489.15 frames.], batch size: 23, lr: 7.58e-04 2022-05-04 03:15:29,360 INFO [train.py:715] (0/8) Epoch 2, batch 300, loss[loss=0.159, simple_loss=0.2225, pruned_loss=0.0477, over 4930.00 frames.], tot_loss[loss=0.1752, simple_loss=0.2415, pruned_loss=0.05443, over 757584.58 frames.], batch size: 29, lr: 7.58e-04 2022-05-04 03:16:10,302 INFO [train.py:715] (0/8) Epoch 2, batch 350, loss[loss=0.1929, simple_loss=0.2579, pruned_loss=0.06394, over 4858.00 frames.], tot_loss[loss=0.1759, simple_loss=0.242, pruned_loss=0.05493, over 804769.30 frames.], batch size: 32, lr: 7.57e-04 2022-05-04 03:16:49,966 INFO [train.py:715] (0/8) Epoch 2, batch 400, loss[loss=0.18, simple_loss=0.2578, pruned_loss=0.05112, over 4818.00 frames.], tot_loss[loss=0.177, simple_loss=0.2433, pruned_loss=0.05537, over 841973.59 frames.], batch size: 25, lr: 7.57e-04 2022-05-04 03:17:30,478 INFO [train.py:715] (0/8) Epoch 2, batch 450, loss[loss=0.1922, simple_loss=0.2486, pruned_loss=0.06796, over 4979.00 frames.], tot_loss[loss=0.1782, simple_loss=0.244, pruned_loss=0.05621, over 870596.31 frames.], batch size: 31, lr: 7.57e-04 2022-05-04 03:18:11,618 INFO [train.py:715] (0/8) Epoch 2, batch 500, loss[loss=0.1973, simple_loss=0.2584, pruned_loss=0.06812, over 4873.00 frames.], tot_loss[loss=0.177, simple_loss=0.2428, pruned_loss=0.05556, over 893846.55 frames.], batch size: 16, lr: 7.57e-04 2022-05-04 03:18:51,551 INFO [train.py:715] (0/8) Epoch 2, batch 550, loss[loss=0.2043, simple_loss=0.2574, pruned_loss=0.07558, over 4956.00 frames.], tot_loss[loss=0.1766, simple_loss=0.2426, pruned_loss=0.05533, over 911445.02 frames.], batch size: 14, lr: 7.56e-04 2022-05-04 03:19:31,926 INFO [train.py:715] (0/8) Epoch 2, batch 600, loss[loss=0.1879, simple_loss=0.2524, pruned_loss=0.06173, over 4901.00 frames.], tot_loss[loss=0.178, simple_loss=0.2437, pruned_loss=0.05619, over 924849.98 frames.], batch size: 17, lr: 7.56e-04 2022-05-04 03:20:12,750 INFO [train.py:715] (0/8) Epoch 2, batch 650, loss[loss=0.184, simple_loss=0.2541, pruned_loss=0.05698, over 4852.00 frames.], tot_loss[loss=0.1788, simple_loss=0.244, pruned_loss=0.0568, over 935683.99 frames.], batch size: 30, lr: 7.56e-04 2022-05-04 03:20:53,346 INFO [train.py:715] (0/8) Epoch 2, batch 700, loss[loss=0.1823, simple_loss=0.2519, pruned_loss=0.05636, over 4978.00 frames.], tot_loss[loss=0.1784, simple_loss=0.2439, pruned_loss=0.05645, over 945360.39 frames.], batch size: 15, lr: 7.56e-04 2022-05-04 03:21:32,904 INFO [train.py:715] (0/8) Epoch 2, batch 750, loss[loss=0.1713, simple_loss=0.2396, pruned_loss=0.0515, over 4906.00 frames.], tot_loss[loss=0.1787, simple_loss=0.2444, pruned_loss=0.0565, over 951209.04 frames.], batch size: 18, lr: 7.55e-04 2022-05-04 03:22:13,344 INFO [train.py:715] (0/8) Epoch 2, batch 800, loss[loss=0.1928, simple_loss=0.256, pruned_loss=0.06479, over 4698.00 frames.], tot_loss[loss=0.1782, simple_loss=0.2439, pruned_loss=0.05622, over 955335.21 frames.], batch size: 15, lr: 7.55e-04 2022-05-04 03:22:53,995 INFO [train.py:715] (0/8) Epoch 2, batch 850, loss[loss=0.1945, simple_loss=0.2586, pruned_loss=0.06518, over 4757.00 frames.], tot_loss[loss=0.1767, simple_loss=0.243, pruned_loss=0.0552, over 959369.42 frames.], batch size: 19, lr: 7.55e-04 2022-05-04 03:23:34,288 INFO [train.py:715] (0/8) Epoch 2, batch 900, loss[loss=0.1509, simple_loss=0.2227, pruned_loss=0.03961, over 4782.00 frames.], tot_loss[loss=0.1757, simple_loss=0.2418, pruned_loss=0.05474, over 961894.43 frames.], batch size: 17, lr: 7.55e-04 2022-05-04 03:24:14,710 INFO [train.py:715] (0/8) Epoch 2, batch 950, loss[loss=0.166, simple_loss=0.2268, pruned_loss=0.0526, over 4899.00 frames.], tot_loss[loss=0.1757, simple_loss=0.2417, pruned_loss=0.05482, over 963717.93 frames.], batch size: 19, lr: 7.54e-04 2022-05-04 03:24:55,396 INFO [train.py:715] (0/8) Epoch 2, batch 1000, loss[loss=0.1635, simple_loss=0.2375, pruned_loss=0.04477, over 4974.00 frames.], tot_loss[loss=0.1762, simple_loss=0.2416, pruned_loss=0.05534, over 965790.29 frames.], batch size: 24, lr: 7.54e-04 2022-05-04 03:25:36,197 INFO [train.py:715] (0/8) Epoch 2, batch 1050, loss[loss=0.1687, simple_loss=0.232, pruned_loss=0.05267, over 4827.00 frames.], tot_loss[loss=0.177, simple_loss=0.2422, pruned_loss=0.05593, over 967122.48 frames.], batch size: 13, lr: 7.54e-04 2022-05-04 03:26:15,800 INFO [train.py:715] (0/8) Epoch 2, batch 1100, loss[loss=0.1716, simple_loss=0.2392, pruned_loss=0.052, over 4780.00 frames.], tot_loss[loss=0.1773, simple_loss=0.2424, pruned_loss=0.05611, over 968758.42 frames.], batch size: 18, lr: 7.53e-04 2022-05-04 03:26:56,299 INFO [train.py:715] (0/8) Epoch 2, batch 1150, loss[loss=0.1738, simple_loss=0.2322, pruned_loss=0.05768, over 4873.00 frames.], tot_loss[loss=0.1771, simple_loss=0.2421, pruned_loss=0.05609, over 968874.28 frames.], batch size: 16, lr: 7.53e-04 2022-05-04 03:27:37,634 INFO [train.py:715] (0/8) Epoch 2, batch 1200, loss[loss=0.1932, simple_loss=0.2597, pruned_loss=0.06339, over 4828.00 frames.], tot_loss[loss=0.1761, simple_loss=0.2412, pruned_loss=0.05546, over 969549.38 frames.], batch size: 15, lr: 7.53e-04 2022-05-04 03:28:18,253 INFO [train.py:715] (0/8) Epoch 2, batch 1250, loss[loss=0.1561, simple_loss=0.2182, pruned_loss=0.04699, over 4778.00 frames.], tot_loss[loss=0.1751, simple_loss=0.2408, pruned_loss=0.05472, over 968914.11 frames.], batch size: 14, lr: 7.53e-04 2022-05-04 03:28:57,933 INFO [train.py:715] (0/8) Epoch 2, batch 1300, loss[loss=0.1752, simple_loss=0.24, pruned_loss=0.05521, over 4981.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2412, pruned_loss=0.05483, over 969610.72 frames.], batch size: 24, lr: 7.52e-04 2022-05-04 03:29:38,475 INFO [train.py:715] (0/8) Epoch 2, batch 1350, loss[loss=0.1539, simple_loss=0.2285, pruned_loss=0.03962, over 4827.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2406, pruned_loss=0.05521, over 970847.05 frames.], batch size: 13, lr: 7.52e-04 2022-05-04 03:30:19,105 INFO [train.py:715] (0/8) Epoch 2, batch 1400, loss[loss=0.2299, simple_loss=0.2915, pruned_loss=0.08414, over 4869.00 frames.], tot_loss[loss=0.1757, simple_loss=0.2408, pruned_loss=0.05531, over 971196.05 frames.], batch size: 16, lr: 7.52e-04 2022-05-04 03:30:59,077 INFO [train.py:715] (0/8) Epoch 2, batch 1450, loss[loss=0.1493, simple_loss=0.2108, pruned_loss=0.04397, over 4846.00 frames.], tot_loss[loss=0.1772, simple_loss=0.2422, pruned_loss=0.05608, over 972129.63 frames.], batch size: 13, lr: 7.52e-04 2022-05-04 03:31:39,476 INFO [train.py:715] (0/8) Epoch 2, batch 1500, loss[loss=0.2333, simple_loss=0.2823, pruned_loss=0.09214, over 4841.00 frames.], tot_loss[loss=0.178, simple_loss=0.243, pruned_loss=0.05646, over 971940.29 frames.], batch size: 15, lr: 7.51e-04 2022-05-04 03:32:20,463 INFO [train.py:715] (0/8) Epoch 2, batch 1550, loss[loss=0.1816, simple_loss=0.2469, pruned_loss=0.05817, over 4896.00 frames.], tot_loss[loss=0.1776, simple_loss=0.2428, pruned_loss=0.05624, over 971668.09 frames.], batch size: 19, lr: 7.51e-04 2022-05-04 03:33:00,533 INFO [train.py:715] (0/8) Epoch 2, batch 1600, loss[loss=0.1684, simple_loss=0.2374, pruned_loss=0.04969, over 4857.00 frames.], tot_loss[loss=0.1769, simple_loss=0.2421, pruned_loss=0.05585, over 972135.93 frames.], batch size: 20, lr: 7.51e-04 2022-05-04 03:33:40,357 INFO [train.py:715] (0/8) Epoch 2, batch 1650, loss[loss=0.1647, simple_loss=0.2305, pruned_loss=0.04941, over 4925.00 frames.], tot_loss[loss=0.177, simple_loss=0.2424, pruned_loss=0.05581, over 972478.47 frames.], batch size: 23, lr: 7.51e-04 2022-05-04 03:34:21,224 INFO [train.py:715] (0/8) Epoch 2, batch 1700, loss[loss=0.1498, simple_loss=0.225, pruned_loss=0.03728, over 4693.00 frames.], tot_loss[loss=0.1759, simple_loss=0.2418, pruned_loss=0.055, over 972475.43 frames.], batch size: 15, lr: 7.50e-04 2022-05-04 03:35:02,272 INFO [train.py:715] (0/8) Epoch 2, batch 1750, loss[loss=0.1658, simple_loss=0.2333, pruned_loss=0.04915, over 4935.00 frames.], tot_loss[loss=0.1766, simple_loss=0.2422, pruned_loss=0.05546, over 972752.75 frames.], batch size: 23, lr: 7.50e-04 2022-05-04 03:35:42,176 INFO [train.py:715] (0/8) Epoch 2, batch 1800, loss[loss=0.1805, simple_loss=0.2446, pruned_loss=0.05824, over 4954.00 frames.], tot_loss[loss=0.1767, simple_loss=0.2422, pruned_loss=0.05565, over 973179.73 frames.], batch size: 24, lr: 7.50e-04 2022-05-04 03:36:22,542 INFO [train.py:715] (0/8) Epoch 2, batch 1850, loss[loss=0.1662, simple_loss=0.2443, pruned_loss=0.04407, over 4905.00 frames.], tot_loss[loss=0.1767, simple_loss=0.2426, pruned_loss=0.05539, over 972924.67 frames.], batch size: 19, lr: 7.50e-04 2022-05-04 03:37:03,520 INFO [train.py:715] (0/8) Epoch 2, batch 1900, loss[loss=0.181, simple_loss=0.2439, pruned_loss=0.05907, over 4853.00 frames.], tot_loss[loss=0.176, simple_loss=0.2417, pruned_loss=0.0551, over 973155.00 frames.], batch size: 26, lr: 7.49e-04 2022-05-04 03:37:44,295 INFO [train.py:715] (0/8) Epoch 2, batch 1950, loss[loss=0.1731, simple_loss=0.2461, pruned_loss=0.05002, over 4796.00 frames.], tot_loss[loss=0.1756, simple_loss=0.2416, pruned_loss=0.05477, over 973174.97 frames.], batch size: 24, lr: 7.49e-04 2022-05-04 03:38:24,096 INFO [train.py:715] (0/8) Epoch 2, batch 2000, loss[loss=0.1919, simple_loss=0.2598, pruned_loss=0.06202, over 4803.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2403, pruned_loss=0.05377, over 972497.47 frames.], batch size: 24, lr: 7.49e-04 2022-05-04 03:39:04,258 INFO [train.py:715] (0/8) Epoch 2, batch 2050, loss[loss=0.1527, simple_loss=0.2297, pruned_loss=0.03783, over 4793.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2413, pruned_loss=0.05485, over 971849.87 frames.], batch size: 21, lr: 7.48e-04 2022-05-04 03:39:45,385 INFO [train.py:715] (0/8) Epoch 2, batch 2100, loss[loss=0.2245, simple_loss=0.2688, pruned_loss=0.09008, over 4803.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2411, pruned_loss=0.05496, over 971456.78 frames.], batch size: 25, lr: 7.48e-04 2022-05-04 03:40:25,359 INFO [train.py:715] (0/8) Epoch 2, batch 2150, loss[loss=0.1644, simple_loss=0.2181, pruned_loss=0.05533, over 4838.00 frames.], tot_loss[loss=0.1748, simple_loss=0.2409, pruned_loss=0.05436, over 971928.37 frames.], batch size: 13, lr: 7.48e-04 2022-05-04 03:41:04,887 INFO [train.py:715] (0/8) Epoch 2, batch 2200, loss[loss=0.1929, simple_loss=0.2517, pruned_loss=0.06705, over 4858.00 frames.], tot_loss[loss=0.1754, simple_loss=0.2412, pruned_loss=0.05481, over 973315.86 frames.], batch size: 20, lr: 7.48e-04 2022-05-04 03:41:45,610 INFO [train.py:715] (0/8) Epoch 2, batch 2250, loss[loss=0.1969, simple_loss=0.2508, pruned_loss=0.07149, over 4824.00 frames.], tot_loss[loss=0.1756, simple_loss=0.2412, pruned_loss=0.05497, over 972836.69 frames.], batch size: 26, lr: 7.47e-04 2022-05-04 03:42:26,411 INFO [train.py:715] (0/8) Epoch 2, batch 2300, loss[loss=0.1628, simple_loss=0.2243, pruned_loss=0.05067, over 4871.00 frames.], tot_loss[loss=0.1752, simple_loss=0.2408, pruned_loss=0.05477, over 973187.66 frames.], batch size: 22, lr: 7.47e-04 2022-05-04 03:43:05,613 INFO [train.py:715] (0/8) Epoch 2, batch 2350, loss[loss=0.1733, simple_loss=0.2348, pruned_loss=0.05595, over 4983.00 frames.], tot_loss[loss=0.1751, simple_loss=0.2403, pruned_loss=0.05488, over 972630.13 frames.], batch size: 15, lr: 7.47e-04 2022-05-04 03:43:30,572 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-72000.pt 2022-05-04 03:43:48,327 INFO [train.py:715] (0/8) Epoch 2, batch 2400, loss[loss=0.1365, simple_loss=0.2153, pruned_loss=0.02882, over 4893.00 frames.], tot_loss[loss=0.1737, simple_loss=0.2394, pruned_loss=0.05398, over 971376.64 frames.], batch size: 19, lr: 7.47e-04 2022-05-04 03:44:29,317 INFO [train.py:715] (0/8) Epoch 2, batch 2450, loss[loss=0.1495, simple_loss=0.219, pruned_loss=0.03996, over 4989.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2392, pruned_loss=0.05385, over 971112.66 frames.], batch size: 14, lr: 7.46e-04 2022-05-04 03:45:09,453 INFO [train.py:715] (0/8) Epoch 2, batch 2500, loss[loss=0.16, simple_loss=0.247, pruned_loss=0.03652, over 4848.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2397, pruned_loss=0.05402, over 971713.05 frames.], batch size: 15, lr: 7.46e-04 2022-05-04 03:45:49,049 INFO [train.py:715] (0/8) Epoch 2, batch 2550, loss[loss=0.169, simple_loss=0.244, pruned_loss=0.04697, over 4850.00 frames.], tot_loss[loss=0.1741, simple_loss=0.24, pruned_loss=0.05411, over 972038.49 frames.], batch size: 13, lr: 7.46e-04 2022-05-04 03:46:29,874 INFO [train.py:715] (0/8) Epoch 2, batch 2600, loss[loss=0.1844, simple_loss=0.2468, pruned_loss=0.06104, over 4904.00 frames.], tot_loss[loss=0.1742, simple_loss=0.2403, pruned_loss=0.05409, over 973125.32 frames.], batch size: 19, lr: 7.46e-04 2022-05-04 03:47:10,394 INFO [train.py:715] (0/8) Epoch 2, batch 2650, loss[loss=0.1587, simple_loss=0.2251, pruned_loss=0.04611, over 4787.00 frames.], tot_loss[loss=0.1746, simple_loss=0.2407, pruned_loss=0.05428, over 973403.90 frames.], batch size: 17, lr: 7.45e-04 2022-05-04 03:47:49,286 INFO [train.py:715] (0/8) Epoch 2, batch 2700, loss[loss=0.151, simple_loss=0.2217, pruned_loss=0.04019, over 4883.00 frames.], tot_loss[loss=0.1738, simple_loss=0.2401, pruned_loss=0.05377, over 973164.48 frames.], batch size: 16, lr: 7.45e-04 2022-05-04 03:48:29,308 INFO [train.py:715] (0/8) Epoch 2, batch 2750, loss[loss=0.2049, simple_loss=0.2661, pruned_loss=0.07182, over 4755.00 frames.], tot_loss[loss=0.175, simple_loss=0.2411, pruned_loss=0.0545, over 972983.96 frames.], batch size: 16, lr: 7.45e-04 2022-05-04 03:49:10,357 INFO [train.py:715] (0/8) Epoch 2, batch 2800, loss[loss=0.23, simple_loss=0.2887, pruned_loss=0.08568, over 4803.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2411, pruned_loss=0.0549, over 972411.56 frames.], batch size: 14, lr: 7.45e-04 2022-05-04 03:49:50,284 INFO [train.py:715] (0/8) Epoch 2, batch 2850, loss[loss=0.2061, simple_loss=0.2784, pruned_loss=0.06685, over 4985.00 frames.], tot_loss[loss=0.1765, simple_loss=0.2418, pruned_loss=0.05555, over 973162.80 frames.], batch size: 28, lr: 7.44e-04 2022-05-04 03:50:29,542 INFO [train.py:715] (0/8) Epoch 2, batch 2900, loss[loss=0.161, simple_loss=0.2135, pruned_loss=0.05419, over 4772.00 frames.], tot_loss[loss=0.1764, simple_loss=0.2418, pruned_loss=0.05547, over 973843.53 frames.], batch size: 12, lr: 7.44e-04 2022-05-04 03:51:09,900 INFO [train.py:715] (0/8) Epoch 2, batch 2950, loss[loss=0.1469, simple_loss=0.2245, pruned_loss=0.03465, over 4979.00 frames.], tot_loss[loss=0.176, simple_loss=0.2414, pruned_loss=0.05529, over 973900.85 frames.], batch size: 26, lr: 7.44e-04 2022-05-04 03:51:50,599 INFO [train.py:715] (0/8) Epoch 2, batch 3000, loss[loss=0.1959, simple_loss=0.2591, pruned_loss=0.06639, over 4886.00 frames.], tot_loss[loss=0.1752, simple_loss=0.2405, pruned_loss=0.05496, over 973706.75 frames.], batch size: 16, lr: 7.44e-04 2022-05-04 03:51:50,600 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 03:52:00,003 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1191, simple_loss=0.2058, pruned_loss=0.01615, over 914524.00 frames. 2022-05-04 03:52:40,627 INFO [train.py:715] (0/8) Epoch 2, batch 3050, loss[loss=0.1489, simple_loss=0.2226, pruned_loss=0.03756, over 4718.00 frames.], tot_loss[loss=0.1759, simple_loss=0.2412, pruned_loss=0.05534, over 973632.76 frames.], batch size: 15, lr: 7.43e-04 2022-05-04 03:53:19,878 INFO [train.py:715] (0/8) Epoch 2, batch 3100, loss[loss=0.1751, simple_loss=0.2254, pruned_loss=0.06236, over 4844.00 frames.], tot_loss[loss=0.1754, simple_loss=0.2409, pruned_loss=0.05498, over 973715.52 frames.], batch size: 32, lr: 7.43e-04 2022-05-04 03:53:59,885 INFO [train.py:715] (0/8) Epoch 2, batch 3150, loss[loss=0.1961, simple_loss=0.2497, pruned_loss=0.07126, over 4852.00 frames.], tot_loss[loss=0.1764, simple_loss=0.2415, pruned_loss=0.05565, over 973428.62 frames.], batch size: 30, lr: 7.43e-04 2022-05-04 03:54:40,150 INFO [train.py:715] (0/8) Epoch 2, batch 3200, loss[loss=0.1739, simple_loss=0.2381, pruned_loss=0.05488, over 4982.00 frames.], tot_loss[loss=0.1769, simple_loss=0.242, pruned_loss=0.05586, over 973581.23 frames.], batch size: 14, lr: 7.43e-04 2022-05-04 03:55:19,792 INFO [train.py:715] (0/8) Epoch 2, batch 3250, loss[loss=0.2174, simple_loss=0.2667, pruned_loss=0.084, over 4966.00 frames.], tot_loss[loss=0.1768, simple_loss=0.2421, pruned_loss=0.05573, over 973452.11 frames.], batch size: 15, lr: 7.42e-04 2022-05-04 03:55:59,353 INFO [train.py:715] (0/8) Epoch 2, batch 3300, loss[loss=0.1735, simple_loss=0.2382, pruned_loss=0.05442, over 4920.00 frames.], tot_loss[loss=0.1754, simple_loss=0.2408, pruned_loss=0.05494, over 974269.54 frames.], batch size: 18, lr: 7.42e-04 2022-05-04 03:56:39,596 INFO [train.py:715] (0/8) Epoch 2, batch 3350, loss[loss=0.1876, simple_loss=0.2492, pruned_loss=0.06305, over 4780.00 frames.], tot_loss[loss=0.174, simple_loss=0.2396, pruned_loss=0.05422, over 974652.88 frames.], batch size: 14, lr: 7.42e-04 2022-05-04 03:57:20,092 INFO [train.py:715] (0/8) Epoch 2, batch 3400, loss[loss=0.1385, simple_loss=0.2059, pruned_loss=0.03559, over 4927.00 frames.], tot_loss[loss=0.174, simple_loss=0.2397, pruned_loss=0.05413, over 974234.14 frames.], batch size: 23, lr: 7.42e-04 2022-05-04 03:57:58,925 INFO [train.py:715] (0/8) Epoch 2, batch 3450, loss[loss=0.2091, simple_loss=0.257, pruned_loss=0.0806, over 4845.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2387, pruned_loss=0.05347, over 974581.14 frames.], batch size: 15, lr: 7.41e-04 2022-05-04 03:58:38,960 INFO [train.py:715] (0/8) Epoch 2, batch 3500, loss[loss=0.1811, simple_loss=0.2472, pruned_loss=0.05752, over 4759.00 frames.], tot_loss[loss=0.1724, simple_loss=0.238, pruned_loss=0.05339, over 973731.76 frames.], batch size: 19, lr: 7.41e-04 2022-05-04 03:59:19,007 INFO [train.py:715] (0/8) Epoch 2, batch 3550, loss[loss=0.1628, simple_loss=0.2332, pruned_loss=0.04617, over 4919.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2386, pruned_loss=0.05362, over 973293.95 frames.], batch size: 17, lr: 7.41e-04 2022-05-04 03:59:58,774 INFO [train.py:715] (0/8) Epoch 2, batch 3600, loss[loss=0.1911, simple_loss=0.2661, pruned_loss=0.05801, over 4858.00 frames.], tot_loss[loss=0.174, simple_loss=0.2396, pruned_loss=0.05419, over 973091.26 frames.], batch size: 20, lr: 7.41e-04 2022-05-04 04:00:37,774 INFO [train.py:715] (0/8) Epoch 2, batch 3650, loss[loss=0.1775, simple_loss=0.2369, pruned_loss=0.05907, over 4821.00 frames.], tot_loss[loss=0.1744, simple_loss=0.24, pruned_loss=0.05444, over 972460.30 frames.], batch size: 15, lr: 7.40e-04 2022-05-04 04:01:18,177 INFO [train.py:715] (0/8) Epoch 2, batch 3700, loss[loss=0.159, simple_loss=0.2103, pruned_loss=0.0538, over 4643.00 frames.], tot_loss[loss=0.1745, simple_loss=0.2402, pruned_loss=0.05436, over 974020.84 frames.], batch size: 13, lr: 7.40e-04 2022-05-04 04:01:58,347 INFO [train.py:715] (0/8) Epoch 2, batch 3750, loss[loss=0.1673, simple_loss=0.2321, pruned_loss=0.05126, over 4849.00 frames.], tot_loss[loss=0.1745, simple_loss=0.2401, pruned_loss=0.0545, over 973834.91 frames.], batch size: 32, lr: 7.40e-04 2022-05-04 04:02:37,083 INFO [train.py:715] (0/8) Epoch 2, batch 3800, loss[loss=0.2044, simple_loss=0.2659, pruned_loss=0.07142, over 4969.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2397, pruned_loss=0.05451, over 974137.51 frames.], batch size: 15, lr: 7.40e-04 2022-05-04 04:03:17,275 INFO [train.py:715] (0/8) Epoch 2, batch 3850, loss[loss=0.2144, simple_loss=0.276, pruned_loss=0.07644, over 4806.00 frames.], tot_loss[loss=0.1749, simple_loss=0.24, pruned_loss=0.0549, over 974757.07 frames.], batch size: 21, lr: 7.39e-04 2022-05-04 04:03:57,605 INFO [train.py:715] (0/8) Epoch 2, batch 3900, loss[loss=0.1911, simple_loss=0.2545, pruned_loss=0.06384, over 4892.00 frames.], tot_loss[loss=0.1746, simple_loss=0.2397, pruned_loss=0.05476, over 974468.41 frames.], batch size: 19, lr: 7.39e-04 2022-05-04 04:04:36,838 INFO [train.py:715] (0/8) Epoch 2, batch 3950, loss[loss=0.1808, simple_loss=0.2444, pruned_loss=0.05866, over 4860.00 frames.], tot_loss[loss=0.1741, simple_loss=0.2394, pruned_loss=0.05446, over 973908.96 frames.], batch size: 20, lr: 7.39e-04 2022-05-04 04:05:16,461 INFO [train.py:715] (0/8) Epoch 2, batch 4000, loss[loss=0.1779, simple_loss=0.2355, pruned_loss=0.06011, over 4985.00 frames.], tot_loss[loss=0.1749, simple_loss=0.2401, pruned_loss=0.05484, over 974078.62 frames.], batch size: 14, lr: 7.39e-04 2022-05-04 04:05:57,029 INFO [train.py:715] (0/8) Epoch 2, batch 4050, loss[loss=0.1783, simple_loss=0.2467, pruned_loss=0.05493, over 4769.00 frames.], tot_loss[loss=0.1749, simple_loss=0.2407, pruned_loss=0.05459, over 973583.91 frames.], batch size: 14, lr: 7.38e-04 2022-05-04 04:06:37,527 INFO [train.py:715] (0/8) Epoch 2, batch 4100, loss[loss=0.1609, simple_loss=0.222, pruned_loss=0.04987, over 4804.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2399, pruned_loss=0.0539, over 973189.85 frames.], batch size: 12, lr: 7.38e-04 2022-05-04 04:07:16,028 INFO [train.py:715] (0/8) Epoch 2, batch 4150, loss[loss=0.1918, simple_loss=0.2608, pruned_loss=0.06138, over 4904.00 frames.], tot_loss[loss=0.1738, simple_loss=0.2398, pruned_loss=0.05392, over 972591.10 frames.], batch size: 39, lr: 7.38e-04 2022-05-04 04:07:55,390 INFO [train.py:715] (0/8) Epoch 2, batch 4200, loss[loss=0.2025, simple_loss=0.2622, pruned_loss=0.07145, over 4954.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2399, pruned_loss=0.05432, over 972058.09 frames.], batch size: 21, lr: 7.38e-04 2022-05-04 04:08:35,851 INFO [train.py:715] (0/8) Epoch 2, batch 4250, loss[loss=0.2156, simple_loss=0.2645, pruned_loss=0.08337, over 4933.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2406, pruned_loss=0.0552, over 971899.98 frames.], batch size: 23, lr: 7.37e-04 2022-05-04 04:09:15,090 INFO [train.py:715] (0/8) Epoch 2, batch 4300, loss[loss=0.1703, simple_loss=0.2361, pruned_loss=0.05223, over 4856.00 frames.], tot_loss[loss=0.1749, simple_loss=0.2399, pruned_loss=0.05491, over 971591.10 frames.], batch size: 32, lr: 7.37e-04 2022-05-04 04:09:54,871 INFO [train.py:715] (0/8) Epoch 2, batch 4350, loss[loss=0.1874, simple_loss=0.2558, pruned_loss=0.05946, over 4829.00 frames.], tot_loss[loss=0.1754, simple_loss=0.2405, pruned_loss=0.05516, over 971916.02 frames.], batch size: 15, lr: 7.37e-04 2022-05-04 04:10:34,718 INFO [train.py:715] (0/8) Epoch 2, batch 4400, loss[loss=0.1729, simple_loss=0.2459, pruned_loss=0.04997, over 4909.00 frames.], tot_loss[loss=0.175, simple_loss=0.2403, pruned_loss=0.05484, over 972562.90 frames.], batch size: 18, lr: 7.37e-04 2022-05-04 04:11:14,736 INFO [train.py:715] (0/8) Epoch 2, batch 4450, loss[loss=0.1694, simple_loss=0.2279, pruned_loss=0.05549, over 4864.00 frames.], tot_loss[loss=0.1758, simple_loss=0.241, pruned_loss=0.05529, over 972618.01 frames.], batch size: 32, lr: 7.36e-04 2022-05-04 04:11:53,880 INFO [train.py:715] (0/8) Epoch 2, batch 4500, loss[loss=0.1661, simple_loss=0.232, pruned_loss=0.05016, over 4816.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2407, pruned_loss=0.05518, over 971713.05 frames.], batch size: 27, lr: 7.36e-04 2022-05-04 04:12:33,912 INFO [train.py:715] (0/8) Epoch 2, batch 4550, loss[loss=0.1959, simple_loss=0.2449, pruned_loss=0.0735, over 4689.00 frames.], tot_loss[loss=0.1751, simple_loss=0.2408, pruned_loss=0.05475, over 971268.17 frames.], batch size: 15, lr: 7.36e-04 2022-05-04 04:13:14,642 INFO [train.py:715] (0/8) Epoch 2, batch 4600, loss[loss=0.1419, simple_loss=0.2211, pruned_loss=0.03132, over 4884.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2405, pruned_loss=0.05399, over 971658.79 frames.], batch size: 22, lr: 7.36e-04 2022-05-04 04:13:53,695 INFO [train.py:715] (0/8) Epoch 2, batch 4650, loss[loss=0.1918, simple_loss=0.2554, pruned_loss=0.06412, over 4775.00 frames.], tot_loss[loss=0.1758, simple_loss=0.2418, pruned_loss=0.05493, over 972128.83 frames.], batch size: 18, lr: 7.35e-04 2022-05-04 04:14:33,004 INFO [train.py:715] (0/8) Epoch 2, batch 4700, loss[loss=0.2115, simple_loss=0.2776, pruned_loss=0.07271, over 4892.00 frames.], tot_loss[loss=0.1749, simple_loss=0.2409, pruned_loss=0.05449, over 972494.39 frames.], batch size: 17, lr: 7.35e-04 2022-05-04 04:15:13,201 INFO [train.py:715] (0/8) Epoch 2, batch 4750, loss[loss=0.1509, simple_loss=0.2218, pruned_loss=0.04001, over 4941.00 frames.], tot_loss[loss=0.1743, simple_loss=0.24, pruned_loss=0.05431, over 972377.28 frames.], batch size: 21, lr: 7.35e-04 2022-05-04 04:15:53,744 INFO [train.py:715] (0/8) Epoch 2, batch 4800, loss[loss=0.1793, simple_loss=0.2343, pruned_loss=0.06209, over 4766.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2402, pruned_loss=0.05414, over 973596.22 frames.], batch size: 18, lr: 7.35e-04 2022-05-04 04:16:33,020 INFO [train.py:715] (0/8) Epoch 2, batch 4850, loss[loss=0.2233, simple_loss=0.2818, pruned_loss=0.08243, over 4947.00 frames.], tot_loss[loss=0.1748, simple_loss=0.2405, pruned_loss=0.05455, over 973898.16 frames.], batch size: 24, lr: 7.34e-04 2022-05-04 04:17:12,478 INFO [train.py:715] (0/8) Epoch 2, batch 4900, loss[loss=0.1684, simple_loss=0.222, pruned_loss=0.05741, over 4827.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2399, pruned_loss=0.05434, over 972506.71 frames.], batch size: 13, lr: 7.34e-04 2022-05-04 04:17:52,934 INFO [train.py:715] (0/8) Epoch 2, batch 4950, loss[loss=0.1999, simple_loss=0.2625, pruned_loss=0.06862, over 4966.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2389, pruned_loss=0.05334, over 972559.53 frames.], batch size: 24, lr: 7.34e-04 2022-05-04 04:18:32,537 INFO [train.py:715] (0/8) Epoch 2, batch 5000, loss[loss=0.1649, simple_loss=0.2322, pruned_loss=0.04879, over 4709.00 frames.], tot_loss[loss=0.1737, simple_loss=0.2398, pruned_loss=0.05377, over 972186.64 frames.], batch size: 15, lr: 7.34e-04 2022-05-04 04:19:12,096 INFO [train.py:715] (0/8) Epoch 2, batch 5050, loss[loss=0.1731, simple_loss=0.2402, pruned_loss=0.05306, over 4819.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2402, pruned_loss=0.05421, over 972182.04 frames.], batch size: 26, lr: 7.33e-04 2022-05-04 04:19:53,166 INFO [train.py:715] (0/8) Epoch 2, batch 5100, loss[loss=0.1918, simple_loss=0.2493, pruned_loss=0.06719, over 4899.00 frames.], tot_loss[loss=0.1755, simple_loss=0.2411, pruned_loss=0.05493, over 973023.74 frames.], batch size: 19, lr: 7.33e-04 2022-05-04 04:20:34,131 INFO [train.py:715] (0/8) Epoch 2, batch 5150, loss[loss=0.1904, simple_loss=0.243, pruned_loss=0.06885, over 4764.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2405, pruned_loss=0.05417, over 973400.89 frames.], batch size: 12, lr: 7.33e-04 2022-05-04 04:21:13,073 INFO [train.py:715] (0/8) Epoch 2, batch 5200, loss[loss=0.1932, simple_loss=0.2475, pruned_loss=0.06941, over 4928.00 frames.], tot_loss[loss=0.1745, simple_loss=0.2406, pruned_loss=0.05418, over 973089.51 frames.], batch size: 18, lr: 7.33e-04 2022-05-04 04:21:52,853 INFO [train.py:715] (0/8) Epoch 2, batch 5250, loss[loss=0.1541, simple_loss=0.2337, pruned_loss=0.0373, over 4771.00 frames.], tot_loss[loss=0.1747, simple_loss=0.2407, pruned_loss=0.0544, over 973330.31 frames.], batch size: 18, lr: 7.32e-04 2022-05-04 04:22:33,066 INFO [train.py:715] (0/8) Epoch 2, batch 5300, loss[loss=0.1841, simple_loss=0.2522, pruned_loss=0.05801, over 4852.00 frames.], tot_loss[loss=0.1742, simple_loss=0.2402, pruned_loss=0.05411, over 972759.37 frames.], batch size: 15, lr: 7.32e-04 2022-05-04 04:23:12,243 INFO [train.py:715] (0/8) Epoch 2, batch 5350, loss[loss=0.1811, simple_loss=0.2425, pruned_loss=0.05984, over 4855.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2403, pruned_loss=0.05412, over 972835.92 frames.], batch size: 20, lr: 7.32e-04 2022-05-04 04:23:51,606 INFO [train.py:715] (0/8) Epoch 2, batch 5400, loss[loss=0.2127, simple_loss=0.2756, pruned_loss=0.0749, over 4895.00 frames.], tot_loss[loss=0.1737, simple_loss=0.2395, pruned_loss=0.05389, over 972700.85 frames.], batch size: 19, lr: 7.32e-04 2022-05-04 04:24:32,283 INFO [train.py:715] (0/8) Epoch 2, batch 5450, loss[loss=0.1548, simple_loss=0.2317, pruned_loss=0.03895, over 4977.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2388, pruned_loss=0.0534, over 973423.90 frames.], batch size: 28, lr: 7.31e-04 2022-05-04 04:25:12,074 INFO [train.py:715] (0/8) Epoch 2, batch 5500, loss[loss=0.1554, simple_loss=0.2336, pruned_loss=0.0386, over 4951.00 frames.], tot_loss[loss=0.171, simple_loss=0.2372, pruned_loss=0.05234, over 973140.71 frames.], batch size: 24, lr: 7.31e-04 2022-05-04 04:25:51,713 INFO [train.py:715] (0/8) Epoch 2, batch 5550, loss[loss=0.2077, simple_loss=0.2636, pruned_loss=0.07585, over 4872.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2381, pruned_loss=0.05277, over 973522.11 frames.], batch size: 16, lr: 7.31e-04 2022-05-04 04:26:32,205 INFO [train.py:715] (0/8) Epoch 2, batch 5600, loss[loss=0.1293, simple_loss=0.2067, pruned_loss=0.02599, over 4875.00 frames.], tot_loss[loss=0.1728, simple_loss=0.239, pruned_loss=0.05329, over 974345.47 frames.], batch size: 22, lr: 7.31e-04 2022-05-04 04:27:13,266 INFO [train.py:715] (0/8) Epoch 2, batch 5650, loss[loss=0.1581, simple_loss=0.2228, pruned_loss=0.04672, over 4755.00 frames.], tot_loss[loss=0.1731, simple_loss=0.239, pruned_loss=0.05358, over 974606.66 frames.], batch size: 16, lr: 7.30e-04 2022-05-04 04:27:53,176 INFO [train.py:715] (0/8) Epoch 2, batch 5700, loss[loss=0.193, simple_loss=0.2614, pruned_loss=0.06233, over 4750.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2387, pruned_loss=0.05338, over 974190.60 frames.], batch size: 19, lr: 7.30e-04 2022-05-04 04:28:33,028 INFO [train.py:715] (0/8) Epoch 2, batch 5750, loss[loss=0.1774, simple_loss=0.2365, pruned_loss=0.0592, over 4943.00 frames.], tot_loss[loss=0.173, simple_loss=0.2386, pruned_loss=0.05367, over 973613.14 frames.], batch size: 21, lr: 7.30e-04 2022-05-04 04:29:13,954 INFO [train.py:715] (0/8) Epoch 2, batch 5800, loss[loss=0.2052, simple_loss=0.2701, pruned_loss=0.0702, over 4817.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2381, pruned_loss=0.05351, over 973374.15 frames.], batch size: 15, lr: 7.30e-04 2022-05-04 04:29:55,097 INFO [train.py:715] (0/8) Epoch 2, batch 5850, loss[loss=0.1393, simple_loss=0.198, pruned_loss=0.04032, over 4835.00 frames.], tot_loss[loss=0.1723, simple_loss=0.238, pruned_loss=0.05333, over 973212.13 frames.], batch size: 13, lr: 7.29e-04 2022-05-04 04:30:34,550 INFO [train.py:715] (0/8) Epoch 2, batch 5900, loss[loss=0.163, simple_loss=0.2444, pruned_loss=0.04079, over 4888.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2388, pruned_loss=0.0534, over 972667.43 frames.], batch size: 16, lr: 7.29e-04 2022-05-04 04:31:15,145 INFO [train.py:715] (0/8) Epoch 2, batch 5950, loss[loss=0.1702, simple_loss=0.2335, pruned_loss=0.05342, over 4807.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2383, pruned_loss=0.05338, over 972509.65 frames.], batch size: 21, lr: 7.29e-04 2022-05-04 04:31:56,154 INFO [train.py:715] (0/8) Epoch 2, batch 6000, loss[loss=0.1817, simple_loss=0.2261, pruned_loss=0.06866, over 4871.00 frames.], tot_loss[loss=0.1711, simple_loss=0.2367, pruned_loss=0.05277, over 972874.71 frames.], batch size: 32, lr: 7.29e-04 2022-05-04 04:31:56,155 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 04:32:04,807 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1188, simple_loss=0.2054, pruned_loss=0.01614, over 914524.00 frames. 2022-05-04 04:32:46,141 INFO [train.py:715] (0/8) Epoch 2, batch 6050, loss[loss=0.1891, simple_loss=0.2645, pruned_loss=0.05685, over 4966.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2368, pruned_loss=0.05311, over 973086.07 frames.], batch size: 15, lr: 7.29e-04 2022-05-04 04:33:25,851 INFO [train.py:715] (0/8) Epoch 2, batch 6100, loss[loss=0.1528, simple_loss=0.2256, pruned_loss=0.04004, over 4867.00 frames.], tot_loss[loss=0.171, simple_loss=0.2365, pruned_loss=0.05276, over 972483.94 frames.], batch size: 22, lr: 7.28e-04 2022-05-04 04:34:05,826 INFO [train.py:715] (0/8) Epoch 2, batch 6150, loss[loss=0.1894, simple_loss=0.2567, pruned_loss=0.06107, over 4952.00 frames.], tot_loss[loss=0.171, simple_loss=0.2369, pruned_loss=0.05261, over 972616.93 frames.], batch size: 39, lr: 7.28e-04 2022-05-04 04:34:46,185 INFO [train.py:715] (0/8) Epoch 2, batch 6200, loss[loss=0.2053, simple_loss=0.2626, pruned_loss=0.07404, over 4763.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2379, pruned_loss=0.05292, over 971571.72 frames.], batch size: 19, lr: 7.28e-04 2022-05-04 04:35:26,608 INFO [train.py:715] (0/8) Epoch 2, batch 6250, loss[loss=0.2094, simple_loss=0.2612, pruned_loss=0.07879, over 4815.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2385, pruned_loss=0.05339, over 971360.35 frames.], batch size: 15, lr: 7.28e-04 2022-05-04 04:36:05,781 INFO [train.py:715] (0/8) Epoch 2, batch 6300, loss[loss=0.157, simple_loss=0.2321, pruned_loss=0.04096, over 4760.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2394, pruned_loss=0.0536, over 971038.06 frames.], batch size: 16, lr: 7.27e-04 2022-05-04 04:36:46,035 INFO [train.py:715] (0/8) Epoch 2, batch 6350, loss[loss=0.1886, simple_loss=0.2553, pruned_loss=0.06099, over 4789.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2389, pruned_loss=0.05325, over 970308.11 frames.], batch size: 18, lr: 7.27e-04 2022-05-04 04:37:26,524 INFO [train.py:715] (0/8) Epoch 2, batch 6400, loss[loss=0.1742, simple_loss=0.2352, pruned_loss=0.05659, over 4785.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2386, pruned_loss=0.05321, over 971927.48 frames.], batch size: 12, lr: 7.27e-04 2022-05-04 04:38:05,327 INFO [train.py:715] (0/8) Epoch 2, batch 6450, loss[loss=0.14, simple_loss=0.2087, pruned_loss=0.03568, over 4753.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2391, pruned_loss=0.0539, over 972557.59 frames.], batch size: 19, lr: 7.27e-04 2022-05-04 04:38:44,596 INFO [train.py:715] (0/8) Epoch 2, batch 6500, loss[loss=0.1952, simple_loss=0.2506, pruned_loss=0.06992, over 4946.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2389, pruned_loss=0.05349, over 971778.95 frames.], batch size: 21, lr: 7.26e-04 2022-05-04 04:39:24,832 INFO [train.py:715] (0/8) Epoch 2, batch 6550, loss[loss=0.1701, simple_loss=0.2318, pruned_loss=0.05417, over 4990.00 frames.], tot_loss[loss=0.173, simple_loss=0.2389, pruned_loss=0.05353, over 972648.86 frames.], batch size: 16, lr: 7.26e-04 2022-05-04 04:40:04,767 INFO [train.py:715] (0/8) Epoch 2, batch 6600, loss[loss=0.155, simple_loss=0.2209, pruned_loss=0.04458, over 4870.00 frames.], tot_loss[loss=0.1723, simple_loss=0.2381, pruned_loss=0.05321, over 973458.85 frames.], batch size: 16, lr: 7.26e-04 2022-05-04 04:40:43,856 INFO [train.py:715] (0/8) Epoch 2, batch 6650, loss[loss=0.1339, simple_loss=0.2032, pruned_loss=0.03224, over 4972.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2388, pruned_loss=0.05344, over 974387.95 frames.], batch size: 14, lr: 7.26e-04 2022-05-04 04:41:23,373 INFO [train.py:715] (0/8) Epoch 2, batch 6700, loss[loss=0.1566, simple_loss=0.2239, pruned_loss=0.04463, over 4767.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2389, pruned_loss=0.05338, over 972909.87 frames.], batch size: 17, lr: 7.25e-04 2022-05-04 04:42:03,554 INFO [train.py:715] (0/8) Epoch 2, batch 6750, loss[loss=0.1689, simple_loss=0.2401, pruned_loss=0.04885, over 4813.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2398, pruned_loss=0.05364, over 972903.73 frames.], batch size: 27, lr: 7.25e-04 2022-05-04 04:42:41,720 INFO [train.py:715] (0/8) Epoch 2, batch 6800, loss[loss=0.1594, simple_loss=0.2171, pruned_loss=0.05087, over 4788.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2391, pruned_loss=0.05303, over 973128.59 frames.], batch size: 13, lr: 7.25e-04 2022-05-04 04:43:20,945 INFO [train.py:715] (0/8) Epoch 2, batch 6850, loss[loss=0.17, simple_loss=0.2454, pruned_loss=0.04729, over 4818.00 frames.], tot_loss[loss=0.1731, simple_loss=0.2397, pruned_loss=0.05329, over 973815.12 frames.], batch size: 27, lr: 7.25e-04 2022-05-04 04:44:01,040 INFO [train.py:715] (0/8) Epoch 2, batch 6900, loss[loss=0.1464, simple_loss=0.2198, pruned_loss=0.0365, over 4783.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2393, pruned_loss=0.05364, over 973107.08 frames.], batch size: 18, lr: 7.24e-04 2022-05-04 04:44:41,210 INFO [train.py:715] (0/8) Epoch 2, batch 6950, loss[loss=0.1455, simple_loss=0.2092, pruned_loss=0.04088, over 4831.00 frames.], tot_loss[loss=0.1752, simple_loss=0.241, pruned_loss=0.05473, over 973101.40 frames.], batch size: 15, lr: 7.24e-04 2022-05-04 04:45:19,414 INFO [train.py:715] (0/8) Epoch 2, batch 7000, loss[loss=0.2086, simple_loss=0.2691, pruned_loss=0.07399, over 4946.00 frames.], tot_loss[loss=0.1748, simple_loss=0.2407, pruned_loss=0.05447, over 972411.10 frames.], batch size: 29, lr: 7.24e-04 2022-05-04 04:45:59,980 INFO [train.py:715] (0/8) Epoch 2, batch 7050, loss[loss=0.121, simple_loss=0.197, pruned_loss=0.02254, over 4784.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2389, pruned_loss=0.05321, over 972207.32 frames.], batch size: 12, lr: 7.24e-04 2022-05-04 04:46:40,407 INFO [train.py:715] (0/8) Epoch 2, batch 7100, loss[loss=0.1714, simple_loss=0.2295, pruned_loss=0.05663, over 4893.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2395, pruned_loss=0.05356, over 972417.28 frames.], batch size: 17, lr: 7.24e-04 2022-05-04 04:47:19,801 INFO [train.py:715] (0/8) Epoch 2, batch 7150, loss[loss=0.1687, simple_loss=0.2371, pruned_loss=0.0502, over 4756.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2394, pruned_loss=0.05375, over 972511.72 frames.], batch size: 19, lr: 7.23e-04 2022-05-04 04:48:00,093 INFO [train.py:715] (0/8) Epoch 2, batch 7200, loss[loss=0.2424, simple_loss=0.3001, pruned_loss=0.09233, over 4781.00 frames.], tot_loss[loss=0.1746, simple_loss=0.2406, pruned_loss=0.05428, over 972444.96 frames.], batch size: 18, lr: 7.23e-04 2022-05-04 04:48:41,285 INFO [train.py:715] (0/8) Epoch 2, batch 7250, loss[loss=0.1919, simple_loss=0.2449, pruned_loss=0.06947, over 4978.00 frames.], tot_loss[loss=0.1736, simple_loss=0.2398, pruned_loss=0.05376, over 973373.58 frames.], batch size: 14, lr: 7.23e-04 2022-05-04 04:49:21,908 INFO [train.py:715] (0/8) Epoch 2, batch 7300, loss[loss=0.2215, simple_loss=0.2629, pruned_loss=0.09002, over 4839.00 frames.], tot_loss[loss=0.1738, simple_loss=0.2398, pruned_loss=0.05389, over 973477.89 frames.], batch size: 32, lr: 7.23e-04 2022-05-04 04:50:01,610 INFO [train.py:715] (0/8) Epoch 2, batch 7350, loss[loss=0.1647, simple_loss=0.2372, pruned_loss=0.04606, over 4786.00 frames.], tot_loss[loss=0.1747, simple_loss=0.2403, pruned_loss=0.05454, over 973096.10 frames.], batch size: 18, lr: 7.22e-04 2022-05-04 04:50:42,535 INFO [train.py:715] (0/8) Epoch 2, batch 7400, loss[loss=0.1865, simple_loss=0.2552, pruned_loss=0.05889, over 4775.00 frames.], tot_loss[loss=0.1759, simple_loss=0.2414, pruned_loss=0.05513, over 973070.65 frames.], batch size: 19, lr: 7.22e-04 2022-05-04 04:51:24,330 INFO [train.py:715] (0/8) Epoch 2, batch 7450, loss[loss=0.1733, simple_loss=0.2411, pruned_loss=0.05275, over 4972.00 frames.], tot_loss[loss=0.1757, simple_loss=0.2417, pruned_loss=0.05487, over 972638.13 frames.], batch size: 15, lr: 7.22e-04 2022-05-04 04:52:04,714 INFO [train.py:715] (0/8) Epoch 2, batch 7500, loss[loss=0.1317, simple_loss=0.2067, pruned_loss=0.02834, over 4920.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2403, pruned_loss=0.05413, over 971949.52 frames.], batch size: 23, lr: 7.22e-04 2022-05-04 04:52:45,162 INFO [train.py:715] (0/8) Epoch 2, batch 7550, loss[loss=0.1413, simple_loss=0.2081, pruned_loss=0.03725, over 4774.00 frames.], tot_loss[loss=0.173, simple_loss=0.2389, pruned_loss=0.05354, over 972135.58 frames.], batch size: 14, lr: 7.21e-04 2022-05-04 04:53:26,945 INFO [train.py:715] (0/8) Epoch 2, batch 7600, loss[loss=0.1647, simple_loss=0.2369, pruned_loss=0.04626, over 4800.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2393, pruned_loss=0.05329, over 972651.00 frames.], batch size: 25, lr: 7.21e-04 2022-05-04 04:54:08,314 INFO [train.py:715] (0/8) Epoch 2, batch 7650, loss[loss=0.1937, simple_loss=0.2397, pruned_loss=0.07387, over 4847.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2398, pruned_loss=0.05403, over 972827.22 frames.], batch size: 13, lr: 7.21e-04 2022-05-04 04:54:48,377 INFO [train.py:715] (0/8) Epoch 2, batch 7700, loss[loss=0.186, simple_loss=0.2428, pruned_loss=0.06456, over 4923.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2392, pruned_loss=0.05365, over 972655.25 frames.], batch size: 23, lr: 7.21e-04 2022-05-04 04:55:29,827 INFO [train.py:715] (0/8) Epoch 2, batch 7750, loss[loss=0.1424, simple_loss=0.2104, pruned_loss=0.03715, over 4868.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2394, pruned_loss=0.05422, over 972825.48 frames.], batch size: 22, lr: 7.21e-04 2022-05-04 04:56:11,494 INFO [train.py:715] (0/8) Epoch 2, batch 7800, loss[loss=0.1966, simple_loss=0.2618, pruned_loss=0.06573, over 4790.00 frames.], tot_loss[loss=0.1746, simple_loss=0.24, pruned_loss=0.05458, over 973087.69 frames.], batch size: 14, lr: 7.20e-04 2022-05-04 04:56:52,000 INFO [train.py:715] (0/8) Epoch 2, batch 7850, loss[loss=0.1758, simple_loss=0.2384, pruned_loss=0.05659, over 4853.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2403, pruned_loss=0.05422, over 973085.70 frames.], batch size: 32, lr: 7.20e-04 2022-05-04 04:57:33,357 INFO [train.py:715] (0/8) Epoch 2, batch 7900, loss[loss=0.1911, simple_loss=0.2558, pruned_loss=0.06317, over 4888.00 frames.], tot_loss[loss=0.1738, simple_loss=0.2399, pruned_loss=0.0538, over 973232.61 frames.], batch size: 17, lr: 7.20e-04 2022-05-04 04:58:15,546 INFO [train.py:715] (0/8) Epoch 2, batch 7950, loss[loss=0.1998, simple_loss=0.2461, pruned_loss=0.07676, over 4844.00 frames.], tot_loss[loss=0.174, simple_loss=0.2399, pruned_loss=0.05403, over 973380.65 frames.], batch size: 32, lr: 7.20e-04 2022-05-04 04:58:57,040 INFO [train.py:715] (0/8) Epoch 2, batch 8000, loss[loss=0.1591, simple_loss=0.2418, pruned_loss=0.03815, over 4764.00 frames.], tot_loss[loss=0.1751, simple_loss=0.2412, pruned_loss=0.05453, over 973461.73 frames.], batch size: 16, lr: 7.19e-04 2022-05-04 04:59:37,240 INFO [train.py:715] (0/8) Epoch 2, batch 8050, loss[loss=0.1941, simple_loss=0.259, pruned_loss=0.06457, over 4817.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2407, pruned_loss=0.05403, over 972823.73 frames.], batch size: 27, lr: 7.19e-04 2022-05-04 05:00:18,972 INFO [train.py:715] (0/8) Epoch 2, batch 8100, loss[loss=0.1783, simple_loss=0.2446, pruned_loss=0.056, over 4858.00 frames.], tot_loss[loss=0.1742, simple_loss=0.2406, pruned_loss=0.05388, over 972643.03 frames.], batch size: 13, lr: 7.19e-04 2022-05-04 05:01:00,831 INFO [train.py:715] (0/8) Epoch 2, batch 8150, loss[loss=0.1463, simple_loss=0.2162, pruned_loss=0.03817, over 4923.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2406, pruned_loss=0.05404, over 973080.89 frames.], batch size: 29, lr: 7.19e-04 2022-05-04 05:01:41,270 INFO [train.py:715] (0/8) Epoch 2, batch 8200, loss[loss=0.2009, simple_loss=0.2521, pruned_loss=0.07482, over 4879.00 frames.], tot_loss[loss=0.1751, simple_loss=0.2414, pruned_loss=0.05446, over 973047.58 frames.], batch size: 16, lr: 7.18e-04 2022-05-04 05:02:22,243 INFO [train.py:715] (0/8) Epoch 2, batch 8250, loss[loss=0.1351, simple_loss=0.1975, pruned_loss=0.03633, over 4985.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2408, pruned_loss=0.05393, over 973656.39 frames.], batch size: 14, lr: 7.18e-04 2022-05-04 05:03:04,357 INFO [train.py:715] (0/8) Epoch 2, batch 8300, loss[loss=0.1672, simple_loss=0.2333, pruned_loss=0.05051, over 4890.00 frames.], tot_loss[loss=0.1741, simple_loss=0.2402, pruned_loss=0.05405, over 973320.34 frames.], batch size: 16, lr: 7.18e-04 2022-05-04 05:03:46,074 INFO [train.py:715] (0/8) Epoch 2, batch 8350, loss[loss=0.1621, simple_loss=0.2364, pruned_loss=0.04385, over 4937.00 frames.], tot_loss[loss=0.1752, simple_loss=0.2409, pruned_loss=0.05474, over 973381.05 frames.], batch size: 21, lr: 7.18e-04 2022-05-04 05:04:26,319 INFO [train.py:715] (0/8) Epoch 2, batch 8400, loss[loss=0.1671, simple_loss=0.2443, pruned_loss=0.04496, over 4689.00 frames.], tot_loss[loss=0.1748, simple_loss=0.2408, pruned_loss=0.05441, over 972818.02 frames.], batch size: 15, lr: 7.18e-04 2022-05-04 05:05:07,468 INFO [train.py:715] (0/8) Epoch 2, batch 8450, loss[loss=0.2321, simple_loss=0.293, pruned_loss=0.08555, over 4923.00 frames.], tot_loss[loss=0.1749, simple_loss=0.2407, pruned_loss=0.05458, over 972958.71 frames.], batch size: 39, lr: 7.17e-04 2022-05-04 05:05:49,558 INFO [train.py:715] (0/8) Epoch 2, batch 8500, loss[loss=0.1843, simple_loss=0.2413, pruned_loss=0.06367, over 4776.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2401, pruned_loss=0.05437, over 971707.30 frames.], batch size: 19, lr: 7.17e-04 2022-05-04 05:06:29,755 INFO [train.py:715] (0/8) Epoch 2, batch 8550, loss[loss=0.1526, simple_loss=0.2277, pruned_loss=0.0387, over 4930.00 frames.], tot_loss[loss=0.1748, simple_loss=0.2405, pruned_loss=0.05461, over 971866.67 frames.], batch size: 23, lr: 7.17e-04 2022-05-04 05:07:10,941 INFO [train.py:715] (0/8) Epoch 2, batch 8600, loss[loss=0.1564, simple_loss=0.239, pruned_loss=0.03693, over 4759.00 frames.], tot_loss[loss=0.1734, simple_loss=0.2393, pruned_loss=0.05373, over 971175.73 frames.], batch size: 19, lr: 7.17e-04 2022-05-04 05:07:52,990 INFO [train.py:715] (0/8) Epoch 2, batch 8650, loss[loss=0.1794, simple_loss=0.2438, pruned_loss=0.05748, over 4756.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2389, pruned_loss=0.05307, over 970920.33 frames.], batch size: 19, lr: 7.16e-04 2022-05-04 05:08:34,283 INFO [train.py:715] (0/8) Epoch 2, batch 8700, loss[loss=0.2111, simple_loss=0.2698, pruned_loss=0.07618, over 4958.00 frames.], tot_loss[loss=0.1745, simple_loss=0.2402, pruned_loss=0.05442, over 971061.56 frames.], batch size: 35, lr: 7.16e-04 2022-05-04 05:09:14,824 INFO [train.py:715] (0/8) Epoch 2, batch 8750, loss[loss=0.1924, simple_loss=0.2576, pruned_loss=0.06364, over 4795.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2396, pruned_loss=0.05411, over 971367.15 frames.], batch size: 24, lr: 7.16e-04 2022-05-04 05:09:56,630 INFO [train.py:715] (0/8) Epoch 2, batch 8800, loss[loss=0.1709, simple_loss=0.2498, pruned_loss=0.04602, over 4755.00 frames.], tot_loss[loss=0.1749, simple_loss=0.2406, pruned_loss=0.05464, over 971160.43 frames.], batch size: 19, lr: 7.16e-04 2022-05-04 05:10:38,728 INFO [train.py:715] (0/8) Epoch 2, batch 8850, loss[loss=0.2323, simple_loss=0.2863, pruned_loss=0.08912, over 4880.00 frames.], tot_loss[loss=0.1751, simple_loss=0.2408, pruned_loss=0.05468, over 972307.88 frames.], batch size: 16, lr: 7.15e-04 2022-05-04 05:11:18,688 INFO [train.py:715] (0/8) Epoch 2, batch 8900, loss[loss=0.1773, simple_loss=0.2547, pruned_loss=0.04997, over 4831.00 frames.], tot_loss[loss=0.1742, simple_loss=0.2399, pruned_loss=0.05428, over 971541.37 frames.], batch size: 15, lr: 7.15e-04 2022-05-04 05:12:00,189 INFO [train.py:715] (0/8) Epoch 2, batch 8950, loss[loss=0.1853, simple_loss=0.2463, pruned_loss=0.06212, over 4957.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2402, pruned_loss=0.05421, over 971301.00 frames.], batch size: 39, lr: 7.15e-04 2022-05-04 05:12:42,400 INFO [train.py:715] (0/8) Epoch 2, batch 9000, loss[loss=0.1527, simple_loss=0.2339, pruned_loss=0.03571, over 4888.00 frames.], tot_loss[loss=0.1739, simple_loss=0.24, pruned_loss=0.05391, over 972437.31 frames.], batch size: 19, lr: 7.15e-04 2022-05-04 05:12:42,401 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 05:12:58,992 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1181, simple_loss=0.2047, pruned_loss=0.01572, over 914524.00 frames. 2022-05-04 05:13:41,063 INFO [train.py:715] (0/8) Epoch 2, batch 9050, loss[loss=0.151, simple_loss=0.2127, pruned_loss=0.04463, over 4956.00 frames.], tot_loss[loss=0.1734, simple_loss=0.2396, pruned_loss=0.0536, over 972788.35 frames.], batch size: 21, lr: 7.15e-04 2022-05-04 05:14:21,239 INFO [train.py:715] (0/8) Epoch 2, batch 9100, loss[loss=0.1834, simple_loss=0.2515, pruned_loss=0.0577, over 4911.00 frames.], tot_loss[loss=0.1725, simple_loss=0.239, pruned_loss=0.05302, over 972356.93 frames.], batch size: 39, lr: 7.14e-04 2022-05-04 05:15:02,335 INFO [train.py:715] (0/8) Epoch 2, batch 9150, loss[loss=0.1444, simple_loss=0.2074, pruned_loss=0.04065, over 4780.00 frames.], tot_loss[loss=0.1723, simple_loss=0.2387, pruned_loss=0.05292, over 971625.01 frames.], batch size: 18, lr: 7.14e-04 2022-05-04 05:15:43,575 INFO [train.py:715] (0/8) Epoch 2, batch 9200, loss[loss=0.1525, simple_loss=0.2152, pruned_loss=0.04485, over 4908.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2381, pruned_loss=0.0528, over 971479.98 frames.], batch size: 18, lr: 7.14e-04 2022-05-04 05:16:25,083 INFO [train.py:715] (0/8) Epoch 2, batch 9250, loss[loss=0.1783, simple_loss=0.241, pruned_loss=0.05775, over 4645.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2387, pruned_loss=0.05306, over 971759.46 frames.], batch size: 13, lr: 7.14e-04 2022-05-04 05:17:05,069 INFO [train.py:715] (0/8) Epoch 2, batch 9300, loss[loss=0.1805, simple_loss=0.2443, pruned_loss=0.05836, over 4735.00 frames.], tot_loss[loss=0.173, simple_loss=0.2394, pruned_loss=0.05328, over 972315.41 frames.], batch size: 16, lr: 7.13e-04 2022-05-04 05:17:46,760 INFO [train.py:715] (0/8) Epoch 2, batch 9350, loss[loss=0.175, simple_loss=0.2468, pruned_loss=0.05159, over 4936.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2388, pruned_loss=0.0534, over 973024.78 frames.], batch size: 29, lr: 7.13e-04 2022-05-04 05:18:28,852 INFO [train.py:715] (0/8) Epoch 2, batch 9400, loss[loss=0.1558, simple_loss=0.2349, pruned_loss=0.0383, over 4886.00 frames.], tot_loss[loss=0.1716, simple_loss=0.2379, pruned_loss=0.05266, over 972692.95 frames.], batch size: 22, lr: 7.13e-04 2022-05-04 05:19:08,498 INFO [train.py:715] (0/8) Epoch 2, batch 9450, loss[loss=0.1632, simple_loss=0.2318, pruned_loss=0.04728, over 4926.00 frames.], tot_loss[loss=0.1736, simple_loss=0.2395, pruned_loss=0.05388, over 972047.58 frames.], batch size: 23, lr: 7.13e-04 2022-05-04 05:19:48,355 INFO [train.py:715] (0/8) Epoch 2, batch 9500, loss[loss=0.1652, simple_loss=0.2354, pruned_loss=0.04746, over 4941.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2387, pruned_loss=0.05346, over 971951.90 frames.], batch size: 21, lr: 7.13e-04 2022-05-04 05:20:28,627 INFO [train.py:715] (0/8) Epoch 2, batch 9550, loss[loss=0.1623, simple_loss=0.2336, pruned_loss=0.04547, over 4913.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2399, pruned_loss=0.05436, over 972065.41 frames.], batch size: 18, lr: 7.12e-04 2022-05-04 05:21:08,634 INFO [train.py:715] (0/8) Epoch 2, batch 9600, loss[loss=0.1851, simple_loss=0.2533, pruned_loss=0.05839, over 4936.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2394, pruned_loss=0.05415, over 972598.87 frames.], batch size: 21, lr: 7.12e-04 2022-05-04 05:21:47,531 INFO [train.py:715] (0/8) Epoch 2, batch 9650, loss[loss=0.1941, simple_loss=0.2667, pruned_loss=0.06073, over 4890.00 frames.], tot_loss[loss=0.173, simple_loss=0.2389, pruned_loss=0.05356, over 972268.55 frames.], batch size: 22, lr: 7.12e-04 2022-05-04 05:22:27,774 INFO [train.py:715] (0/8) Epoch 2, batch 9700, loss[loss=0.1983, simple_loss=0.2593, pruned_loss=0.06863, over 4793.00 frames.], tot_loss[loss=0.1734, simple_loss=0.2389, pruned_loss=0.0539, over 971817.74 frames.], batch size: 14, lr: 7.12e-04 2022-05-04 05:23:08,404 INFO [train.py:715] (0/8) Epoch 2, batch 9750, loss[loss=0.2156, simple_loss=0.275, pruned_loss=0.07811, over 4817.00 frames.], tot_loss[loss=0.174, simple_loss=0.2393, pruned_loss=0.05434, over 971670.80 frames.], batch size: 26, lr: 7.11e-04 2022-05-04 05:23:47,695 INFO [train.py:715] (0/8) Epoch 2, batch 9800, loss[loss=0.1923, simple_loss=0.2627, pruned_loss=0.06094, over 4874.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2393, pruned_loss=0.05379, over 971322.62 frames.], batch size: 16, lr: 7.11e-04 2022-05-04 05:24:26,792 INFO [train.py:715] (0/8) Epoch 2, batch 9850, loss[loss=0.1783, simple_loss=0.2535, pruned_loss=0.05154, over 4689.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2397, pruned_loss=0.05407, over 971349.89 frames.], batch size: 15, lr: 7.11e-04 2022-05-04 05:25:06,813 INFO [train.py:715] (0/8) Epoch 2, batch 9900, loss[loss=0.1765, simple_loss=0.2267, pruned_loss=0.06316, over 4822.00 frames.], tot_loss[loss=0.1748, simple_loss=0.2406, pruned_loss=0.05445, over 971973.92 frames.], batch size: 12, lr: 7.11e-04 2022-05-04 05:25:46,404 INFO [train.py:715] (0/8) Epoch 2, batch 9950, loss[loss=0.1661, simple_loss=0.2428, pruned_loss=0.04473, over 4923.00 frames.], tot_loss[loss=0.1746, simple_loss=0.2407, pruned_loss=0.05421, over 972848.54 frames.], batch size: 18, lr: 7.11e-04 2022-05-04 05:26:25,425 INFO [train.py:715] (0/8) Epoch 2, batch 10000, loss[loss=0.1479, simple_loss=0.2221, pruned_loss=0.03681, over 4802.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2393, pruned_loss=0.05309, over 971793.18 frames.], batch size: 12, lr: 7.10e-04 2022-05-04 05:27:06,106 INFO [train.py:715] (0/8) Epoch 2, batch 10050, loss[loss=0.1878, simple_loss=0.2503, pruned_loss=0.0626, over 4855.00 frames.], tot_loss[loss=0.173, simple_loss=0.2391, pruned_loss=0.05344, over 971840.76 frames.], batch size: 22, lr: 7.10e-04 2022-05-04 05:27:45,910 INFO [train.py:715] (0/8) Epoch 2, batch 10100, loss[loss=0.185, simple_loss=0.2549, pruned_loss=0.05751, over 4945.00 frames.], tot_loss[loss=0.1732, simple_loss=0.2398, pruned_loss=0.0533, over 972431.50 frames.], batch size: 21, lr: 7.10e-04 2022-05-04 05:28:25,920 INFO [train.py:715] (0/8) Epoch 2, batch 10150, loss[loss=0.1431, simple_loss=0.2177, pruned_loss=0.03431, over 4922.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2399, pruned_loss=0.0535, over 973145.93 frames.], batch size: 29, lr: 7.10e-04 2022-05-04 05:29:06,178 INFO [train.py:715] (0/8) Epoch 2, batch 10200, loss[loss=0.1465, simple_loss=0.224, pruned_loss=0.03449, over 4808.00 frames.], tot_loss[loss=0.1735, simple_loss=0.24, pruned_loss=0.05347, over 972480.66 frames.], batch size: 25, lr: 7.09e-04 2022-05-04 05:29:47,604 INFO [train.py:715] (0/8) Epoch 2, batch 10250, loss[loss=0.2205, simple_loss=0.2743, pruned_loss=0.08338, over 4988.00 frames.], tot_loss[loss=0.1731, simple_loss=0.2393, pruned_loss=0.05344, over 972195.33 frames.], batch size: 14, lr: 7.09e-04 2022-05-04 05:30:27,418 INFO [train.py:715] (0/8) Epoch 2, batch 10300, loss[loss=0.1602, simple_loss=0.232, pruned_loss=0.04421, over 4989.00 frames.], tot_loss[loss=0.173, simple_loss=0.2392, pruned_loss=0.05339, over 972612.19 frames.], batch size: 25, lr: 7.09e-04 2022-05-04 05:31:07,040 INFO [train.py:715] (0/8) Epoch 2, batch 10350, loss[loss=0.1506, simple_loss=0.2176, pruned_loss=0.04173, over 4877.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2387, pruned_loss=0.05344, over 971563.90 frames.], batch size: 16, lr: 7.09e-04 2022-05-04 05:31:31,983 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-80000.pt 2022-05-04 05:31:49,855 INFO [train.py:715] (0/8) Epoch 2, batch 10400, loss[loss=0.1272, simple_loss=0.2037, pruned_loss=0.02535, over 4953.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2387, pruned_loss=0.05338, over 972121.22 frames.], batch size: 14, lr: 7.09e-04 2022-05-04 05:32:31,021 INFO [train.py:715] (0/8) Epoch 2, batch 10450, loss[loss=0.1595, simple_loss=0.224, pruned_loss=0.04754, over 4784.00 frames.], tot_loss[loss=0.173, simple_loss=0.2387, pruned_loss=0.05361, over 971891.40 frames.], batch size: 18, lr: 7.08e-04 2022-05-04 05:33:11,274 INFO [train.py:715] (0/8) Epoch 2, batch 10500, loss[loss=0.1409, simple_loss=0.2064, pruned_loss=0.03769, over 4788.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2387, pruned_loss=0.05336, over 971839.61 frames.], batch size: 12, lr: 7.08e-04 2022-05-04 05:33:50,618 INFO [train.py:715] (0/8) Epoch 2, batch 10550, loss[loss=0.1735, simple_loss=0.2288, pruned_loss=0.05908, over 4963.00 frames.], tot_loss[loss=0.1713, simple_loss=0.2375, pruned_loss=0.05251, over 972008.13 frames.], batch size: 15, lr: 7.08e-04 2022-05-04 05:34:31,848 INFO [train.py:715] (0/8) Epoch 2, batch 10600, loss[loss=0.1869, simple_loss=0.2526, pruned_loss=0.06063, over 4868.00 frames.], tot_loss[loss=0.172, simple_loss=0.2385, pruned_loss=0.05276, over 972318.01 frames.], batch size: 16, lr: 7.08e-04 2022-05-04 05:35:12,044 INFO [train.py:715] (0/8) Epoch 2, batch 10650, loss[loss=0.1471, simple_loss=0.2083, pruned_loss=0.04298, over 4778.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2384, pruned_loss=0.05321, over 971936.97 frames.], batch size: 18, lr: 7.07e-04 2022-05-04 05:35:51,941 INFO [train.py:715] (0/8) Epoch 2, batch 10700, loss[loss=0.1922, simple_loss=0.2557, pruned_loss=0.06433, over 4959.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2398, pruned_loss=0.05399, over 971791.13 frames.], batch size: 24, lr: 7.07e-04 2022-05-04 05:36:32,505 INFO [train.py:715] (0/8) Epoch 2, batch 10750, loss[loss=0.2114, simple_loss=0.27, pruned_loss=0.07644, over 4858.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2387, pruned_loss=0.05301, over 971976.63 frames.], batch size: 32, lr: 7.07e-04 2022-05-04 05:37:13,628 INFO [train.py:715] (0/8) Epoch 2, batch 10800, loss[loss=0.1437, simple_loss=0.2212, pruned_loss=0.0331, over 4832.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2388, pruned_loss=0.05245, over 972074.28 frames.], batch size: 25, lr: 7.07e-04 2022-05-04 05:37:53,815 INFO [train.py:715] (0/8) Epoch 2, batch 10850, loss[loss=0.1842, simple_loss=0.252, pruned_loss=0.05827, over 4779.00 frames.], tot_loss[loss=0.1711, simple_loss=0.2379, pruned_loss=0.05215, over 972203.82 frames.], batch size: 17, lr: 7.07e-04 2022-05-04 05:38:33,325 INFO [train.py:715] (0/8) Epoch 2, batch 10900, loss[loss=0.1708, simple_loss=0.2388, pruned_loss=0.05139, over 4984.00 frames.], tot_loss[loss=0.171, simple_loss=0.2374, pruned_loss=0.05233, over 973198.65 frames.], batch size: 25, lr: 7.06e-04 2022-05-04 05:39:14,355 INFO [train.py:715] (0/8) Epoch 2, batch 10950, loss[loss=0.1649, simple_loss=0.2346, pruned_loss=0.04765, over 4870.00 frames.], tot_loss[loss=0.172, simple_loss=0.238, pruned_loss=0.05301, over 974010.19 frames.], batch size: 20, lr: 7.06e-04 2022-05-04 05:39:54,190 INFO [train.py:715] (0/8) Epoch 2, batch 11000, loss[loss=0.186, simple_loss=0.262, pruned_loss=0.05505, over 4980.00 frames.], tot_loss[loss=0.171, simple_loss=0.2373, pruned_loss=0.05236, over 973508.71 frames.], batch size: 25, lr: 7.06e-04 2022-05-04 05:40:33,762 INFO [train.py:715] (0/8) Epoch 2, batch 11050, loss[loss=0.1774, simple_loss=0.2441, pruned_loss=0.05534, over 4826.00 frames.], tot_loss[loss=0.1706, simple_loss=0.2371, pruned_loss=0.05202, over 973021.75 frames.], batch size: 26, lr: 7.06e-04 2022-05-04 05:41:14,435 INFO [train.py:715] (0/8) Epoch 2, batch 11100, loss[loss=0.1952, simple_loss=0.2456, pruned_loss=0.0724, over 4960.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2376, pruned_loss=0.05263, over 972682.61 frames.], batch size: 15, lr: 7.05e-04 2022-05-04 05:41:54,865 INFO [train.py:715] (0/8) Epoch 2, batch 11150, loss[loss=0.175, simple_loss=0.2589, pruned_loss=0.0456, over 4861.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2377, pruned_loss=0.05306, over 973196.79 frames.], batch size: 30, lr: 7.05e-04 2022-05-04 05:42:35,626 INFO [train.py:715] (0/8) Epoch 2, batch 11200, loss[loss=0.1726, simple_loss=0.2357, pruned_loss=0.0547, over 4856.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2389, pruned_loss=0.05337, over 972600.79 frames.], batch size: 32, lr: 7.05e-04 2022-05-04 05:43:15,656 INFO [train.py:715] (0/8) Epoch 2, batch 11250, loss[loss=0.179, simple_loss=0.2478, pruned_loss=0.05517, over 4855.00 frames.], tot_loss[loss=0.1731, simple_loss=0.2392, pruned_loss=0.05349, over 972556.44 frames.], batch size: 20, lr: 7.05e-04 2022-05-04 05:43:56,718 INFO [train.py:715] (0/8) Epoch 2, batch 11300, loss[loss=0.2154, simple_loss=0.2748, pruned_loss=0.078, over 4896.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2389, pruned_loss=0.05335, over 971792.65 frames.], batch size: 19, lr: 7.05e-04 2022-05-04 05:44:37,056 INFO [train.py:715] (0/8) Epoch 2, batch 11350, loss[loss=0.1593, simple_loss=0.2267, pruned_loss=0.04598, over 4889.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2389, pruned_loss=0.05321, over 972117.30 frames.], batch size: 22, lr: 7.04e-04 2022-05-04 05:45:16,683 INFO [train.py:715] (0/8) Epoch 2, batch 11400, loss[loss=0.1114, simple_loss=0.1825, pruned_loss=0.02017, over 4789.00 frames.], tot_loss[loss=0.1737, simple_loss=0.2394, pruned_loss=0.05398, over 972655.31 frames.], batch size: 12, lr: 7.04e-04 2022-05-04 05:45:56,733 INFO [train.py:715] (0/8) Epoch 2, batch 11450, loss[loss=0.2205, simple_loss=0.2669, pruned_loss=0.08703, over 4874.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2393, pruned_loss=0.05422, over 972538.81 frames.], batch size: 16, lr: 7.04e-04 2022-05-04 05:46:37,328 INFO [train.py:715] (0/8) Epoch 2, batch 11500, loss[loss=0.1413, simple_loss=0.2076, pruned_loss=0.0375, over 4848.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2386, pruned_loss=0.05349, over 973778.19 frames.], batch size: 13, lr: 7.04e-04 2022-05-04 05:47:18,055 INFO [train.py:715] (0/8) Epoch 2, batch 11550, loss[loss=0.2015, simple_loss=0.2599, pruned_loss=0.07161, over 4827.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2392, pruned_loss=0.05386, over 973715.55 frames.], batch size: 30, lr: 7.04e-04 2022-05-04 05:47:58,023 INFO [train.py:715] (0/8) Epoch 2, batch 11600, loss[loss=0.1882, simple_loss=0.2449, pruned_loss=0.06577, over 4989.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2396, pruned_loss=0.05417, over 973408.30 frames.], batch size: 16, lr: 7.03e-04 2022-05-04 05:48:39,179 INFO [train.py:715] (0/8) Epoch 2, batch 11650, loss[loss=0.1757, simple_loss=0.2382, pruned_loss=0.05664, over 4993.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2396, pruned_loss=0.05372, over 972987.28 frames.], batch size: 14, lr: 7.03e-04 2022-05-04 05:49:19,425 INFO [train.py:715] (0/8) Epoch 2, batch 11700, loss[loss=0.1758, simple_loss=0.2433, pruned_loss=0.05416, over 4920.00 frames.], tot_loss[loss=0.173, simple_loss=0.2394, pruned_loss=0.05332, over 972488.74 frames.], batch size: 39, lr: 7.03e-04 2022-05-04 05:49:59,625 INFO [train.py:715] (0/8) Epoch 2, batch 11750, loss[loss=0.1719, simple_loss=0.2496, pruned_loss=0.04709, over 4777.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2388, pruned_loss=0.05335, over 971864.79 frames.], batch size: 12, lr: 7.03e-04 2022-05-04 05:50:40,399 INFO [train.py:715] (0/8) Epoch 2, batch 11800, loss[loss=0.1399, simple_loss=0.2105, pruned_loss=0.03469, over 4889.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2383, pruned_loss=0.05278, over 972308.22 frames.], batch size: 22, lr: 7.02e-04 2022-05-04 05:51:20,985 INFO [train.py:715] (0/8) Epoch 2, batch 11850, loss[loss=0.2181, simple_loss=0.275, pruned_loss=0.0806, over 4865.00 frames.], tot_loss[loss=0.1729, simple_loss=0.239, pruned_loss=0.05343, over 972284.21 frames.], batch size: 32, lr: 7.02e-04 2022-05-04 05:52:00,405 INFO [train.py:715] (0/8) Epoch 2, batch 11900, loss[loss=0.1575, simple_loss=0.212, pruned_loss=0.05155, over 4794.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2378, pruned_loss=0.0528, over 972063.78 frames.], batch size: 17, lr: 7.02e-04 2022-05-04 05:52:40,338 INFO [train.py:715] (0/8) Epoch 2, batch 11950, loss[loss=0.1393, simple_loss=0.208, pruned_loss=0.03528, over 4899.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2379, pruned_loss=0.05259, over 972224.95 frames.], batch size: 18, lr: 7.02e-04 2022-05-04 05:53:21,657 INFO [train.py:715] (0/8) Epoch 2, batch 12000, loss[loss=0.2117, simple_loss=0.2781, pruned_loss=0.0726, over 4811.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2382, pruned_loss=0.05348, over 972624.23 frames.], batch size: 21, lr: 7.02e-04 2022-05-04 05:53:21,659 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 05:53:45,624 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1181, simple_loss=0.2049, pruned_loss=0.01568, over 914524.00 frames. 2022-05-04 05:54:27,023 INFO [train.py:715] (0/8) Epoch 2, batch 12050, loss[loss=0.1771, simple_loss=0.2481, pruned_loss=0.05304, over 4790.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2382, pruned_loss=0.05344, over 973032.78 frames.], batch size: 14, lr: 7.01e-04 2022-05-04 05:55:07,117 INFO [train.py:715] (0/8) Epoch 2, batch 12100, loss[loss=0.1719, simple_loss=0.2403, pruned_loss=0.05179, over 4979.00 frames.], tot_loss[loss=0.1734, simple_loss=0.239, pruned_loss=0.05386, over 972830.70 frames.], batch size: 25, lr: 7.01e-04 2022-05-04 05:55:47,111 INFO [train.py:715] (0/8) Epoch 2, batch 12150, loss[loss=0.2187, simple_loss=0.2743, pruned_loss=0.08152, over 4964.00 frames.], tot_loss[loss=0.1735, simple_loss=0.2393, pruned_loss=0.05382, over 972937.34 frames.], batch size: 24, lr: 7.01e-04 2022-05-04 05:56:27,810 INFO [train.py:715] (0/8) Epoch 2, batch 12200, loss[loss=0.1545, simple_loss=0.2123, pruned_loss=0.04838, over 4806.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2392, pruned_loss=0.05374, over 973800.47 frames.], batch size: 12, lr: 7.01e-04 2022-05-04 05:57:07,985 INFO [train.py:715] (0/8) Epoch 2, batch 12250, loss[loss=0.1663, simple_loss=0.2493, pruned_loss=0.04167, over 4982.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2391, pruned_loss=0.05321, over 972856.97 frames.], batch size: 24, lr: 7.01e-04 2022-05-04 05:57:48,413 INFO [train.py:715] (0/8) Epoch 2, batch 12300, loss[loss=0.1672, simple_loss=0.2366, pruned_loss=0.04889, over 4915.00 frames.], tot_loss[loss=0.1734, simple_loss=0.2399, pruned_loss=0.05351, over 972879.54 frames.], batch size: 17, lr: 7.00e-04 2022-05-04 05:58:28,536 INFO [train.py:715] (0/8) Epoch 2, batch 12350, loss[loss=0.1573, simple_loss=0.2172, pruned_loss=0.04866, over 4772.00 frames.], tot_loss[loss=0.1732, simple_loss=0.2396, pruned_loss=0.05342, over 972992.59 frames.], batch size: 14, lr: 7.00e-04 2022-05-04 05:59:09,751 INFO [train.py:715] (0/8) Epoch 2, batch 12400, loss[loss=0.2027, simple_loss=0.264, pruned_loss=0.07071, over 4894.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2385, pruned_loss=0.05245, over 973207.89 frames.], batch size: 16, lr: 7.00e-04 2022-05-04 05:59:50,016 INFO [train.py:715] (0/8) Epoch 2, batch 12450, loss[loss=0.1881, simple_loss=0.2514, pruned_loss=0.06247, over 4751.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2389, pruned_loss=0.05278, over 972971.95 frames.], batch size: 19, lr: 7.00e-04 2022-05-04 06:00:29,871 INFO [train.py:715] (0/8) Epoch 2, batch 12500, loss[loss=0.1335, simple_loss=0.2069, pruned_loss=0.02999, over 4759.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2396, pruned_loss=0.05353, over 973487.16 frames.], batch size: 19, lr: 6.99e-04 2022-05-04 06:01:10,536 INFO [train.py:715] (0/8) Epoch 2, batch 12550, loss[loss=0.1369, simple_loss=0.2172, pruned_loss=0.02832, over 4794.00 frames.], tot_loss[loss=0.1716, simple_loss=0.2382, pruned_loss=0.05247, over 973626.85 frames.], batch size: 12, lr: 6.99e-04 2022-05-04 06:01:50,875 INFO [train.py:715] (0/8) Epoch 2, batch 12600, loss[loss=0.1661, simple_loss=0.234, pruned_loss=0.0491, over 4883.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2386, pruned_loss=0.05324, over 972984.46 frames.], batch size: 19, lr: 6.99e-04 2022-05-04 06:02:30,892 INFO [train.py:715] (0/8) Epoch 2, batch 12650, loss[loss=0.2064, simple_loss=0.2719, pruned_loss=0.07047, over 4899.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2393, pruned_loss=0.05361, over 972720.68 frames.], batch size: 19, lr: 6.99e-04 2022-05-04 06:03:11,021 INFO [train.py:715] (0/8) Epoch 2, batch 12700, loss[loss=0.1608, simple_loss=0.2312, pruned_loss=0.04519, over 4991.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2384, pruned_loss=0.05273, over 972257.32 frames.], batch size: 31, lr: 6.99e-04 2022-05-04 06:03:51,749 INFO [train.py:715] (0/8) Epoch 2, batch 12750, loss[loss=0.1603, simple_loss=0.2199, pruned_loss=0.05034, over 4984.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2385, pruned_loss=0.05293, over 972703.77 frames.], batch size: 28, lr: 6.98e-04 2022-05-04 06:04:31,917 INFO [train.py:715] (0/8) Epoch 2, batch 12800, loss[loss=0.1465, simple_loss=0.219, pruned_loss=0.03701, over 4900.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2391, pruned_loss=0.05332, over 971895.51 frames.], batch size: 17, lr: 6.98e-04 2022-05-04 06:05:11,606 INFO [train.py:715] (0/8) Epoch 2, batch 12850, loss[loss=0.1965, simple_loss=0.2524, pruned_loss=0.07027, over 4921.00 frames.], tot_loss[loss=0.1725, simple_loss=0.239, pruned_loss=0.05304, over 973069.25 frames.], batch size: 39, lr: 6.98e-04 2022-05-04 06:05:52,430 INFO [train.py:715] (0/8) Epoch 2, batch 12900, loss[loss=0.1382, simple_loss=0.2121, pruned_loss=0.03212, over 4818.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2385, pruned_loss=0.05252, over 972924.51 frames.], batch size: 27, lr: 6.98e-04 2022-05-04 06:06:32,859 INFO [train.py:715] (0/8) Epoch 2, batch 12950, loss[loss=0.1708, simple_loss=0.2407, pruned_loss=0.05048, over 4852.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2384, pruned_loss=0.0526, over 972801.98 frames.], batch size: 20, lr: 6.98e-04 2022-05-04 06:07:12,813 INFO [train.py:715] (0/8) Epoch 2, batch 13000, loss[loss=0.1639, simple_loss=0.2422, pruned_loss=0.04285, over 4829.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2405, pruned_loss=0.05368, over 973440.64 frames.], batch size: 27, lr: 6.97e-04 2022-05-04 06:07:53,252 INFO [train.py:715] (0/8) Epoch 2, batch 13050, loss[loss=0.1662, simple_loss=0.2335, pruned_loss=0.04942, over 4757.00 frames.], tot_loss[loss=0.1738, simple_loss=0.24, pruned_loss=0.05376, over 972362.45 frames.], batch size: 19, lr: 6.97e-04 2022-05-04 06:08:34,485 INFO [train.py:715] (0/8) Epoch 2, batch 13100, loss[loss=0.1603, simple_loss=0.2384, pruned_loss=0.04109, over 4752.00 frames.], tot_loss[loss=0.1741, simple_loss=0.2398, pruned_loss=0.05414, over 971866.84 frames.], batch size: 19, lr: 6.97e-04 2022-05-04 06:09:14,696 INFO [train.py:715] (0/8) Epoch 2, batch 13150, loss[loss=0.1917, simple_loss=0.2539, pruned_loss=0.06472, over 4832.00 frames.], tot_loss[loss=0.174, simple_loss=0.24, pruned_loss=0.05402, over 971165.71 frames.], batch size: 13, lr: 6.97e-04 2022-05-04 06:09:54,437 INFO [train.py:715] (0/8) Epoch 2, batch 13200, loss[loss=0.2141, simple_loss=0.2716, pruned_loss=0.07826, over 4903.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2385, pruned_loss=0.05327, over 971447.62 frames.], batch size: 19, lr: 6.96e-04 2022-05-04 06:10:35,332 INFO [train.py:715] (0/8) Epoch 2, batch 13250, loss[loss=0.1497, simple_loss=0.2192, pruned_loss=0.04012, over 4938.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2389, pruned_loss=0.05323, over 971110.98 frames.], batch size: 18, lr: 6.96e-04 2022-05-04 06:11:15,867 INFO [train.py:715] (0/8) Epoch 2, batch 13300, loss[loss=0.1347, simple_loss=0.1993, pruned_loss=0.03502, over 4975.00 frames.], tot_loss[loss=0.1716, simple_loss=0.2379, pruned_loss=0.0527, over 971328.04 frames.], batch size: 28, lr: 6.96e-04 2022-05-04 06:11:55,898 INFO [train.py:715] (0/8) Epoch 2, batch 13350, loss[loss=0.1314, simple_loss=0.2034, pruned_loss=0.0297, over 4697.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2377, pruned_loss=0.05299, over 970972.88 frames.], batch size: 15, lr: 6.96e-04 2022-05-04 06:12:36,496 INFO [train.py:715] (0/8) Epoch 2, batch 13400, loss[loss=0.1326, simple_loss=0.2024, pruned_loss=0.03141, over 4937.00 frames.], tot_loss[loss=0.1721, simple_loss=0.2381, pruned_loss=0.05303, over 971308.50 frames.], batch size: 21, lr: 6.96e-04 2022-05-04 06:13:17,581 INFO [train.py:715] (0/8) Epoch 2, batch 13450, loss[loss=0.1611, simple_loss=0.2347, pruned_loss=0.04377, over 4793.00 frames.], tot_loss[loss=0.1713, simple_loss=0.2371, pruned_loss=0.05279, over 970511.41 frames.], batch size: 21, lr: 6.95e-04 2022-05-04 06:13:57,534 INFO [train.py:715] (0/8) Epoch 2, batch 13500, loss[loss=0.1796, simple_loss=0.2343, pruned_loss=0.06249, over 4850.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2378, pruned_loss=0.05287, over 971328.98 frames.], batch size: 25, lr: 6.95e-04 2022-05-04 06:14:37,535 INFO [train.py:715] (0/8) Epoch 2, batch 13550, loss[loss=0.1588, simple_loss=0.2178, pruned_loss=0.04996, over 4825.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2391, pruned_loss=0.05323, over 970829.43 frames.], batch size: 12, lr: 6.95e-04 2022-05-04 06:15:18,686 INFO [train.py:715] (0/8) Epoch 2, batch 13600, loss[loss=0.1667, simple_loss=0.234, pruned_loss=0.04967, over 4637.00 frames.], tot_loss[loss=0.1718, simple_loss=0.238, pruned_loss=0.05282, over 970909.64 frames.], batch size: 13, lr: 6.95e-04 2022-05-04 06:15:59,135 INFO [train.py:715] (0/8) Epoch 2, batch 13650, loss[loss=0.1678, simple_loss=0.2342, pruned_loss=0.05071, over 4745.00 frames.], tot_loss[loss=0.1736, simple_loss=0.2397, pruned_loss=0.05373, over 971700.17 frames.], batch size: 19, lr: 6.95e-04 2022-05-04 06:16:38,695 INFO [train.py:715] (0/8) Epoch 2, batch 13700, loss[loss=0.1892, simple_loss=0.2377, pruned_loss=0.07038, over 4946.00 frames.], tot_loss[loss=0.173, simple_loss=0.2392, pruned_loss=0.05341, over 971087.24 frames.], batch size: 35, lr: 6.94e-04 2022-05-04 06:17:19,957 INFO [train.py:715] (0/8) Epoch 2, batch 13750, loss[loss=0.1515, simple_loss=0.225, pruned_loss=0.03901, over 4976.00 frames.], tot_loss[loss=0.1726, simple_loss=0.239, pruned_loss=0.0531, over 971381.79 frames.], batch size: 15, lr: 6.94e-04 2022-05-04 06:18:00,038 INFO [train.py:715] (0/8) Epoch 2, batch 13800, loss[loss=0.1862, simple_loss=0.2331, pruned_loss=0.06966, over 4792.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2389, pruned_loss=0.05325, over 971565.17 frames.], batch size: 17, lr: 6.94e-04 2022-05-04 06:18:39,729 INFO [train.py:715] (0/8) Epoch 2, batch 13850, loss[loss=0.1811, simple_loss=0.2531, pruned_loss=0.05452, over 4854.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2391, pruned_loss=0.05322, over 970907.25 frames.], batch size: 20, lr: 6.94e-04 2022-05-04 06:19:19,317 INFO [train.py:715] (0/8) Epoch 2, batch 13900, loss[loss=0.1426, simple_loss=0.222, pruned_loss=0.0316, over 4866.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2386, pruned_loss=0.0524, over 970313.34 frames.], batch size: 20, lr: 6.94e-04 2022-05-04 06:20:00,083 INFO [train.py:715] (0/8) Epoch 2, batch 13950, loss[loss=0.1713, simple_loss=0.2436, pruned_loss=0.04956, over 4888.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2382, pruned_loss=0.05262, over 969781.32 frames.], batch size: 22, lr: 6.93e-04 2022-05-04 06:20:40,297 INFO [train.py:715] (0/8) Epoch 2, batch 14000, loss[loss=0.1617, simple_loss=0.2318, pruned_loss=0.0458, over 4795.00 frames.], tot_loss[loss=0.1726, simple_loss=0.239, pruned_loss=0.05307, over 970676.85 frames.], batch size: 21, lr: 6.93e-04 2022-05-04 06:21:19,543 INFO [train.py:715] (0/8) Epoch 2, batch 14050, loss[loss=0.14, simple_loss=0.2015, pruned_loss=0.03925, over 4825.00 frames.], tot_loss[loss=0.172, simple_loss=0.2384, pruned_loss=0.05282, over 970361.99 frames.], batch size: 13, lr: 6.93e-04 2022-05-04 06:22:01,052 INFO [train.py:715] (0/8) Epoch 2, batch 14100, loss[loss=0.1568, simple_loss=0.2255, pruned_loss=0.04406, over 4908.00 frames.], tot_loss[loss=0.1741, simple_loss=0.24, pruned_loss=0.05414, over 970974.18 frames.], batch size: 19, lr: 6.93e-04 2022-05-04 06:22:41,695 INFO [train.py:715] (0/8) Epoch 2, batch 14150, loss[loss=0.1749, simple_loss=0.2363, pruned_loss=0.05673, over 4697.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2387, pruned_loss=0.05358, over 970847.94 frames.], batch size: 15, lr: 6.93e-04 2022-05-04 06:23:21,644 INFO [train.py:715] (0/8) Epoch 2, batch 14200, loss[loss=0.1559, simple_loss=0.2237, pruned_loss=0.04405, over 4867.00 frames.], tot_loss[loss=0.1743, simple_loss=0.2396, pruned_loss=0.05454, over 970609.00 frames.], batch size: 20, lr: 6.92e-04 2022-05-04 06:24:01,483 INFO [train.py:715] (0/8) Epoch 2, batch 14250, loss[loss=0.1699, simple_loss=0.2384, pruned_loss=0.05072, over 4930.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2392, pruned_loss=0.05477, over 970110.41 frames.], batch size: 23, lr: 6.92e-04 2022-05-04 06:24:42,097 INFO [train.py:715] (0/8) Epoch 2, batch 14300, loss[loss=0.2232, simple_loss=0.2884, pruned_loss=0.07904, over 4938.00 frames.], tot_loss[loss=0.1732, simple_loss=0.2383, pruned_loss=0.05411, over 970783.20 frames.], batch size: 21, lr: 6.92e-04 2022-05-04 06:25:21,657 INFO [train.py:715] (0/8) Epoch 2, batch 14350, loss[loss=0.1944, simple_loss=0.2598, pruned_loss=0.06449, over 4985.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2379, pruned_loss=0.05383, over 971383.97 frames.], batch size: 31, lr: 6.92e-04 2022-05-04 06:26:01,523 INFO [train.py:715] (0/8) Epoch 2, batch 14400, loss[loss=0.1372, simple_loss=0.1988, pruned_loss=0.03781, over 4774.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2378, pruned_loss=0.05369, over 971433.78 frames.], batch size: 12, lr: 6.92e-04 2022-05-04 06:26:41,861 INFO [train.py:715] (0/8) Epoch 2, batch 14450, loss[loss=0.2097, simple_loss=0.2723, pruned_loss=0.07355, over 4886.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2383, pruned_loss=0.05353, over 972388.23 frames.], batch size: 38, lr: 6.91e-04 2022-05-04 06:27:22,095 INFO [train.py:715] (0/8) Epoch 2, batch 14500, loss[loss=0.1542, simple_loss=0.2225, pruned_loss=0.04289, over 4774.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2387, pruned_loss=0.05325, over 973004.21 frames.], batch size: 17, lr: 6.91e-04 2022-05-04 06:28:01,698 INFO [train.py:715] (0/8) Epoch 2, batch 14550, loss[loss=0.1611, simple_loss=0.2377, pruned_loss=0.04228, over 4806.00 frames.], tot_loss[loss=0.1721, simple_loss=0.2386, pruned_loss=0.05276, over 972221.53 frames.], batch size: 24, lr: 6.91e-04 2022-05-04 06:28:42,170 INFO [train.py:715] (0/8) Epoch 2, batch 14600, loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.0292, over 4799.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2384, pruned_loss=0.05299, over 971984.93 frames.], batch size: 24, lr: 6.91e-04 2022-05-04 06:29:22,663 INFO [train.py:715] (0/8) Epoch 2, batch 14650, loss[loss=0.2023, simple_loss=0.2592, pruned_loss=0.07273, over 4858.00 frames.], tot_loss[loss=0.173, simple_loss=0.239, pruned_loss=0.0535, over 972114.53 frames.], batch size: 13, lr: 6.90e-04 2022-05-04 06:30:01,959 INFO [train.py:715] (0/8) Epoch 2, batch 14700, loss[loss=0.1759, simple_loss=0.247, pruned_loss=0.05244, over 4977.00 frames.], tot_loss[loss=0.1738, simple_loss=0.2397, pruned_loss=0.05397, over 972211.34 frames.], batch size: 24, lr: 6.90e-04 2022-05-04 06:30:41,305 INFO [train.py:715] (0/8) Epoch 2, batch 14750, loss[loss=0.1699, simple_loss=0.2364, pruned_loss=0.05172, over 4928.00 frames.], tot_loss[loss=0.1733, simple_loss=0.2392, pruned_loss=0.0537, over 971881.24 frames.], batch size: 29, lr: 6.90e-04 2022-05-04 06:31:21,766 INFO [train.py:715] (0/8) Epoch 2, batch 14800, loss[loss=0.1477, simple_loss=0.2107, pruned_loss=0.04233, over 4843.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2397, pruned_loss=0.05402, over 971233.64 frames.], batch size: 15, lr: 6.90e-04 2022-05-04 06:32:01,271 INFO [train.py:715] (0/8) Epoch 2, batch 14850, loss[loss=0.1967, simple_loss=0.2622, pruned_loss=0.06563, over 4802.00 frames.], tot_loss[loss=0.1744, simple_loss=0.2402, pruned_loss=0.05435, over 971351.01 frames.], batch size: 21, lr: 6.90e-04 2022-05-04 06:32:40,958 INFO [train.py:715] (0/8) Epoch 2, batch 14900, loss[loss=0.1729, simple_loss=0.2462, pruned_loss=0.04984, over 4916.00 frames.], tot_loss[loss=0.175, simple_loss=0.2407, pruned_loss=0.05462, over 971034.39 frames.], batch size: 39, lr: 6.89e-04 2022-05-04 06:33:21,115 INFO [train.py:715] (0/8) Epoch 2, batch 14950, loss[loss=0.1961, simple_loss=0.2633, pruned_loss=0.06448, over 4796.00 frames.], tot_loss[loss=0.1735, simple_loss=0.239, pruned_loss=0.05401, over 971401.73 frames.], batch size: 17, lr: 6.89e-04 2022-05-04 06:34:01,761 INFO [train.py:715] (0/8) Epoch 2, batch 15000, loss[loss=0.1448, simple_loss=0.2234, pruned_loss=0.03308, over 4950.00 frames.], tot_loss[loss=0.1736, simple_loss=0.2393, pruned_loss=0.05392, over 971522.09 frames.], batch size: 21, lr: 6.89e-04 2022-05-04 06:34:01,763 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 06:34:11,142 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1176, simple_loss=0.2043, pruned_loss=0.01548, over 914524.00 frames. 2022-05-04 06:34:52,062 INFO [train.py:715] (0/8) Epoch 2, batch 15050, loss[loss=0.1522, simple_loss=0.2243, pruned_loss=0.04, over 4985.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2385, pruned_loss=0.05295, over 971684.29 frames.], batch size: 35, lr: 6.89e-04 2022-05-04 06:35:31,196 INFO [train.py:715] (0/8) Epoch 2, batch 15100, loss[loss=0.1821, simple_loss=0.2417, pruned_loss=0.06126, over 4855.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2382, pruned_loss=0.05263, over 971321.14 frames.], batch size: 32, lr: 6.89e-04 2022-05-04 06:36:11,673 INFO [train.py:715] (0/8) Epoch 2, batch 15150, loss[loss=0.1955, simple_loss=0.2615, pruned_loss=0.06481, over 4975.00 frames.], tot_loss[loss=0.1731, simple_loss=0.2395, pruned_loss=0.05334, over 972166.45 frames.], batch size: 35, lr: 6.88e-04 2022-05-04 06:36:52,153 INFO [train.py:715] (0/8) Epoch 2, batch 15200, loss[loss=0.1578, simple_loss=0.2171, pruned_loss=0.04923, over 4839.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2381, pruned_loss=0.05273, over 972745.15 frames.], batch size: 15, lr: 6.88e-04 2022-05-04 06:37:31,887 INFO [train.py:715] (0/8) Epoch 2, batch 15250, loss[loss=0.1864, simple_loss=0.2522, pruned_loss=0.06027, over 4931.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2381, pruned_loss=0.0523, over 971688.38 frames.], batch size: 39, lr: 6.88e-04 2022-05-04 06:38:11,348 INFO [train.py:715] (0/8) Epoch 2, batch 15300, loss[loss=0.1399, simple_loss=0.2169, pruned_loss=0.03143, over 4968.00 frames.], tot_loss[loss=0.1711, simple_loss=0.2378, pruned_loss=0.05219, over 972196.52 frames.], batch size: 24, lr: 6.88e-04 2022-05-04 06:38:51,799 INFO [train.py:715] (0/8) Epoch 2, batch 15350, loss[loss=0.2042, simple_loss=0.2692, pruned_loss=0.06961, over 4932.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2385, pruned_loss=0.05222, over 972744.91 frames.], batch size: 23, lr: 6.88e-04 2022-05-04 06:39:32,696 INFO [train.py:715] (0/8) Epoch 2, batch 15400, loss[loss=0.136, simple_loss=0.2065, pruned_loss=0.03275, over 4809.00 frames.], tot_loss[loss=0.172, simple_loss=0.2387, pruned_loss=0.05262, over 973312.19 frames.], batch size: 12, lr: 6.87e-04 2022-05-04 06:40:11,870 INFO [train.py:715] (0/8) Epoch 2, batch 15450, loss[loss=0.1748, simple_loss=0.2272, pruned_loss=0.06117, over 4866.00 frames.], tot_loss[loss=0.1716, simple_loss=0.2385, pruned_loss=0.0524, over 972603.51 frames.], batch size: 32, lr: 6.87e-04 2022-05-04 06:40:52,376 INFO [train.py:715] (0/8) Epoch 2, batch 15500, loss[loss=0.1777, simple_loss=0.2486, pruned_loss=0.05341, over 4794.00 frames.], tot_loss[loss=0.173, simple_loss=0.24, pruned_loss=0.05304, over 972818.80 frames.], batch size: 17, lr: 6.87e-04 2022-05-04 06:41:32,619 INFO [train.py:715] (0/8) Epoch 2, batch 15550, loss[loss=0.1695, simple_loss=0.2308, pruned_loss=0.05412, over 4818.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2395, pruned_loss=0.05265, over 972694.58 frames.], batch size: 21, lr: 6.87e-04 2022-05-04 06:42:12,563 INFO [train.py:715] (0/8) Epoch 2, batch 15600, loss[loss=0.1635, simple_loss=0.2253, pruned_loss=0.05085, over 4943.00 frames.], tot_loss[loss=0.1722, simple_loss=0.239, pruned_loss=0.05266, over 972320.02 frames.], batch size: 21, lr: 6.87e-04 2022-05-04 06:42:52,375 INFO [train.py:715] (0/8) Epoch 2, batch 15650, loss[loss=0.1542, simple_loss=0.2293, pruned_loss=0.03957, over 4982.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2387, pruned_loss=0.05233, over 972004.84 frames.], batch size: 28, lr: 6.86e-04 2022-05-04 06:43:33,100 INFO [train.py:715] (0/8) Epoch 2, batch 15700, loss[loss=0.1951, simple_loss=0.2631, pruned_loss=0.06359, over 4969.00 frames.], tot_loss[loss=0.1714, simple_loss=0.238, pruned_loss=0.0524, over 972454.54 frames.], batch size: 35, lr: 6.86e-04 2022-05-04 06:44:13,633 INFO [train.py:715] (0/8) Epoch 2, batch 15750, loss[loss=0.1699, simple_loss=0.2345, pruned_loss=0.05267, over 4750.00 frames.], tot_loss[loss=0.1721, simple_loss=0.2387, pruned_loss=0.05275, over 972322.29 frames.], batch size: 16, lr: 6.86e-04 2022-05-04 06:44:52,974 INFO [train.py:715] (0/8) Epoch 2, batch 15800, loss[loss=0.1439, simple_loss=0.2127, pruned_loss=0.03758, over 4829.00 frames.], tot_loss[loss=0.172, simple_loss=0.2383, pruned_loss=0.05281, over 972183.41 frames.], batch size: 15, lr: 6.86e-04 2022-05-04 06:45:33,633 INFO [train.py:715] (0/8) Epoch 2, batch 15850, loss[loss=0.1849, simple_loss=0.2492, pruned_loss=0.06032, over 4731.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2388, pruned_loss=0.05282, over 971962.12 frames.], batch size: 16, lr: 6.86e-04 2022-05-04 06:46:14,113 INFO [train.py:715] (0/8) Epoch 2, batch 15900, loss[loss=0.1617, simple_loss=0.2393, pruned_loss=0.04199, over 4946.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2386, pruned_loss=0.05292, over 971953.82 frames.], batch size: 24, lr: 6.85e-04 2022-05-04 06:46:53,880 INFO [train.py:715] (0/8) Epoch 2, batch 15950, loss[loss=0.1866, simple_loss=0.242, pruned_loss=0.06562, over 4858.00 frames.], tot_loss[loss=0.1727, simple_loss=0.239, pruned_loss=0.05318, over 972614.58 frames.], batch size: 30, lr: 6.85e-04 2022-05-04 06:47:34,111 INFO [train.py:715] (0/8) Epoch 2, batch 16000, loss[loss=0.2068, simple_loss=0.2623, pruned_loss=0.07569, over 4953.00 frames.], tot_loss[loss=0.1727, simple_loss=0.239, pruned_loss=0.05326, over 972043.21 frames.], batch size: 21, lr: 6.85e-04 2022-05-04 06:48:14,443 INFO [train.py:715] (0/8) Epoch 2, batch 16050, loss[loss=0.1564, simple_loss=0.2326, pruned_loss=0.04008, over 4975.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2395, pruned_loss=0.05304, over 972128.09 frames.], batch size: 28, lr: 6.85e-04 2022-05-04 06:48:54,890 INFO [train.py:715] (0/8) Epoch 2, batch 16100, loss[loss=0.1566, simple_loss=0.2193, pruned_loss=0.04696, over 4891.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2394, pruned_loss=0.05308, over 972380.30 frames.], batch size: 19, lr: 6.85e-04 2022-05-04 06:49:34,157 INFO [train.py:715] (0/8) Epoch 2, batch 16150, loss[loss=0.1633, simple_loss=0.2239, pruned_loss=0.05132, over 4872.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2382, pruned_loss=0.05229, over 972323.66 frames.], batch size: 16, lr: 6.84e-04 2022-05-04 06:50:14,548 INFO [train.py:715] (0/8) Epoch 2, batch 16200, loss[loss=0.2107, simple_loss=0.2755, pruned_loss=0.07293, over 4864.00 frames.], tot_loss[loss=0.1712, simple_loss=0.2382, pruned_loss=0.05212, over 972926.13 frames.], batch size: 34, lr: 6.84e-04 2022-05-04 06:50:54,948 INFO [train.py:715] (0/8) Epoch 2, batch 16250, loss[loss=0.2049, simple_loss=0.2657, pruned_loss=0.07208, over 4771.00 frames.], tot_loss[loss=0.1714, simple_loss=0.238, pruned_loss=0.05238, over 972533.38 frames.], batch size: 17, lr: 6.84e-04 2022-05-04 06:51:34,795 INFO [train.py:715] (0/8) Epoch 2, batch 16300, loss[loss=0.1796, simple_loss=0.2516, pruned_loss=0.05386, over 4757.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2384, pruned_loss=0.05268, over 972440.70 frames.], batch size: 16, lr: 6.84e-04 2022-05-04 06:52:14,671 INFO [train.py:715] (0/8) Epoch 2, batch 16350, loss[loss=0.2049, simple_loss=0.2523, pruned_loss=0.07878, over 4785.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2388, pruned_loss=0.05306, over 971830.38 frames.], batch size: 17, lr: 6.84e-04 2022-05-04 06:52:55,176 INFO [train.py:715] (0/8) Epoch 2, batch 16400, loss[loss=0.1653, simple_loss=0.2353, pruned_loss=0.04765, over 4871.00 frames.], tot_loss[loss=0.1721, simple_loss=0.2387, pruned_loss=0.05272, over 972477.10 frames.], batch size: 30, lr: 6.83e-04 2022-05-04 06:53:35,564 INFO [train.py:715] (0/8) Epoch 2, batch 16450, loss[loss=0.1665, simple_loss=0.2261, pruned_loss=0.05344, over 4872.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2393, pruned_loss=0.05317, over 972942.42 frames.], batch size: 32, lr: 6.83e-04 2022-05-04 06:54:15,147 INFO [train.py:715] (0/8) Epoch 2, batch 16500, loss[loss=0.2136, simple_loss=0.2703, pruned_loss=0.07847, over 4761.00 frames.], tot_loss[loss=0.1717, simple_loss=0.2386, pruned_loss=0.05238, over 972691.69 frames.], batch size: 16, lr: 6.83e-04 2022-05-04 06:54:56,128 INFO [train.py:715] (0/8) Epoch 2, batch 16550, loss[loss=0.1399, simple_loss=0.2124, pruned_loss=0.03371, over 4776.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2388, pruned_loss=0.05279, over 972718.44 frames.], batch size: 12, lr: 6.83e-04 2022-05-04 06:55:36,866 INFO [train.py:715] (0/8) Epoch 2, batch 16600, loss[loss=0.1844, simple_loss=0.2671, pruned_loss=0.05086, over 4794.00 frames.], tot_loss[loss=0.1723, simple_loss=0.2389, pruned_loss=0.0529, over 972926.28 frames.], batch size: 24, lr: 6.83e-04 2022-05-04 06:56:16,716 INFO [train.py:715] (0/8) Epoch 2, batch 16650, loss[loss=0.2145, simple_loss=0.2629, pruned_loss=0.08309, over 4793.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2391, pruned_loss=0.05331, over 973171.62 frames.], batch size: 24, lr: 6.82e-04 2022-05-04 06:56:57,164 INFO [train.py:715] (0/8) Epoch 2, batch 16700, loss[loss=0.1656, simple_loss=0.2334, pruned_loss=0.04886, over 4693.00 frames.], tot_loss[loss=0.172, simple_loss=0.2384, pruned_loss=0.05281, over 972046.85 frames.], batch size: 15, lr: 6.82e-04 2022-05-04 06:57:37,914 INFO [train.py:715] (0/8) Epoch 2, batch 16750, loss[loss=0.156, simple_loss=0.227, pruned_loss=0.0425, over 4925.00 frames.], tot_loss[loss=0.1712, simple_loss=0.2377, pruned_loss=0.05236, over 971812.22 frames.], batch size: 29, lr: 6.82e-04 2022-05-04 06:58:18,621 INFO [train.py:715] (0/8) Epoch 2, batch 16800, loss[loss=0.1682, simple_loss=0.2305, pruned_loss=0.05295, over 4985.00 frames.], tot_loss[loss=0.171, simple_loss=0.2375, pruned_loss=0.05224, over 972335.00 frames.], batch size: 33, lr: 6.82e-04 2022-05-04 06:58:58,049 INFO [train.py:715] (0/8) Epoch 2, batch 16850, loss[loss=0.1806, simple_loss=0.2454, pruned_loss=0.0579, over 4853.00 frames.], tot_loss[loss=0.1705, simple_loss=0.2369, pruned_loss=0.05209, over 972931.21 frames.], batch size: 20, lr: 6.82e-04 2022-05-04 06:59:39,311 INFO [train.py:715] (0/8) Epoch 2, batch 16900, loss[loss=0.1701, simple_loss=0.2479, pruned_loss=0.04615, over 4831.00 frames.], tot_loss[loss=0.1721, simple_loss=0.2382, pruned_loss=0.05295, over 973010.99 frames.], batch size: 26, lr: 6.81e-04 2022-05-04 07:00:20,138 INFO [train.py:715] (0/8) Epoch 2, batch 16950, loss[loss=0.15, simple_loss=0.204, pruned_loss=0.04796, over 4773.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2383, pruned_loss=0.05332, over 973362.80 frames.], batch size: 17, lr: 6.81e-04 2022-05-04 07:00:59,946 INFO [train.py:715] (0/8) Epoch 2, batch 17000, loss[loss=0.1651, simple_loss=0.2264, pruned_loss=0.0519, over 4745.00 frames.], tot_loss[loss=0.1732, simple_loss=0.2392, pruned_loss=0.0536, over 973906.28 frames.], batch size: 19, lr: 6.81e-04 2022-05-04 07:01:40,377 INFO [train.py:715] (0/8) Epoch 2, batch 17050, loss[loss=0.1497, simple_loss=0.2271, pruned_loss=0.03609, over 4790.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2391, pruned_loss=0.05324, over 974017.15 frames.], batch size: 18, lr: 6.81e-04 2022-05-04 07:02:20,963 INFO [train.py:715] (0/8) Epoch 2, batch 17100, loss[loss=0.1737, simple_loss=0.2425, pruned_loss=0.05238, over 4819.00 frames.], tot_loss[loss=0.1723, simple_loss=0.2386, pruned_loss=0.05297, over 973387.85 frames.], batch size: 25, lr: 6.81e-04 2022-05-04 07:03:01,193 INFO [train.py:715] (0/8) Epoch 2, batch 17150, loss[loss=0.1655, simple_loss=0.2313, pruned_loss=0.04987, over 4968.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2391, pruned_loss=0.05326, over 973882.46 frames.], batch size: 14, lr: 6.81e-04 2022-05-04 07:03:40,483 INFO [train.py:715] (0/8) Epoch 2, batch 17200, loss[loss=0.1534, simple_loss=0.2302, pruned_loss=0.03829, over 4924.00 frames.], tot_loss[loss=0.1739, simple_loss=0.2398, pruned_loss=0.05398, over 973304.92 frames.], batch size: 17, lr: 6.80e-04 2022-05-04 07:04:20,887 INFO [train.py:715] (0/8) Epoch 2, batch 17250, loss[loss=0.1587, simple_loss=0.2335, pruned_loss=0.04194, over 4642.00 frames.], tot_loss[loss=0.1729, simple_loss=0.239, pruned_loss=0.05346, over 972758.23 frames.], batch size: 13, lr: 6.80e-04 2022-05-04 07:05:01,348 INFO [train.py:715] (0/8) Epoch 2, batch 17300, loss[loss=0.178, simple_loss=0.2462, pruned_loss=0.05489, over 4982.00 frames.], tot_loss[loss=0.1738, simple_loss=0.2394, pruned_loss=0.0541, over 972537.21 frames.], batch size: 26, lr: 6.80e-04 2022-05-04 07:05:40,925 INFO [train.py:715] (0/8) Epoch 2, batch 17350, loss[loss=0.1867, simple_loss=0.2535, pruned_loss=0.05993, over 4935.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2387, pruned_loss=0.05305, over 972076.56 frames.], batch size: 21, lr: 6.80e-04 2022-05-04 07:06:20,386 INFO [train.py:715] (0/8) Epoch 2, batch 17400, loss[loss=0.1592, simple_loss=0.2262, pruned_loss=0.04613, over 4955.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2385, pruned_loss=0.05289, over 972796.84 frames.], batch size: 29, lr: 6.80e-04 2022-05-04 07:07:00,340 INFO [train.py:715] (0/8) Epoch 2, batch 17450, loss[loss=0.154, simple_loss=0.228, pruned_loss=0.03996, over 4795.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2387, pruned_loss=0.05345, over 972640.21 frames.], batch size: 21, lr: 6.79e-04 2022-05-04 07:07:40,091 INFO [train.py:715] (0/8) Epoch 2, batch 17500, loss[loss=0.1686, simple_loss=0.244, pruned_loss=0.04657, over 4945.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2383, pruned_loss=0.05322, over 972056.87 frames.], batch size: 29, lr: 6.79e-04 2022-05-04 07:08:18,852 INFO [train.py:715] (0/8) Epoch 2, batch 17550, loss[loss=0.17, simple_loss=0.2475, pruned_loss=0.04621, over 4794.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2373, pruned_loss=0.05212, over 972120.37 frames.], batch size: 24, lr: 6.79e-04 2022-05-04 07:08:58,966 INFO [train.py:715] (0/8) Epoch 2, batch 17600, loss[loss=0.1935, simple_loss=0.2467, pruned_loss=0.07013, over 4935.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2376, pruned_loss=0.0526, over 973173.34 frames.], batch size: 35, lr: 6.79e-04 2022-05-04 07:09:38,386 INFO [train.py:715] (0/8) Epoch 2, batch 17650, loss[loss=0.2153, simple_loss=0.2759, pruned_loss=0.07738, over 4923.00 frames.], tot_loss[loss=0.1705, simple_loss=0.2369, pruned_loss=0.05208, over 972049.70 frames.], batch size: 39, lr: 6.79e-04 2022-05-04 07:10:17,893 INFO [train.py:715] (0/8) Epoch 2, batch 17700, loss[loss=0.1999, simple_loss=0.2645, pruned_loss=0.06764, over 4841.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2371, pruned_loss=0.05225, over 971960.84 frames.], batch size: 20, lr: 6.78e-04 2022-05-04 07:10:57,826 INFO [train.py:715] (0/8) Epoch 2, batch 17750, loss[loss=0.1787, simple_loss=0.2487, pruned_loss=0.05436, over 4722.00 frames.], tot_loss[loss=0.1712, simple_loss=0.2376, pruned_loss=0.05241, over 972416.32 frames.], batch size: 15, lr: 6.78e-04 2022-05-04 07:11:37,689 INFO [train.py:715] (0/8) Epoch 2, batch 17800, loss[loss=0.1676, simple_loss=0.2373, pruned_loss=0.04897, over 4908.00 frames.], tot_loss[loss=0.1709, simple_loss=0.2373, pruned_loss=0.05226, over 972121.09 frames.], batch size: 18, lr: 6.78e-04 2022-05-04 07:12:17,976 INFO [train.py:715] (0/8) Epoch 2, batch 17850, loss[loss=0.1882, simple_loss=0.2623, pruned_loss=0.05703, over 4929.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2379, pruned_loss=0.05254, over 972254.38 frames.], batch size: 29, lr: 6.78e-04 2022-05-04 07:12:56,811 INFO [train.py:715] (0/8) Epoch 2, batch 17900, loss[loss=0.2098, simple_loss=0.2725, pruned_loss=0.07358, over 4969.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2382, pruned_loss=0.05277, over 972135.32 frames.], batch size: 15, lr: 6.78e-04 2022-05-04 07:13:36,733 INFO [train.py:715] (0/8) Epoch 2, batch 17950, loss[loss=0.1456, simple_loss=0.215, pruned_loss=0.03814, over 4795.00 frames.], tot_loss[loss=0.1705, simple_loss=0.2373, pruned_loss=0.05189, over 972428.90 frames.], batch size: 25, lr: 6.77e-04 2022-05-04 07:14:16,910 INFO [train.py:715] (0/8) Epoch 2, batch 18000, loss[loss=0.1804, simple_loss=0.2366, pruned_loss=0.06209, over 4966.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2374, pruned_loss=0.05272, over 971541.92 frames.], batch size: 14, lr: 6.77e-04 2022-05-04 07:14:16,911 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 07:14:26,628 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1173, simple_loss=0.2039, pruned_loss=0.01538, over 914524.00 frames. 2022-05-04 07:15:07,354 INFO [train.py:715] (0/8) Epoch 2, batch 18050, loss[loss=0.1461, simple_loss=0.2047, pruned_loss=0.04374, over 4823.00 frames.], tot_loss[loss=0.1707, simple_loss=0.237, pruned_loss=0.05226, over 972386.77 frames.], batch size: 12, lr: 6.77e-04 2022-05-04 07:15:46,537 INFO [train.py:715] (0/8) Epoch 2, batch 18100, loss[loss=0.1678, simple_loss=0.2373, pruned_loss=0.04915, over 4917.00 frames.], tot_loss[loss=0.1716, simple_loss=0.238, pruned_loss=0.05263, over 972716.15 frames.], batch size: 23, lr: 6.77e-04 2022-05-04 07:16:27,431 INFO [train.py:715] (0/8) Epoch 2, batch 18150, loss[loss=0.2085, simple_loss=0.2804, pruned_loss=0.06827, over 4696.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2386, pruned_loss=0.05292, over 972597.24 frames.], batch size: 15, lr: 6.77e-04 2022-05-04 07:17:08,381 INFO [train.py:715] (0/8) Epoch 2, batch 18200, loss[loss=0.1533, simple_loss=0.2177, pruned_loss=0.04451, over 4823.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2379, pruned_loss=0.05247, over 971946.96 frames.], batch size: 15, lr: 6.76e-04 2022-05-04 07:17:49,839 INFO [train.py:715] (0/8) Epoch 2, batch 18250, loss[loss=0.153, simple_loss=0.2246, pruned_loss=0.04075, over 4908.00 frames.], tot_loss[loss=0.1716, simple_loss=0.238, pruned_loss=0.05265, over 972067.55 frames.], batch size: 17, lr: 6.76e-04 2022-05-04 07:18:30,283 INFO [train.py:715] (0/8) Epoch 2, batch 18300, loss[loss=0.1528, simple_loss=0.2243, pruned_loss=0.04069, over 4941.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2372, pruned_loss=0.0522, over 971776.92 frames.], batch size: 29, lr: 6.76e-04 2022-05-04 07:19:12,142 INFO [train.py:715] (0/8) Epoch 2, batch 18350, loss[loss=0.1808, simple_loss=0.2661, pruned_loss=0.04769, over 4813.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2374, pruned_loss=0.05209, over 971817.46 frames.], batch size: 15, lr: 6.76e-04 2022-05-04 07:19:36,695 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-88000.pt 2022-05-04 07:19:56,507 INFO [train.py:715] (0/8) Epoch 2, batch 18400, loss[loss=0.163, simple_loss=0.2402, pruned_loss=0.04296, over 4919.00 frames.], tot_loss[loss=0.1712, simple_loss=0.2376, pruned_loss=0.05239, over 972935.35 frames.], batch size: 18, lr: 6.76e-04 2022-05-04 07:20:36,600 INFO [train.py:715] (0/8) Epoch 2, batch 18450, loss[loss=0.1662, simple_loss=0.2419, pruned_loss=0.04528, over 4924.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2387, pruned_loss=0.05286, over 973610.23 frames.], batch size: 29, lr: 6.75e-04 2022-05-04 07:21:18,111 INFO [train.py:715] (0/8) Epoch 2, batch 18500, loss[loss=0.1635, simple_loss=0.2351, pruned_loss=0.04593, over 4926.00 frames.], tot_loss[loss=0.1723, simple_loss=0.2386, pruned_loss=0.05303, over 972663.97 frames.], batch size: 17, lr: 6.75e-04 2022-05-04 07:21:59,813 INFO [train.py:715] (0/8) Epoch 2, batch 18550, loss[loss=0.1599, simple_loss=0.2303, pruned_loss=0.04476, over 4909.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2388, pruned_loss=0.05322, over 972733.36 frames.], batch size: 19, lr: 6.75e-04 2022-05-04 07:22:41,518 INFO [train.py:715] (0/8) Epoch 2, batch 18600, loss[loss=0.1714, simple_loss=0.2352, pruned_loss=0.05378, over 4974.00 frames.], tot_loss[loss=0.1719, simple_loss=0.2381, pruned_loss=0.05284, over 972768.73 frames.], batch size: 15, lr: 6.75e-04 2022-05-04 07:23:21,835 INFO [train.py:715] (0/8) Epoch 2, batch 18650, loss[loss=0.1646, simple_loss=0.2378, pruned_loss=0.04565, over 4927.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2383, pruned_loss=0.05269, over 972970.02 frames.], batch size: 39, lr: 6.75e-04 2022-05-04 07:24:03,490 INFO [train.py:715] (0/8) Epoch 2, batch 18700, loss[loss=0.1671, simple_loss=0.2405, pruned_loss=0.04683, over 4949.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2393, pruned_loss=0.05283, over 973145.29 frames.], batch size: 21, lr: 6.75e-04 2022-05-04 07:24:45,174 INFO [train.py:715] (0/8) Epoch 2, batch 18750, loss[loss=0.2127, simple_loss=0.2686, pruned_loss=0.07837, over 4870.00 frames.], tot_loss[loss=0.1726, simple_loss=0.2392, pruned_loss=0.05301, over 973031.94 frames.], batch size: 32, lr: 6.74e-04 2022-05-04 07:25:25,746 INFO [train.py:715] (0/8) Epoch 2, batch 18800, loss[loss=0.184, simple_loss=0.2506, pruned_loss=0.05871, over 4985.00 frames.], tot_loss[loss=0.1711, simple_loss=0.2378, pruned_loss=0.05223, over 972056.79 frames.], batch size: 26, lr: 6.74e-04 2022-05-04 07:26:06,676 INFO [train.py:715] (0/8) Epoch 2, batch 18850, loss[loss=0.1775, simple_loss=0.2485, pruned_loss=0.05321, over 4793.00 frames.], tot_loss[loss=0.1706, simple_loss=0.2374, pruned_loss=0.05188, over 972617.98 frames.], batch size: 17, lr: 6.74e-04 2022-05-04 07:26:48,086 INFO [train.py:715] (0/8) Epoch 2, batch 18900, loss[loss=0.1506, simple_loss=0.2206, pruned_loss=0.04031, over 4737.00 frames.], tot_loss[loss=0.1721, simple_loss=0.2388, pruned_loss=0.0527, over 972417.25 frames.], batch size: 16, lr: 6.74e-04 2022-05-04 07:27:29,078 INFO [train.py:715] (0/8) Epoch 2, batch 18950, loss[loss=0.1735, simple_loss=0.2445, pruned_loss=0.05129, over 4976.00 frames.], tot_loss[loss=0.173, simple_loss=0.2392, pruned_loss=0.05339, over 971660.39 frames.], batch size: 24, lr: 6.74e-04 2022-05-04 07:28:09,471 INFO [train.py:715] (0/8) Epoch 2, batch 19000, loss[loss=0.1684, simple_loss=0.241, pruned_loss=0.04789, over 4954.00 frames.], tot_loss[loss=0.1737, simple_loss=0.2398, pruned_loss=0.05375, over 971730.20 frames.], batch size: 39, lr: 6.73e-04 2022-05-04 07:28:51,002 INFO [train.py:715] (0/8) Epoch 2, batch 19050, loss[loss=0.166, simple_loss=0.2303, pruned_loss=0.0508, over 4789.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2387, pruned_loss=0.05331, over 971665.92 frames.], batch size: 18, lr: 6.73e-04 2022-05-04 07:29:32,579 INFO [train.py:715] (0/8) Epoch 2, batch 19100, loss[loss=0.1551, simple_loss=0.2192, pruned_loss=0.04551, over 4902.00 frames.], tot_loss[loss=0.172, simple_loss=0.238, pruned_loss=0.05302, over 971428.78 frames.], batch size: 19, lr: 6.73e-04 2022-05-04 07:30:13,201 INFO [train.py:715] (0/8) Epoch 2, batch 19150, loss[loss=0.1626, simple_loss=0.218, pruned_loss=0.05365, over 4929.00 frames.], tot_loss[loss=0.1727, simple_loss=0.2385, pruned_loss=0.05348, over 971390.27 frames.], batch size: 23, lr: 6.73e-04 2022-05-04 07:30:53,911 INFO [train.py:715] (0/8) Epoch 2, batch 19200, loss[loss=0.2169, simple_loss=0.2771, pruned_loss=0.07832, over 4860.00 frames.], tot_loss[loss=0.1729, simple_loss=0.2385, pruned_loss=0.05363, over 971597.41 frames.], batch size: 32, lr: 6.73e-04 2022-05-04 07:31:35,012 INFO [train.py:715] (0/8) Epoch 2, batch 19250, loss[loss=0.1763, simple_loss=0.2522, pruned_loss=0.05021, over 4779.00 frames.], tot_loss[loss=0.1718, simple_loss=0.238, pruned_loss=0.0528, over 971770.54 frames.], batch size: 18, lr: 6.72e-04 2022-05-04 07:32:15,461 INFO [train.py:715] (0/8) Epoch 2, batch 19300, loss[loss=0.1599, simple_loss=0.222, pruned_loss=0.04895, over 4744.00 frames.], tot_loss[loss=0.1718, simple_loss=0.238, pruned_loss=0.05282, over 972115.55 frames.], batch size: 16, lr: 6.72e-04 2022-05-04 07:32:55,619 INFO [train.py:715] (0/8) Epoch 2, batch 19350, loss[loss=0.1944, simple_loss=0.2586, pruned_loss=0.06513, over 4797.00 frames.], tot_loss[loss=0.1713, simple_loss=0.2378, pruned_loss=0.0524, over 972341.30 frames.], batch size: 21, lr: 6.72e-04 2022-05-04 07:33:36,561 INFO [train.py:715] (0/8) Epoch 2, batch 19400, loss[loss=0.2254, simple_loss=0.2756, pruned_loss=0.08765, over 4812.00 frames.], tot_loss[loss=0.1711, simple_loss=0.2378, pruned_loss=0.05225, over 972128.74 frames.], batch size: 25, lr: 6.72e-04 2022-05-04 07:34:18,485 INFO [train.py:715] (0/8) Epoch 2, batch 19450, loss[loss=0.1643, simple_loss=0.2485, pruned_loss=0.04008, over 4833.00 frames.], tot_loss[loss=0.1712, simple_loss=0.238, pruned_loss=0.05222, over 971626.31 frames.], batch size: 26, lr: 6.72e-04 2022-05-04 07:34:58,703 INFO [train.py:715] (0/8) Epoch 2, batch 19500, loss[loss=0.1662, simple_loss=0.2399, pruned_loss=0.04628, over 4874.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2379, pruned_loss=0.05188, over 972187.15 frames.], batch size: 20, lr: 6.72e-04 2022-05-04 07:35:38,987 INFO [train.py:715] (0/8) Epoch 2, batch 19550, loss[loss=0.1656, simple_loss=0.2346, pruned_loss=0.04824, over 4947.00 frames.], tot_loss[loss=0.1705, simple_loss=0.2372, pruned_loss=0.05185, over 971919.18 frames.], batch size: 21, lr: 6.71e-04 2022-05-04 07:36:20,458 INFO [train.py:715] (0/8) Epoch 2, batch 19600, loss[loss=0.1985, simple_loss=0.2552, pruned_loss=0.07092, over 4990.00 frames.], tot_loss[loss=0.1702, simple_loss=0.237, pruned_loss=0.05166, over 971436.19 frames.], batch size: 25, lr: 6.71e-04 2022-05-04 07:37:01,116 INFO [train.py:715] (0/8) Epoch 2, batch 19650, loss[loss=0.1631, simple_loss=0.2361, pruned_loss=0.04507, over 4737.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2355, pruned_loss=0.05098, over 971777.76 frames.], batch size: 16, lr: 6.71e-04 2022-05-04 07:37:40,955 INFO [train.py:715] (0/8) Epoch 2, batch 19700, loss[loss=0.1537, simple_loss=0.2293, pruned_loss=0.03908, over 4841.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2357, pruned_loss=0.05108, over 972640.79 frames.], batch size: 15, lr: 6.71e-04 2022-05-04 07:38:21,849 INFO [train.py:715] (0/8) Epoch 2, batch 19750, loss[loss=0.1964, simple_loss=0.2693, pruned_loss=0.0617, over 4892.00 frames.], tot_loss[loss=0.1688, simple_loss=0.236, pruned_loss=0.05079, over 973316.30 frames.], batch size: 16, lr: 6.71e-04 2022-05-04 07:39:02,975 INFO [train.py:715] (0/8) Epoch 2, batch 19800, loss[loss=0.1945, simple_loss=0.2441, pruned_loss=0.07243, over 4983.00 frames.], tot_loss[loss=0.1692, simple_loss=0.2364, pruned_loss=0.05102, over 972824.44 frames.], batch size: 35, lr: 6.70e-04 2022-05-04 07:39:42,775 INFO [train.py:715] (0/8) Epoch 2, batch 19850, loss[loss=0.1616, simple_loss=0.2362, pruned_loss=0.04346, over 4925.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2356, pruned_loss=0.05077, over 971489.92 frames.], batch size: 29, lr: 6.70e-04 2022-05-04 07:40:23,488 INFO [train.py:715] (0/8) Epoch 2, batch 19900, loss[loss=0.1863, simple_loss=0.2434, pruned_loss=0.06458, over 4863.00 frames.], tot_loss[loss=0.17, simple_loss=0.2366, pruned_loss=0.05168, over 970951.59 frames.], batch size: 32, lr: 6.70e-04 2022-05-04 07:41:04,462 INFO [train.py:715] (0/8) Epoch 2, batch 19950, loss[loss=0.1495, simple_loss=0.2136, pruned_loss=0.04275, over 4828.00 frames.], tot_loss[loss=0.1692, simple_loss=0.236, pruned_loss=0.05117, over 971295.79 frames.], batch size: 13, lr: 6.70e-04 2022-05-04 07:41:44,814 INFO [train.py:715] (0/8) Epoch 2, batch 20000, loss[loss=0.1432, simple_loss=0.226, pruned_loss=0.03021, over 4779.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2364, pruned_loss=0.05129, over 971405.57 frames.], batch size: 17, lr: 6.70e-04 2022-05-04 07:42:25,552 INFO [train.py:715] (0/8) Epoch 2, batch 20050, loss[loss=0.1628, simple_loss=0.2316, pruned_loss=0.04704, over 4805.00 frames.], tot_loss[loss=0.169, simple_loss=0.2361, pruned_loss=0.05091, over 971314.80 frames.], batch size: 25, lr: 6.69e-04 2022-05-04 07:43:06,876 INFO [train.py:715] (0/8) Epoch 2, batch 20100, loss[loss=0.2026, simple_loss=0.2727, pruned_loss=0.06625, over 4902.00 frames.], tot_loss[loss=0.17, simple_loss=0.2372, pruned_loss=0.05146, over 971686.14 frames.], batch size: 17, lr: 6.69e-04 2022-05-04 07:43:48,587 INFO [train.py:715] (0/8) Epoch 2, batch 20150, loss[loss=0.1307, simple_loss=0.2003, pruned_loss=0.03057, over 4991.00 frames.], tot_loss[loss=0.1705, simple_loss=0.2375, pruned_loss=0.05178, over 971587.39 frames.], batch size: 14, lr: 6.69e-04 2022-05-04 07:44:28,882 INFO [train.py:715] (0/8) Epoch 2, batch 20200, loss[loss=0.1485, simple_loss=0.2131, pruned_loss=0.04197, over 4742.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2386, pruned_loss=0.05223, over 971889.54 frames.], batch size: 16, lr: 6.69e-04 2022-05-04 07:45:10,321 INFO [train.py:715] (0/8) Epoch 2, batch 20250, loss[loss=0.1486, simple_loss=0.2115, pruned_loss=0.0428, over 4945.00 frames.], tot_loss[loss=0.1712, simple_loss=0.2385, pruned_loss=0.05197, over 971765.46 frames.], batch size: 29, lr: 6.69e-04 2022-05-04 07:45:52,275 INFO [train.py:715] (0/8) Epoch 2, batch 20300, loss[loss=0.1587, simple_loss=0.2328, pruned_loss=0.0423, over 4862.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2395, pruned_loss=0.05268, over 971762.57 frames.], batch size: 38, lr: 6.69e-04 2022-05-04 07:46:33,100 INFO [train.py:715] (0/8) Epoch 2, batch 20350, loss[loss=0.1672, simple_loss=0.2399, pruned_loss=0.04721, over 4957.00 frames.], tot_loss[loss=0.1714, simple_loss=0.2386, pruned_loss=0.05209, over 972056.34 frames.], batch size: 15, lr: 6.68e-04 2022-05-04 07:47:14,090 INFO [train.py:715] (0/8) Epoch 2, batch 20400, loss[loss=0.1657, simple_loss=0.2321, pruned_loss=0.04962, over 4806.00 frames.], tot_loss[loss=0.1712, simple_loss=0.2382, pruned_loss=0.05208, over 971671.81 frames.], batch size: 21, lr: 6.68e-04 2022-05-04 07:47:56,154 INFO [train.py:715] (0/8) Epoch 2, batch 20450, loss[loss=0.1697, simple_loss=0.2311, pruned_loss=0.05413, over 4966.00 frames.], tot_loss[loss=0.1709, simple_loss=0.2379, pruned_loss=0.05193, over 971685.49 frames.], batch size: 39, lr: 6.68e-04 2022-05-04 07:48:37,718 INFO [train.py:715] (0/8) Epoch 2, batch 20500, loss[loss=0.2123, simple_loss=0.2698, pruned_loss=0.07739, over 4828.00 frames.], tot_loss[loss=0.1699, simple_loss=0.2365, pruned_loss=0.0516, over 971294.66 frames.], batch size: 15, lr: 6.68e-04 2022-05-04 07:49:18,513 INFO [train.py:715] (0/8) Epoch 2, batch 20550, loss[loss=0.148, simple_loss=0.2247, pruned_loss=0.0357, over 4888.00 frames.], tot_loss[loss=0.1709, simple_loss=0.2378, pruned_loss=0.05198, over 971586.73 frames.], batch size: 19, lr: 6.68e-04 2022-05-04 07:49:59,711 INFO [train.py:715] (0/8) Epoch 2, batch 20600, loss[loss=0.2006, simple_loss=0.2521, pruned_loss=0.07451, over 4742.00 frames.], tot_loss[loss=0.1722, simple_loss=0.2389, pruned_loss=0.05274, over 971466.57 frames.], batch size: 16, lr: 6.67e-04 2022-05-04 07:50:41,276 INFO [train.py:715] (0/8) Epoch 2, batch 20650, loss[loss=0.1495, simple_loss=0.2213, pruned_loss=0.03879, over 4766.00 frames.], tot_loss[loss=0.1725, simple_loss=0.2392, pruned_loss=0.0529, over 971533.80 frames.], batch size: 18, lr: 6.67e-04 2022-05-04 07:51:22,513 INFO [train.py:715] (0/8) Epoch 2, batch 20700, loss[loss=0.1857, simple_loss=0.2514, pruned_loss=0.06, over 4931.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2391, pruned_loss=0.05286, over 972153.50 frames.], batch size: 23, lr: 6.67e-04 2022-05-04 07:52:03,039 INFO [train.py:715] (0/8) Epoch 2, batch 20750, loss[loss=0.1744, simple_loss=0.239, pruned_loss=0.05487, over 4844.00 frames.], tot_loss[loss=0.1728, simple_loss=0.2392, pruned_loss=0.05315, over 970987.66 frames.], batch size: 15, lr: 6.67e-04 2022-05-04 07:52:44,288 INFO [train.py:715] (0/8) Epoch 2, batch 20800, loss[loss=0.2178, simple_loss=0.2862, pruned_loss=0.07472, over 4818.00 frames.], tot_loss[loss=0.1724, simple_loss=0.2386, pruned_loss=0.05307, over 971936.20 frames.], batch size: 25, lr: 6.67e-04 2022-05-04 07:53:25,488 INFO [train.py:715] (0/8) Epoch 2, batch 20850, loss[loss=0.1898, simple_loss=0.2618, pruned_loss=0.05888, over 4953.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2373, pruned_loss=0.05213, over 972011.15 frames.], batch size: 15, lr: 6.66e-04 2022-05-04 07:54:06,144 INFO [train.py:715] (0/8) Epoch 2, batch 20900, loss[loss=0.1511, simple_loss=0.2086, pruned_loss=0.04684, over 4857.00 frames.], tot_loss[loss=0.171, simple_loss=0.2377, pruned_loss=0.0522, over 972170.66 frames.], batch size: 34, lr: 6.66e-04 2022-05-04 07:54:47,198 INFO [train.py:715] (0/8) Epoch 2, batch 20950, loss[loss=0.1589, simple_loss=0.221, pruned_loss=0.04846, over 4955.00 frames.], tot_loss[loss=0.1702, simple_loss=0.2369, pruned_loss=0.0518, over 972123.63 frames.], batch size: 14, lr: 6.66e-04 2022-05-04 07:55:28,395 INFO [train.py:715] (0/8) Epoch 2, batch 21000, loss[loss=0.1775, simple_loss=0.2398, pruned_loss=0.05758, over 4980.00 frames.], tot_loss[loss=0.1703, simple_loss=0.2372, pruned_loss=0.05167, over 973181.21 frames.], batch size: 25, lr: 6.66e-04 2022-05-04 07:55:28,397 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 07:55:39,045 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1174, simple_loss=0.2036, pruned_loss=0.01562, over 914524.00 frames. 2022-05-04 07:56:20,528 INFO [train.py:715] (0/8) Epoch 2, batch 21050, loss[loss=0.1683, simple_loss=0.2239, pruned_loss=0.05633, over 4982.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2358, pruned_loss=0.05088, over 972418.92 frames.], batch size: 14, lr: 6.66e-04 2022-05-04 07:57:00,991 INFO [train.py:715] (0/8) Epoch 2, batch 21100, loss[loss=0.1592, simple_loss=0.2308, pruned_loss=0.04379, over 4824.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2358, pruned_loss=0.05055, over 972567.99 frames.], batch size: 15, lr: 6.66e-04 2022-05-04 07:57:41,500 INFO [train.py:715] (0/8) Epoch 2, batch 21150, loss[loss=0.1644, simple_loss=0.2395, pruned_loss=0.04471, over 4939.00 frames.], tot_loss[loss=0.168, simple_loss=0.2351, pruned_loss=0.05048, over 973288.44 frames.], batch size: 29, lr: 6.65e-04 2022-05-04 07:58:22,043 INFO [train.py:715] (0/8) Epoch 2, batch 21200, loss[loss=0.1428, simple_loss=0.2148, pruned_loss=0.03541, over 4832.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2365, pruned_loss=0.05127, over 973490.00 frames.], batch size: 30, lr: 6.65e-04 2022-05-04 07:59:02,140 INFO [train.py:715] (0/8) Epoch 2, batch 21250, loss[loss=0.1578, simple_loss=0.234, pruned_loss=0.04085, over 4911.00 frames.], tot_loss[loss=0.1704, simple_loss=0.2369, pruned_loss=0.05194, over 973236.66 frames.], batch size: 17, lr: 6.65e-04 2022-05-04 07:59:42,851 INFO [train.py:715] (0/8) Epoch 2, batch 21300, loss[loss=0.2062, simple_loss=0.2739, pruned_loss=0.06928, over 4758.00 frames.], tot_loss[loss=0.1709, simple_loss=0.2372, pruned_loss=0.05228, over 971803.50 frames.], batch size: 16, lr: 6.65e-04 2022-05-04 08:00:23,583 INFO [train.py:715] (0/8) Epoch 2, batch 21350, loss[loss=0.1545, simple_loss=0.2247, pruned_loss=0.04219, over 4956.00 frames.], tot_loss[loss=0.1701, simple_loss=0.2369, pruned_loss=0.05162, over 971601.50 frames.], batch size: 21, lr: 6.65e-04 2022-05-04 08:01:04,869 INFO [train.py:715] (0/8) Epoch 2, batch 21400, loss[loss=0.1857, simple_loss=0.2577, pruned_loss=0.05686, over 4826.00 frames.], tot_loss[loss=0.1706, simple_loss=0.2375, pruned_loss=0.05181, over 971812.70 frames.], batch size: 15, lr: 6.64e-04 2022-05-04 08:01:45,140 INFO [train.py:715] (0/8) Epoch 2, batch 21450, loss[loss=0.1484, simple_loss=0.2279, pruned_loss=0.03445, over 4803.00 frames.], tot_loss[loss=0.1699, simple_loss=0.2369, pruned_loss=0.05149, over 971957.48 frames.], batch size: 21, lr: 6.64e-04 2022-05-04 08:02:26,070 INFO [train.py:715] (0/8) Epoch 2, batch 21500, loss[loss=0.165, simple_loss=0.2348, pruned_loss=0.04762, over 4921.00 frames.], tot_loss[loss=0.1696, simple_loss=0.2366, pruned_loss=0.05126, over 972221.36 frames.], batch size: 23, lr: 6.64e-04 2022-05-04 08:03:07,360 INFO [train.py:715] (0/8) Epoch 2, batch 21550, loss[loss=0.1536, simple_loss=0.2265, pruned_loss=0.04035, over 4821.00 frames.], tot_loss[loss=0.171, simple_loss=0.2373, pruned_loss=0.05239, over 972218.98 frames.], batch size: 21, lr: 6.64e-04 2022-05-04 08:03:47,342 INFO [train.py:715] (0/8) Epoch 2, batch 21600, loss[loss=0.1714, simple_loss=0.2397, pruned_loss=0.05156, over 4820.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2379, pruned_loss=0.05257, over 972925.30 frames.], batch size: 27, lr: 6.64e-04 2022-05-04 08:04:28,569 INFO [train.py:715] (0/8) Epoch 2, batch 21650, loss[loss=0.1724, simple_loss=0.2304, pruned_loss=0.05719, over 4864.00 frames.], tot_loss[loss=0.1709, simple_loss=0.2375, pruned_loss=0.05219, over 973193.42 frames.], batch size: 16, lr: 6.64e-04 2022-05-04 08:05:10,119 INFO [train.py:715] (0/8) Epoch 2, batch 21700, loss[loss=0.1411, simple_loss=0.2168, pruned_loss=0.03273, over 4770.00 frames.], tot_loss[loss=0.1718, simple_loss=0.2382, pruned_loss=0.05267, over 972951.76 frames.], batch size: 17, lr: 6.63e-04 2022-05-04 08:05:50,693 INFO [train.py:715] (0/8) Epoch 2, batch 21750, loss[loss=0.2005, simple_loss=0.254, pruned_loss=0.07354, over 4973.00 frames.], tot_loss[loss=0.171, simple_loss=0.2368, pruned_loss=0.05263, over 972996.04 frames.], batch size: 24, lr: 6.63e-04 2022-05-04 08:06:31,761 INFO [train.py:715] (0/8) Epoch 2, batch 21800, loss[loss=0.1699, simple_loss=0.238, pruned_loss=0.0509, over 4945.00 frames.], tot_loss[loss=0.1715, simple_loss=0.2372, pruned_loss=0.05287, over 971733.15 frames.], batch size: 21, lr: 6.63e-04 2022-05-04 08:07:12,181 INFO [train.py:715] (0/8) Epoch 2, batch 21850, loss[loss=0.1946, simple_loss=0.2544, pruned_loss=0.06739, over 4931.00 frames.], tot_loss[loss=0.1723, simple_loss=0.2377, pruned_loss=0.05339, over 970876.82 frames.], batch size: 21, lr: 6.63e-04 2022-05-04 08:07:53,271 INFO [train.py:715] (0/8) Epoch 2, batch 21900, loss[loss=0.1791, simple_loss=0.2509, pruned_loss=0.05365, over 4819.00 frames.], tot_loss[loss=0.1707, simple_loss=0.2365, pruned_loss=0.05247, over 971141.19 frames.], batch size: 26, lr: 6.63e-04 2022-05-04 08:08:33,962 INFO [train.py:715] (0/8) Epoch 2, batch 21950, loss[loss=0.1361, simple_loss=0.204, pruned_loss=0.03408, over 4773.00 frames.], tot_loss[loss=0.171, simple_loss=0.2364, pruned_loss=0.05281, over 971910.70 frames.], batch size: 18, lr: 6.62e-04 2022-05-04 08:09:15,713 INFO [train.py:715] (0/8) Epoch 2, batch 22000, loss[loss=0.2, simple_loss=0.2653, pruned_loss=0.06733, over 4754.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2354, pruned_loss=0.05183, over 972439.50 frames.], batch size: 16, lr: 6.62e-04 2022-05-04 08:09:57,815 INFO [train.py:715] (0/8) Epoch 2, batch 22050, loss[loss=0.1389, simple_loss=0.2044, pruned_loss=0.03671, over 4904.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2347, pruned_loss=0.05118, over 971914.01 frames.], batch size: 17, lr: 6.62e-04 2022-05-04 08:10:38,625 INFO [train.py:715] (0/8) Epoch 2, batch 22100, loss[loss=0.1866, simple_loss=0.2605, pruned_loss=0.0564, over 4928.00 frames.], tot_loss[loss=0.17, simple_loss=0.2362, pruned_loss=0.05193, over 971568.74 frames.], batch size: 39, lr: 6.62e-04 2022-05-04 08:11:20,109 INFO [train.py:715] (0/8) Epoch 2, batch 22150, loss[loss=0.1618, simple_loss=0.2281, pruned_loss=0.04772, over 4901.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2351, pruned_loss=0.05122, over 972286.79 frames.], batch size: 17, lr: 6.62e-04 2022-05-04 08:12:01,872 INFO [train.py:715] (0/8) Epoch 2, batch 22200, loss[loss=0.1667, simple_loss=0.2365, pruned_loss=0.04844, over 4803.00 frames.], tot_loss[loss=0.1691, simple_loss=0.2357, pruned_loss=0.05127, over 972307.98 frames.], batch size: 25, lr: 6.62e-04 2022-05-04 08:12:43,325 INFO [train.py:715] (0/8) Epoch 2, batch 22250, loss[loss=0.1529, simple_loss=0.2257, pruned_loss=0.04009, over 4813.00 frames.], tot_loss[loss=0.1693, simple_loss=0.2362, pruned_loss=0.05119, over 973103.51 frames.], batch size: 25, lr: 6.61e-04 2022-05-04 08:13:24,130 INFO [train.py:715] (0/8) Epoch 2, batch 22300, loss[loss=0.2017, simple_loss=0.2619, pruned_loss=0.0707, over 4753.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2364, pruned_loss=0.05133, over 972710.33 frames.], batch size: 19, lr: 6.61e-04 2022-05-04 08:14:05,209 INFO [train.py:715] (0/8) Epoch 2, batch 22350, loss[loss=0.1392, simple_loss=0.2179, pruned_loss=0.03023, over 4792.00 frames.], tot_loss[loss=0.1701, simple_loss=0.2369, pruned_loss=0.05164, over 972252.76 frames.], batch size: 17, lr: 6.61e-04 2022-05-04 08:14:46,093 INFO [train.py:715] (0/8) Epoch 2, batch 22400, loss[loss=0.1635, simple_loss=0.2338, pruned_loss=0.04663, over 4786.00 frames.], tot_loss[loss=0.17, simple_loss=0.2369, pruned_loss=0.05154, over 972795.56 frames.], batch size: 18, lr: 6.61e-04 2022-05-04 08:15:26,444 INFO [train.py:715] (0/8) Epoch 2, batch 22450, loss[loss=0.18, simple_loss=0.2528, pruned_loss=0.05365, over 4970.00 frames.], tot_loss[loss=0.1707, simple_loss=0.2371, pruned_loss=0.05218, over 973351.59 frames.], batch size: 15, lr: 6.61e-04 2022-05-04 08:16:07,658 INFO [train.py:715] (0/8) Epoch 2, batch 22500, loss[loss=0.1801, simple_loss=0.2459, pruned_loss=0.05709, over 4939.00 frames.], tot_loss[loss=0.1706, simple_loss=0.2369, pruned_loss=0.05213, over 972168.78 frames.], batch size: 21, lr: 6.61e-04 2022-05-04 08:16:48,519 INFO [train.py:715] (0/8) Epoch 2, batch 22550, loss[loss=0.1883, simple_loss=0.2575, pruned_loss=0.05955, over 4785.00 frames.], tot_loss[loss=0.171, simple_loss=0.2373, pruned_loss=0.05235, over 973234.30 frames.], batch size: 17, lr: 6.60e-04 2022-05-04 08:17:29,223 INFO [train.py:715] (0/8) Epoch 2, batch 22600, loss[loss=0.1578, simple_loss=0.2272, pruned_loss=0.04419, over 4791.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2366, pruned_loss=0.05248, over 972495.32 frames.], batch size: 12, lr: 6.60e-04 2022-05-04 08:18:09,996 INFO [train.py:715] (0/8) Epoch 2, batch 22650, loss[loss=0.1743, simple_loss=0.2397, pruned_loss=0.0544, over 4947.00 frames.], tot_loss[loss=0.171, simple_loss=0.2375, pruned_loss=0.05224, over 971563.29 frames.], batch size: 21, lr: 6.60e-04 2022-05-04 08:18:50,671 INFO [train.py:715] (0/8) Epoch 2, batch 22700, loss[loss=0.1636, simple_loss=0.2267, pruned_loss=0.05022, over 4876.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2375, pruned_loss=0.05203, over 972251.74 frames.], batch size: 32, lr: 6.60e-04 2022-05-04 08:19:31,394 INFO [train.py:715] (0/8) Epoch 2, batch 22750, loss[loss=0.1677, simple_loss=0.2402, pruned_loss=0.04763, over 4773.00 frames.], tot_loss[loss=0.1713, simple_loss=0.238, pruned_loss=0.05234, over 973205.03 frames.], batch size: 16, lr: 6.60e-04 2022-05-04 08:20:12,247 INFO [train.py:715] (0/8) Epoch 2, batch 22800, loss[loss=0.2022, simple_loss=0.2619, pruned_loss=0.07127, over 4815.00 frames.], tot_loss[loss=0.171, simple_loss=0.2376, pruned_loss=0.05222, over 973461.90 frames.], batch size: 27, lr: 6.59e-04 2022-05-04 08:20:53,295 INFO [train.py:715] (0/8) Epoch 2, batch 22850, loss[loss=0.175, simple_loss=0.2345, pruned_loss=0.05771, over 4788.00 frames.], tot_loss[loss=0.1707, simple_loss=0.2371, pruned_loss=0.05213, over 973705.93 frames.], batch size: 17, lr: 6.59e-04 2022-05-04 08:21:34,653 INFO [train.py:715] (0/8) Epoch 2, batch 22900, loss[loss=0.1791, simple_loss=0.2443, pruned_loss=0.05691, over 4941.00 frames.], tot_loss[loss=0.17, simple_loss=0.237, pruned_loss=0.05147, over 973281.35 frames.], batch size: 29, lr: 6.59e-04 2022-05-04 08:22:15,453 INFO [train.py:715] (0/8) Epoch 2, batch 22950, loss[loss=0.1606, simple_loss=0.2184, pruned_loss=0.05136, over 4874.00 frames.], tot_loss[loss=0.1704, simple_loss=0.2371, pruned_loss=0.05183, over 972856.03 frames.], batch size: 20, lr: 6.59e-04 2022-05-04 08:22:56,045 INFO [train.py:715] (0/8) Epoch 2, batch 23000, loss[loss=0.1911, simple_loss=0.2555, pruned_loss=0.06333, over 4864.00 frames.], tot_loss[loss=0.1706, simple_loss=0.2374, pruned_loss=0.05195, over 972661.98 frames.], batch size: 32, lr: 6.59e-04 2022-05-04 08:23:37,059 INFO [train.py:715] (0/8) Epoch 2, batch 23050, loss[loss=0.1744, simple_loss=0.2316, pruned_loss=0.05866, over 4920.00 frames.], tot_loss[loss=0.1708, simple_loss=0.2374, pruned_loss=0.05209, over 972859.95 frames.], batch size: 29, lr: 6.59e-04 2022-05-04 08:24:17,902 INFO [train.py:715] (0/8) Epoch 2, batch 23100, loss[loss=0.1356, simple_loss=0.2039, pruned_loss=0.03361, over 4811.00 frames.], tot_loss[loss=0.1692, simple_loss=0.2366, pruned_loss=0.05094, over 972959.80 frames.], batch size: 25, lr: 6.58e-04 2022-05-04 08:24:58,402 INFO [train.py:715] (0/8) Epoch 2, batch 23150, loss[loss=0.161, simple_loss=0.2402, pruned_loss=0.04093, over 4755.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2368, pruned_loss=0.05113, over 972596.08 frames.], batch size: 19, lr: 6.58e-04 2022-05-04 08:25:39,725 INFO [train.py:715] (0/8) Epoch 2, batch 23200, loss[loss=0.1604, simple_loss=0.2321, pruned_loss=0.04438, over 4865.00 frames.], tot_loss[loss=0.1691, simple_loss=0.2364, pruned_loss=0.05087, over 972352.16 frames.], batch size: 20, lr: 6.58e-04 2022-05-04 08:26:20,395 INFO [train.py:715] (0/8) Epoch 2, batch 23250, loss[loss=0.1713, simple_loss=0.2502, pruned_loss=0.04623, over 4913.00 frames.], tot_loss[loss=0.169, simple_loss=0.2367, pruned_loss=0.05062, over 973495.93 frames.], batch size: 23, lr: 6.58e-04 2022-05-04 08:27:00,747 INFO [train.py:715] (0/8) Epoch 2, batch 23300, loss[loss=0.1507, simple_loss=0.2236, pruned_loss=0.03897, over 4804.00 frames.], tot_loss[loss=0.1682, simple_loss=0.236, pruned_loss=0.05015, over 973156.65 frames.], batch size: 21, lr: 6.58e-04 2022-05-04 08:27:41,447 INFO [train.py:715] (0/8) Epoch 2, batch 23350, loss[loss=0.1635, simple_loss=0.2395, pruned_loss=0.04377, over 4878.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2362, pruned_loss=0.05016, over 972157.30 frames.], batch size: 22, lr: 6.57e-04 2022-05-04 08:28:22,391 INFO [train.py:715] (0/8) Epoch 2, batch 23400, loss[loss=0.1781, simple_loss=0.2308, pruned_loss=0.06268, over 4979.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2358, pruned_loss=0.05059, over 971557.67 frames.], batch size: 14, lr: 6.57e-04 2022-05-04 08:29:03,323 INFO [train.py:715] (0/8) Epoch 2, batch 23450, loss[loss=0.1641, simple_loss=0.2433, pruned_loss=0.04242, over 4930.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2357, pruned_loss=0.05073, over 972578.85 frames.], batch size: 18, lr: 6.57e-04 2022-05-04 08:29:43,618 INFO [train.py:715] (0/8) Epoch 2, batch 23500, loss[loss=0.1658, simple_loss=0.2372, pruned_loss=0.04716, over 4954.00 frames.], tot_loss[loss=0.1681, simple_loss=0.2354, pruned_loss=0.05045, over 972558.45 frames.], batch size: 14, lr: 6.57e-04 2022-05-04 08:30:24,812 INFO [train.py:715] (0/8) Epoch 2, batch 23550, loss[loss=0.1589, simple_loss=0.2342, pruned_loss=0.04178, over 4766.00 frames.], tot_loss[loss=0.168, simple_loss=0.2354, pruned_loss=0.05027, over 972471.98 frames.], batch size: 18, lr: 6.57e-04 2022-05-04 08:31:05,705 INFO [train.py:715] (0/8) Epoch 2, batch 23600, loss[loss=0.2009, simple_loss=0.2667, pruned_loss=0.06752, over 4689.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2361, pruned_loss=0.0505, over 972028.20 frames.], batch size: 15, lr: 6.57e-04 2022-05-04 08:31:45,432 INFO [train.py:715] (0/8) Epoch 2, batch 23650, loss[loss=0.1787, simple_loss=0.231, pruned_loss=0.06322, over 4706.00 frames.], tot_loss[loss=0.169, simple_loss=0.236, pruned_loss=0.05094, over 971428.22 frames.], batch size: 15, lr: 6.56e-04 2022-05-04 08:32:27,503 INFO [train.py:715] (0/8) Epoch 2, batch 23700, loss[loss=0.1441, simple_loss=0.2105, pruned_loss=0.03888, over 4969.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2361, pruned_loss=0.05074, over 972670.58 frames.], batch size: 35, lr: 6.56e-04 2022-05-04 08:33:07,931 INFO [train.py:715] (0/8) Epoch 2, batch 23750, loss[loss=0.1405, simple_loss=0.2188, pruned_loss=0.03109, over 4931.00 frames.], tot_loss[loss=0.1682, simple_loss=0.2356, pruned_loss=0.05045, over 973246.91 frames.], batch size: 23, lr: 6.56e-04 2022-05-04 08:33:48,794 INFO [train.py:715] (0/8) Epoch 2, batch 23800, loss[loss=0.1794, simple_loss=0.2479, pruned_loss=0.05539, over 4978.00 frames.], tot_loss[loss=0.1676, simple_loss=0.235, pruned_loss=0.05013, over 973695.25 frames.], batch size: 39, lr: 6.56e-04 2022-05-04 08:34:29,259 INFO [train.py:715] (0/8) Epoch 2, batch 23850, loss[loss=0.1849, simple_loss=0.2564, pruned_loss=0.05668, over 4864.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2358, pruned_loss=0.05095, over 973521.65 frames.], batch size: 32, lr: 6.56e-04 2022-05-04 08:35:10,696 INFO [train.py:715] (0/8) Epoch 2, batch 23900, loss[loss=0.151, simple_loss=0.226, pruned_loss=0.03801, over 4796.00 frames.], tot_loss[loss=0.1691, simple_loss=0.2362, pruned_loss=0.05105, over 973072.76 frames.], batch size: 24, lr: 6.56e-04 2022-05-04 08:35:51,718 INFO [train.py:715] (0/8) Epoch 2, batch 23950, loss[loss=0.152, simple_loss=0.2278, pruned_loss=0.03812, over 4916.00 frames.], tot_loss[loss=0.17, simple_loss=0.2372, pruned_loss=0.05143, over 972514.74 frames.], batch size: 23, lr: 6.55e-04 2022-05-04 08:36:31,646 INFO [train.py:715] (0/8) Epoch 2, batch 24000, loss[loss=0.1564, simple_loss=0.221, pruned_loss=0.04592, over 4965.00 frames.], tot_loss[loss=0.1704, simple_loss=0.2374, pruned_loss=0.05166, over 971958.87 frames.], batch size: 15, lr: 6.55e-04 2022-05-04 08:36:31,647 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 08:36:40,333 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1168, simple_loss=0.2032, pruned_loss=0.01518, over 914524.00 frames. 2022-05-04 08:37:20,458 INFO [train.py:715] (0/8) Epoch 2, batch 24050, loss[loss=0.1721, simple_loss=0.255, pruned_loss=0.04456, over 4875.00 frames.], tot_loss[loss=0.1711, simple_loss=0.2381, pruned_loss=0.05209, over 972024.88 frames.], batch size: 20, lr: 6.55e-04 2022-05-04 08:38:01,994 INFO [train.py:715] (0/8) Epoch 2, batch 24100, loss[loss=0.1614, simple_loss=0.2239, pruned_loss=0.0494, over 4904.00 frames.], tot_loss[loss=0.1707, simple_loss=0.2376, pruned_loss=0.05192, over 972464.01 frames.], batch size: 19, lr: 6.55e-04 2022-05-04 08:38:42,990 INFO [train.py:715] (0/8) Epoch 2, batch 24150, loss[loss=0.2505, simple_loss=0.3075, pruned_loss=0.09672, over 4815.00 frames.], tot_loss[loss=0.1698, simple_loss=0.2368, pruned_loss=0.05145, over 972602.15 frames.], batch size: 27, lr: 6.55e-04 2022-05-04 08:39:24,308 INFO [train.py:715] (0/8) Epoch 2, batch 24200, loss[loss=0.1589, simple_loss=0.2303, pruned_loss=0.04374, over 4831.00 frames.], tot_loss[loss=0.1691, simple_loss=0.2362, pruned_loss=0.05095, over 972988.63 frames.], batch size: 26, lr: 6.55e-04 2022-05-04 08:40:05,191 INFO [train.py:715] (0/8) Epoch 2, batch 24250, loss[loss=0.1721, simple_loss=0.2414, pruned_loss=0.05145, over 4922.00 frames.], tot_loss[loss=0.1693, simple_loss=0.2366, pruned_loss=0.05105, over 972885.50 frames.], batch size: 23, lr: 6.54e-04 2022-05-04 08:40:46,093 INFO [train.py:715] (0/8) Epoch 2, batch 24300, loss[loss=0.1563, simple_loss=0.2219, pruned_loss=0.04537, over 4770.00 frames.], tot_loss[loss=0.1684, simple_loss=0.2357, pruned_loss=0.05055, over 972541.65 frames.], batch size: 18, lr: 6.54e-04 2022-05-04 08:41:26,661 INFO [train.py:715] (0/8) Epoch 2, batch 24350, loss[loss=0.1821, simple_loss=0.2499, pruned_loss=0.05711, over 4960.00 frames.], tot_loss[loss=0.169, simple_loss=0.2362, pruned_loss=0.05094, over 973049.97 frames.], batch size: 21, lr: 6.54e-04 2022-05-04 08:42:06,522 INFO [train.py:715] (0/8) Epoch 2, batch 24400, loss[loss=0.1747, simple_loss=0.2372, pruned_loss=0.05611, over 4794.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2366, pruned_loss=0.05125, over 973105.82 frames.], batch size: 24, lr: 6.54e-04 2022-05-04 08:42:47,542 INFO [train.py:715] (0/8) Epoch 2, batch 24450, loss[loss=0.1874, simple_loss=0.2609, pruned_loss=0.05694, over 4770.00 frames.], tot_loss[loss=0.1697, simple_loss=0.2368, pruned_loss=0.05125, over 973138.06 frames.], batch size: 19, lr: 6.54e-04 2022-05-04 08:43:27,491 INFO [train.py:715] (0/8) Epoch 2, batch 24500, loss[loss=0.1612, simple_loss=0.2168, pruned_loss=0.05277, over 4838.00 frames.], tot_loss[loss=0.169, simple_loss=0.2363, pruned_loss=0.05087, over 972840.81 frames.], batch size: 12, lr: 6.53e-04 2022-05-04 08:44:07,369 INFO [train.py:715] (0/8) Epoch 2, batch 24550, loss[loss=0.1992, simple_loss=0.2599, pruned_loss=0.06925, over 4843.00 frames.], tot_loss[loss=0.1696, simple_loss=0.2366, pruned_loss=0.05131, over 972601.27 frames.], batch size: 30, lr: 6.53e-04 2022-05-04 08:44:46,870 INFO [train.py:715] (0/8) Epoch 2, batch 24600, loss[loss=0.1846, simple_loss=0.2419, pruned_loss=0.06361, over 4938.00 frames.], tot_loss[loss=0.1691, simple_loss=0.236, pruned_loss=0.05106, over 972445.22 frames.], batch size: 29, lr: 6.53e-04 2022-05-04 08:45:27,057 INFO [train.py:715] (0/8) Epoch 2, batch 24650, loss[loss=0.1534, simple_loss=0.224, pruned_loss=0.04136, over 4973.00 frames.], tot_loss[loss=0.1689, simple_loss=0.236, pruned_loss=0.0509, over 973501.58 frames.], batch size: 24, lr: 6.53e-04 2022-05-04 08:46:06,408 INFO [train.py:715] (0/8) Epoch 2, batch 24700, loss[loss=0.1394, simple_loss=0.2071, pruned_loss=0.03588, over 4836.00 frames.], tot_loss[loss=0.1691, simple_loss=0.236, pruned_loss=0.05113, over 972843.37 frames.], batch size: 15, lr: 6.53e-04 2022-05-04 08:46:45,152 INFO [train.py:715] (0/8) Epoch 2, batch 24750, loss[loss=0.1734, simple_loss=0.2469, pruned_loss=0.04991, over 4869.00 frames.], tot_loss[loss=0.1698, simple_loss=0.2371, pruned_loss=0.05127, over 972651.90 frames.], batch size: 22, lr: 6.53e-04 2022-05-04 08:47:24,972 INFO [train.py:715] (0/8) Epoch 2, batch 24800, loss[loss=0.1795, simple_loss=0.2319, pruned_loss=0.06352, over 4772.00 frames.], tot_loss[loss=0.1716, simple_loss=0.2386, pruned_loss=0.05232, over 972319.08 frames.], batch size: 18, lr: 6.52e-04 2022-05-04 08:48:04,571 INFO [train.py:715] (0/8) Epoch 2, batch 24850, loss[loss=0.1218, simple_loss=0.1932, pruned_loss=0.02519, over 4960.00 frames.], tot_loss[loss=0.1702, simple_loss=0.2373, pruned_loss=0.05153, over 972422.56 frames.], batch size: 24, lr: 6.52e-04 2022-05-04 08:48:43,455 INFO [train.py:715] (0/8) Epoch 2, batch 24900, loss[loss=0.1526, simple_loss=0.2181, pruned_loss=0.04354, over 4684.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2363, pruned_loss=0.05076, over 972574.90 frames.], batch size: 15, lr: 6.52e-04 2022-05-04 08:49:22,920 INFO [train.py:715] (0/8) Epoch 2, batch 24950, loss[loss=0.1353, simple_loss=0.2075, pruned_loss=0.03152, over 4753.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2355, pruned_loss=0.05012, over 972014.96 frames.], batch size: 12, lr: 6.52e-04 2022-05-04 08:50:02,456 INFO [train.py:715] (0/8) Epoch 2, batch 25000, loss[loss=0.1636, simple_loss=0.2297, pruned_loss=0.04872, over 4758.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2345, pruned_loss=0.04986, over 972961.89 frames.], batch size: 18, lr: 6.52e-04 2022-05-04 08:50:41,254 INFO [train.py:715] (0/8) Epoch 2, batch 25050, loss[loss=0.2126, simple_loss=0.2758, pruned_loss=0.07475, over 4837.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2359, pruned_loss=0.05035, over 973484.49 frames.], batch size: 30, lr: 6.52e-04 2022-05-04 08:51:19,782 INFO [train.py:715] (0/8) Epoch 2, batch 25100, loss[loss=0.159, simple_loss=0.2334, pruned_loss=0.04231, over 4966.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2359, pruned_loss=0.05064, over 973674.63 frames.], batch size: 15, lr: 6.51e-04 2022-05-04 08:51:59,033 INFO [train.py:715] (0/8) Epoch 2, batch 25150, loss[loss=0.1613, simple_loss=0.2278, pruned_loss=0.04743, over 4699.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2346, pruned_loss=0.05032, over 971640.74 frames.], batch size: 15, lr: 6.51e-04 2022-05-04 08:52:37,845 INFO [train.py:715] (0/8) Epoch 2, batch 25200, loss[loss=0.1327, simple_loss=0.2101, pruned_loss=0.0277, over 4981.00 frames.], tot_loss[loss=0.1681, simple_loss=0.2346, pruned_loss=0.05078, over 971920.29 frames.], batch size: 28, lr: 6.51e-04 2022-05-04 08:53:16,874 INFO [train.py:715] (0/8) Epoch 2, batch 25250, loss[loss=0.2247, simple_loss=0.2918, pruned_loss=0.07884, over 4816.00 frames.], tot_loss[loss=0.1699, simple_loss=0.236, pruned_loss=0.05191, over 972392.90 frames.], batch size: 24, lr: 6.51e-04 2022-05-04 08:53:55,852 INFO [train.py:715] (0/8) Epoch 2, batch 25300, loss[loss=0.1594, simple_loss=0.2368, pruned_loss=0.04103, over 4755.00 frames.], tot_loss[loss=0.1705, simple_loss=0.2369, pruned_loss=0.05204, over 972778.04 frames.], batch size: 16, lr: 6.51e-04 2022-05-04 08:54:35,070 INFO [train.py:715] (0/8) Epoch 2, batch 25350, loss[loss=0.1994, simple_loss=0.2503, pruned_loss=0.07431, over 4907.00 frames.], tot_loss[loss=0.171, simple_loss=0.2376, pruned_loss=0.05221, over 972344.08 frames.], batch size: 17, lr: 6.51e-04 2022-05-04 08:55:14,139 INFO [train.py:715] (0/8) Epoch 2, batch 25400, loss[loss=0.1632, simple_loss=0.2335, pruned_loss=0.04641, over 4835.00 frames.], tot_loss[loss=0.1694, simple_loss=0.2361, pruned_loss=0.05138, over 971555.63 frames.], batch size: 32, lr: 6.50e-04 2022-05-04 08:55:52,994 INFO [train.py:715] (0/8) Epoch 2, batch 25450, loss[loss=0.1593, simple_loss=0.2246, pruned_loss=0.04699, over 4976.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2355, pruned_loss=0.05107, over 971378.45 frames.], batch size: 25, lr: 6.50e-04 2022-05-04 08:56:32,016 INFO [train.py:715] (0/8) Epoch 2, batch 25500, loss[loss=0.1594, simple_loss=0.2342, pruned_loss=0.04227, over 4984.00 frames.], tot_loss[loss=0.1697, simple_loss=0.2363, pruned_loss=0.05154, over 971257.68 frames.], batch size: 28, lr: 6.50e-04 2022-05-04 08:57:11,298 INFO [train.py:715] (0/8) Epoch 2, batch 25550, loss[loss=0.1739, simple_loss=0.2294, pruned_loss=0.05922, over 4787.00 frames.], tot_loss[loss=0.1691, simple_loss=0.2362, pruned_loss=0.05099, over 972244.24 frames.], batch size: 14, lr: 6.50e-04 2022-05-04 08:57:50,302 INFO [train.py:715] (0/8) Epoch 2, batch 25600, loss[loss=0.123, simple_loss=0.1976, pruned_loss=0.02424, over 4912.00 frames.], tot_loss[loss=0.1692, simple_loss=0.2364, pruned_loss=0.05094, over 971329.89 frames.], batch size: 17, lr: 6.50e-04 2022-05-04 08:58:29,643 INFO [train.py:715] (0/8) Epoch 2, batch 25650, loss[loss=0.1583, simple_loss=0.2281, pruned_loss=0.04421, over 4981.00 frames.], tot_loss[loss=0.169, simple_loss=0.2363, pruned_loss=0.0509, over 971156.63 frames.], batch size: 31, lr: 6.50e-04 2022-05-04 08:59:09,551 INFO [train.py:715] (0/8) Epoch 2, batch 25700, loss[loss=0.1719, simple_loss=0.2333, pruned_loss=0.05521, over 4930.00 frames.], tot_loss[loss=0.169, simple_loss=0.2366, pruned_loss=0.05073, over 971145.05 frames.], batch size: 23, lr: 6.49e-04 2022-05-04 08:59:48,688 INFO [train.py:715] (0/8) Epoch 2, batch 25750, loss[loss=0.1712, simple_loss=0.24, pruned_loss=0.0512, over 4795.00 frames.], tot_loss[loss=0.169, simple_loss=0.2362, pruned_loss=0.05087, over 971325.69 frames.], batch size: 21, lr: 6.49e-04 2022-05-04 09:00:27,434 INFO [train.py:715] (0/8) Epoch 2, batch 25800, loss[loss=0.1581, simple_loss=0.2219, pruned_loss=0.04719, over 4983.00 frames.], tot_loss[loss=0.1703, simple_loss=0.2371, pruned_loss=0.05182, over 971686.85 frames.], batch size: 15, lr: 6.49e-04 2022-05-04 09:01:06,420 INFO [train.py:715] (0/8) Epoch 2, batch 25850, loss[loss=0.2137, simple_loss=0.2672, pruned_loss=0.08014, over 4784.00 frames.], tot_loss[loss=0.1699, simple_loss=0.2366, pruned_loss=0.05157, over 971005.76 frames.], batch size: 17, lr: 6.49e-04 2022-05-04 09:01:46,180 INFO [train.py:715] (0/8) Epoch 2, batch 25900, loss[loss=0.1765, simple_loss=0.2501, pruned_loss=0.05146, over 4767.00 frames.], tot_loss[loss=0.1691, simple_loss=0.2358, pruned_loss=0.05122, over 971013.05 frames.], batch size: 18, lr: 6.49e-04 2022-05-04 09:02:25,984 INFO [train.py:715] (0/8) Epoch 2, batch 25950, loss[loss=0.1672, simple_loss=0.2283, pruned_loss=0.05306, over 4835.00 frames.], tot_loss[loss=0.1678, simple_loss=0.2347, pruned_loss=0.05045, over 971788.49 frames.], batch size: 13, lr: 6.49e-04 2022-05-04 09:03:05,065 INFO [train.py:715] (0/8) Epoch 2, batch 26000, loss[loss=0.1505, simple_loss=0.2288, pruned_loss=0.03614, over 4847.00 frames.], tot_loss[loss=0.1666, simple_loss=0.2334, pruned_loss=0.04991, over 971857.51 frames.], batch size: 34, lr: 6.48e-04 2022-05-04 09:03:44,734 INFO [train.py:715] (0/8) Epoch 2, batch 26050, loss[loss=0.1827, simple_loss=0.2472, pruned_loss=0.05911, over 4759.00 frames.], tot_loss[loss=0.1673, simple_loss=0.2344, pruned_loss=0.05011, over 972133.69 frames.], batch size: 19, lr: 6.48e-04 2022-05-04 09:04:24,304 INFO [train.py:715] (0/8) Epoch 2, batch 26100, loss[loss=0.1485, simple_loss=0.222, pruned_loss=0.03748, over 4936.00 frames.], tot_loss[loss=0.1666, simple_loss=0.2343, pruned_loss=0.0495, over 972102.37 frames.], batch size: 29, lr: 6.48e-04 2022-05-04 09:05:03,481 INFO [train.py:715] (0/8) Epoch 2, batch 26150, loss[loss=0.1693, simple_loss=0.2333, pruned_loss=0.05266, over 4751.00 frames.], tot_loss[loss=0.1678, simple_loss=0.235, pruned_loss=0.05032, over 972332.57 frames.], batch size: 19, lr: 6.48e-04 2022-05-04 09:05:42,983 INFO [train.py:715] (0/8) Epoch 2, batch 26200, loss[loss=0.1419, simple_loss=0.2166, pruned_loss=0.03358, over 4780.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2358, pruned_loss=0.05094, over 972260.85 frames.], batch size: 12, lr: 6.48e-04 2022-05-04 09:06:22,733 INFO [train.py:715] (0/8) Epoch 2, batch 26250, loss[loss=0.1847, simple_loss=0.2399, pruned_loss=0.06475, over 4888.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2357, pruned_loss=0.05069, over 972713.57 frames.], batch size: 22, lr: 6.48e-04 2022-05-04 09:07:02,315 INFO [train.py:715] (0/8) Epoch 2, batch 26300, loss[loss=0.1765, simple_loss=0.2427, pruned_loss=0.05513, over 4752.00 frames.], tot_loss[loss=0.1676, simple_loss=0.235, pruned_loss=0.05013, over 972783.78 frames.], batch size: 16, lr: 6.47e-04 2022-05-04 09:07:40,824 INFO [train.py:715] (0/8) Epoch 2, batch 26350, loss[loss=0.1757, simple_loss=0.2453, pruned_loss=0.05299, over 4916.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2353, pruned_loss=0.05025, over 973026.35 frames.], batch size: 17, lr: 6.47e-04 2022-05-04 09:08:05,517 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-96000.pt 2022-05-04 09:08:23,934 INFO [train.py:715] (0/8) Epoch 2, batch 26400, loss[loss=0.1702, simple_loss=0.2369, pruned_loss=0.05175, over 4973.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2351, pruned_loss=0.05008, over 972398.70 frames.], batch size: 14, lr: 6.47e-04 2022-05-04 09:09:03,682 INFO [train.py:715] (0/8) Epoch 2, batch 26450, loss[loss=0.1707, simple_loss=0.242, pruned_loss=0.04965, over 4853.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2357, pruned_loss=0.05045, over 972090.58 frames.], batch size: 30, lr: 6.47e-04 2022-05-04 09:09:42,576 INFO [train.py:715] (0/8) Epoch 2, batch 26500, loss[loss=0.1931, simple_loss=0.2569, pruned_loss=0.06463, over 4763.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2353, pruned_loss=0.04996, over 972228.30 frames.], batch size: 19, lr: 6.47e-04 2022-05-04 09:10:22,391 INFO [train.py:715] (0/8) Epoch 2, batch 26550, loss[loss=0.1694, simple_loss=0.2335, pruned_loss=0.05263, over 4774.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2349, pruned_loss=0.04991, over 971815.88 frames.], batch size: 14, lr: 6.46e-04 2022-05-04 09:11:02,375 INFO [train.py:715] (0/8) Epoch 2, batch 26600, loss[loss=0.1724, simple_loss=0.2417, pruned_loss=0.05157, over 4846.00 frames.], tot_loss[loss=0.168, simple_loss=0.2353, pruned_loss=0.05034, over 972828.77 frames.], batch size: 32, lr: 6.46e-04 2022-05-04 09:11:41,993 INFO [train.py:715] (0/8) Epoch 2, batch 26650, loss[loss=0.1695, simple_loss=0.2323, pruned_loss=0.05331, over 4893.00 frames.], tot_loss[loss=0.1675, simple_loss=0.235, pruned_loss=0.05001, over 972416.70 frames.], batch size: 17, lr: 6.46e-04 2022-05-04 09:12:21,003 INFO [train.py:715] (0/8) Epoch 2, batch 26700, loss[loss=0.2072, simple_loss=0.2792, pruned_loss=0.06759, over 4824.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2356, pruned_loss=0.05078, over 972568.49 frames.], batch size: 25, lr: 6.46e-04 2022-05-04 09:13:00,962 INFO [train.py:715] (0/8) Epoch 2, batch 26750, loss[loss=0.1656, simple_loss=0.2271, pruned_loss=0.0521, over 4816.00 frames.], tot_loss[loss=0.1704, simple_loss=0.237, pruned_loss=0.05188, over 973198.33 frames.], batch size: 13, lr: 6.46e-04 2022-05-04 09:13:40,193 INFO [train.py:715] (0/8) Epoch 2, batch 26800, loss[loss=0.1551, simple_loss=0.2361, pruned_loss=0.03704, over 4974.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2362, pruned_loss=0.05068, over 973593.35 frames.], batch size: 24, lr: 6.46e-04 2022-05-04 09:14:19,186 INFO [train.py:715] (0/8) Epoch 2, batch 26850, loss[loss=0.1842, simple_loss=0.2583, pruned_loss=0.05508, over 4873.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2365, pruned_loss=0.05054, over 974168.23 frames.], batch size: 32, lr: 6.45e-04 2022-05-04 09:14:58,117 INFO [train.py:715] (0/8) Epoch 2, batch 26900, loss[loss=0.1852, simple_loss=0.2461, pruned_loss=0.06218, over 4982.00 frames.], tot_loss[loss=0.1699, simple_loss=0.2373, pruned_loss=0.05127, over 974857.77 frames.], batch size: 28, lr: 6.45e-04 2022-05-04 09:15:37,580 INFO [train.py:715] (0/8) Epoch 2, batch 26950, loss[loss=0.1828, simple_loss=0.2515, pruned_loss=0.05707, over 4855.00 frames.], tot_loss[loss=0.1698, simple_loss=0.2371, pruned_loss=0.05132, over 973921.98 frames.], batch size: 20, lr: 6.45e-04 2022-05-04 09:16:16,466 INFO [train.py:715] (0/8) Epoch 2, batch 27000, loss[loss=0.1746, simple_loss=0.2345, pruned_loss=0.05732, over 4779.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2357, pruned_loss=0.05073, over 973511.80 frames.], batch size: 14, lr: 6.45e-04 2022-05-04 09:16:16,467 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 09:16:25,254 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1164, simple_loss=0.2027, pruned_loss=0.01502, over 914524.00 frames. 2022-05-04 09:17:03,618 INFO [train.py:715] (0/8) Epoch 2, batch 27050, loss[loss=0.1694, simple_loss=0.2378, pruned_loss=0.05045, over 4948.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2351, pruned_loss=0.05072, over 973329.38 frames.], batch size: 21, lr: 6.45e-04 2022-05-04 09:17:42,883 INFO [train.py:715] (0/8) Epoch 2, batch 27100, loss[loss=0.183, simple_loss=0.2517, pruned_loss=0.05714, over 4974.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2358, pruned_loss=0.05103, over 973893.24 frames.], batch size: 39, lr: 6.45e-04 2022-05-04 09:18:22,880 INFO [train.py:715] (0/8) Epoch 2, batch 27150, loss[loss=0.1697, simple_loss=0.2367, pruned_loss=0.05135, over 4794.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2362, pruned_loss=0.05137, over 972831.03 frames.], batch size: 17, lr: 6.44e-04 2022-05-04 09:19:02,265 INFO [train.py:715] (0/8) Epoch 2, batch 27200, loss[loss=0.1572, simple_loss=0.2131, pruned_loss=0.05067, over 4734.00 frames.], tot_loss[loss=0.1697, simple_loss=0.2367, pruned_loss=0.05136, over 972417.97 frames.], batch size: 12, lr: 6.44e-04 2022-05-04 09:19:41,120 INFO [train.py:715] (0/8) Epoch 2, batch 27250, loss[loss=0.147, simple_loss=0.218, pruned_loss=0.03795, over 4837.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2361, pruned_loss=0.05082, over 971883.17 frames.], batch size: 13, lr: 6.44e-04 2022-05-04 09:20:20,689 INFO [train.py:715] (0/8) Epoch 2, batch 27300, loss[loss=0.1787, simple_loss=0.2443, pruned_loss=0.05656, over 4830.00 frames.], tot_loss[loss=0.1699, simple_loss=0.2365, pruned_loss=0.05164, over 971734.13 frames.], batch size: 13, lr: 6.44e-04 2022-05-04 09:20:59,723 INFO [train.py:715] (0/8) Epoch 2, batch 27350, loss[loss=0.1436, simple_loss=0.2156, pruned_loss=0.03583, over 4761.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2361, pruned_loss=0.0508, over 971551.60 frames.], batch size: 14, lr: 6.44e-04 2022-05-04 09:21:38,802 INFO [train.py:715] (0/8) Epoch 2, batch 27400, loss[loss=0.1404, simple_loss=0.2111, pruned_loss=0.03482, over 4805.00 frames.], tot_loss[loss=0.1694, simple_loss=0.2366, pruned_loss=0.05113, over 972541.67 frames.], batch size: 21, lr: 6.44e-04 2022-05-04 09:22:17,476 INFO [train.py:715] (0/8) Epoch 2, batch 27450, loss[loss=0.1588, simple_loss=0.2186, pruned_loss=0.04952, over 4985.00 frames.], tot_loss[loss=0.1687, simple_loss=0.236, pruned_loss=0.05065, over 972709.47 frames.], batch size: 31, lr: 6.44e-04 2022-05-04 09:22:57,212 INFO [train.py:715] (0/8) Epoch 2, batch 27500, loss[loss=0.1658, simple_loss=0.2403, pruned_loss=0.04565, over 4880.00 frames.], tot_loss[loss=0.1686, simple_loss=0.2358, pruned_loss=0.05075, over 972896.91 frames.], batch size: 22, lr: 6.43e-04 2022-05-04 09:23:37,093 INFO [train.py:715] (0/8) Epoch 2, batch 27550, loss[loss=0.1607, simple_loss=0.2322, pruned_loss=0.04462, over 4986.00 frames.], tot_loss[loss=0.1682, simple_loss=0.2354, pruned_loss=0.05047, over 973388.97 frames.], batch size: 25, lr: 6.43e-04 2022-05-04 09:24:16,421 INFO [train.py:715] (0/8) Epoch 2, batch 27600, loss[loss=0.1518, simple_loss=0.2169, pruned_loss=0.04336, over 4900.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2347, pruned_loss=0.05054, over 973606.00 frames.], batch size: 19, lr: 6.43e-04 2022-05-04 09:24:55,997 INFO [train.py:715] (0/8) Epoch 2, batch 27650, loss[loss=0.1476, simple_loss=0.2238, pruned_loss=0.03572, over 4787.00 frames.], tot_loss[loss=0.1684, simple_loss=0.2356, pruned_loss=0.05057, over 973912.74 frames.], batch size: 18, lr: 6.43e-04 2022-05-04 09:25:36,592 INFO [train.py:715] (0/8) Epoch 2, batch 27700, loss[loss=0.1595, simple_loss=0.2245, pruned_loss=0.04727, over 4746.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2357, pruned_loss=0.05104, over 973923.62 frames.], batch size: 16, lr: 6.43e-04 2022-05-04 09:26:16,916 INFO [train.py:715] (0/8) Epoch 2, batch 27750, loss[loss=0.1516, simple_loss=0.2319, pruned_loss=0.03561, over 4818.00 frames.], tot_loss[loss=0.1677, simple_loss=0.235, pruned_loss=0.05023, over 973878.38 frames.], batch size: 26, lr: 6.43e-04 2022-05-04 09:26:56,303 INFO [train.py:715] (0/8) Epoch 2, batch 27800, loss[loss=0.1533, simple_loss=0.2234, pruned_loss=0.04157, over 4912.00 frames.], tot_loss[loss=0.1677, simple_loss=0.235, pruned_loss=0.05016, over 973293.10 frames.], batch size: 17, lr: 6.42e-04 2022-05-04 09:27:36,595 INFO [train.py:715] (0/8) Epoch 2, batch 27850, loss[loss=0.1655, simple_loss=0.2373, pruned_loss=0.04686, over 4980.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2352, pruned_loss=0.05028, over 973440.68 frames.], batch size: 28, lr: 6.42e-04 2022-05-04 09:28:15,904 INFO [train.py:715] (0/8) Epoch 2, batch 27900, loss[loss=0.1406, simple_loss=0.2069, pruned_loss=0.03716, over 4761.00 frames.], tot_loss[loss=0.1679, simple_loss=0.235, pruned_loss=0.05039, over 973184.55 frames.], batch size: 19, lr: 6.42e-04 2022-05-04 09:28:55,084 INFO [train.py:715] (0/8) Epoch 2, batch 27950, loss[loss=0.1454, simple_loss=0.219, pruned_loss=0.03591, over 4989.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2361, pruned_loss=0.05073, over 972911.38 frames.], batch size: 25, lr: 6.42e-04 2022-05-04 09:29:34,670 INFO [train.py:715] (0/8) Epoch 2, batch 28000, loss[loss=0.1401, simple_loss=0.214, pruned_loss=0.03305, over 4701.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2363, pruned_loss=0.0507, over 972144.60 frames.], batch size: 15, lr: 6.42e-04 2022-05-04 09:30:15,044 INFO [train.py:715] (0/8) Epoch 2, batch 28050, loss[loss=0.1772, simple_loss=0.2534, pruned_loss=0.05051, over 4967.00 frames.], tot_loss[loss=0.1681, simple_loss=0.2361, pruned_loss=0.05003, over 972608.57 frames.], batch size: 14, lr: 6.42e-04 2022-05-04 09:30:54,016 INFO [train.py:715] (0/8) Epoch 2, batch 28100, loss[loss=0.1267, simple_loss=0.1903, pruned_loss=0.03158, over 4751.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2365, pruned_loss=0.05023, over 973125.10 frames.], batch size: 12, lr: 6.41e-04 2022-05-04 09:31:33,559 INFO [train.py:715] (0/8) Epoch 2, batch 28150, loss[loss=0.1645, simple_loss=0.2357, pruned_loss=0.04667, over 4921.00 frames.], tot_loss[loss=0.169, simple_loss=0.2368, pruned_loss=0.05064, over 972767.11 frames.], batch size: 29, lr: 6.41e-04 2022-05-04 09:32:13,297 INFO [train.py:715] (0/8) Epoch 2, batch 28200, loss[loss=0.1687, simple_loss=0.2221, pruned_loss=0.05761, over 4854.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2361, pruned_loss=0.0507, over 972739.03 frames.], batch size: 32, lr: 6.41e-04 2022-05-04 09:32:52,896 INFO [train.py:715] (0/8) Epoch 2, batch 28250, loss[loss=0.1514, simple_loss=0.2259, pruned_loss=0.03841, over 4948.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2356, pruned_loss=0.05075, over 972673.40 frames.], batch size: 29, lr: 6.41e-04 2022-05-04 09:33:31,978 INFO [train.py:715] (0/8) Epoch 2, batch 28300, loss[loss=0.1711, simple_loss=0.2334, pruned_loss=0.05439, over 4871.00 frames.], tot_loss[loss=0.1691, simple_loss=0.236, pruned_loss=0.05109, over 972403.05 frames.], batch size: 32, lr: 6.41e-04 2022-05-04 09:34:11,318 INFO [train.py:715] (0/8) Epoch 2, batch 28350, loss[loss=0.1584, simple_loss=0.2181, pruned_loss=0.04933, over 4908.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2358, pruned_loss=0.05076, over 971853.59 frames.], batch size: 18, lr: 6.41e-04 2022-05-04 09:34:51,511 INFO [train.py:715] (0/8) Epoch 2, batch 28400, loss[loss=0.1678, simple_loss=0.251, pruned_loss=0.0423, over 4785.00 frames.], tot_loss[loss=0.168, simple_loss=0.2351, pruned_loss=0.05049, over 972209.49 frames.], batch size: 18, lr: 6.40e-04 2022-05-04 09:35:30,756 INFO [train.py:715] (0/8) Epoch 2, batch 28450, loss[loss=0.1732, simple_loss=0.2373, pruned_loss=0.0545, over 4816.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2352, pruned_loss=0.05073, over 972124.75 frames.], batch size: 12, lr: 6.40e-04 2022-05-04 09:36:10,159 INFO [train.py:715] (0/8) Epoch 2, batch 28500, loss[loss=0.1829, simple_loss=0.2512, pruned_loss=0.05729, over 4933.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2353, pruned_loss=0.05083, over 972687.84 frames.], batch size: 29, lr: 6.40e-04 2022-05-04 09:36:50,110 INFO [train.py:715] (0/8) Epoch 2, batch 28550, loss[loss=0.2331, simple_loss=0.3017, pruned_loss=0.08222, over 4851.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2359, pruned_loss=0.0507, over 972308.58 frames.], batch size: 32, lr: 6.40e-04 2022-05-04 09:37:30,232 INFO [train.py:715] (0/8) Epoch 2, batch 28600, loss[loss=0.1718, simple_loss=0.2536, pruned_loss=0.04497, over 4804.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2362, pruned_loss=0.05075, over 972222.71 frames.], batch size: 24, lr: 6.40e-04 2022-05-04 09:38:09,271 INFO [train.py:715] (0/8) Epoch 2, batch 28650, loss[loss=0.1906, simple_loss=0.248, pruned_loss=0.06654, over 4973.00 frames.], tot_loss[loss=0.17, simple_loss=0.2366, pruned_loss=0.05172, over 972887.00 frames.], batch size: 35, lr: 6.40e-04 2022-05-04 09:38:49,125 INFO [train.py:715] (0/8) Epoch 2, batch 28700, loss[loss=0.1497, simple_loss=0.2119, pruned_loss=0.04374, over 4871.00 frames.], tot_loss[loss=0.1703, simple_loss=0.2371, pruned_loss=0.05173, over 972882.35 frames.], batch size: 32, lr: 6.39e-04 2022-05-04 09:39:29,583 INFO [train.py:715] (0/8) Epoch 2, batch 28750, loss[loss=0.1864, simple_loss=0.2558, pruned_loss=0.0585, over 4876.00 frames.], tot_loss[loss=0.1698, simple_loss=0.2366, pruned_loss=0.05153, over 972387.05 frames.], batch size: 38, lr: 6.39e-04 2022-05-04 09:40:08,510 INFO [train.py:715] (0/8) Epoch 2, batch 28800, loss[loss=0.1721, simple_loss=0.2299, pruned_loss=0.0572, over 4895.00 frames.], tot_loss[loss=0.1694, simple_loss=0.2365, pruned_loss=0.05112, over 971748.41 frames.], batch size: 19, lr: 6.39e-04 2022-05-04 09:40:48,103 INFO [train.py:715] (0/8) Epoch 2, batch 28850, loss[loss=0.1872, simple_loss=0.2462, pruned_loss=0.06409, over 4771.00 frames.], tot_loss[loss=0.1693, simple_loss=0.2365, pruned_loss=0.0511, over 971590.90 frames.], batch size: 19, lr: 6.39e-04 2022-05-04 09:41:28,104 INFO [train.py:715] (0/8) Epoch 2, batch 28900, loss[loss=0.1793, simple_loss=0.2416, pruned_loss=0.05854, over 4853.00 frames.], tot_loss[loss=0.1697, simple_loss=0.2367, pruned_loss=0.05136, over 971905.11 frames.], batch size: 30, lr: 6.39e-04 2022-05-04 09:42:07,485 INFO [train.py:715] (0/8) Epoch 2, batch 28950, loss[loss=0.1752, simple_loss=0.2345, pruned_loss=0.05796, over 4870.00 frames.], tot_loss[loss=0.169, simple_loss=0.2362, pruned_loss=0.05094, over 972068.81 frames.], batch size: 22, lr: 6.39e-04 2022-05-04 09:42:46,857 INFO [train.py:715] (0/8) Epoch 2, batch 29000, loss[loss=0.1543, simple_loss=0.2203, pruned_loss=0.04412, over 4826.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2358, pruned_loss=0.05062, over 972342.64 frames.], batch size: 30, lr: 6.38e-04 2022-05-04 09:43:26,613 INFO [train.py:715] (0/8) Epoch 2, batch 29050, loss[loss=0.1878, simple_loss=0.2507, pruned_loss=0.06243, over 4919.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2351, pruned_loss=0.05002, over 972865.01 frames.], batch size: 18, lr: 6.38e-04 2022-05-04 09:44:06,288 INFO [train.py:715] (0/8) Epoch 2, batch 29100, loss[loss=0.1585, simple_loss=0.2237, pruned_loss=0.0467, over 4982.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2346, pruned_loss=0.04939, over 972955.75 frames.], batch size: 14, lr: 6.38e-04 2022-05-04 09:44:45,463 INFO [train.py:715] (0/8) Epoch 2, batch 29150, loss[loss=0.2034, simple_loss=0.2674, pruned_loss=0.06963, over 4839.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2364, pruned_loss=0.05061, over 972656.71 frames.], batch size: 15, lr: 6.38e-04 2022-05-04 09:45:24,941 INFO [train.py:715] (0/8) Epoch 2, batch 29200, loss[loss=0.1736, simple_loss=0.2402, pruned_loss=0.05344, over 4890.00 frames.], tot_loss[loss=0.1678, simple_loss=0.2353, pruned_loss=0.05014, over 972724.92 frames.], batch size: 16, lr: 6.38e-04 2022-05-04 09:46:05,375 INFO [train.py:715] (0/8) Epoch 2, batch 29250, loss[loss=0.1925, simple_loss=0.2578, pruned_loss=0.06357, over 4752.00 frames.], tot_loss[loss=0.169, simple_loss=0.2362, pruned_loss=0.0509, over 972520.23 frames.], batch size: 16, lr: 6.38e-04 2022-05-04 09:46:44,478 INFO [train.py:715] (0/8) Epoch 2, batch 29300, loss[loss=0.1464, simple_loss=0.2219, pruned_loss=0.03549, over 4812.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2363, pruned_loss=0.05067, over 972996.26 frames.], batch size: 12, lr: 6.37e-04 2022-05-04 09:47:23,247 INFO [train.py:715] (0/8) Epoch 2, batch 29350, loss[loss=0.1422, simple_loss=0.2069, pruned_loss=0.03878, over 4909.00 frames.], tot_loss[loss=0.169, simple_loss=0.2362, pruned_loss=0.05093, over 973268.09 frames.], batch size: 17, lr: 6.37e-04 2022-05-04 09:48:02,461 INFO [train.py:715] (0/8) Epoch 2, batch 29400, loss[loss=0.1515, simple_loss=0.2163, pruned_loss=0.04341, over 4690.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2356, pruned_loss=0.05053, over 972615.82 frames.], batch size: 15, lr: 6.37e-04 2022-05-04 09:48:41,886 INFO [train.py:715] (0/8) Epoch 2, batch 29450, loss[loss=0.1794, simple_loss=0.2398, pruned_loss=0.05947, over 4847.00 frames.], tot_loss[loss=0.1677, simple_loss=0.2352, pruned_loss=0.05013, over 972839.61 frames.], batch size: 30, lr: 6.37e-04 2022-05-04 09:49:20,757 INFO [train.py:715] (0/8) Epoch 2, batch 29500, loss[loss=0.1664, simple_loss=0.2284, pruned_loss=0.05216, over 4815.00 frames.], tot_loss[loss=0.1678, simple_loss=0.2353, pruned_loss=0.05011, over 973500.55 frames.], batch size: 27, lr: 6.37e-04 2022-05-04 09:49:59,763 INFO [train.py:715] (0/8) Epoch 2, batch 29550, loss[loss=0.1648, simple_loss=0.2375, pruned_loss=0.0461, over 4935.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2339, pruned_loss=0.0498, over 972088.78 frames.], batch size: 23, lr: 6.37e-04 2022-05-04 09:50:39,178 INFO [train.py:715] (0/8) Epoch 2, batch 29600, loss[loss=0.209, simple_loss=0.2663, pruned_loss=0.07586, over 4790.00 frames.], tot_loss[loss=0.1671, simple_loss=0.234, pruned_loss=0.05007, over 972112.41 frames.], batch size: 24, lr: 6.37e-04 2022-05-04 09:51:18,368 INFO [train.py:715] (0/8) Epoch 2, batch 29650, loss[loss=0.1775, simple_loss=0.2449, pruned_loss=0.05503, over 4947.00 frames.], tot_loss[loss=0.1678, simple_loss=0.2349, pruned_loss=0.05033, over 972415.34 frames.], batch size: 29, lr: 6.36e-04 2022-05-04 09:51:57,127 INFO [train.py:715] (0/8) Epoch 2, batch 29700, loss[loss=0.167, simple_loss=0.2359, pruned_loss=0.0491, over 4793.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2339, pruned_loss=0.0498, over 971829.36 frames.], batch size: 18, lr: 6.36e-04 2022-05-04 09:52:36,252 INFO [train.py:715] (0/8) Epoch 2, batch 29750, loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.03314, over 4919.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2351, pruned_loss=0.05089, over 972409.79 frames.], batch size: 29, lr: 6.36e-04 2022-05-04 09:53:15,364 INFO [train.py:715] (0/8) Epoch 2, batch 29800, loss[loss=0.1666, simple_loss=0.2448, pruned_loss=0.04422, over 4962.00 frames.], tot_loss[loss=0.1684, simple_loss=0.235, pruned_loss=0.05093, over 972459.08 frames.], batch size: 28, lr: 6.36e-04 2022-05-04 09:53:53,999 INFO [train.py:715] (0/8) Epoch 2, batch 29850, loss[loss=0.1742, simple_loss=0.2323, pruned_loss=0.05812, over 4972.00 frames.], tot_loss[loss=0.1682, simple_loss=0.2348, pruned_loss=0.05075, over 971579.72 frames.], batch size: 14, lr: 6.36e-04 2022-05-04 09:54:33,008 INFO [train.py:715] (0/8) Epoch 2, batch 29900, loss[loss=0.1545, simple_loss=0.219, pruned_loss=0.04502, over 4685.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2342, pruned_loss=0.05009, over 971470.83 frames.], batch size: 15, lr: 6.36e-04 2022-05-04 09:55:12,828 INFO [train.py:715] (0/8) Epoch 2, batch 29950, loss[loss=0.1677, simple_loss=0.2346, pruned_loss=0.05035, over 4706.00 frames.], tot_loss[loss=0.1684, simple_loss=0.235, pruned_loss=0.05084, over 971168.02 frames.], batch size: 15, lr: 6.35e-04 2022-05-04 09:55:51,631 INFO [train.py:715] (0/8) Epoch 2, batch 30000, loss[loss=0.1864, simple_loss=0.2493, pruned_loss=0.06181, over 4750.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2351, pruned_loss=0.05111, over 971428.12 frames.], batch size: 16, lr: 6.35e-04 2022-05-04 09:55:51,632 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 09:56:00,454 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1166, simple_loss=0.2028, pruned_loss=0.01515, over 914524.00 frames. 2022-05-04 09:56:39,113 INFO [train.py:715] (0/8) Epoch 2, batch 30050, loss[loss=0.1584, simple_loss=0.2265, pruned_loss=0.04515, over 4785.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2348, pruned_loss=0.05046, over 971170.72 frames.], batch size: 14, lr: 6.35e-04 2022-05-04 09:57:18,478 INFO [train.py:715] (0/8) Epoch 2, batch 30100, loss[loss=0.1905, simple_loss=0.2563, pruned_loss=0.06236, over 4940.00 frames.], tot_loss[loss=0.1692, simple_loss=0.2361, pruned_loss=0.05113, over 970649.15 frames.], batch size: 14, lr: 6.35e-04 2022-05-04 09:57:57,549 INFO [train.py:715] (0/8) Epoch 2, batch 30150, loss[loss=0.1744, simple_loss=0.2366, pruned_loss=0.0561, over 4984.00 frames.], tot_loss[loss=0.1693, simple_loss=0.2364, pruned_loss=0.05105, over 970832.14 frames.], batch size: 16, lr: 6.35e-04 2022-05-04 09:58:37,025 INFO [train.py:715] (0/8) Epoch 2, batch 30200, loss[loss=0.1671, simple_loss=0.2362, pruned_loss=0.04896, over 4819.00 frames.], tot_loss[loss=0.1694, simple_loss=0.2362, pruned_loss=0.05126, over 971582.84 frames.], batch size: 26, lr: 6.35e-04 2022-05-04 09:59:15,774 INFO [train.py:715] (0/8) Epoch 2, batch 30250, loss[loss=0.1459, simple_loss=0.2046, pruned_loss=0.04362, over 4839.00 frames.], tot_loss[loss=0.1696, simple_loss=0.2368, pruned_loss=0.05114, over 972947.43 frames.], batch size: 13, lr: 6.34e-04 2022-05-04 09:59:55,025 INFO [train.py:715] (0/8) Epoch 2, batch 30300, loss[loss=0.1407, simple_loss=0.2148, pruned_loss=0.03327, over 4700.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2361, pruned_loss=0.05081, over 972515.78 frames.], batch size: 15, lr: 6.34e-04 2022-05-04 10:00:35,004 INFO [train.py:715] (0/8) Epoch 2, batch 30350, loss[loss=0.2017, simple_loss=0.26, pruned_loss=0.07173, over 4942.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2359, pruned_loss=0.05099, over 973289.30 frames.], batch size: 39, lr: 6.34e-04 2022-05-04 10:01:14,087 INFO [train.py:715] (0/8) Epoch 2, batch 30400, loss[loss=0.1688, simple_loss=0.2378, pruned_loss=0.04991, over 4920.00 frames.], tot_loss[loss=0.1681, simple_loss=0.2354, pruned_loss=0.05037, over 972051.41 frames.], batch size: 18, lr: 6.34e-04 2022-05-04 10:01:53,191 INFO [train.py:715] (0/8) Epoch 2, batch 30450, loss[loss=0.1689, simple_loss=0.2341, pruned_loss=0.05186, over 4879.00 frames.], tot_loss[loss=0.1664, simple_loss=0.2339, pruned_loss=0.04947, over 971797.83 frames.], batch size: 16, lr: 6.34e-04 2022-05-04 10:02:32,961 INFO [train.py:715] (0/8) Epoch 2, batch 30500, loss[loss=0.1796, simple_loss=0.2485, pruned_loss=0.05531, over 4850.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2346, pruned_loss=0.04993, over 971405.18 frames.], batch size: 32, lr: 6.34e-04 2022-05-04 10:03:12,627 INFO [train.py:715] (0/8) Epoch 2, batch 30550, loss[loss=0.2184, simple_loss=0.278, pruned_loss=0.07947, over 4790.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2352, pruned_loss=0.05031, over 971925.25 frames.], batch size: 17, lr: 6.33e-04 2022-05-04 10:03:51,363 INFO [train.py:715] (0/8) Epoch 2, batch 30600, loss[loss=0.1627, simple_loss=0.2394, pruned_loss=0.04297, over 4975.00 frames.], tot_loss[loss=0.1677, simple_loss=0.2347, pruned_loss=0.05034, over 971936.87 frames.], batch size: 35, lr: 6.33e-04 2022-05-04 10:04:31,217 INFO [train.py:715] (0/8) Epoch 2, batch 30650, loss[loss=0.1435, simple_loss=0.2022, pruned_loss=0.04244, over 4850.00 frames.], tot_loss[loss=0.1658, simple_loss=0.233, pruned_loss=0.04929, over 972552.68 frames.], batch size: 12, lr: 6.33e-04 2022-05-04 10:05:11,278 INFO [train.py:715] (0/8) Epoch 2, batch 30700, loss[loss=0.1515, simple_loss=0.2134, pruned_loss=0.04484, over 4856.00 frames.], tot_loss[loss=0.1659, simple_loss=0.2331, pruned_loss=0.04936, over 972585.72 frames.], batch size: 30, lr: 6.33e-04 2022-05-04 10:05:51,105 INFO [train.py:715] (0/8) Epoch 2, batch 30750, loss[loss=0.1604, simple_loss=0.2331, pruned_loss=0.04387, over 4829.00 frames.], tot_loss[loss=0.1662, simple_loss=0.2335, pruned_loss=0.04948, over 972469.75 frames.], batch size: 15, lr: 6.33e-04 2022-05-04 10:06:30,171 INFO [train.py:715] (0/8) Epoch 2, batch 30800, loss[loss=0.1732, simple_loss=0.2471, pruned_loss=0.04967, over 4896.00 frames.], tot_loss[loss=0.1663, simple_loss=0.2339, pruned_loss=0.04936, over 972558.70 frames.], batch size: 17, lr: 6.33e-04 2022-05-04 10:07:09,678 INFO [train.py:715] (0/8) Epoch 2, batch 30850, loss[loss=0.1451, simple_loss=0.2105, pruned_loss=0.03986, over 4804.00 frames.], tot_loss[loss=0.1673, simple_loss=0.2345, pruned_loss=0.05002, over 971573.98 frames.], batch size: 13, lr: 6.33e-04 2022-05-04 10:07:49,327 INFO [train.py:715] (0/8) Epoch 2, batch 30900, loss[loss=0.1895, simple_loss=0.2508, pruned_loss=0.06413, over 4820.00 frames.], tot_loss[loss=0.1673, simple_loss=0.2348, pruned_loss=0.04988, over 972369.49 frames.], batch size: 26, lr: 6.32e-04 2022-05-04 10:08:27,846 INFO [train.py:715] (0/8) Epoch 2, batch 30950, loss[loss=0.1643, simple_loss=0.2381, pruned_loss=0.04527, over 4880.00 frames.], tot_loss[loss=0.1675, simple_loss=0.2353, pruned_loss=0.04987, over 972473.36 frames.], batch size: 16, lr: 6.32e-04 2022-05-04 10:09:07,769 INFO [train.py:715] (0/8) Epoch 2, batch 31000, loss[loss=0.1469, simple_loss=0.2116, pruned_loss=0.04112, over 4955.00 frames.], tot_loss[loss=0.1673, simple_loss=0.235, pruned_loss=0.04979, over 972374.53 frames.], batch size: 35, lr: 6.32e-04 2022-05-04 10:09:48,212 INFO [train.py:715] (0/8) Epoch 2, batch 31050, loss[loss=0.1755, simple_loss=0.2371, pruned_loss=0.05693, over 4800.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2361, pruned_loss=0.05068, over 972327.57 frames.], batch size: 25, lr: 6.32e-04 2022-05-04 10:10:27,689 INFO [train.py:715] (0/8) Epoch 2, batch 31100, loss[loss=0.1535, simple_loss=0.2334, pruned_loss=0.03677, over 4928.00 frames.], tot_loss[loss=0.1687, simple_loss=0.236, pruned_loss=0.05071, over 972498.82 frames.], batch size: 29, lr: 6.32e-04 2022-05-04 10:11:07,480 INFO [train.py:715] (0/8) Epoch 2, batch 31150, loss[loss=0.1857, simple_loss=0.2395, pruned_loss=0.06594, over 4804.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2362, pruned_loss=0.05043, over 972502.92 frames.], batch size: 24, lr: 6.32e-04 2022-05-04 10:11:47,656 INFO [train.py:715] (0/8) Epoch 2, batch 31200, loss[loss=0.1858, simple_loss=0.2578, pruned_loss=0.05687, over 4885.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2357, pruned_loss=0.05011, over 972289.95 frames.], batch size: 39, lr: 6.31e-04 2022-05-04 10:12:27,442 INFO [train.py:715] (0/8) Epoch 2, batch 31250, loss[loss=0.1419, simple_loss=0.2056, pruned_loss=0.03907, over 4786.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2364, pruned_loss=0.05071, over 971596.37 frames.], batch size: 18, lr: 6.31e-04 2022-05-04 10:13:06,642 INFO [train.py:715] (0/8) Epoch 2, batch 31300, loss[loss=0.1772, simple_loss=0.2308, pruned_loss=0.06176, over 4780.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2353, pruned_loss=0.05023, over 971870.32 frames.], batch size: 12, lr: 6.31e-04 2022-05-04 10:13:46,586 INFO [train.py:715] (0/8) Epoch 2, batch 31350, loss[loss=0.1819, simple_loss=0.2648, pruned_loss=0.04949, over 4851.00 frames.], tot_loss[loss=0.168, simple_loss=0.2354, pruned_loss=0.05031, over 972773.69 frames.], batch size: 20, lr: 6.31e-04 2022-05-04 10:14:26,948 INFO [train.py:715] (0/8) Epoch 2, batch 31400, loss[loss=0.1587, simple_loss=0.2286, pruned_loss=0.04434, over 4741.00 frames.], tot_loss[loss=0.1681, simple_loss=0.2351, pruned_loss=0.0505, over 972744.07 frames.], batch size: 16, lr: 6.31e-04 2022-05-04 10:15:06,595 INFO [train.py:715] (0/8) Epoch 2, batch 31450, loss[loss=0.1201, simple_loss=0.184, pruned_loss=0.02808, over 4823.00 frames.], tot_loss[loss=0.1703, simple_loss=0.2369, pruned_loss=0.05184, over 972747.98 frames.], batch size: 13, lr: 6.31e-04 2022-05-04 10:15:46,238 INFO [train.py:715] (0/8) Epoch 2, batch 31500, loss[loss=0.1392, simple_loss=0.2025, pruned_loss=0.03801, over 4985.00 frames.], tot_loss[loss=0.1704, simple_loss=0.2373, pruned_loss=0.05172, over 973327.95 frames.], batch size: 25, lr: 6.31e-04 2022-05-04 10:16:26,028 INFO [train.py:715] (0/8) Epoch 2, batch 31550, loss[loss=0.1817, simple_loss=0.2494, pruned_loss=0.05695, over 4833.00 frames.], tot_loss[loss=0.1701, simple_loss=0.2368, pruned_loss=0.05171, over 973858.90 frames.], batch size: 30, lr: 6.30e-04 2022-05-04 10:17:05,439 INFO [train.py:715] (0/8) Epoch 2, batch 31600, loss[loss=0.1272, simple_loss=0.2001, pruned_loss=0.02712, over 4910.00 frames.], tot_loss[loss=0.1699, simple_loss=0.2371, pruned_loss=0.05133, over 974452.15 frames.], batch size: 18, lr: 6.30e-04 2022-05-04 10:17:44,223 INFO [train.py:715] (0/8) Epoch 2, batch 31650, loss[loss=0.1912, simple_loss=0.2544, pruned_loss=0.06399, over 4851.00 frames.], tot_loss[loss=0.1694, simple_loss=0.2364, pruned_loss=0.05114, over 974437.92 frames.], batch size: 30, lr: 6.30e-04 2022-05-04 10:18:24,068 INFO [train.py:715] (0/8) Epoch 2, batch 31700, loss[loss=0.1572, simple_loss=0.2375, pruned_loss=0.03848, over 4810.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2354, pruned_loss=0.05098, over 973978.56 frames.], batch size: 25, lr: 6.30e-04 2022-05-04 10:19:04,300 INFO [train.py:715] (0/8) Epoch 2, batch 31750, loss[loss=0.1618, simple_loss=0.221, pruned_loss=0.05133, over 4918.00 frames.], tot_loss[loss=0.1685, simple_loss=0.2355, pruned_loss=0.05078, over 973773.67 frames.], batch size: 18, lr: 6.30e-04 2022-05-04 10:19:44,139 INFO [train.py:715] (0/8) Epoch 2, batch 31800, loss[loss=0.1655, simple_loss=0.2399, pruned_loss=0.0455, over 4751.00 frames.], tot_loss[loss=0.1675, simple_loss=0.2344, pruned_loss=0.05034, over 973161.35 frames.], batch size: 16, lr: 6.30e-04 2022-05-04 10:20:23,463 INFO [train.py:715] (0/8) Epoch 2, batch 31850, loss[loss=0.1825, simple_loss=0.2412, pruned_loss=0.06193, over 4853.00 frames.], tot_loss[loss=0.1684, simple_loss=0.235, pruned_loss=0.05088, over 972907.96 frames.], batch size: 20, lr: 6.29e-04 2022-05-04 10:21:02,953 INFO [train.py:715] (0/8) Epoch 2, batch 31900, loss[loss=0.1609, simple_loss=0.2371, pruned_loss=0.0423, over 4982.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2345, pruned_loss=0.05034, over 973459.64 frames.], batch size: 24, lr: 6.29e-04 2022-05-04 10:21:42,551 INFO [train.py:715] (0/8) Epoch 2, batch 31950, loss[loss=0.1516, simple_loss=0.2168, pruned_loss=0.04323, over 4772.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2344, pruned_loss=0.04992, over 973554.16 frames.], batch size: 12, lr: 6.29e-04 2022-05-04 10:22:21,473 INFO [train.py:715] (0/8) Epoch 2, batch 32000, loss[loss=0.1484, simple_loss=0.2058, pruned_loss=0.04548, over 4758.00 frames.], tot_loss[loss=0.1669, simple_loss=0.2337, pruned_loss=0.05007, over 973360.11 frames.], batch size: 19, lr: 6.29e-04 2022-05-04 10:23:01,106 INFO [train.py:715] (0/8) Epoch 2, batch 32050, loss[loss=0.1801, simple_loss=0.2416, pruned_loss=0.05931, over 4907.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2346, pruned_loss=0.05029, over 972739.22 frames.], batch size: 19, lr: 6.29e-04 2022-05-04 10:23:41,015 INFO [train.py:715] (0/8) Epoch 2, batch 32100, loss[loss=0.1327, simple_loss=0.2033, pruned_loss=0.03109, over 4993.00 frames.], tot_loss[loss=0.1676, simple_loss=0.235, pruned_loss=0.05008, over 972736.00 frames.], batch size: 14, lr: 6.29e-04 2022-05-04 10:24:20,303 INFO [train.py:715] (0/8) Epoch 2, batch 32150, loss[loss=0.1451, simple_loss=0.2177, pruned_loss=0.03626, over 4820.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2347, pruned_loss=0.04986, over 973006.78 frames.], batch size: 26, lr: 6.29e-04 2022-05-04 10:24:59,277 INFO [train.py:715] (0/8) Epoch 2, batch 32200, loss[loss=0.1839, simple_loss=0.2432, pruned_loss=0.0623, over 4698.00 frames.], tot_loss[loss=0.1681, simple_loss=0.235, pruned_loss=0.05055, over 972967.01 frames.], batch size: 15, lr: 6.28e-04 2022-05-04 10:25:39,137 INFO [train.py:715] (0/8) Epoch 2, batch 32250, loss[loss=0.1825, simple_loss=0.2514, pruned_loss=0.05677, over 4875.00 frames.], tot_loss[loss=0.1678, simple_loss=0.235, pruned_loss=0.05033, over 973426.33 frames.], batch size: 39, lr: 6.28e-04 2022-05-04 10:26:18,493 INFO [train.py:715] (0/8) Epoch 2, batch 32300, loss[loss=0.1628, simple_loss=0.2355, pruned_loss=0.04503, over 4789.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2345, pruned_loss=0.04988, over 973699.63 frames.], batch size: 17, lr: 6.28e-04 2022-05-04 10:26:57,489 INFO [train.py:715] (0/8) Epoch 2, batch 32350, loss[loss=0.15, simple_loss=0.2185, pruned_loss=0.04081, over 4929.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2348, pruned_loss=0.05022, over 973985.00 frames.], batch size: 18, lr: 6.28e-04 2022-05-04 10:27:37,324 INFO [train.py:715] (0/8) Epoch 2, batch 32400, loss[loss=0.1297, simple_loss=0.1934, pruned_loss=0.03297, over 4783.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2343, pruned_loss=0.0499, over 974014.20 frames.], batch size: 12, lr: 6.28e-04 2022-05-04 10:28:17,092 INFO [train.py:715] (0/8) Epoch 2, batch 32450, loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03315, over 4968.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2346, pruned_loss=0.0501, over 973744.11 frames.], batch size: 15, lr: 6.28e-04 2022-05-04 10:28:56,073 INFO [train.py:715] (0/8) Epoch 2, batch 32500, loss[loss=0.1672, simple_loss=0.241, pruned_loss=0.04668, over 4762.00 frames.], tot_loss[loss=0.1664, simple_loss=0.2339, pruned_loss=0.04945, over 972960.03 frames.], batch size: 19, lr: 6.27e-04 2022-05-04 10:29:35,588 INFO [train.py:715] (0/8) Epoch 2, batch 32550, loss[loss=0.1584, simple_loss=0.2299, pruned_loss=0.04342, over 4841.00 frames.], tot_loss[loss=0.1662, simple_loss=0.2338, pruned_loss=0.04924, over 973623.78 frames.], batch size: 20, lr: 6.27e-04 2022-05-04 10:30:15,647 INFO [train.py:715] (0/8) Epoch 2, batch 32600, loss[loss=0.1586, simple_loss=0.2265, pruned_loss=0.04535, over 4970.00 frames.], tot_loss[loss=0.1671, simple_loss=0.235, pruned_loss=0.04967, over 974370.41 frames.], batch size: 15, lr: 6.27e-04 2022-05-04 10:30:54,905 INFO [train.py:715] (0/8) Epoch 2, batch 32650, loss[loss=0.1587, simple_loss=0.2277, pruned_loss=0.04491, over 4963.00 frames.], tot_loss[loss=0.1661, simple_loss=0.2338, pruned_loss=0.04923, over 973601.51 frames.], batch size: 24, lr: 6.27e-04 2022-05-04 10:31:33,747 INFO [train.py:715] (0/8) Epoch 2, batch 32700, loss[loss=0.1883, simple_loss=0.2536, pruned_loss=0.06145, over 4818.00 frames.], tot_loss[loss=0.1669, simple_loss=0.2342, pruned_loss=0.04984, over 973282.97 frames.], batch size: 27, lr: 6.27e-04 2022-05-04 10:32:13,534 INFO [train.py:715] (0/8) Epoch 2, batch 32750, loss[loss=0.1703, simple_loss=0.2449, pruned_loss=0.04782, over 4851.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2347, pruned_loss=0.04981, over 972637.69 frames.], batch size: 20, lr: 6.27e-04 2022-05-04 10:32:53,518 INFO [train.py:715] (0/8) Epoch 2, batch 32800, loss[loss=0.177, simple_loss=0.2388, pruned_loss=0.05763, over 4815.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2348, pruned_loss=0.04998, over 972790.65 frames.], batch size: 13, lr: 6.27e-04 2022-05-04 10:33:32,250 INFO [train.py:715] (0/8) Epoch 2, batch 32850, loss[loss=0.1681, simple_loss=0.2345, pruned_loss=0.05083, over 4809.00 frames.], tot_loss[loss=0.1692, simple_loss=0.2364, pruned_loss=0.05103, over 971583.90 frames.], batch size: 26, lr: 6.26e-04 2022-05-04 10:34:11,595 INFO [train.py:715] (0/8) Epoch 2, batch 32900, loss[loss=0.1351, simple_loss=0.2134, pruned_loss=0.02839, over 4828.00 frames.], tot_loss[loss=0.1693, simple_loss=0.2363, pruned_loss=0.05116, over 972028.77 frames.], batch size: 26, lr: 6.26e-04 2022-05-04 10:34:51,515 INFO [train.py:715] (0/8) Epoch 2, batch 32950, loss[loss=0.1544, simple_loss=0.2314, pruned_loss=0.03868, over 4801.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2354, pruned_loss=0.05015, over 972101.58 frames.], batch size: 24, lr: 6.26e-04 2022-05-04 10:35:30,094 INFO [train.py:715] (0/8) Epoch 2, batch 33000, loss[loss=0.1678, simple_loss=0.2391, pruned_loss=0.04828, over 4963.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2348, pruned_loss=0.05006, over 972462.97 frames.], batch size: 15, lr: 6.26e-04 2022-05-04 10:35:30,095 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 10:35:38,853 INFO [train.py:742] (0/8) Epoch 2, validation: loss=0.1163, simple_loss=0.2025, pruned_loss=0.01504, over 914524.00 frames. 2022-05-04 10:36:17,839 INFO [train.py:715] (0/8) Epoch 2, batch 33050, loss[loss=0.1514, simple_loss=0.2252, pruned_loss=0.03881, over 4880.00 frames.], tot_loss[loss=0.1666, simple_loss=0.2342, pruned_loss=0.04947, over 971748.23 frames.], batch size: 19, lr: 6.26e-04 2022-05-04 10:36:57,381 INFO [train.py:715] (0/8) Epoch 2, batch 33100, loss[loss=0.1574, simple_loss=0.2253, pruned_loss=0.04476, over 4964.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2342, pruned_loss=0.0496, over 972465.41 frames.], batch size: 24, lr: 6.26e-04 2022-05-04 10:37:37,176 INFO [train.py:715] (0/8) Epoch 2, batch 33150, loss[loss=0.1722, simple_loss=0.2403, pruned_loss=0.05204, over 4682.00 frames.], tot_loss[loss=0.1664, simple_loss=0.2342, pruned_loss=0.04927, over 972245.13 frames.], batch size: 15, lr: 6.25e-04 2022-05-04 10:38:16,781 INFO [train.py:715] (0/8) Epoch 2, batch 33200, loss[loss=0.1478, simple_loss=0.2173, pruned_loss=0.03913, over 4909.00 frames.], tot_loss[loss=0.166, simple_loss=0.2335, pruned_loss=0.04927, over 971941.33 frames.], batch size: 18, lr: 6.25e-04 2022-05-04 10:38:56,317 INFO [train.py:715] (0/8) Epoch 2, batch 33250, loss[loss=0.1516, simple_loss=0.2253, pruned_loss=0.03893, over 4839.00 frames.], tot_loss[loss=0.1663, simple_loss=0.234, pruned_loss=0.04933, over 971803.54 frames.], batch size: 26, lr: 6.25e-04 2022-05-04 10:39:35,517 INFO [train.py:715] (0/8) Epoch 2, batch 33300, loss[loss=0.1915, simple_loss=0.2566, pruned_loss=0.0632, over 4777.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2345, pruned_loss=0.04985, over 971390.72 frames.], batch size: 14, lr: 6.25e-04 2022-05-04 10:40:14,693 INFO [train.py:715] (0/8) Epoch 2, batch 33350, loss[loss=0.2021, simple_loss=0.2709, pruned_loss=0.06663, over 4953.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2346, pruned_loss=0.05013, over 971461.53 frames.], batch size: 39, lr: 6.25e-04 2022-05-04 10:40:53,960 INFO [train.py:715] (0/8) Epoch 2, batch 33400, loss[loss=0.18, simple_loss=0.2432, pruned_loss=0.0584, over 4759.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2346, pruned_loss=0.05006, over 970821.42 frames.], batch size: 16, lr: 6.25e-04 2022-05-04 10:41:33,182 INFO [train.py:715] (0/8) Epoch 2, batch 33450, loss[loss=0.1937, simple_loss=0.2525, pruned_loss=0.06744, over 4901.00 frames.], tot_loss[loss=0.1689, simple_loss=0.2359, pruned_loss=0.05092, over 970817.01 frames.], batch size: 17, lr: 6.25e-04 2022-05-04 10:42:13,247 INFO [train.py:715] (0/8) Epoch 2, batch 33500, loss[loss=0.1505, simple_loss=0.2259, pruned_loss=0.0375, over 4949.00 frames.], tot_loss[loss=0.1695, simple_loss=0.2366, pruned_loss=0.05126, over 971312.83 frames.], batch size: 21, lr: 6.24e-04 2022-05-04 10:42:52,006 INFO [train.py:715] (0/8) Epoch 2, batch 33550, loss[loss=0.1695, simple_loss=0.2343, pruned_loss=0.05236, over 4972.00 frames.], tot_loss[loss=0.1684, simple_loss=0.2355, pruned_loss=0.05071, over 972475.62 frames.], batch size: 28, lr: 6.24e-04 2022-05-04 10:43:31,503 INFO [train.py:715] (0/8) Epoch 2, batch 33600, loss[loss=0.1659, simple_loss=0.234, pruned_loss=0.04885, over 4907.00 frames.], tot_loss[loss=0.1693, simple_loss=0.2361, pruned_loss=0.05124, over 971861.44 frames.], batch size: 18, lr: 6.24e-04 2022-05-04 10:44:11,052 INFO [train.py:715] (0/8) Epoch 2, batch 33650, loss[loss=0.1582, simple_loss=0.2327, pruned_loss=0.04182, over 4957.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2354, pruned_loss=0.05058, over 972191.53 frames.], batch size: 24, lr: 6.24e-04 2022-05-04 10:44:50,487 INFO [train.py:715] (0/8) Epoch 2, batch 33700, loss[loss=0.184, simple_loss=0.2623, pruned_loss=0.05281, over 4812.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2358, pruned_loss=0.05079, over 973032.25 frames.], batch size: 27, lr: 6.24e-04 2022-05-04 10:45:29,904 INFO [train.py:715] (0/8) Epoch 2, batch 33750, loss[loss=0.1474, simple_loss=0.2164, pruned_loss=0.03926, over 4958.00 frames.], tot_loss[loss=0.169, simple_loss=0.2363, pruned_loss=0.05084, over 973529.78 frames.], batch size: 29, lr: 6.24e-04 2022-05-04 10:46:09,309 INFO [train.py:715] (0/8) Epoch 2, batch 33800, loss[loss=0.1507, simple_loss=0.2107, pruned_loss=0.04533, over 4799.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2356, pruned_loss=0.05048, over 973635.07 frames.], batch size: 12, lr: 6.23e-04 2022-05-04 10:46:49,488 INFO [train.py:715] (0/8) Epoch 2, batch 33850, loss[loss=0.1337, simple_loss=0.196, pruned_loss=0.03568, over 4858.00 frames.], tot_loss[loss=0.1674, simple_loss=0.2347, pruned_loss=0.05001, over 973540.19 frames.], batch size: 13, lr: 6.23e-04 2022-05-04 10:47:28,885 INFO [train.py:715] (0/8) Epoch 2, batch 33900, loss[loss=0.1746, simple_loss=0.2306, pruned_loss=0.05929, over 4788.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2343, pruned_loss=0.05004, over 974224.57 frames.], batch size: 14, lr: 6.23e-04 2022-05-04 10:48:08,027 INFO [train.py:715] (0/8) Epoch 2, batch 33950, loss[loss=0.153, simple_loss=0.22, pruned_loss=0.04298, over 4639.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2343, pruned_loss=0.04993, over 972995.89 frames.], batch size: 13, lr: 6.23e-04 2022-05-04 10:48:47,948 INFO [train.py:715] (0/8) Epoch 2, batch 34000, loss[loss=0.1846, simple_loss=0.2458, pruned_loss=0.06164, over 4707.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2344, pruned_loss=0.05001, over 973077.95 frames.], batch size: 15, lr: 6.23e-04 2022-05-04 10:49:27,581 INFO [train.py:715] (0/8) Epoch 2, batch 34050, loss[loss=0.1406, simple_loss=0.2224, pruned_loss=0.02937, over 4957.00 frames.], tot_loss[loss=0.1679, simple_loss=0.2351, pruned_loss=0.05035, over 973523.35 frames.], batch size: 21, lr: 6.23e-04 2022-05-04 10:50:07,049 INFO [train.py:715] (0/8) Epoch 2, batch 34100, loss[loss=0.1429, simple_loss=0.2157, pruned_loss=0.035, over 4979.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2346, pruned_loss=0.04976, over 972586.95 frames.], batch size: 35, lr: 6.23e-04 2022-05-04 10:50:46,460 INFO [train.py:715] (0/8) Epoch 2, batch 34150, loss[loss=0.1522, simple_loss=0.2129, pruned_loss=0.0457, over 4792.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2342, pruned_loss=0.04959, over 972674.97 frames.], batch size: 14, lr: 6.22e-04 2022-05-04 10:51:26,748 INFO [train.py:715] (0/8) Epoch 2, batch 34200, loss[loss=0.1678, simple_loss=0.2442, pruned_loss=0.04573, over 4843.00 frames.], tot_loss[loss=0.1676, simple_loss=0.235, pruned_loss=0.05016, over 973190.39 frames.], batch size: 15, lr: 6.22e-04 2022-05-04 10:52:06,318 INFO [train.py:715] (0/8) Epoch 2, batch 34250, loss[loss=0.2011, simple_loss=0.2586, pruned_loss=0.07182, over 4730.00 frames.], tot_loss[loss=0.1692, simple_loss=0.2365, pruned_loss=0.05091, over 973069.29 frames.], batch size: 16, lr: 6.22e-04 2022-05-04 10:52:45,481 INFO [train.py:715] (0/8) Epoch 2, batch 34300, loss[loss=0.1336, simple_loss=0.2062, pruned_loss=0.03046, over 4969.00 frames.], tot_loss[loss=0.1682, simple_loss=0.2354, pruned_loss=0.05045, over 973449.51 frames.], batch size: 24, lr: 6.22e-04 2022-05-04 10:53:25,367 INFO [train.py:715] (0/8) Epoch 2, batch 34350, loss[loss=0.1746, simple_loss=0.2416, pruned_loss=0.05383, over 4856.00 frames.], tot_loss[loss=0.1683, simple_loss=0.2354, pruned_loss=0.0506, over 972577.68 frames.], batch size: 20, lr: 6.22e-04 2022-05-04 10:53:48,989 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-104000.pt 2022-05-04 10:54:07,388 INFO [train.py:715] (0/8) Epoch 2, batch 34400, loss[loss=0.1729, simple_loss=0.2382, pruned_loss=0.05384, over 4888.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2359, pruned_loss=0.05079, over 971805.81 frames.], batch size: 16, lr: 6.22e-04 2022-05-04 10:54:46,514 INFO [train.py:715] (0/8) Epoch 2, batch 34450, loss[loss=0.1453, simple_loss=0.2166, pruned_loss=0.03699, over 4793.00 frames.], tot_loss[loss=0.1694, simple_loss=0.2367, pruned_loss=0.05104, over 972266.18 frames.], batch size: 13, lr: 6.22e-04 2022-05-04 10:55:25,439 INFO [train.py:715] (0/8) Epoch 2, batch 34500, loss[loss=0.2359, simple_loss=0.285, pruned_loss=0.09341, over 4900.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2363, pruned_loss=0.05061, over 971747.82 frames.], batch size: 19, lr: 6.21e-04 2022-05-04 10:56:05,350 INFO [train.py:715] (0/8) Epoch 2, batch 34550, loss[loss=0.1574, simple_loss=0.2333, pruned_loss=0.04077, over 4811.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2348, pruned_loss=0.04968, over 971563.43 frames.], batch size: 27, lr: 6.21e-04 2022-05-04 10:56:44,140 INFO [train.py:715] (0/8) Epoch 2, batch 34600, loss[loss=0.1837, simple_loss=0.248, pruned_loss=0.0597, over 4876.00 frames.], tot_loss[loss=0.1668, simple_loss=0.2346, pruned_loss=0.04956, over 970796.56 frames.], batch size: 16, lr: 6.21e-04 2022-05-04 10:57:23,175 INFO [train.py:715] (0/8) Epoch 2, batch 34650, loss[loss=0.2005, simple_loss=0.2509, pruned_loss=0.07506, over 4964.00 frames.], tot_loss[loss=0.1677, simple_loss=0.2354, pruned_loss=0.05002, over 971399.43 frames.], batch size: 15, lr: 6.21e-04 2022-05-04 10:58:02,534 INFO [train.py:715] (0/8) Epoch 2, batch 34700, loss[loss=0.151, simple_loss=0.222, pruned_loss=0.04002, over 4838.00 frames.], tot_loss[loss=0.1687, simple_loss=0.2361, pruned_loss=0.05063, over 972290.43 frames.], batch size: 15, lr: 6.21e-04 2022-05-04 10:58:40,566 INFO [train.py:715] (0/8) Epoch 2, batch 34750, loss[loss=0.1805, simple_loss=0.2513, pruned_loss=0.0548, over 4683.00 frames.], tot_loss[loss=0.1688, simple_loss=0.2362, pruned_loss=0.05073, over 972118.44 frames.], batch size: 15, lr: 6.21e-04 2022-05-04 10:59:17,105 INFO [train.py:715] (0/8) Epoch 2, batch 34800, loss[loss=0.1499, simple_loss=0.2017, pruned_loss=0.04904, over 4824.00 frames.], tot_loss[loss=0.1675, simple_loss=0.2349, pruned_loss=0.05003, over 972278.20 frames.], batch size: 12, lr: 6.20e-04 2022-05-04 10:59:26,778 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-2.pt 2022-05-04 11:00:07,065 INFO [train.py:715] (0/8) Epoch 3, batch 0, loss[loss=0.1642, simple_loss=0.235, pruned_loss=0.04668, over 4819.00 frames.], tot_loss[loss=0.1642, simple_loss=0.235, pruned_loss=0.04668, over 4819.00 frames.], batch size: 25, lr: 5.87e-04 2022-05-04 11:00:45,749 INFO [train.py:715] (0/8) Epoch 3, batch 50, loss[loss=0.1662, simple_loss=0.2414, pruned_loss=0.04545, over 4897.00 frames.], tot_loss[loss=0.1663, simple_loss=0.2347, pruned_loss=0.04896, over 219745.68 frames.], batch size: 19, lr: 5.87e-04 2022-05-04 11:01:25,678 INFO [train.py:715] (0/8) Epoch 3, batch 100, loss[loss=0.2137, simple_loss=0.2676, pruned_loss=0.0799, over 4893.00 frames.], tot_loss[loss=0.1701, simple_loss=0.2377, pruned_loss=0.05129, over 387677.85 frames.], batch size: 17, lr: 5.87e-04 2022-05-04 11:02:05,233 INFO [train.py:715] (0/8) Epoch 3, batch 150, loss[loss=0.1729, simple_loss=0.2436, pruned_loss=0.05113, over 4956.00 frames.], tot_loss[loss=0.1684, simple_loss=0.236, pruned_loss=0.05038, over 517766.04 frames.], batch size: 21, lr: 5.86e-04 2022-05-04 11:02:44,387 INFO [train.py:715] (0/8) Epoch 3, batch 200, loss[loss=0.1734, simple_loss=0.2391, pruned_loss=0.05382, over 4820.00 frames.], tot_loss[loss=0.1678, simple_loss=0.2347, pruned_loss=0.05045, over 619127.74 frames.], batch size: 27, lr: 5.86e-04 2022-05-04 11:03:23,625 INFO [train.py:715] (0/8) Epoch 3, batch 250, loss[loss=0.1401, simple_loss=0.2049, pruned_loss=0.03768, over 4872.00 frames.], tot_loss[loss=0.1678, simple_loss=0.2349, pruned_loss=0.05033, over 697837.29 frames.], batch size: 22, lr: 5.86e-04 2022-05-04 11:04:03,629 INFO [train.py:715] (0/8) Epoch 3, batch 300, loss[loss=0.151, simple_loss=0.225, pruned_loss=0.03854, over 4831.00 frames.], tot_loss[loss=0.1662, simple_loss=0.2339, pruned_loss=0.04929, over 758606.57 frames.], batch size: 27, lr: 5.86e-04 2022-05-04 11:04:42,644 INFO [train.py:715] (0/8) Epoch 3, batch 350, loss[loss=0.1915, simple_loss=0.2585, pruned_loss=0.0623, over 4788.00 frames.], tot_loss[loss=0.1676, simple_loss=0.2354, pruned_loss=0.04987, over 806195.95 frames.], batch size: 17, lr: 5.86e-04 2022-05-04 11:05:21,845 INFO [train.py:715] (0/8) Epoch 3, batch 400, loss[loss=0.1898, simple_loss=0.2502, pruned_loss=0.0647, over 4869.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2342, pruned_loss=0.04961, over 842980.05 frames.], batch size: 16, lr: 5.86e-04 2022-05-04 11:06:01,609 INFO [train.py:715] (0/8) Epoch 3, batch 450, loss[loss=0.1433, simple_loss=0.2221, pruned_loss=0.03231, over 4984.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2335, pruned_loss=0.04887, over 871965.98 frames.], batch size: 28, lr: 5.86e-04 2022-05-04 11:06:41,123 INFO [train.py:715] (0/8) Epoch 3, batch 500, loss[loss=0.1462, simple_loss=0.2204, pruned_loss=0.03606, over 4820.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2335, pruned_loss=0.04884, over 893507.14 frames.], batch size: 13, lr: 5.85e-04 2022-05-04 11:07:20,467 INFO [train.py:715] (0/8) Epoch 3, batch 550, loss[loss=0.199, simple_loss=0.2625, pruned_loss=0.06775, over 4929.00 frames.], tot_loss[loss=0.166, simple_loss=0.2337, pruned_loss=0.04913, over 910917.86 frames.], batch size: 23, lr: 5.85e-04 2022-05-04 11:07:59,338 INFO [train.py:715] (0/8) Epoch 3, batch 600, loss[loss=0.1644, simple_loss=0.2371, pruned_loss=0.04583, over 4985.00 frames.], tot_loss[loss=0.1655, simple_loss=0.2332, pruned_loss=0.04886, over 925161.34 frames.], batch size: 25, lr: 5.85e-04 2022-05-04 11:08:39,291 INFO [train.py:715] (0/8) Epoch 3, batch 650, loss[loss=0.1593, simple_loss=0.2173, pruned_loss=0.05071, over 4970.00 frames.], tot_loss[loss=0.1677, simple_loss=0.2346, pruned_loss=0.05039, over 936222.90 frames.], batch size: 15, lr: 5.85e-04 2022-05-04 11:09:18,638 INFO [train.py:715] (0/8) Epoch 3, batch 700, loss[loss=0.1664, simple_loss=0.2256, pruned_loss=0.05359, over 4990.00 frames.], tot_loss[loss=0.1681, simple_loss=0.2345, pruned_loss=0.05082, over 943863.37 frames.], batch size: 28, lr: 5.85e-04 2022-05-04 11:09:57,738 INFO [train.py:715] (0/8) Epoch 3, batch 750, loss[loss=0.1546, simple_loss=0.2209, pruned_loss=0.04414, over 4969.00 frames.], tot_loss[loss=0.1673, simple_loss=0.2341, pruned_loss=0.05018, over 950296.54 frames.], batch size: 15, lr: 5.85e-04 2022-05-04 11:10:37,300 INFO [train.py:715] (0/8) Epoch 3, batch 800, loss[loss=0.1246, simple_loss=0.1999, pruned_loss=0.0247, over 4779.00 frames.], tot_loss[loss=0.1671, simple_loss=0.234, pruned_loss=0.05003, over 955702.29 frames.], batch size: 14, lr: 5.85e-04 2022-05-04 11:11:17,441 INFO [train.py:715] (0/8) Epoch 3, batch 850, loss[loss=0.1504, simple_loss=0.2201, pruned_loss=0.04035, over 4763.00 frames.], tot_loss[loss=0.1673, simple_loss=0.2343, pruned_loss=0.05014, over 958694.84 frames.], batch size: 19, lr: 5.84e-04 2022-05-04 11:11:56,829 INFO [train.py:715] (0/8) Epoch 3, batch 900, loss[loss=0.1577, simple_loss=0.227, pruned_loss=0.0442, over 4754.00 frames.], tot_loss[loss=0.1659, simple_loss=0.2336, pruned_loss=0.04916, over 961407.42 frames.], batch size: 16, lr: 5.84e-04 2022-05-04 11:12:35,441 INFO [train.py:715] (0/8) Epoch 3, batch 950, loss[loss=0.1397, simple_loss=0.2068, pruned_loss=0.03625, over 4762.00 frames.], tot_loss[loss=0.1652, simple_loss=0.2328, pruned_loss=0.04875, over 964293.41 frames.], batch size: 18, lr: 5.84e-04 2022-05-04 11:13:15,426 INFO [train.py:715] (0/8) Epoch 3, batch 1000, loss[loss=0.1658, simple_loss=0.2393, pruned_loss=0.04616, over 4916.00 frames.], tot_loss[loss=0.1655, simple_loss=0.2331, pruned_loss=0.04895, over 967078.27 frames.], batch size: 18, lr: 5.84e-04 2022-05-04 11:13:55,095 INFO [train.py:715] (0/8) Epoch 3, batch 1050, loss[loss=0.1831, simple_loss=0.2535, pruned_loss=0.05636, over 4911.00 frames.], tot_loss[loss=0.1662, simple_loss=0.2341, pruned_loss=0.04921, over 969008.65 frames.], batch size: 17, lr: 5.84e-04 2022-05-04 11:14:34,004 INFO [train.py:715] (0/8) Epoch 3, batch 1100, loss[loss=0.1795, simple_loss=0.2362, pruned_loss=0.06133, over 4987.00 frames.], tot_loss[loss=0.1653, simple_loss=0.233, pruned_loss=0.04882, over 969868.37 frames.], batch size: 25, lr: 5.84e-04 2022-05-04 11:15:12,872 INFO [train.py:715] (0/8) Epoch 3, batch 1150, loss[loss=0.1559, simple_loss=0.2223, pruned_loss=0.0447, over 4925.00 frames.], tot_loss[loss=0.1666, simple_loss=0.2344, pruned_loss=0.04943, over 971027.62 frames.], batch size: 18, lr: 5.84e-04 2022-05-04 11:15:52,683 INFO [train.py:715] (0/8) Epoch 3, batch 1200, loss[loss=0.1416, simple_loss=0.2065, pruned_loss=0.03831, over 4929.00 frames.], tot_loss[loss=0.1669, simple_loss=0.2342, pruned_loss=0.04979, over 972420.83 frames.], batch size: 23, lr: 5.83e-04 2022-05-04 11:16:31,657 INFO [train.py:715] (0/8) Epoch 3, batch 1250, loss[loss=0.2182, simple_loss=0.289, pruned_loss=0.07369, over 4878.00 frames.], tot_loss[loss=0.1669, simple_loss=0.234, pruned_loss=0.04986, over 971858.81 frames.], batch size: 16, lr: 5.83e-04 2022-05-04 11:17:10,160 INFO [train.py:715] (0/8) Epoch 3, batch 1300, loss[loss=0.1547, simple_loss=0.227, pruned_loss=0.04123, over 4870.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2335, pruned_loss=0.04886, over 971961.75 frames.], batch size: 16, lr: 5.83e-04 2022-05-04 11:17:49,723 INFO [train.py:715] (0/8) Epoch 3, batch 1350, loss[loss=0.1774, simple_loss=0.2388, pruned_loss=0.05803, over 4937.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2331, pruned_loss=0.04916, over 972082.41 frames.], batch size: 21, lr: 5.83e-04 2022-05-04 11:18:28,999 INFO [train.py:715] (0/8) Epoch 3, batch 1400, loss[loss=0.1424, simple_loss=0.2089, pruned_loss=0.03798, over 4970.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2329, pruned_loss=0.04929, over 972648.80 frames.], batch size: 28, lr: 5.83e-04 2022-05-04 11:19:07,865 INFO [train.py:715] (0/8) Epoch 3, batch 1450, loss[loss=0.1974, simple_loss=0.2542, pruned_loss=0.07031, over 4854.00 frames.], tot_loss[loss=0.166, simple_loss=0.2331, pruned_loss=0.04946, over 972969.21 frames.], batch size: 30, lr: 5.83e-04 2022-05-04 11:19:46,422 INFO [train.py:715] (0/8) Epoch 3, batch 1500, loss[loss=0.1681, simple_loss=0.2201, pruned_loss=0.05802, over 4883.00 frames.], tot_loss[loss=0.165, simple_loss=0.2323, pruned_loss=0.04882, over 973244.07 frames.], batch size: 32, lr: 5.83e-04 2022-05-04 11:20:26,149 INFO [train.py:715] (0/8) Epoch 3, batch 1550, loss[loss=0.1882, simple_loss=0.2505, pruned_loss=0.0629, over 4889.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2334, pruned_loss=0.049, over 973360.55 frames.], batch size: 22, lr: 5.83e-04 2022-05-04 11:21:05,416 INFO [train.py:715] (0/8) Epoch 3, batch 1600, loss[loss=0.1667, simple_loss=0.2334, pruned_loss=0.04995, over 4868.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2323, pruned_loss=0.04849, over 973456.44 frames.], batch size: 22, lr: 5.82e-04 2022-05-04 11:21:43,527 INFO [train.py:715] (0/8) Epoch 3, batch 1650, loss[loss=0.1338, simple_loss=0.1973, pruned_loss=0.03519, over 4766.00 frames.], tot_loss[loss=0.1653, simple_loss=0.2332, pruned_loss=0.04873, over 972827.00 frames.], batch size: 14, lr: 5.82e-04 2022-05-04 11:22:22,781 INFO [train.py:715] (0/8) Epoch 3, batch 1700, loss[loss=0.161, simple_loss=0.2427, pruned_loss=0.03965, over 4871.00 frames.], tot_loss[loss=0.1652, simple_loss=0.233, pruned_loss=0.04872, over 972991.03 frames.], batch size: 22, lr: 5.82e-04 2022-05-04 11:23:02,319 INFO [train.py:715] (0/8) Epoch 3, batch 1750, loss[loss=0.1795, simple_loss=0.2505, pruned_loss=0.05427, over 4921.00 frames.], tot_loss[loss=0.1656, simple_loss=0.233, pruned_loss=0.04904, over 973363.61 frames.], batch size: 23, lr: 5.82e-04 2022-05-04 11:23:41,617 INFO [train.py:715] (0/8) Epoch 3, batch 1800, loss[loss=0.1476, simple_loss=0.2256, pruned_loss=0.03483, over 4693.00 frames.], tot_loss[loss=0.1648, simple_loss=0.2326, pruned_loss=0.04847, over 973259.38 frames.], batch size: 15, lr: 5.82e-04 2022-05-04 11:24:20,320 INFO [train.py:715] (0/8) Epoch 3, batch 1850, loss[loss=0.1492, simple_loss=0.2238, pruned_loss=0.0373, over 4848.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2323, pruned_loss=0.04854, over 974134.32 frames.], batch size: 30, lr: 5.82e-04 2022-05-04 11:25:00,295 INFO [train.py:715] (0/8) Epoch 3, batch 1900, loss[loss=0.1915, simple_loss=0.2563, pruned_loss=0.0634, over 4926.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2316, pruned_loss=0.04803, over 973782.71 frames.], batch size: 39, lr: 5.82e-04 2022-05-04 11:25:39,887 INFO [train.py:715] (0/8) Epoch 3, batch 1950, loss[loss=0.1579, simple_loss=0.2248, pruned_loss=0.04554, over 4982.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2316, pruned_loss=0.04813, over 973812.14 frames.], batch size: 31, lr: 5.81e-04 2022-05-04 11:26:18,805 INFO [train.py:715] (0/8) Epoch 3, batch 2000, loss[loss=0.1932, simple_loss=0.2683, pruned_loss=0.05899, over 4826.00 frames.], tot_loss[loss=0.164, simple_loss=0.2318, pruned_loss=0.04812, over 972977.36 frames.], batch size: 27, lr: 5.81e-04 2022-05-04 11:26:58,007 INFO [train.py:715] (0/8) Epoch 3, batch 2050, loss[loss=0.2407, simple_loss=0.2931, pruned_loss=0.09414, over 4909.00 frames.], tot_loss[loss=0.1648, simple_loss=0.2329, pruned_loss=0.04833, over 972593.79 frames.], batch size: 17, lr: 5.81e-04 2022-05-04 11:27:37,797 INFO [train.py:715] (0/8) Epoch 3, batch 2100, loss[loss=0.1894, simple_loss=0.256, pruned_loss=0.06137, over 4641.00 frames.], tot_loss[loss=0.1648, simple_loss=0.2329, pruned_loss=0.04834, over 972550.22 frames.], batch size: 13, lr: 5.81e-04 2022-05-04 11:28:17,050 INFO [train.py:715] (0/8) Epoch 3, batch 2150, loss[loss=0.1953, simple_loss=0.2641, pruned_loss=0.06324, over 4930.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2327, pruned_loss=0.04834, over 973170.81 frames.], batch size: 39, lr: 5.81e-04 2022-05-04 11:28:55,717 INFO [train.py:715] (0/8) Epoch 3, batch 2200, loss[loss=0.1769, simple_loss=0.2394, pruned_loss=0.05724, over 4763.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2334, pruned_loss=0.04889, over 973049.65 frames.], batch size: 19, lr: 5.81e-04 2022-05-04 11:29:35,103 INFO [train.py:715] (0/8) Epoch 3, batch 2250, loss[loss=0.1476, simple_loss=0.2219, pruned_loss=0.03664, over 4977.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2323, pruned_loss=0.04844, over 973144.17 frames.], batch size: 25, lr: 5.81e-04 2022-05-04 11:30:14,522 INFO [train.py:715] (0/8) Epoch 3, batch 2300, loss[loss=0.1935, simple_loss=0.2573, pruned_loss=0.06484, over 4911.00 frames.], tot_loss[loss=0.1664, simple_loss=0.2336, pruned_loss=0.04957, over 972459.98 frames.], batch size: 17, lr: 5.80e-04 2022-05-04 11:30:53,580 INFO [train.py:715] (0/8) Epoch 3, batch 2350, loss[loss=0.1691, simple_loss=0.2296, pruned_loss=0.05436, over 4685.00 frames.], tot_loss[loss=0.1657, simple_loss=0.233, pruned_loss=0.04922, over 972290.82 frames.], batch size: 15, lr: 5.80e-04 2022-05-04 11:31:32,373 INFO [train.py:715] (0/8) Epoch 3, batch 2400, loss[loss=0.1962, simple_loss=0.2619, pruned_loss=0.06525, over 4752.00 frames.], tot_loss[loss=0.1663, simple_loss=0.2336, pruned_loss=0.04953, over 971957.82 frames.], batch size: 16, lr: 5.80e-04 2022-05-04 11:32:12,612 INFO [train.py:715] (0/8) Epoch 3, batch 2450, loss[loss=0.1824, simple_loss=0.2557, pruned_loss=0.05458, over 4900.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2329, pruned_loss=0.04901, over 971106.38 frames.], batch size: 39, lr: 5.80e-04 2022-05-04 11:32:51,965 INFO [train.py:715] (0/8) Epoch 3, batch 2500, loss[loss=0.1853, simple_loss=0.2445, pruned_loss=0.06301, over 4770.00 frames.], tot_loss[loss=0.1659, simple_loss=0.2333, pruned_loss=0.04927, over 970918.77 frames.], batch size: 17, lr: 5.80e-04 2022-05-04 11:33:30,789 INFO [train.py:715] (0/8) Epoch 3, batch 2550, loss[loss=0.1536, simple_loss=0.2248, pruned_loss=0.04123, over 4945.00 frames.], tot_loss[loss=0.166, simple_loss=0.2339, pruned_loss=0.04906, over 970957.86 frames.], batch size: 35, lr: 5.80e-04 2022-05-04 11:34:11,447 INFO [train.py:715] (0/8) Epoch 3, batch 2600, loss[loss=0.1757, simple_loss=0.2413, pruned_loss=0.05505, over 4889.00 frames.], tot_loss[loss=0.165, simple_loss=0.233, pruned_loss=0.04845, over 970690.41 frames.], batch size: 17, lr: 5.80e-04 2022-05-04 11:34:51,561 INFO [train.py:715] (0/8) Epoch 3, batch 2650, loss[loss=0.1535, simple_loss=0.2263, pruned_loss=0.04031, over 4861.00 frames.], tot_loss[loss=0.166, simple_loss=0.2337, pruned_loss=0.04911, over 970700.39 frames.], batch size: 20, lr: 5.80e-04 2022-05-04 11:35:30,755 INFO [train.py:715] (0/8) Epoch 3, batch 2700, loss[loss=0.1733, simple_loss=0.2394, pruned_loss=0.05354, over 4756.00 frames.], tot_loss[loss=0.1663, simple_loss=0.234, pruned_loss=0.04931, over 971969.69 frames.], batch size: 19, lr: 5.79e-04 2022-05-04 11:36:10,257 INFO [train.py:715] (0/8) Epoch 3, batch 2750, loss[loss=0.1652, simple_loss=0.2291, pruned_loss=0.0506, over 4841.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2332, pruned_loss=0.04878, over 972715.03 frames.], batch size: 13, lr: 5.79e-04 2022-05-04 11:36:50,508 INFO [train.py:715] (0/8) Epoch 3, batch 2800, loss[loss=0.1368, simple_loss=0.2037, pruned_loss=0.03494, over 4908.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2333, pruned_loss=0.04897, over 973133.87 frames.], batch size: 18, lr: 5.79e-04 2022-05-04 11:37:29,793 INFO [train.py:715] (0/8) Epoch 3, batch 2850, loss[loss=0.1651, simple_loss=0.2341, pruned_loss=0.048, over 4774.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2333, pruned_loss=0.04904, over 972572.89 frames.], batch size: 12, lr: 5.79e-04 2022-05-04 11:38:08,465 INFO [train.py:715] (0/8) Epoch 3, batch 2900, loss[loss=0.1732, simple_loss=0.2298, pruned_loss=0.05824, over 4893.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2322, pruned_loss=0.04829, over 972709.11 frames.], batch size: 16, lr: 5.79e-04 2022-05-04 11:38:48,425 INFO [train.py:715] (0/8) Epoch 3, batch 2950, loss[loss=0.1506, simple_loss=0.2188, pruned_loss=0.04123, over 4769.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2333, pruned_loss=0.04903, over 972191.71 frames.], batch size: 18, lr: 5.79e-04 2022-05-04 11:39:28,056 INFO [train.py:715] (0/8) Epoch 3, batch 3000, loss[loss=0.1725, simple_loss=0.2385, pruned_loss=0.05324, over 4974.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2328, pruned_loss=0.0492, over 972775.46 frames.], batch size: 24, lr: 5.79e-04 2022-05-04 11:39:28,057 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 11:39:36,790 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1153, simple_loss=0.2015, pruned_loss=0.0146, over 914524.00 frames. 2022-05-04 11:40:16,886 INFO [train.py:715] (0/8) Epoch 3, batch 3050, loss[loss=0.2036, simple_loss=0.2685, pruned_loss=0.06935, over 4925.00 frames.], tot_loss[loss=0.1655, simple_loss=0.2328, pruned_loss=0.04911, over 972716.49 frames.], batch size: 23, lr: 5.78e-04 2022-05-04 11:40:55,668 INFO [train.py:715] (0/8) Epoch 3, batch 3100, loss[loss=0.152, simple_loss=0.2199, pruned_loss=0.04205, over 4887.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2326, pruned_loss=0.04944, over 972855.54 frames.], batch size: 32, lr: 5.78e-04 2022-05-04 11:41:35,057 INFO [train.py:715] (0/8) Epoch 3, batch 3150, loss[loss=0.1546, simple_loss=0.2339, pruned_loss=0.03764, over 4811.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2326, pruned_loss=0.04931, over 972869.89 frames.], batch size: 25, lr: 5.78e-04 2022-05-04 11:42:14,857 INFO [train.py:715] (0/8) Epoch 3, batch 3200, loss[loss=0.2181, simple_loss=0.2838, pruned_loss=0.0762, over 4832.00 frames.], tot_loss[loss=0.1663, simple_loss=0.2333, pruned_loss=0.04967, over 972616.55 frames.], batch size: 15, lr: 5.78e-04 2022-05-04 11:42:54,657 INFO [train.py:715] (0/8) Epoch 3, batch 3250, loss[loss=0.1645, simple_loss=0.2407, pruned_loss=0.04418, over 4784.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2326, pruned_loss=0.04881, over 972986.30 frames.], batch size: 12, lr: 5.78e-04 2022-05-04 11:43:33,196 INFO [train.py:715] (0/8) Epoch 3, batch 3300, loss[loss=0.1901, simple_loss=0.2603, pruned_loss=0.05995, over 4831.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2328, pruned_loss=0.04905, over 972075.64 frames.], batch size: 26, lr: 5.78e-04 2022-05-04 11:44:13,010 INFO [train.py:715] (0/8) Epoch 3, batch 3350, loss[loss=0.1793, simple_loss=0.2431, pruned_loss=0.05777, over 4819.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2331, pruned_loss=0.04883, over 972180.31 frames.], batch size: 26, lr: 5.78e-04 2022-05-04 11:44:52,485 INFO [train.py:715] (0/8) Epoch 3, batch 3400, loss[loss=0.1479, simple_loss=0.2176, pruned_loss=0.03915, over 4919.00 frames.], tot_loss[loss=0.1661, simple_loss=0.2336, pruned_loss=0.04931, over 971936.05 frames.], batch size: 18, lr: 5.77e-04 2022-05-04 11:45:31,172 INFO [train.py:715] (0/8) Epoch 3, batch 3450, loss[loss=0.1546, simple_loss=0.2265, pruned_loss=0.04134, over 4801.00 frames.], tot_loss[loss=0.1653, simple_loss=0.233, pruned_loss=0.04881, over 972033.48 frames.], batch size: 21, lr: 5.77e-04 2022-05-04 11:46:10,505 INFO [train.py:715] (0/8) Epoch 3, batch 3500, loss[loss=0.1774, simple_loss=0.2438, pruned_loss=0.05548, over 4984.00 frames.], tot_loss[loss=0.1668, simple_loss=0.2344, pruned_loss=0.04957, over 971839.87 frames.], batch size: 35, lr: 5.77e-04 2022-05-04 11:46:50,811 INFO [train.py:715] (0/8) Epoch 3, batch 3550, loss[loss=0.1998, simple_loss=0.2637, pruned_loss=0.06793, over 4911.00 frames.], tot_loss[loss=0.1668, simple_loss=0.2343, pruned_loss=0.0497, over 971903.98 frames.], batch size: 35, lr: 5.77e-04 2022-05-04 11:47:30,668 INFO [train.py:715] (0/8) Epoch 3, batch 3600, loss[loss=0.1806, simple_loss=0.2518, pruned_loss=0.05465, over 4842.00 frames.], tot_loss[loss=0.1659, simple_loss=0.2336, pruned_loss=0.04911, over 972556.02 frames.], batch size: 15, lr: 5.77e-04 2022-05-04 11:48:09,900 INFO [train.py:715] (0/8) Epoch 3, batch 3650, loss[loss=0.1736, simple_loss=0.23, pruned_loss=0.05858, over 4818.00 frames.], tot_loss[loss=0.1662, simple_loss=0.2337, pruned_loss=0.0493, over 972918.21 frames.], batch size: 25, lr: 5.77e-04 2022-05-04 11:48:49,624 INFO [train.py:715] (0/8) Epoch 3, batch 3700, loss[loss=0.192, simple_loss=0.2519, pruned_loss=0.06606, over 4845.00 frames.], tot_loss[loss=0.166, simple_loss=0.2337, pruned_loss=0.04916, over 972126.01 frames.], batch size: 32, lr: 5.77e-04 2022-05-04 11:49:29,640 INFO [train.py:715] (0/8) Epoch 3, batch 3750, loss[loss=0.1743, simple_loss=0.2501, pruned_loss=0.04923, over 4804.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2334, pruned_loss=0.04876, over 972238.89 frames.], batch size: 25, lr: 5.77e-04 2022-05-04 11:50:09,334 INFO [train.py:715] (0/8) Epoch 3, batch 3800, loss[loss=0.143, simple_loss=0.2175, pruned_loss=0.03427, over 4954.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2321, pruned_loss=0.04854, over 972453.87 frames.], batch size: 15, lr: 5.76e-04 2022-05-04 11:50:48,722 INFO [train.py:715] (0/8) Epoch 3, batch 3850, loss[loss=0.196, simple_loss=0.2644, pruned_loss=0.06384, over 4846.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2322, pruned_loss=0.04835, over 972247.67 frames.], batch size: 32, lr: 5.76e-04 2022-05-04 11:51:28,559 INFO [train.py:715] (0/8) Epoch 3, batch 3900, loss[loss=0.1586, simple_loss=0.234, pruned_loss=0.04165, over 4856.00 frames.], tot_loss[loss=0.166, simple_loss=0.233, pruned_loss=0.04947, over 972896.91 frames.], batch size: 32, lr: 5.76e-04 2022-05-04 11:52:08,061 INFO [train.py:715] (0/8) Epoch 3, batch 3950, loss[loss=0.1695, simple_loss=0.2321, pruned_loss=0.05344, over 4779.00 frames.], tot_loss[loss=0.1653, simple_loss=0.2323, pruned_loss=0.04913, over 972524.73 frames.], batch size: 17, lr: 5.76e-04 2022-05-04 11:52:47,081 INFO [train.py:715] (0/8) Epoch 3, batch 4000, loss[loss=0.21, simple_loss=0.2864, pruned_loss=0.06682, over 4947.00 frames.], tot_loss[loss=0.1655, simple_loss=0.2326, pruned_loss=0.04915, over 972996.62 frames.], batch size: 24, lr: 5.76e-04 2022-05-04 11:53:26,523 INFO [train.py:715] (0/8) Epoch 3, batch 4050, loss[loss=0.1222, simple_loss=0.1934, pruned_loss=0.02551, over 4865.00 frames.], tot_loss[loss=0.1663, simple_loss=0.2333, pruned_loss=0.04964, over 971881.43 frames.], batch size: 13, lr: 5.76e-04 2022-05-04 11:54:06,703 INFO [train.py:715] (0/8) Epoch 3, batch 4100, loss[loss=0.201, simple_loss=0.26, pruned_loss=0.07106, over 4791.00 frames.], tot_loss[loss=0.1668, simple_loss=0.234, pruned_loss=0.04977, over 972504.89 frames.], batch size: 17, lr: 5.76e-04 2022-05-04 11:54:45,652 INFO [train.py:715] (0/8) Epoch 3, batch 4150, loss[loss=0.1668, simple_loss=0.2297, pruned_loss=0.05193, over 4894.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2339, pruned_loss=0.04977, over 972333.46 frames.], batch size: 17, lr: 5.76e-04 2022-05-04 11:55:24,493 INFO [train.py:715] (0/8) Epoch 3, batch 4200, loss[loss=0.1569, simple_loss=0.2158, pruned_loss=0.049, over 4919.00 frames.], tot_loss[loss=0.1669, simple_loss=0.2342, pruned_loss=0.04978, over 972634.55 frames.], batch size: 39, lr: 5.75e-04 2022-05-04 11:56:04,946 INFO [train.py:715] (0/8) Epoch 3, batch 4250, loss[loss=0.1651, simple_loss=0.2363, pruned_loss=0.04698, over 4783.00 frames.], tot_loss[loss=0.1665, simple_loss=0.2336, pruned_loss=0.04972, over 972882.34 frames.], batch size: 17, lr: 5.75e-04 2022-05-04 11:56:44,318 INFO [train.py:715] (0/8) Epoch 3, batch 4300, loss[loss=0.1788, simple_loss=0.2421, pruned_loss=0.05772, over 4797.00 frames.], tot_loss[loss=0.1667, simple_loss=0.2339, pruned_loss=0.04973, over 973082.22 frames.], batch size: 13, lr: 5.75e-04 2022-05-04 11:57:23,799 INFO [train.py:715] (0/8) Epoch 3, batch 4350, loss[loss=0.1523, simple_loss=0.2257, pruned_loss=0.03949, over 4807.00 frames.], tot_loss[loss=0.1669, simple_loss=0.2343, pruned_loss=0.04975, over 973035.06 frames.], batch size: 25, lr: 5.75e-04 2022-05-04 11:58:03,483 INFO [train.py:715] (0/8) Epoch 3, batch 4400, loss[loss=0.1626, simple_loss=0.2246, pruned_loss=0.05025, over 4782.00 frames.], tot_loss[loss=0.1675, simple_loss=0.2349, pruned_loss=0.0501, over 972317.46 frames.], batch size: 18, lr: 5.75e-04 2022-05-04 11:58:43,516 INFO [train.py:715] (0/8) Epoch 3, batch 4450, loss[loss=0.1691, simple_loss=0.2362, pruned_loss=0.05101, over 4938.00 frames.], tot_loss[loss=0.1672, simple_loss=0.2346, pruned_loss=0.04986, over 971818.17 frames.], batch size: 21, lr: 5.75e-04 2022-05-04 11:59:22,568 INFO [train.py:715] (0/8) Epoch 3, batch 4500, loss[loss=0.151, simple_loss=0.2206, pruned_loss=0.04071, over 4829.00 frames.], tot_loss[loss=0.1663, simple_loss=0.2338, pruned_loss=0.0494, over 971398.48 frames.], batch size: 26, lr: 5.75e-04 2022-05-04 12:00:01,993 INFO [train.py:715] (0/8) Epoch 3, batch 4550, loss[loss=0.1422, simple_loss=0.2012, pruned_loss=0.0416, over 4747.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2332, pruned_loss=0.04846, over 971939.71 frames.], batch size: 12, lr: 5.74e-04 2022-05-04 12:00:41,742 INFO [train.py:715] (0/8) Epoch 3, batch 4600, loss[loss=0.1701, simple_loss=0.2484, pruned_loss=0.04585, over 4753.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2325, pruned_loss=0.04786, over 971570.34 frames.], batch size: 19, lr: 5.74e-04 2022-05-04 12:01:21,002 INFO [train.py:715] (0/8) Epoch 3, batch 4650, loss[loss=0.1726, simple_loss=0.2431, pruned_loss=0.05104, over 4936.00 frames.], tot_loss[loss=0.165, simple_loss=0.2331, pruned_loss=0.04844, over 971576.34 frames.], batch size: 23, lr: 5.74e-04 2022-05-04 12:01:59,933 INFO [train.py:715] (0/8) Epoch 3, batch 4700, loss[loss=0.1589, simple_loss=0.221, pruned_loss=0.04836, over 4972.00 frames.], tot_loss[loss=0.1656, simple_loss=0.2335, pruned_loss=0.04881, over 971140.25 frames.], batch size: 35, lr: 5.74e-04 2022-05-04 12:02:39,136 INFO [train.py:715] (0/8) Epoch 3, batch 4750, loss[loss=0.1827, simple_loss=0.2522, pruned_loss=0.0566, over 4893.00 frames.], tot_loss[loss=0.1637, simple_loss=0.2318, pruned_loss=0.04781, over 971795.48 frames.], batch size: 19, lr: 5.74e-04 2022-05-04 12:03:18,735 INFO [train.py:715] (0/8) Epoch 3, batch 4800, loss[loss=0.1278, simple_loss=0.2067, pruned_loss=0.02445, over 4872.00 frames.], tot_loss[loss=0.1642, simple_loss=0.2318, pruned_loss=0.04824, over 971521.77 frames.], batch size: 22, lr: 5.74e-04 2022-05-04 12:03:58,125 INFO [train.py:715] (0/8) Epoch 3, batch 4850, loss[loss=0.1652, simple_loss=0.2364, pruned_loss=0.04706, over 4912.00 frames.], tot_loss[loss=0.164, simple_loss=0.2317, pruned_loss=0.04811, over 972552.04 frames.], batch size: 17, lr: 5.74e-04 2022-05-04 12:04:36,951 INFO [train.py:715] (0/8) Epoch 3, batch 4900, loss[loss=0.1683, simple_loss=0.2402, pruned_loss=0.04824, over 4763.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2321, pruned_loss=0.04776, over 973468.75 frames.], batch size: 18, lr: 5.74e-04 2022-05-04 12:05:16,867 INFO [train.py:715] (0/8) Epoch 3, batch 4950, loss[loss=0.1776, simple_loss=0.255, pruned_loss=0.05014, over 4791.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2325, pruned_loss=0.048, over 972904.48 frames.], batch size: 18, lr: 5.73e-04 2022-05-04 12:05:56,318 INFO [train.py:715] (0/8) Epoch 3, batch 5000, loss[loss=0.152, simple_loss=0.2194, pruned_loss=0.04232, over 4961.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2316, pruned_loss=0.04763, over 973011.27 frames.], batch size: 35, lr: 5.73e-04 2022-05-04 12:06:35,120 INFO [train.py:715] (0/8) Epoch 3, batch 5050, loss[loss=0.1499, simple_loss=0.2134, pruned_loss=0.04318, over 4815.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2316, pruned_loss=0.04727, over 973791.60 frames.], batch size: 13, lr: 5.73e-04 2022-05-04 12:07:14,484 INFO [train.py:715] (0/8) Epoch 3, batch 5100, loss[loss=0.1569, simple_loss=0.2289, pruned_loss=0.0424, over 4944.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2325, pruned_loss=0.04837, over 973520.66 frames.], batch size: 29, lr: 5.73e-04 2022-05-04 12:07:54,242 INFO [train.py:715] (0/8) Epoch 3, batch 5150, loss[loss=0.147, simple_loss=0.2183, pruned_loss=0.03785, over 4982.00 frames.], tot_loss[loss=0.163, simple_loss=0.2311, pruned_loss=0.04741, over 973471.05 frames.], batch size: 28, lr: 5.73e-04 2022-05-04 12:08:32,991 INFO [train.py:715] (0/8) Epoch 3, batch 5200, loss[loss=0.1626, simple_loss=0.236, pruned_loss=0.04464, over 4849.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2323, pruned_loss=0.04824, over 973135.13 frames.], batch size: 32, lr: 5.73e-04 2022-05-04 12:09:12,108 INFO [train.py:715] (0/8) Epoch 3, batch 5250, loss[loss=0.205, simple_loss=0.2685, pruned_loss=0.07074, over 4961.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2328, pruned_loss=0.04873, over 973828.52 frames.], batch size: 15, lr: 5.73e-04 2022-05-04 12:09:52,192 INFO [train.py:715] (0/8) Epoch 3, batch 5300, loss[loss=0.1173, simple_loss=0.1935, pruned_loss=0.02052, over 4914.00 frames.], tot_loss[loss=0.1642, simple_loss=0.232, pruned_loss=0.0482, over 972589.99 frames.], batch size: 18, lr: 5.72e-04 2022-05-04 12:10:31,372 INFO [train.py:715] (0/8) Epoch 3, batch 5350, loss[loss=0.1404, simple_loss=0.2079, pruned_loss=0.03644, over 4789.00 frames.], tot_loss[loss=0.1645, simple_loss=0.2321, pruned_loss=0.04844, over 972611.78 frames.], batch size: 17, lr: 5.72e-04 2022-05-04 12:11:10,304 INFO [train.py:715] (0/8) Epoch 3, batch 5400, loss[loss=0.1739, simple_loss=0.2425, pruned_loss=0.05267, over 4929.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2315, pruned_loss=0.04804, over 973159.45 frames.], batch size: 23, lr: 5.72e-04 2022-05-04 12:11:49,947 INFO [train.py:715] (0/8) Epoch 3, batch 5450, loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.0298, over 4939.00 frames.], tot_loss[loss=0.1644, simple_loss=0.232, pruned_loss=0.04842, over 973473.67 frames.], batch size: 29, lr: 5.72e-04 2022-05-04 12:12:30,206 INFO [train.py:715] (0/8) Epoch 3, batch 5500, loss[loss=0.1307, simple_loss=0.2077, pruned_loss=0.02684, over 4865.00 frames.], tot_loss[loss=0.1645, simple_loss=0.2325, pruned_loss=0.04824, over 973943.89 frames.], batch size: 20, lr: 5.72e-04 2022-05-04 12:13:09,478 INFO [train.py:715] (0/8) Epoch 3, batch 5550, loss[loss=0.1985, simple_loss=0.2597, pruned_loss=0.06872, over 4973.00 frames.], tot_loss[loss=0.1649, simple_loss=0.233, pruned_loss=0.04843, over 973633.51 frames.], batch size: 24, lr: 5.72e-04 2022-05-04 12:13:49,878 INFO [train.py:715] (0/8) Epoch 3, batch 5600, loss[loss=0.181, simple_loss=0.2561, pruned_loss=0.05294, over 4965.00 frames.], tot_loss[loss=0.165, simple_loss=0.2332, pruned_loss=0.0484, over 974019.33 frames.], batch size: 28, lr: 5.72e-04 2022-05-04 12:14:29,643 INFO [train.py:715] (0/8) Epoch 3, batch 5650, loss[loss=0.169, simple_loss=0.2522, pruned_loss=0.04292, over 4817.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2316, pruned_loss=0.04741, over 974020.28 frames.], batch size: 27, lr: 5.72e-04 2022-05-04 12:15:08,735 INFO [train.py:715] (0/8) Epoch 3, batch 5700, loss[loss=0.1537, simple_loss=0.2252, pruned_loss=0.04107, over 4910.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2311, pruned_loss=0.04702, over 973852.42 frames.], batch size: 17, lr: 5.71e-04 2022-05-04 12:15:48,067 INFO [train.py:715] (0/8) Epoch 3, batch 5750, loss[loss=0.1595, simple_loss=0.2275, pruned_loss=0.04572, over 4864.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2317, pruned_loss=0.04773, over 974034.36 frames.], batch size: 20, lr: 5.71e-04 2022-05-04 12:16:27,888 INFO [train.py:715] (0/8) Epoch 3, batch 5800, loss[loss=0.1246, simple_loss=0.1933, pruned_loss=0.02797, over 4774.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2327, pruned_loss=0.04835, over 974208.26 frames.], batch size: 19, lr: 5.71e-04 2022-05-04 12:17:07,630 INFO [train.py:715] (0/8) Epoch 3, batch 5850, loss[loss=0.1555, simple_loss=0.2423, pruned_loss=0.03436, over 4968.00 frames.], tot_loss[loss=0.1636, simple_loss=0.232, pruned_loss=0.04758, over 974497.74 frames.], batch size: 24, lr: 5.71e-04 2022-05-04 12:17:46,988 INFO [train.py:715] (0/8) Epoch 3, batch 5900, loss[loss=0.1848, simple_loss=0.2491, pruned_loss=0.06026, over 4785.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2333, pruned_loss=0.04843, over 974122.51 frames.], batch size: 18, lr: 5.71e-04 2022-05-04 12:18:26,961 INFO [train.py:715] (0/8) Epoch 3, batch 5950, loss[loss=0.1778, simple_loss=0.2352, pruned_loss=0.06018, over 4862.00 frames.], tot_loss[loss=0.1649, simple_loss=0.2327, pruned_loss=0.04859, over 974824.22 frames.], batch size: 20, lr: 5.71e-04 2022-05-04 12:19:06,645 INFO [train.py:715] (0/8) Epoch 3, batch 6000, loss[loss=0.1699, simple_loss=0.2223, pruned_loss=0.05875, over 4973.00 frames.], tot_loss[loss=0.1643, simple_loss=0.232, pruned_loss=0.04831, over 974578.10 frames.], batch size: 31, lr: 5.71e-04 2022-05-04 12:19:06,646 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 12:19:15,398 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1149, simple_loss=0.2013, pruned_loss=0.01424, over 914524.00 frames. 2022-05-04 12:19:55,211 INFO [train.py:715] (0/8) Epoch 3, batch 6050, loss[loss=0.1633, simple_loss=0.2225, pruned_loss=0.05207, over 4983.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2332, pruned_loss=0.04909, over 974655.14 frames.], batch size: 15, lr: 5.71e-04 2022-05-04 12:20:34,637 INFO [train.py:715] (0/8) Epoch 3, batch 6100, loss[loss=0.1702, simple_loss=0.2381, pruned_loss=0.05119, over 4907.00 frames.], tot_loss[loss=0.1658, simple_loss=0.2333, pruned_loss=0.04921, over 974798.07 frames.], batch size: 19, lr: 5.70e-04 2022-05-04 12:21:13,557 INFO [train.py:715] (0/8) Epoch 3, batch 6150, loss[loss=0.2201, simple_loss=0.2788, pruned_loss=0.08064, over 4883.00 frames.], tot_loss[loss=0.1652, simple_loss=0.2332, pruned_loss=0.04864, over 974299.51 frames.], batch size: 39, lr: 5.70e-04 2022-05-04 12:21:53,163 INFO [train.py:715] (0/8) Epoch 3, batch 6200, loss[loss=0.1529, simple_loss=0.2186, pruned_loss=0.04357, over 4832.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2333, pruned_loss=0.04873, over 974065.71 frames.], batch size: 15, lr: 5.70e-04 2022-05-04 12:22:33,156 INFO [train.py:715] (0/8) Epoch 3, batch 6250, loss[loss=0.1453, simple_loss=0.2174, pruned_loss=0.03656, over 4879.00 frames.], tot_loss[loss=0.1648, simple_loss=0.2328, pruned_loss=0.04839, over 974043.81 frames.], batch size: 16, lr: 5.70e-04 2022-05-04 12:23:12,501 INFO [train.py:715] (0/8) Epoch 3, batch 6300, loss[loss=0.1638, simple_loss=0.2317, pruned_loss=0.04798, over 4928.00 frames.], tot_loss[loss=0.1652, simple_loss=0.2331, pruned_loss=0.04866, over 974030.10 frames.], batch size: 23, lr: 5.70e-04 2022-05-04 12:23:51,741 INFO [train.py:715] (0/8) Epoch 3, batch 6350, loss[loss=0.1609, simple_loss=0.2274, pruned_loss=0.04718, over 4792.00 frames.], tot_loss[loss=0.1654, simple_loss=0.2335, pruned_loss=0.04865, over 974089.87 frames.], batch size: 14, lr: 5.70e-04 2022-05-04 12:24:31,951 INFO [train.py:715] (0/8) Epoch 3, batch 6400, loss[loss=0.1723, simple_loss=0.2398, pruned_loss=0.05239, over 4949.00 frames.], tot_loss[loss=0.1662, simple_loss=0.2338, pruned_loss=0.04927, over 973988.22 frames.], batch size: 39, lr: 5.70e-04 2022-05-04 12:25:11,498 INFO [train.py:715] (0/8) Epoch 3, batch 6450, loss[loss=0.1585, simple_loss=0.2302, pruned_loss=0.04344, over 4959.00 frames.], tot_loss[loss=0.1657, simple_loss=0.2334, pruned_loss=0.04899, over 973840.02 frames.], batch size: 14, lr: 5.70e-04 2022-05-04 12:25:50,483 INFO [train.py:715] (0/8) Epoch 3, batch 6500, loss[loss=0.1504, simple_loss=0.2288, pruned_loss=0.03605, over 4761.00 frames.], tot_loss[loss=0.165, simple_loss=0.2329, pruned_loss=0.04861, over 972966.31 frames.], batch size: 18, lr: 5.69e-04 2022-05-04 12:26:30,130 INFO [train.py:715] (0/8) Epoch 3, batch 6550, loss[loss=0.1642, simple_loss=0.2223, pruned_loss=0.05303, over 4707.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2321, pruned_loss=0.04829, over 973386.87 frames.], batch size: 15, lr: 5.69e-04 2022-05-04 12:27:09,930 INFO [train.py:715] (0/8) Epoch 3, batch 6600, loss[loss=0.1766, simple_loss=0.2405, pruned_loss=0.05632, over 4842.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2324, pruned_loss=0.04804, over 973028.10 frames.], batch size: 20, lr: 5.69e-04 2022-05-04 12:27:49,185 INFO [train.py:715] (0/8) Epoch 3, batch 6650, loss[loss=0.1857, simple_loss=0.2534, pruned_loss=0.05898, over 4790.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2327, pruned_loss=0.04822, over 972366.80 frames.], batch size: 17, lr: 5.69e-04 2022-05-04 12:28:28,361 INFO [train.py:715] (0/8) Epoch 3, batch 6700, loss[loss=0.1361, simple_loss=0.2089, pruned_loss=0.03164, over 4839.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2321, pruned_loss=0.04782, over 972500.48 frames.], batch size: 13, lr: 5.69e-04 2022-05-04 12:29:08,699 INFO [train.py:715] (0/8) Epoch 3, batch 6750, loss[loss=0.1508, simple_loss=0.2272, pruned_loss=0.0372, over 4905.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2317, pruned_loss=0.04736, over 972208.61 frames.], batch size: 22, lr: 5.69e-04 2022-05-04 12:29:47,742 INFO [train.py:715] (0/8) Epoch 3, batch 6800, loss[loss=0.1718, simple_loss=0.2323, pruned_loss=0.05565, over 4928.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2313, pruned_loss=0.04722, over 973404.72 frames.], batch size: 29, lr: 5.69e-04 2022-05-04 12:30:27,119 INFO [train.py:715] (0/8) Epoch 3, batch 6850, loss[loss=0.1516, simple_loss=0.2273, pruned_loss=0.03792, over 4894.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2315, pruned_loss=0.04673, over 973925.75 frames.], batch size: 19, lr: 5.68e-04 2022-05-04 12:31:06,817 INFO [train.py:715] (0/8) Epoch 3, batch 6900, loss[loss=0.1754, simple_loss=0.2329, pruned_loss=0.05894, over 4906.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2316, pruned_loss=0.04713, over 973604.06 frames.], batch size: 17, lr: 5.68e-04 2022-05-04 12:31:46,651 INFO [train.py:715] (0/8) Epoch 3, batch 6950, loss[loss=0.1604, simple_loss=0.2285, pruned_loss=0.04617, over 4751.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2315, pruned_loss=0.04705, over 974383.11 frames.], batch size: 16, lr: 5.68e-04 2022-05-04 12:32:25,807 INFO [train.py:715] (0/8) Epoch 3, batch 7000, loss[loss=0.1737, simple_loss=0.2409, pruned_loss=0.05325, over 4984.00 frames.], tot_loss[loss=0.1649, simple_loss=0.2332, pruned_loss=0.04834, over 973969.97 frames.], batch size: 26, lr: 5.68e-04 2022-05-04 12:33:05,831 INFO [train.py:715] (0/8) Epoch 3, batch 7050, loss[loss=0.1422, simple_loss=0.2156, pruned_loss=0.03441, over 4901.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2327, pruned_loss=0.04829, over 973774.80 frames.], batch size: 17, lr: 5.68e-04 2022-05-04 12:33:45,718 INFO [train.py:715] (0/8) Epoch 3, batch 7100, loss[loss=0.1572, simple_loss=0.2289, pruned_loss=0.04279, over 4919.00 frames.], tot_loss[loss=0.1651, simple_loss=0.233, pruned_loss=0.04859, over 974289.61 frames.], batch size: 17, lr: 5.68e-04 2022-05-04 12:34:24,809 INFO [train.py:715] (0/8) Epoch 3, batch 7150, loss[loss=0.1586, simple_loss=0.2325, pruned_loss=0.04238, over 4892.00 frames.], tot_loss[loss=0.1648, simple_loss=0.2324, pruned_loss=0.04859, over 974011.25 frames.], batch size: 19, lr: 5.68e-04 2022-05-04 12:35:04,378 INFO [train.py:715] (0/8) Epoch 3, batch 7200, loss[loss=0.1376, simple_loss=0.211, pruned_loss=0.03213, over 4932.00 frames.], tot_loss[loss=0.165, simple_loss=0.2324, pruned_loss=0.04878, over 973248.71 frames.], batch size: 18, lr: 5.68e-04 2022-05-04 12:35:44,150 INFO [train.py:715] (0/8) Epoch 3, batch 7250, loss[loss=0.1593, simple_loss=0.2359, pruned_loss=0.04137, over 4915.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2325, pruned_loss=0.04889, over 973513.31 frames.], batch size: 17, lr: 5.67e-04 2022-05-04 12:36:23,546 INFO [train.py:715] (0/8) Epoch 3, batch 7300, loss[loss=0.1457, simple_loss=0.2242, pruned_loss=0.03359, over 4704.00 frames.], tot_loss[loss=0.1653, simple_loss=0.2326, pruned_loss=0.049, over 973690.70 frames.], batch size: 15, lr: 5.67e-04 2022-05-04 12:37:03,010 INFO [train.py:715] (0/8) Epoch 3, batch 7350, loss[loss=0.1482, simple_loss=0.2264, pruned_loss=0.03503, over 4913.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2325, pruned_loss=0.04838, over 972783.31 frames.], batch size: 18, lr: 5.67e-04 2022-05-04 12:37:42,376 INFO [train.py:715] (0/8) Epoch 3, batch 7400, loss[loss=0.191, simple_loss=0.2582, pruned_loss=0.06197, over 4941.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2323, pruned_loss=0.04819, over 972775.28 frames.], batch size: 35, lr: 5.67e-04 2022-05-04 12:38:22,633 INFO [train.py:715] (0/8) Epoch 3, batch 7450, loss[loss=0.1421, simple_loss=0.2199, pruned_loss=0.03216, over 4918.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2323, pruned_loss=0.04763, over 972982.23 frames.], batch size: 17, lr: 5.67e-04 2022-05-04 12:39:01,777 INFO [train.py:715] (0/8) Epoch 3, batch 7500, loss[loss=0.1666, simple_loss=0.2335, pruned_loss=0.04983, over 4872.00 frames.], tot_loss[loss=0.164, simple_loss=0.2322, pruned_loss=0.04787, over 973025.05 frames.], batch size: 16, lr: 5.67e-04 2022-05-04 12:39:41,043 INFO [train.py:715] (0/8) Epoch 3, batch 7550, loss[loss=0.1511, simple_loss=0.2247, pruned_loss=0.03876, over 4801.00 frames.], tot_loss[loss=0.165, simple_loss=0.233, pruned_loss=0.0485, over 972441.96 frames.], batch size: 21, lr: 5.67e-04 2022-05-04 12:39:57,124 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-112000.pt 2022-05-04 12:40:22,796 INFO [train.py:715] (0/8) Epoch 3, batch 7600, loss[loss=0.1619, simple_loss=0.2358, pruned_loss=0.044, over 4909.00 frames.], tot_loss[loss=0.165, simple_loss=0.2331, pruned_loss=0.04848, over 973659.73 frames.], batch size: 18, lr: 5.67e-04 2022-05-04 12:41:02,150 INFO [train.py:715] (0/8) Epoch 3, batch 7650, loss[loss=0.1605, simple_loss=0.2293, pruned_loss=0.04582, over 4891.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2324, pruned_loss=0.04793, over 973757.78 frames.], batch size: 19, lr: 5.66e-04 2022-05-04 12:41:41,413 INFO [train.py:715] (0/8) Epoch 3, batch 7700, loss[loss=0.1227, simple_loss=0.1978, pruned_loss=0.02386, over 4979.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2324, pruned_loss=0.04813, over 974531.92 frames.], batch size: 28, lr: 5.66e-04 2022-05-04 12:42:20,883 INFO [train.py:715] (0/8) Epoch 3, batch 7750, loss[loss=0.1317, simple_loss=0.1959, pruned_loss=0.03375, over 4834.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2324, pruned_loss=0.04793, over 974156.91 frames.], batch size: 13, lr: 5.66e-04 2022-05-04 12:43:00,215 INFO [train.py:715] (0/8) Epoch 3, batch 7800, loss[loss=0.1981, simple_loss=0.2584, pruned_loss=0.0689, over 4980.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2321, pruned_loss=0.04828, over 973900.49 frames.], batch size: 28, lr: 5.66e-04 2022-05-04 12:43:38,784 INFO [train.py:715] (0/8) Epoch 3, batch 7850, loss[loss=0.2024, simple_loss=0.2625, pruned_loss=0.07116, over 4757.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2321, pruned_loss=0.04822, over 973979.44 frames.], batch size: 16, lr: 5.66e-04 2022-05-04 12:44:18,374 INFO [train.py:715] (0/8) Epoch 3, batch 7900, loss[loss=0.1839, simple_loss=0.2382, pruned_loss=0.06482, over 4849.00 frames.], tot_loss[loss=0.1652, simple_loss=0.233, pruned_loss=0.04875, over 974085.76 frames.], batch size: 13, lr: 5.66e-04 2022-05-04 12:44:58,143 INFO [train.py:715] (0/8) Epoch 3, batch 7950, loss[loss=0.174, simple_loss=0.2438, pruned_loss=0.05213, over 4889.00 frames.], tot_loss[loss=0.166, simple_loss=0.2337, pruned_loss=0.04916, over 973354.66 frames.], batch size: 19, lr: 5.66e-04 2022-05-04 12:45:36,729 INFO [train.py:715] (0/8) Epoch 3, batch 8000, loss[loss=0.1678, simple_loss=0.2273, pruned_loss=0.05408, over 4955.00 frames.], tot_loss[loss=0.1666, simple_loss=0.2339, pruned_loss=0.04964, over 973833.18 frames.], batch size: 21, lr: 5.66e-04 2022-05-04 12:46:14,905 INFO [train.py:715] (0/8) Epoch 3, batch 8050, loss[loss=0.1667, simple_loss=0.2406, pruned_loss=0.04634, over 4749.00 frames.], tot_loss[loss=0.1671, simple_loss=0.2344, pruned_loss=0.0499, over 973671.45 frames.], batch size: 16, lr: 5.65e-04 2022-05-04 12:46:53,634 INFO [train.py:715] (0/8) Epoch 3, batch 8100, loss[loss=0.1528, simple_loss=0.2228, pruned_loss=0.04143, over 4815.00 frames.], tot_loss[loss=0.1661, simple_loss=0.2335, pruned_loss=0.04937, over 972794.63 frames.], batch size: 25, lr: 5.65e-04 2022-05-04 12:47:31,940 INFO [train.py:715] (0/8) Epoch 3, batch 8150, loss[loss=0.1678, simple_loss=0.2284, pruned_loss=0.05362, over 4890.00 frames.], tot_loss[loss=0.1655, simple_loss=0.2328, pruned_loss=0.04908, over 972272.45 frames.], batch size: 16, lr: 5.65e-04 2022-05-04 12:48:10,087 INFO [train.py:715] (0/8) Epoch 3, batch 8200, loss[loss=0.1729, simple_loss=0.2313, pruned_loss=0.05726, over 4820.00 frames.], tot_loss[loss=0.166, simple_loss=0.2332, pruned_loss=0.04942, over 972296.95 frames.], batch size: 15, lr: 5.65e-04 2022-05-04 12:48:49,889 INFO [train.py:715] (0/8) Epoch 3, batch 8250, loss[loss=0.1461, simple_loss=0.2279, pruned_loss=0.03221, over 4947.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2324, pruned_loss=0.0489, over 973416.54 frames.], batch size: 21, lr: 5.65e-04 2022-05-04 12:49:30,620 INFO [train.py:715] (0/8) Epoch 3, batch 8300, loss[loss=0.1934, simple_loss=0.2645, pruned_loss=0.06112, over 4697.00 frames.], tot_loss[loss=0.165, simple_loss=0.2323, pruned_loss=0.04886, over 972298.59 frames.], batch size: 15, lr: 5.65e-04 2022-05-04 12:50:10,668 INFO [train.py:715] (0/8) Epoch 3, batch 8350, loss[loss=0.1476, simple_loss=0.2179, pruned_loss=0.03866, over 4923.00 frames.], tot_loss[loss=0.1658, simple_loss=0.2327, pruned_loss=0.04944, over 972369.04 frames.], batch size: 18, lr: 5.65e-04 2022-05-04 12:50:50,667 INFO [train.py:715] (0/8) Epoch 3, batch 8400, loss[loss=0.1684, simple_loss=0.2355, pruned_loss=0.05066, over 4963.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2316, pruned_loss=0.04859, over 972593.19 frames.], batch size: 35, lr: 5.65e-04 2022-05-04 12:51:30,650 INFO [train.py:715] (0/8) Epoch 3, batch 8450, loss[loss=0.1559, simple_loss=0.229, pruned_loss=0.04144, over 4762.00 frames.], tot_loss[loss=0.164, simple_loss=0.2315, pruned_loss=0.0482, over 972311.84 frames.], batch size: 19, lr: 5.64e-04 2022-05-04 12:52:10,871 INFO [train.py:715] (0/8) Epoch 3, batch 8500, loss[loss=0.1848, simple_loss=0.2431, pruned_loss=0.06327, over 4879.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2316, pruned_loss=0.04805, over 972473.38 frames.], batch size: 16, lr: 5.64e-04 2022-05-04 12:52:49,928 INFO [train.py:715] (0/8) Epoch 3, batch 8550, loss[loss=0.1366, simple_loss=0.2005, pruned_loss=0.03634, over 4985.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2316, pruned_loss=0.04805, over 972355.73 frames.], batch size: 15, lr: 5.64e-04 2022-05-04 12:53:31,546 INFO [train.py:715] (0/8) Epoch 3, batch 8600, loss[loss=0.151, simple_loss=0.222, pruned_loss=0.03994, over 4944.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2316, pruned_loss=0.04783, over 972398.94 frames.], batch size: 29, lr: 5.64e-04 2022-05-04 12:54:13,125 INFO [train.py:715] (0/8) Epoch 3, batch 8650, loss[loss=0.1409, simple_loss=0.2167, pruned_loss=0.03258, over 4857.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2313, pruned_loss=0.04718, over 972263.15 frames.], batch size: 20, lr: 5.64e-04 2022-05-04 12:54:53,245 INFO [train.py:715] (0/8) Epoch 3, batch 8700, loss[loss=0.1702, simple_loss=0.2357, pruned_loss=0.05241, over 4788.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2323, pruned_loss=0.04813, over 972218.14 frames.], batch size: 14, lr: 5.64e-04 2022-05-04 12:55:34,483 INFO [train.py:715] (0/8) Epoch 3, batch 8750, loss[loss=0.16, simple_loss=0.225, pruned_loss=0.04749, over 4901.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2315, pruned_loss=0.04813, over 973181.64 frames.], batch size: 19, lr: 5.64e-04 2022-05-04 12:56:14,903 INFO [train.py:715] (0/8) Epoch 3, batch 8800, loss[loss=0.1739, simple_loss=0.2552, pruned_loss=0.04628, over 4969.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2314, pruned_loss=0.04789, over 973176.63 frames.], batch size: 28, lr: 5.64e-04 2022-05-04 12:56:55,634 INFO [train.py:715] (0/8) Epoch 3, batch 8850, loss[loss=0.1729, simple_loss=0.234, pruned_loss=0.05594, over 4773.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2319, pruned_loss=0.04849, over 973244.86 frames.], batch size: 12, lr: 5.63e-04 2022-05-04 12:57:35,613 INFO [train.py:715] (0/8) Epoch 3, batch 8900, loss[loss=0.1546, simple_loss=0.2335, pruned_loss=0.03782, over 4960.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2321, pruned_loss=0.04836, over 973545.24 frames.], batch size: 24, lr: 5.63e-04 2022-05-04 12:58:17,387 INFO [train.py:715] (0/8) Epoch 3, batch 8950, loss[loss=0.1523, simple_loss=0.2175, pruned_loss=0.04353, over 4747.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2322, pruned_loss=0.04833, over 973008.87 frames.], batch size: 16, lr: 5.63e-04 2022-05-04 12:58:59,318 INFO [train.py:715] (0/8) Epoch 3, batch 9000, loss[loss=0.1628, simple_loss=0.2375, pruned_loss=0.04408, over 4924.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2318, pruned_loss=0.04799, over 972179.14 frames.], batch size: 35, lr: 5.63e-04 2022-05-04 12:58:59,319 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 12:59:08,109 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1147, simple_loss=0.2006, pruned_loss=0.01442, over 914524.00 frames. 2022-05-04 12:59:49,674 INFO [train.py:715] (0/8) Epoch 3, batch 9050, loss[loss=0.1617, simple_loss=0.2363, pruned_loss=0.0435, over 4825.00 frames.], tot_loss[loss=0.164, simple_loss=0.2318, pruned_loss=0.04811, over 972605.63 frames.], batch size: 26, lr: 5.63e-04 2022-05-04 13:00:30,624 INFO [train.py:715] (0/8) Epoch 3, batch 9100, loss[loss=0.16, simple_loss=0.2387, pruned_loss=0.0406, over 4868.00 frames.], tot_loss[loss=0.1641, simple_loss=0.232, pruned_loss=0.04804, over 973065.36 frames.], batch size: 16, lr: 5.63e-04 2022-05-04 13:01:11,925 INFO [train.py:715] (0/8) Epoch 3, batch 9150, loss[loss=0.1517, simple_loss=0.2267, pruned_loss=0.0383, over 4784.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2316, pruned_loss=0.04725, over 973063.36 frames.], batch size: 18, lr: 5.63e-04 2022-05-04 13:01:53,287 INFO [train.py:715] (0/8) Epoch 3, batch 9200, loss[loss=0.1592, simple_loss=0.231, pruned_loss=0.04367, over 4922.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2322, pruned_loss=0.04783, over 972577.12 frames.], batch size: 18, lr: 5.63e-04 2022-05-04 13:02:34,668 INFO [train.py:715] (0/8) Epoch 3, batch 9250, loss[loss=0.1691, simple_loss=0.2282, pruned_loss=0.05503, over 4884.00 frames.], tot_loss[loss=0.1645, simple_loss=0.233, pruned_loss=0.04795, over 972776.99 frames.], batch size: 22, lr: 5.62e-04 2022-05-04 13:03:15,396 INFO [train.py:715] (0/8) Epoch 3, batch 9300, loss[loss=0.1908, simple_loss=0.2593, pruned_loss=0.06112, over 4923.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2325, pruned_loss=0.04751, over 973719.54 frames.], batch size: 23, lr: 5.62e-04 2022-05-04 13:03:56,626 INFO [train.py:715] (0/8) Epoch 3, batch 9350, loss[loss=0.1441, simple_loss=0.2243, pruned_loss=0.03197, over 4771.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2331, pruned_loss=0.04813, over 972750.75 frames.], batch size: 17, lr: 5.62e-04 2022-05-04 13:04:38,913 INFO [train.py:715] (0/8) Epoch 3, batch 9400, loss[loss=0.1424, simple_loss=0.2304, pruned_loss=0.0272, over 4985.00 frames.], tot_loss[loss=0.1649, simple_loss=0.2333, pruned_loss=0.04821, over 973113.03 frames.], batch size: 26, lr: 5.62e-04 2022-05-04 13:05:19,299 INFO [train.py:715] (0/8) Epoch 3, batch 9450, loss[loss=0.141, simple_loss=0.2146, pruned_loss=0.03366, over 4950.00 frames.], tot_loss[loss=0.164, simple_loss=0.2327, pruned_loss=0.04762, over 971816.92 frames.], batch size: 29, lr: 5.62e-04 2022-05-04 13:06:00,823 INFO [train.py:715] (0/8) Epoch 3, batch 9500, loss[loss=0.1603, simple_loss=0.2413, pruned_loss=0.03961, over 4820.00 frames.], tot_loss[loss=0.1637, simple_loss=0.2322, pruned_loss=0.04759, over 973109.17 frames.], batch size: 26, lr: 5.62e-04 2022-05-04 13:06:42,709 INFO [train.py:715] (0/8) Epoch 3, batch 9550, loss[loss=0.1546, simple_loss=0.2034, pruned_loss=0.05292, over 4762.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2328, pruned_loss=0.04833, over 971978.10 frames.], batch size: 16, lr: 5.62e-04 2022-05-04 13:07:24,287 INFO [train.py:715] (0/8) Epoch 3, batch 9600, loss[loss=0.1432, simple_loss=0.2166, pruned_loss=0.03491, over 4871.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2322, pruned_loss=0.04799, over 971448.58 frames.], batch size: 16, lr: 5.62e-04 2022-05-04 13:08:05,438 INFO [train.py:715] (0/8) Epoch 3, batch 9650, loss[loss=0.1247, simple_loss=0.1935, pruned_loss=0.0279, over 4991.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2313, pruned_loss=0.04788, over 970389.80 frames.], batch size: 24, lr: 5.61e-04 2022-05-04 13:08:46,947 INFO [train.py:715] (0/8) Epoch 3, batch 9700, loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02857, over 4983.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2321, pruned_loss=0.04807, over 971370.70 frames.], batch size: 15, lr: 5.61e-04 2022-05-04 13:09:27,937 INFO [train.py:715] (0/8) Epoch 3, batch 9750, loss[loss=0.2038, simple_loss=0.2574, pruned_loss=0.07509, over 4777.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2321, pruned_loss=0.04827, over 971272.68 frames.], batch size: 14, lr: 5.61e-04 2022-05-04 13:10:08,813 INFO [train.py:715] (0/8) Epoch 3, batch 9800, loss[loss=0.159, simple_loss=0.23, pruned_loss=0.04402, over 4814.00 frames.], tot_loss[loss=0.164, simple_loss=0.2322, pruned_loss=0.04787, over 972220.27 frames.], batch size: 25, lr: 5.61e-04 2022-05-04 13:10:50,545 INFO [train.py:715] (0/8) Epoch 3, batch 9850, loss[loss=0.1701, simple_loss=0.2436, pruned_loss=0.0483, over 4757.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2321, pruned_loss=0.04798, over 971810.66 frames.], batch size: 19, lr: 5.61e-04 2022-05-04 13:11:32,495 INFO [train.py:715] (0/8) Epoch 3, batch 9900, loss[loss=0.1777, simple_loss=0.2419, pruned_loss=0.05673, over 4836.00 frames.], tot_loss[loss=0.164, simple_loss=0.2322, pruned_loss=0.04785, over 972021.33 frames.], batch size: 15, lr: 5.61e-04 2022-05-04 13:12:12,991 INFO [train.py:715] (0/8) Epoch 3, batch 9950, loss[loss=0.1725, simple_loss=0.2366, pruned_loss=0.05425, over 4780.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2316, pruned_loss=0.04733, over 972458.46 frames.], batch size: 18, lr: 5.61e-04 2022-05-04 13:12:54,729 INFO [train.py:715] (0/8) Epoch 3, batch 10000, loss[loss=0.1411, simple_loss=0.2089, pruned_loss=0.03665, over 4844.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2315, pruned_loss=0.04769, over 972093.03 frames.], batch size: 13, lr: 5.61e-04 2022-05-04 13:13:36,178 INFO [train.py:715] (0/8) Epoch 3, batch 10050, loss[loss=0.1856, simple_loss=0.2548, pruned_loss=0.05819, over 4972.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2311, pruned_loss=0.04779, over 972283.97 frames.], batch size: 28, lr: 5.61e-04 2022-05-04 13:14:17,625 INFO [train.py:715] (0/8) Epoch 3, batch 10100, loss[loss=0.1325, simple_loss=0.195, pruned_loss=0.03502, over 4809.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2304, pruned_loss=0.04758, over 971563.98 frames.], batch size: 13, lr: 5.60e-04 2022-05-04 13:14:58,623 INFO [train.py:715] (0/8) Epoch 3, batch 10150, loss[loss=0.1887, simple_loss=0.2551, pruned_loss=0.06113, over 4872.00 frames.], tot_loss[loss=0.1644, simple_loss=0.232, pruned_loss=0.04843, over 971754.90 frames.], batch size: 20, lr: 5.60e-04 2022-05-04 13:15:40,216 INFO [train.py:715] (0/8) Epoch 3, batch 10200, loss[loss=0.1873, simple_loss=0.2477, pruned_loss=0.06345, over 4867.00 frames.], tot_loss[loss=0.1641, simple_loss=0.232, pruned_loss=0.04813, over 972712.80 frames.], batch size: 20, lr: 5.60e-04 2022-05-04 13:16:21,938 INFO [train.py:715] (0/8) Epoch 3, batch 10250, loss[loss=0.1665, simple_loss=0.2332, pruned_loss=0.04987, over 4927.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2313, pruned_loss=0.04754, over 972084.67 frames.], batch size: 29, lr: 5.60e-04 2022-05-04 13:17:01,803 INFO [train.py:715] (0/8) Epoch 3, batch 10300, loss[loss=0.1841, simple_loss=0.2524, pruned_loss=0.05794, over 4887.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2314, pruned_loss=0.04758, over 972221.53 frames.], batch size: 22, lr: 5.60e-04 2022-05-04 13:17:42,041 INFO [train.py:715] (0/8) Epoch 3, batch 10350, loss[loss=0.1766, simple_loss=0.2339, pruned_loss=0.05963, over 4751.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2316, pruned_loss=0.04785, over 972095.99 frames.], batch size: 16, lr: 5.60e-04 2022-05-04 13:18:22,569 INFO [train.py:715] (0/8) Epoch 3, batch 10400, loss[loss=0.1563, simple_loss=0.2309, pruned_loss=0.04088, over 4787.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2322, pruned_loss=0.04821, over 972798.23 frames.], batch size: 19, lr: 5.60e-04 2022-05-04 13:19:03,192 INFO [train.py:715] (0/8) Epoch 3, batch 10450, loss[loss=0.2343, simple_loss=0.2738, pruned_loss=0.09743, over 4931.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2305, pruned_loss=0.04743, over 972177.44 frames.], batch size: 18, lr: 5.60e-04 2022-05-04 13:19:43,613 INFO [train.py:715] (0/8) Epoch 3, batch 10500, loss[loss=0.1522, simple_loss=0.2211, pruned_loss=0.04161, over 4810.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2305, pruned_loss=0.04737, over 972200.33 frames.], batch size: 24, lr: 5.59e-04 2022-05-04 13:20:24,620 INFO [train.py:715] (0/8) Epoch 3, batch 10550, loss[loss=0.2143, simple_loss=0.2764, pruned_loss=0.07611, over 4801.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2302, pruned_loss=0.04724, over 971929.99 frames.], batch size: 21, lr: 5.59e-04 2022-05-04 13:21:07,130 INFO [train.py:715] (0/8) Epoch 3, batch 10600, loss[loss=0.185, simple_loss=0.248, pruned_loss=0.06103, over 4837.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2306, pruned_loss=0.04703, over 972483.89 frames.], batch size: 30, lr: 5.59e-04 2022-05-04 13:21:48,621 INFO [train.py:715] (0/8) Epoch 3, batch 10650, loss[loss=0.1584, simple_loss=0.2319, pruned_loss=0.04245, over 4952.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2297, pruned_loss=0.04655, over 972833.26 frames.], batch size: 24, lr: 5.59e-04 2022-05-04 13:22:30,744 INFO [train.py:715] (0/8) Epoch 3, batch 10700, loss[loss=0.1521, simple_loss=0.2335, pruned_loss=0.03536, over 4933.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2306, pruned_loss=0.04654, over 972698.52 frames.], batch size: 23, lr: 5.59e-04 2022-05-04 13:23:13,517 INFO [train.py:715] (0/8) Epoch 3, batch 10750, loss[loss=0.1956, simple_loss=0.2608, pruned_loss=0.06514, over 4912.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2316, pruned_loss=0.04731, over 972934.84 frames.], batch size: 39, lr: 5.59e-04 2022-05-04 13:23:56,754 INFO [train.py:715] (0/8) Epoch 3, batch 10800, loss[loss=0.1617, simple_loss=0.2391, pruned_loss=0.04215, over 4944.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2305, pruned_loss=0.04678, over 973797.38 frames.], batch size: 39, lr: 5.59e-04 2022-05-04 13:24:38,548 INFO [train.py:715] (0/8) Epoch 3, batch 10850, loss[loss=0.18, simple_loss=0.2475, pruned_loss=0.05627, over 4987.00 frames.], tot_loss[loss=0.162, simple_loss=0.2308, pruned_loss=0.04659, over 972975.29 frames.], batch size: 14, lr: 5.59e-04 2022-05-04 13:25:21,317 INFO [train.py:715] (0/8) Epoch 3, batch 10900, loss[loss=0.1442, simple_loss=0.208, pruned_loss=0.04017, over 4796.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2321, pruned_loss=0.04723, over 972715.33 frames.], batch size: 24, lr: 5.58e-04 2022-05-04 13:26:04,557 INFO [train.py:715] (0/8) Epoch 3, batch 10950, loss[loss=0.1517, simple_loss=0.2225, pruned_loss=0.04042, over 4988.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2321, pruned_loss=0.04734, over 972422.31 frames.], batch size: 15, lr: 5.58e-04 2022-05-04 13:26:46,514 INFO [train.py:715] (0/8) Epoch 3, batch 11000, loss[loss=0.154, simple_loss=0.2304, pruned_loss=0.03876, over 4812.00 frames.], tot_loss[loss=0.1631, simple_loss=0.232, pruned_loss=0.04714, over 972535.90 frames.], batch size: 25, lr: 5.58e-04 2022-05-04 13:27:28,081 INFO [train.py:715] (0/8) Epoch 3, batch 11050, loss[loss=0.165, simple_loss=0.2199, pruned_loss=0.05509, over 4916.00 frames.], tot_loss[loss=0.1633, simple_loss=0.232, pruned_loss=0.04732, over 972422.34 frames.], batch size: 23, lr: 5.58e-04 2022-05-04 13:28:11,595 INFO [train.py:715] (0/8) Epoch 3, batch 11100, loss[loss=0.1493, simple_loss=0.2075, pruned_loss=0.04552, over 4920.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2313, pruned_loss=0.04723, over 973785.13 frames.], batch size: 17, lr: 5.58e-04 2022-05-04 13:28:53,676 INFO [train.py:715] (0/8) Epoch 3, batch 11150, loss[loss=0.1621, simple_loss=0.22, pruned_loss=0.05207, over 4973.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2317, pruned_loss=0.04769, over 973295.82 frames.], batch size: 14, lr: 5.58e-04 2022-05-04 13:29:35,735 INFO [train.py:715] (0/8) Epoch 3, batch 11200, loss[loss=0.1459, simple_loss=0.2212, pruned_loss=0.03531, over 4811.00 frames.], tot_loss[loss=0.1642, simple_loss=0.2318, pruned_loss=0.04835, over 973093.64 frames.], batch size: 26, lr: 5.58e-04 2022-05-04 13:30:18,272 INFO [train.py:715] (0/8) Epoch 3, batch 11250, loss[loss=0.1828, simple_loss=0.2534, pruned_loss=0.05613, over 4918.00 frames.], tot_loss[loss=0.1642, simple_loss=0.232, pruned_loss=0.04825, over 973955.56 frames.], batch size: 18, lr: 5.58e-04 2022-05-04 13:31:01,499 INFO [train.py:715] (0/8) Epoch 3, batch 11300, loss[loss=0.154, simple_loss=0.2243, pruned_loss=0.04187, over 4778.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2317, pruned_loss=0.04859, over 973273.82 frames.], batch size: 18, lr: 5.57e-04 2022-05-04 13:31:42,774 INFO [train.py:715] (0/8) Epoch 3, batch 11350, loss[loss=0.1929, simple_loss=0.2431, pruned_loss=0.07132, over 4842.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2308, pruned_loss=0.04798, over 973748.53 frames.], batch size: 32, lr: 5.57e-04 2022-05-04 13:32:25,107 INFO [train.py:715] (0/8) Epoch 3, batch 11400, loss[loss=0.1818, simple_loss=0.2475, pruned_loss=0.05805, over 4778.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2303, pruned_loss=0.04738, over 972099.08 frames.], batch size: 17, lr: 5.57e-04 2022-05-04 13:33:08,054 INFO [train.py:715] (0/8) Epoch 3, batch 11450, loss[loss=0.183, simple_loss=0.2438, pruned_loss=0.06106, over 4824.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2303, pruned_loss=0.04722, over 971984.60 frames.], batch size: 25, lr: 5.57e-04 2022-05-04 13:33:50,200 INFO [train.py:715] (0/8) Epoch 3, batch 11500, loss[loss=0.1615, simple_loss=0.2301, pruned_loss=0.04643, over 4978.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2307, pruned_loss=0.04692, over 971780.93 frames.], batch size: 15, lr: 5.57e-04 2022-05-04 13:34:32,225 INFO [train.py:715] (0/8) Epoch 3, batch 11550, loss[loss=0.1687, simple_loss=0.2353, pruned_loss=0.05108, over 4837.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2308, pruned_loss=0.04734, over 972381.82 frames.], batch size: 15, lr: 5.57e-04 2022-05-04 13:35:14,414 INFO [train.py:715] (0/8) Epoch 3, batch 11600, loss[loss=0.1756, simple_loss=0.2364, pruned_loss=0.05744, over 4808.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2294, pruned_loss=0.04655, over 972256.07 frames.], batch size: 24, lr: 5.57e-04 2022-05-04 13:35:57,171 INFO [train.py:715] (0/8) Epoch 3, batch 11650, loss[loss=0.1476, simple_loss=0.214, pruned_loss=0.04062, over 4755.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2295, pruned_loss=0.04653, over 971665.78 frames.], batch size: 16, lr: 5.57e-04 2022-05-04 13:36:39,260 INFO [train.py:715] (0/8) Epoch 3, batch 11700, loss[loss=0.1662, simple_loss=0.2362, pruned_loss=0.0481, over 4959.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2305, pruned_loss=0.04698, over 972251.02 frames.], batch size: 21, lr: 5.57e-04 2022-05-04 13:37:21,481 INFO [train.py:715] (0/8) Epoch 3, batch 11750, loss[loss=0.1416, simple_loss=0.2118, pruned_loss=0.03574, over 4917.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2309, pruned_loss=0.04696, over 972588.61 frames.], batch size: 18, lr: 5.56e-04 2022-05-04 13:38:05,285 INFO [train.py:715] (0/8) Epoch 3, batch 11800, loss[loss=0.1916, simple_loss=0.2556, pruned_loss=0.06379, over 4793.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2311, pruned_loss=0.04756, over 972052.06 frames.], batch size: 24, lr: 5.56e-04 2022-05-04 13:38:47,455 INFO [train.py:715] (0/8) Epoch 3, batch 11850, loss[loss=0.1611, simple_loss=0.2315, pruned_loss=0.04537, over 4815.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2303, pruned_loss=0.04697, over 972484.40 frames.], batch size: 12, lr: 5.56e-04 2022-05-04 13:39:29,601 INFO [train.py:715] (0/8) Epoch 3, batch 11900, loss[loss=0.1579, simple_loss=0.2252, pruned_loss=0.04527, over 4822.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2308, pruned_loss=0.04724, over 972600.42 frames.], batch size: 15, lr: 5.56e-04 2022-05-04 13:40:11,699 INFO [train.py:715] (0/8) Epoch 3, batch 11950, loss[loss=0.1573, simple_loss=0.2144, pruned_loss=0.0501, over 4896.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2313, pruned_loss=0.0476, over 972804.73 frames.], batch size: 19, lr: 5.56e-04 2022-05-04 13:40:54,202 INFO [train.py:715] (0/8) Epoch 3, batch 12000, loss[loss=0.1647, simple_loss=0.2292, pruned_loss=0.05008, over 4882.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2313, pruned_loss=0.04717, over 971962.80 frames.], batch size: 19, lr: 5.56e-04 2022-05-04 13:40:54,203 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 13:41:02,574 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1142, simple_loss=0.2003, pruned_loss=0.01401, over 914524.00 frames. 2022-05-04 13:41:44,680 INFO [train.py:715] (0/8) Epoch 3, batch 12050, loss[loss=0.1575, simple_loss=0.2326, pruned_loss=0.04123, over 4793.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2312, pruned_loss=0.04719, over 970997.39 frames.], batch size: 24, lr: 5.56e-04 2022-05-04 13:42:26,373 INFO [train.py:715] (0/8) Epoch 3, batch 12100, loss[loss=0.1713, simple_loss=0.2364, pruned_loss=0.05308, over 4815.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2308, pruned_loss=0.04714, over 972116.48 frames.], batch size: 21, lr: 5.56e-04 2022-05-04 13:43:08,777 INFO [train.py:715] (0/8) Epoch 3, batch 12150, loss[loss=0.1429, simple_loss=0.2318, pruned_loss=0.02698, over 4814.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2311, pruned_loss=0.04709, over 972317.03 frames.], batch size: 25, lr: 5.55e-04 2022-05-04 13:43:52,022 INFO [train.py:715] (0/8) Epoch 3, batch 12200, loss[loss=0.1367, simple_loss=0.2115, pruned_loss=0.03092, over 4850.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2314, pruned_loss=0.04704, over 971564.99 frames.], batch size: 15, lr: 5.55e-04 2022-05-04 13:44:33,690 INFO [train.py:715] (0/8) Epoch 3, batch 12250, loss[loss=0.1522, simple_loss=0.2265, pruned_loss=0.03897, over 4841.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2321, pruned_loss=0.04751, over 971544.76 frames.], batch size: 32, lr: 5.55e-04 2022-05-04 13:45:15,591 INFO [train.py:715] (0/8) Epoch 3, batch 12300, loss[loss=0.2024, simple_loss=0.2813, pruned_loss=0.06175, over 4968.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2327, pruned_loss=0.0479, over 971602.18 frames.], batch size: 24, lr: 5.55e-04 2022-05-04 13:45:58,049 INFO [train.py:715] (0/8) Epoch 3, batch 12350, loss[loss=0.1394, simple_loss=0.2056, pruned_loss=0.03662, over 4782.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2325, pruned_loss=0.04788, over 971364.57 frames.], batch size: 18, lr: 5.55e-04 2022-05-04 13:46:41,413 INFO [train.py:715] (0/8) Epoch 3, batch 12400, loss[loss=0.1615, simple_loss=0.2334, pruned_loss=0.04474, over 4808.00 frames.], tot_loss[loss=0.1644, simple_loss=0.2327, pruned_loss=0.04809, over 971073.45 frames.], batch size: 13, lr: 5.55e-04 2022-05-04 13:47:23,073 INFO [train.py:715] (0/8) Epoch 3, batch 12450, loss[loss=0.2759, simple_loss=0.3286, pruned_loss=0.1116, over 4848.00 frames.], tot_loss[loss=0.1651, simple_loss=0.2331, pruned_loss=0.0485, over 971113.68 frames.], batch size: 13, lr: 5.55e-04 2022-05-04 13:48:04,574 INFO [train.py:715] (0/8) Epoch 3, batch 12500, loss[loss=0.1527, simple_loss=0.211, pruned_loss=0.0472, over 4870.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2323, pruned_loss=0.04813, over 971269.80 frames.], batch size: 22, lr: 5.55e-04 2022-05-04 13:48:47,314 INFO [train.py:715] (0/8) Epoch 3, batch 12550, loss[loss=0.1772, simple_loss=0.2324, pruned_loss=0.06099, over 4869.00 frames.], tot_loss[loss=0.1652, simple_loss=0.233, pruned_loss=0.04869, over 971088.59 frames.], batch size: 32, lr: 5.54e-04 2022-05-04 13:49:29,598 INFO [train.py:715] (0/8) Epoch 3, batch 12600, loss[loss=0.1499, simple_loss=0.2176, pruned_loss=0.04117, over 4905.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2325, pruned_loss=0.04804, over 971469.30 frames.], batch size: 19, lr: 5.54e-04 2022-05-04 13:50:11,359 INFO [train.py:715] (0/8) Epoch 3, batch 12650, loss[loss=0.1459, simple_loss=0.2097, pruned_loss=0.04102, over 4886.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2331, pruned_loss=0.04816, over 972079.82 frames.], batch size: 19, lr: 5.54e-04 2022-05-04 13:50:53,058 INFO [train.py:715] (0/8) Epoch 3, batch 12700, loss[loss=0.1673, simple_loss=0.2303, pruned_loss=0.05216, over 4839.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2332, pruned_loss=0.04816, over 972390.45 frames.], batch size: 15, lr: 5.54e-04 2022-05-04 13:51:35,171 INFO [train.py:715] (0/8) Epoch 3, batch 12750, loss[loss=0.1621, simple_loss=0.2338, pruned_loss=0.04523, over 4748.00 frames.], tot_loss[loss=0.163, simple_loss=0.2315, pruned_loss=0.04721, over 971225.63 frames.], batch size: 19, lr: 5.54e-04 2022-05-04 13:52:17,429 INFO [train.py:715] (0/8) Epoch 3, batch 12800, loss[loss=0.1695, simple_loss=0.2344, pruned_loss=0.05226, over 4854.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2317, pruned_loss=0.0476, over 971669.33 frames.], batch size: 30, lr: 5.54e-04 2022-05-04 13:52:58,258 INFO [train.py:715] (0/8) Epoch 3, batch 12850, loss[loss=0.1573, simple_loss=0.2322, pruned_loss=0.04125, over 4871.00 frames.], tot_loss[loss=0.163, simple_loss=0.2314, pruned_loss=0.04728, over 972028.20 frames.], batch size: 20, lr: 5.54e-04 2022-05-04 13:53:40,952 INFO [train.py:715] (0/8) Epoch 3, batch 12900, loss[loss=0.1586, simple_loss=0.2245, pruned_loss=0.04637, over 4749.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2313, pruned_loss=0.04711, over 972361.33 frames.], batch size: 19, lr: 5.54e-04 2022-05-04 13:54:23,559 INFO [train.py:715] (0/8) Epoch 3, batch 12950, loss[loss=0.2227, simple_loss=0.2802, pruned_loss=0.0826, over 4854.00 frames.], tot_loss[loss=0.1647, simple_loss=0.2329, pruned_loss=0.04821, over 971585.34 frames.], batch size: 20, lr: 5.54e-04 2022-05-04 13:55:04,925 INFO [train.py:715] (0/8) Epoch 3, batch 13000, loss[loss=0.1572, simple_loss=0.2221, pruned_loss=0.04616, over 4751.00 frames.], tot_loss[loss=0.165, simple_loss=0.2332, pruned_loss=0.04836, over 971573.90 frames.], batch size: 16, lr: 5.53e-04 2022-05-04 13:55:46,797 INFO [train.py:715] (0/8) Epoch 3, batch 13050, loss[loss=0.1657, simple_loss=0.2337, pruned_loss=0.04889, over 4770.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2327, pruned_loss=0.04797, over 971485.68 frames.], batch size: 17, lr: 5.53e-04 2022-05-04 13:56:28,788 INFO [train.py:715] (0/8) Epoch 3, batch 13100, loss[loss=0.173, simple_loss=0.2369, pruned_loss=0.05453, over 4989.00 frames.], tot_loss[loss=0.1637, simple_loss=0.2318, pruned_loss=0.04784, over 970568.51 frames.], batch size: 25, lr: 5.53e-04 2022-05-04 13:57:10,549 INFO [train.py:715] (0/8) Epoch 3, batch 13150, loss[loss=0.1831, simple_loss=0.2606, pruned_loss=0.05283, over 4982.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2317, pruned_loss=0.04689, over 970423.10 frames.], batch size: 25, lr: 5.53e-04 2022-05-04 13:57:52,117 INFO [train.py:715] (0/8) Epoch 3, batch 13200, loss[loss=0.1374, simple_loss=0.2118, pruned_loss=0.03149, over 4831.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2312, pruned_loss=0.04684, over 970471.21 frames.], batch size: 26, lr: 5.53e-04 2022-05-04 13:58:34,747 INFO [train.py:715] (0/8) Epoch 3, batch 13250, loss[loss=0.1431, simple_loss=0.2124, pruned_loss=0.03693, over 4762.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2321, pruned_loss=0.0472, over 971295.64 frames.], batch size: 19, lr: 5.53e-04 2022-05-04 13:59:17,161 INFO [train.py:715] (0/8) Epoch 3, batch 13300, loss[loss=0.1897, simple_loss=0.256, pruned_loss=0.06174, over 4783.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2317, pruned_loss=0.04724, over 971685.75 frames.], batch size: 18, lr: 5.53e-04 2022-05-04 13:59:58,632 INFO [train.py:715] (0/8) Epoch 3, batch 13350, loss[loss=0.1783, simple_loss=0.2418, pruned_loss=0.05738, over 4938.00 frames.], tot_loss[loss=0.1633, simple_loss=0.232, pruned_loss=0.04732, over 971966.51 frames.], batch size: 29, lr: 5.53e-04 2022-05-04 14:00:40,463 INFO [train.py:715] (0/8) Epoch 3, batch 13400, loss[loss=0.139, simple_loss=0.2052, pruned_loss=0.03636, over 4813.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2311, pruned_loss=0.04718, over 971796.27 frames.], batch size: 27, lr: 5.52e-04 2022-05-04 14:01:23,057 INFO [train.py:715] (0/8) Epoch 3, batch 13450, loss[loss=0.1361, simple_loss=0.2059, pruned_loss=0.03316, over 4944.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2309, pruned_loss=0.04736, over 971728.92 frames.], batch size: 21, lr: 5.52e-04 2022-05-04 14:02:04,523 INFO [train.py:715] (0/8) Epoch 3, batch 13500, loss[loss=0.1973, simple_loss=0.2725, pruned_loss=0.06107, over 4813.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2315, pruned_loss=0.04773, over 972206.61 frames.], batch size: 25, lr: 5.52e-04 2022-05-04 14:02:46,050 INFO [train.py:715] (0/8) Epoch 3, batch 13550, loss[loss=0.1058, simple_loss=0.1787, pruned_loss=0.01643, over 4809.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2313, pruned_loss=0.04754, over 971910.26 frames.], batch size: 12, lr: 5.52e-04 2022-05-04 14:03:28,378 INFO [train.py:715] (0/8) Epoch 3, batch 13600, loss[loss=0.1391, simple_loss=0.2164, pruned_loss=0.03088, over 4750.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2321, pruned_loss=0.04801, over 972105.82 frames.], batch size: 16, lr: 5.52e-04 2022-05-04 14:04:10,279 INFO [train.py:715] (0/8) Epoch 3, batch 13650, loss[loss=0.1786, simple_loss=0.2504, pruned_loss=0.05336, over 4824.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2313, pruned_loss=0.04715, over 972697.03 frames.], batch size: 26, lr: 5.52e-04 2022-05-04 14:04:51,700 INFO [train.py:715] (0/8) Epoch 3, batch 13700, loss[loss=0.1587, simple_loss=0.228, pruned_loss=0.04473, over 4936.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2322, pruned_loss=0.04774, over 972592.92 frames.], batch size: 23, lr: 5.52e-04 2022-05-04 14:05:34,459 INFO [train.py:715] (0/8) Epoch 3, batch 13750, loss[loss=0.1418, simple_loss=0.2166, pruned_loss=0.0335, over 4826.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2318, pruned_loss=0.04767, over 972207.09 frames.], batch size: 26, lr: 5.52e-04 2022-05-04 14:06:16,562 INFO [train.py:715] (0/8) Epoch 3, batch 13800, loss[loss=0.1844, simple_loss=0.2404, pruned_loss=0.06421, over 4876.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2305, pruned_loss=0.04715, over 971215.60 frames.], batch size: 22, lr: 5.52e-04 2022-05-04 14:06:58,025 INFO [train.py:715] (0/8) Epoch 3, batch 13850, loss[loss=0.1549, simple_loss=0.2194, pruned_loss=0.04517, over 4952.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2304, pruned_loss=0.04699, over 972189.01 frames.], batch size: 14, lr: 5.51e-04 2022-05-04 14:07:39,261 INFO [train.py:715] (0/8) Epoch 3, batch 13900, loss[loss=0.159, simple_loss=0.2331, pruned_loss=0.04242, over 4917.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2308, pruned_loss=0.04709, over 971697.63 frames.], batch size: 18, lr: 5.51e-04 2022-05-04 14:08:21,702 INFO [train.py:715] (0/8) Epoch 3, batch 13950, loss[loss=0.1333, simple_loss=0.2046, pruned_loss=0.03098, over 4866.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2303, pruned_loss=0.0469, over 971643.45 frames.], batch size: 20, lr: 5.51e-04 2022-05-04 14:09:04,158 INFO [train.py:715] (0/8) Epoch 3, batch 14000, loss[loss=0.1462, simple_loss=0.2072, pruned_loss=0.04262, over 4986.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2303, pruned_loss=0.04636, over 972247.07 frames.], batch size: 14, lr: 5.51e-04 2022-05-04 14:09:45,587 INFO [train.py:715] (0/8) Epoch 3, batch 14050, loss[loss=0.1404, simple_loss=0.2045, pruned_loss=0.03813, over 4886.00 frames.], tot_loss[loss=0.162, simple_loss=0.2309, pruned_loss=0.0466, over 972295.33 frames.], batch size: 22, lr: 5.51e-04 2022-05-04 14:10:28,387 INFO [train.py:715] (0/8) Epoch 3, batch 14100, loss[loss=0.1777, simple_loss=0.2316, pruned_loss=0.06192, over 4882.00 frames.], tot_loss[loss=0.1622, simple_loss=0.231, pruned_loss=0.04668, over 972065.41 frames.], batch size: 32, lr: 5.51e-04 2022-05-04 14:11:10,224 INFO [train.py:715] (0/8) Epoch 3, batch 14150, loss[loss=0.1182, simple_loss=0.1878, pruned_loss=0.02423, over 4844.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2305, pruned_loss=0.04621, over 972991.15 frames.], batch size: 13, lr: 5.51e-04 2022-05-04 14:11:51,368 INFO [train.py:715] (0/8) Epoch 3, batch 14200, loss[loss=0.1534, simple_loss=0.2255, pruned_loss=0.04069, over 4874.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2293, pruned_loss=0.04578, over 972386.11 frames.], batch size: 16, lr: 5.51e-04 2022-05-04 14:12:33,503 INFO [train.py:715] (0/8) Epoch 3, batch 14250, loss[loss=0.1553, simple_loss=0.2258, pruned_loss=0.04239, over 4822.00 frames.], tot_loss[loss=0.162, simple_loss=0.2308, pruned_loss=0.04662, over 972383.43 frames.], batch size: 12, lr: 5.51e-04 2022-05-04 14:13:15,869 INFO [train.py:715] (0/8) Epoch 3, batch 14300, loss[loss=0.1517, simple_loss=0.2294, pruned_loss=0.03702, over 4754.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2312, pruned_loss=0.04689, over 972445.25 frames.], batch size: 16, lr: 5.50e-04 2022-05-04 14:13:58,169 INFO [train.py:715] (0/8) Epoch 3, batch 14350, loss[loss=0.1646, simple_loss=0.241, pruned_loss=0.04411, over 4960.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2315, pruned_loss=0.04705, over 972276.77 frames.], batch size: 24, lr: 5.50e-04 2022-05-04 14:14:38,947 INFO [train.py:715] (0/8) Epoch 3, batch 14400, loss[loss=0.1792, simple_loss=0.2508, pruned_loss=0.05375, over 4980.00 frames.], tot_loss[loss=0.1622, simple_loss=0.231, pruned_loss=0.04671, over 972697.27 frames.], batch size: 28, lr: 5.50e-04 2022-05-04 14:15:21,403 INFO [train.py:715] (0/8) Epoch 3, batch 14450, loss[loss=0.2232, simple_loss=0.2891, pruned_loss=0.07861, over 4808.00 frames.], tot_loss[loss=0.1623, simple_loss=0.231, pruned_loss=0.0468, over 972900.08 frames.], batch size: 21, lr: 5.50e-04 2022-05-04 14:16:03,349 INFO [train.py:715] (0/8) Epoch 3, batch 14500, loss[loss=0.1468, simple_loss=0.2246, pruned_loss=0.03454, over 4816.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2312, pruned_loss=0.04688, over 972959.38 frames.], batch size: 27, lr: 5.50e-04 2022-05-04 14:16:44,525 INFO [train.py:715] (0/8) Epoch 3, batch 14550, loss[loss=0.1744, simple_loss=0.2385, pruned_loss=0.05515, over 4941.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2314, pruned_loss=0.04726, over 972293.56 frames.], batch size: 29, lr: 5.50e-04 2022-05-04 14:17:26,977 INFO [train.py:715] (0/8) Epoch 3, batch 14600, loss[loss=0.1502, simple_loss=0.2169, pruned_loss=0.04181, over 4944.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2308, pruned_loss=0.047, over 972328.74 frames.], batch size: 21, lr: 5.50e-04 2022-05-04 14:18:08,863 INFO [train.py:715] (0/8) Epoch 3, batch 14650, loss[loss=0.1376, simple_loss=0.2052, pruned_loss=0.03507, over 4764.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2313, pruned_loss=0.04719, over 971923.98 frames.], batch size: 19, lr: 5.50e-04 2022-05-04 14:18:50,921 INFO [train.py:715] (0/8) Epoch 3, batch 14700, loss[loss=0.1673, simple_loss=0.2319, pruned_loss=0.05132, over 4854.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2318, pruned_loss=0.0474, over 971779.24 frames.], batch size: 20, lr: 5.49e-04 2022-05-04 14:19:32,216 INFO [train.py:715] (0/8) Epoch 3, batch 14750, loss[loss=0.1902, simple_loss=0.2591, pruned_loss=0.06064, over 4965.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2321, pruned_loss=0.04753, over 971298.69 frames.], batch size: 24, lr: 5.49e-04 2022-05-04 14:20:14,630 INFO [train.py:715] (0/8) Epoch 3, batch 14800, loss[loss=0.174, simple_loss=0.2374, pruned_loss=0.0553, over 4978.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2306, pruned_loss=0.04689, over 972603.53 frames.], batch size: 15, lr: 5.49e-04 2022-05-04 14:20:56,940 INFO [train.py:715] (0/8) Epoch 3, batch 14850, loss[loss=0.1629, simple_loss=0.2396, pruned_loss=0.04307, over 4943.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2305, pruned_loss=0.04721, over 972503.68 frames.], batch size: 21, lr: 5.49e-04 2022-05-04 14:21:37,853 INFO [train.py:715] (0/8) Epoch 3, batch 14900, loss[loss=0.1895, simple_loss=0.2576, pruned_loss=0.06072, over 4832.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2306, pruned_loss=0.04726, over 972534.82 frames.], batch size: 15, lr: 5.49e-04 2022-05-04 14:22:20,812 INFO [train.py:715] (0/8) Epoch 3, batch 14950, loss[loss=0.1648, simple_loss=0.229, pruned_loss=0.05031, over 4829.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2302, pruned_loss=0.04699, over 972299.38 frames.], batch size: 27, lr: 5.49e-04 2022-05-04 14:23:02,210 INFO [train.py:715] (0/8) Epoch 3, batch 15000, loss[loss=0.1254, simple_loss=0.1999, pruned_loss=0.02545, over 4849.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2306, pruned_loss=0.04736, over 972357.86 frames.], batch size: 13, lr: 5.49e-04 2022-05-04 14:23:02,211 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 14:23:10,876 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1142, simple_loss=0.2003, pruned_loss=0.01402, over 914524.00 frames. 2022-05-04 14:23:52,704 INFO [train.py:715] (0/8) Epoch 3, batch 15050, loss[loss=0.1533, simple_loss=0.2198, pruned_loss=0.04341, over 4781.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2301, pruned_loss=0.04689, over 971920.85 frames.], batch size: 14, lr: 5.49e-04 2022-05-04 14:24:34,025 INFO [train.py:715] (0/8) Epoch 3, batch 15100, loss[loss=0.1334, simple_loss=0.2043, pruned_loss=0.03127, over 4754.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2306, pruned_loss=0.04749, over 971831.33 frames.], batch size: 12, lr: 5.49e-04 2022-05-04 14:25:16,181 INFO [train.py:715] (0/8) Epoch 3, batch 15150, loss[loss=0.1371, simple_loss=0.2028, pruned_loss=0.03566, over 4750.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2306, pruned_loss=0.04736, over 971313.60 frames.], batch size: 19, lr: 5.48e-04 2022-05-04 14:25:57,808 INFO [train.py:715] (0/8) Epoch 3, batch 15200, loss[loss=0.1461, simple_loss=0.2192, pruned_loss=0.03648, over 4692.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2313, pruned_loss=0.04784, over 970734.12 frames.], batch size: 15, lr: 5.48e-04 2022-05-04 14:26:39,370 INFO [train.py:715] (0/8) Epoch 3, batch 15250, loss[loss=0.1303, simple_loss=0.2062, pruned_loss=0.02716, over 4969.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2312, pruned_loss=0.0477, over 970996.90 frames.], batch size: 24, lr: 5.48e-04 2022-05-04 14:27:20,706 INFO [train.py:715] (0/8) Epoch 3, batch 15300, loss[loss=0.1803, simple_loss=0.251, pruned_loss=0.05482, over 4798.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2313, pruned_loss=0.04763, over 971217.60 frames.], batch size: 21, lr: 5.48e-04 2022-05-04 14:28:02,522 INFO [train.py:715] (0/8) Epoch 3, batch 15350, loss[loss=0.1773, simple_loss=0.2402, pruned_loss=0.05718, over 4977.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2313, pruned_loss=0.04725, over 971819.90 frames.], batch size: 14, lr: 5.48e-04 2022-05-04 14:28:44,643 INFO [train.py:715] (0/8) Epoch 3, batch 15400, loss[loss=0.1397, simple_loss=0.2191, pruned_loss=0.03009, over 4792.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2308, pruned_loss=0.04694, over 971771.41 frames.], batch size: 17, lr: 5.48e-04 2022-05-04 14:29:25,738 INFO [train.py:715] (0/8) Epoch 3, batch 15450, loss[loss=0.1791, simple_loss=0.2324, pruned_loss=0.06292, over 4756.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2309, pruned_loss=0.0471, over 971820.66 frames.], batch size: 14, lr: 5.48e-04 2022-05-04 14:30:08,679 INFO [train.py:715] (0/8) Epoch 3, batch 15500, loss[loss=0.1684, simple_loss=0.2416, pruned_loss=0.04763, over 4800.00 frames.], tot_loss[loss=0.163, simple_loss=0.2313, pruned_loss=0.0473, over 972461.11 frames.], batch size: 21, lr: 5.48e-04 2022-05-04 14:30:50,506 INFO [train.py:715] (0/8) Epoch 3, batch 15550, loss[loss=0.154, simple_loss=0.2254, pruned_loss=0.04127, over 4835.00 frames.], tot_loss[loss=0.1641, simple_loss=0.2326, pruned_loss=0.04785, over 972494.46 frames.], batch size: 15, lr: 5.48e-04 2022-05-04 14:31:06,844 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-120000.pt 2022-05-04 14:31:35,091 INFO [train.py:715] (0/8) Epoch 3, batch 15600, loss[loss=0.1714, simple_loss=0.2446, pruned_loss=0.04905, over 4806.00 frames.], tot_loss[loss=0.1637, simple_loss=0.232, pruned_loss=0.04773, over 972096.25 frames.], batch size: 21, lr: 5.47e-04 2022-05-04 14:32:16,097 INFO [train.py:715] (0/8) Epoch 3, batch 15650, loss[loss=0.1977, simple_loss=0.2499, pruned_loss=0.07271, over 4697.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2316, pruned_loss=0.04786, over 973007.04 frames.], batch size: 15, lr: 5.47e-04 2022-05-04 14:32:57,687 INFO [train.py:715] (0/8) Epoch 3, batch 15700, loss[loss=0.1362, simple_loss=0.2168, pruned_loss=0.02776, over 4889.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2303, pruned_loss=0.04698, over 972590.16 frames.], batch size: 19, lr: 5.47e-04 2022-05-04 14:33:40,522 INFO [train.py:715] (0/8) Epoch 3, batch 15750, loss[loss=0.1785, simple_loss=0.2403, pruned_loss=0.05835, over 4868.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2299, pruned_loss=0.04666, over 971950.93 frames.], batch size: 30, lr: 5.47e-04 2022-05-04 14:34:22,329 INFO [train.py:715] (0/8) Epoch 3, batch 15800, loss[loss=0.1753, simple_loss=0.2279, pruned_loss=0.06136, over 4790.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2299, pruned_loss=0.04675, over 971753.49 frames.], batch size: 13, lr: 5.47e-04 2022-05-04 14:35:03,580 INFO [train.py:715] (0/8) Epoch 3, batch 15850, loss[loss=0.1523, simple_loss=0.2253, pruned_loss=0.0396, over 4898.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2299, pruned_loss=0.04647, over 972384.61 frames.], batch size: 19, lr: 5.47e-04 2022-05-04 14:35:45,953 INFO [train.py:715] (0/8) Epoch 3, batch 15900, loss[loss=0.2051, simple_loss=0.2763, pruned_loss=0.06694, over 4775.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2305, pruned_loss=0.04702, over 972533.63 frames.], batch size: 17, lr: 5.47e-04 2022-05-04 14:36:28,596 INFO [train.py:715] (0/8) Epoch 3, batch 15950, loss[loss=0.1472, simple_loss=0.2174, pruned_loss=0.03845, over 4896.00 frames.], tot_loss[loss=0.163, simple_loss=0.2311, pruned_loss=0.04746, over 972113.11 frames.], batch size: 22, lr: 5.47e-04 2022-05-04 14:37:09,197 INFO [train.py:715] (0/8) Epoch 3, batch 16000, loss[loss=0.1526, simple_loss=0.2229, pruned_loss=0.04112, over 4699.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2318, pruned_loss=0.04767, over 972794.70 frames.], batch size: 15, lr: 5.47e-04 2022-05-04 14:37:50,843 INFO [train.py:715] (0/8) Epoch 3, batch 16050, loss[loss=0.1619, simple_loss=0.2365, pruned_loss=0.04369, over 4851.00 frames.], tot_loss[loss=0.1643, simple_loss=0.2326, pruned_loss=0.04794, over 972702.57 frames.], batch size: 16, lr: 5.46e-04 2022-05-04 14:38:33,473 INFO [train.py:715] (0/8) Epoch 3, batch 16100, loss[loss=0.1632, simple_loss=0.2395, pruned_loss=0.04341, over 4828.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2321, pruned_loss=0.04739, over 973011.48 frames.], batch size: 13, lr: 5.46e-04 2022-05-04 14:39:15,444 INFO [train.py:715] (0/8) Epoch 3, batch 16150, loss[loss=0.1827, simple_loss=0.244, pruned_loss=0.06074, over 4769.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2317, pruned_loss=0.04759, over 971961.29 frames.], batch size: 14, lr: 5.46e-04 2022-05-04 14:39:56,184 INFO [train.py:715] (0/8) Epoch 3, batch 16200, loss[loss=0.13, simple_loss=0.1987, pruned_loss=0.03068, over 4807.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2316, pruned_loss=0.04741, over 971733.28 frames.], batch size: 12, lr: 5.46e-04 2022-05-04 14:40:38,473 INFO [train.py:715] (0/8) Epoch 3, batch 16250, loss[loss=0.1778, simple_loss=0.2465, pruned_loss=0.05458, over 4902.00 frames.], tot_loss[loss=0.1635, simple_loss=0.232, pruned_loss=0.04744, over 972070.30 frames.], batch size: 19, lr: 5.46e-04 2022-05-04 14:41:20,553 INFO [train.py:715] (0/8) Epoch 3, batch 16300, loss[loss=0.158, simple_loss=0.2221, pruned_loss=0.04692, over 4948.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2319, pruned_loss=0.04739, over 972210.54 frames.], batch size: 14, lr: 5.46e-04 2022-05-04 14:42:01,216 INFO [train.py:715] (0/8) Epoch 3, batch 16350, loss[loss=0.1519, simple_loss=0.2289, pruned_loss=0.03744, over 4770.00 frames.], tot_loss[loss=0.163, simple_loss=0.2314, pruned_loss=0.04727, over 972093.16 frames.], batch size: 18, lr: 5.46e-04 2022-05-04 14:42:43,183 INFO [train.py:715] (0/8) Epoch 3, batch 16400, loss[loss=0.1484, simple_loss=0.2197, pruned_loss=0.03856, over 4926.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2305, pruned_loss=0.04712, over 971417.16 frames.], batch size: 23, lr: 5.46e-04 2022-05-04 14:43:25,719 INFO [train.py:715] (0/8) Epoch 3, batch 16450, loss[loss=0.2836, simple_loss=0.3092, pruned_loss=0.129, over 4788.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2309, pruned_loss=0.04747, over 971962.59 frames.], batch size: 18, lr: 5.45e-04 2022-05-04 14:44:08,335 INFO [train.py:715] (0/8) Epoch 3, batch 16500, loss[loss=0.1504, simple_loss=0.2142, pruned_loss=0.04326, over 4831.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2317, pruned_loss=0.04762, over 972216.12 frames.], batch size: 15, lr: 5.45e-04 2022-05-04 14:44:49,049 INFO [train.py:715] (0/8) Epoch 3, batch 16550, loss[loss=0.1943, simple_loss=0.2618, pruned_loss=0.06338, over 4844.00 frames.], tot_loss[loss=0.164, simple_loss=0.2318, pruned_loss=0.04814, over 971863.57 frames.], batch size: 30, lr: 5.45e-04 2022-05-04 14:45:31,911 INFO [train.py:715] (0/8) Epoch 3, batch 16600, loss[loss=0.1837, simple_loss=0.2461, pruned_loss=0.06062, over 4835.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2313, pruned_loss=0.04816, over 971855.67 frames.], batch size: 30, lr: 5.45e-04 2022-05-04 14:46:14,679 INFO [train.py:715] (0/8) Epoch 3, batch 16650, loss[loss=0.1533, simple_loss=0.2214, pruned_loss=0.04262, over 4857.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2307, pruned_loss=0.04802, over 973016.83 frames.], batch size: 20, lr: 5.45e-04 2022-05-04 14:46:55,374 INFO [train.py:715] (0/8) Epoch 3, batch 16700, loss[loss=0.1469, simple_loss=0.2205, pruned_loss=0.03666, over 4980.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2301, pruned_loss=0.04722, over 972899.83 frames.], batch size: 28, lr: 5.45e-04 2022-05-04 14:47:37,406 INFO [train.py:715] (0/8) Epoch 3, batch 16750, loss[loss=0.1492, simple_loss=0.2271, pruned_loss=0.03565, over 4883.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2306, pruned_loss=0.04745, over 972871.08 frames.], batch size: 16, lr: 5.45e-04 2022-05-04 14:48:19,847 INFO [train.py:715] (0/8) Epoch 3, batch 16800, loss[loss=0.116, simple_loss=0.1818, pruned_loss=0.02511, over 4794.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2294, pruned_loss=0.04676, over 973252.42 frames.], batch size: 12, lr: 5.45e-04 2022-05-04 14:49:01,330 INFO [train.py:715] (0/8) Epoch 3, batch 16850, loss[loss=0.193, simple_loss=0.2509, pruned_loss=0.06758, over 4774.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2293, pruned_loss=0.04659, over 972692.41 frames.], batch size: 14, lr: 5.45e-04 2022-05-04 14:49:42,729 INFO [train.py:715] (0/8) Epoch 3, batch 16900, loss[loss=0.1645, simple_loss=0.2298, pruned_loss=0.04954, over 4875.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2292, pruned_loss=0.04686, over 972797.90 frames.], batch size: 22, lr: 5.44e-04 2022-05-04 14:50:24,685 INFO [train.py:715] (0/8) Epoch 3, batch 16950, loss[loss=0.1458, simple_loss=0.2224, pruned_loss=0.03458, over 4966.00 frames.], tot_loss[loss=0.1611, simple_loss=0.2292, pruned_loss=0.04648, over 972869.80 frames.], batch size: 28, lr: 5.44e-04 2022-05-04 14:51:07,245 INFO [train.py:715] (0/8) Epoch 3, batch 17000, loss[loss=0.1261, simple_loss=0.1987, pruned_loss=0.02679, over 4652.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2293, pruned_loss=0.04628, over 971985.32 frames.], batch size: 13, lr: 5.44e-04 2022-05-04 14:51:47,561 INFO [train.py:715] (0/8) Epoch 3, batch 17050, loss[loss=0.1846, simple_loss=0.2343, pruned_loss=0.06748, over 4900.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2293, pruned_loss=0.04627, over 972233.92 frames.], batch size: 17, lr: 5.44e-04 2022-05-04 14:52:29,480 INFO [train.py:715] (0/8) Epoch 3, batch 17100, loss[loss=0.1458, simple_loss=0.2243, pruned_loss=0.03363, over 4779.00 frames.], tot_loss[loss=0.161, simple_loss=0.2291, pruned_loss=0.04646, over 972362.57 frames.], batch size: 19, lr: 5.44e-04 2022-05-04 14:53:11,180 INFO [train.py:715] (0/8) Epoch 3, batch 17150, loss[loss=0.1566, simple_loss=0.2258, pruned_loss=0.04374, over 4934.00 frames.], tot_loss[loss=0.1607, simple_loss=0.2288, pruned_loss=0.04629, over 972480.56 frames.], batch size: 29, lr: 5.44e-04 2022-05-04 14:53:52,357 INFO [train.py:715] (0/8) Epoch 3, batch 17200, loss[loss=0.145, simple_loss=0.212, pruned_loss=0.03895, over 4908.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2284, pruned_loss=0.04588, over 972804.52 frames.], batch size: 17, lr: 5.44e-04 2022-05-04 14:54:33,051 INFO [train.py:715] (0/8) Epoch 3, batch 17250, loss[loss=0.1547, simple_loss=0.2289, pruned_loss=0.04025, over 4845.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2283, pruned_loss=0.04553, over 973247.91 frames.], batch size: 26, lr: 5.44e-04 2022-05-04 14:55:14,505 INFO [train.py:715] (0/8) Epoch 3, batch 17300, loss[loss=0.125, simple_loss=0.2017, pruned_loss=0.02418, over 4911.00 frames.], tot_loss[loss=0.1601, simple_loss=0.229, pruned_loss=0.04565, over 973004.03 frames.], batch size: 19, lr: 5.44e-04 2022-05-04 14:55:56,120 INFO [train.py:715] (0/8) Epoch 3, batch 17350, loss[loss=0.14, simple_loss=0.2091, pruned_loss=0.03547, over 4842.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2299, pruned_loss=0.0464, over 972694.84 frames.], batch size: 30, lr: 5.43e-04 2022-05-04 14:56:36,184 INFO [train.py:715] (0/8) Epoch 3, batch 17400, loss[loss=0.1501, simple_loss=0.2328, pruned_loss=0.03368, over 4874.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2299, pruned_loss=0.04651, over 972713.64 frames.], batch size: 22, lr: 5.43e-04 2022-05-04 14:57:18,251 INFO [train.py:715] (0/8) Epoch 3, batch 17450, loss[loss=0.1794, simple_loss=0.25, pruned_loss=0.05436, over 4882.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2297, pruned_loss=0.04638, over 973066.07 frames.], batch size: 22, lr: 5.43e-04 2022-05-04 14:58:00,474 INFO [train.py:715] (0/8) Epoch 3, batch 17500, loss[loss=0.2233, simple_loss=0.2808, pruned_loss=0.08294, over 4908.00 frames.], tot_loss[loss=0.1635, simple_loss=0.232, pruned_loss=0.0475, over 972317.79 frames.], batch size: 17, lr: 5.43e-04 2022-05-04 14:58:41,509 INFO [train.py:715] (0/8) Epoch 3, batch 17550, loss[loss=0.132, simple_loss=0.2054, pruned_loss=0.02931, over 4805.00 frames.], tot_loss[loss=0.163, simple_loss=0.2314, pruned_loss=0.04736, over 971691.15 frames.], batch size: 21, lr: 5.43e-04 2022-05-04 14:59:22,855 INFO [train.py:715] (0/8) Epoch 3, batch 17600, loss[loss=0.1953, simple_loss=0.2546, pruned_loss=0.06802, over 4981.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2307, pruned_loss=0.04704, over 971802.97 frames.], batch size: 33, lr: 5.43e-04 2022-05-04 15:00:04,535 INFO [train.py:715] (0/8) Epoch 3, batch 17650, loss[loss=0.1312, simple_loss=0.2024, pruned_loss=0.03006, over 4926.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2303, pruned_loss=0.04673, over 971997.16 frames.], batch size: 29, lr: 5.43e-04 2022-05-04 15:00:46,082 INFO [train.py:715] (0/8) Epoch 3, batch 17700, loss[loss=0.1521, simple_loss=0.2115, pruned_loss=0.04635, over 4792.00 frames.], tot_loss[loss=0.162, simple_loss=0.2302, pruned_loss=0.04685, over 971747.18 frames.], batch size: 12, lr: 5.43e-04 2022-05-04 15:01:26,897 INFO [train.py:715] (0/8) Epoch 3, batch 17750, loss[loss=0.198, simple_loss=0.2719, pruned_loss=0.06212, over 4770.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2306, pruned_loss=0.047, over 971445.19 frames.], batch size: 17, lr: 5.43e-04 2022-05-04 15:02:08,927 INFO [train.py:715] (0/8) Epoch 3, batch 17800, loss[loss=0.1754, simple_loss=0.2472, pruned_loss=0.05179, over 4820.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2313, pruned_loss=0.04766, over 971781.26 frames.], batch size: 24, lr: 5.42e-04 2022-05-04 15:02:50,347 INFO [train.py:715] (0/8) Epoch 3, batch 17850, loss[loss=0.1781, simple_loss=0.2479, pruned_loss=0.05412, over 4782.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2315, pruned_loss=0.04784, over 971058.95 frames.], batch size: 14, lr: 5.42e-04 2022-05-04 15:03:30,305 INFO [train.py:715] (0/8) Epoch 3, batch 17900, loss[loss=0.1845, simple_loss=0.2552, pruned_loss=0.05692, over 4893.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2315, pruned_loss=0.04772, over 971645.04 frames.], batch size: 39, lr: 5.42e-04 2022-05-04 15:04:12,145 INFO [train.py:715] (0/8) Epoch 3, batch 17950, loss[loss=0.1543, simple_loss=0.2201, pruned_loss=0.04426, over 4795.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2316, pruned_loss=0.04755, over 971753.12 frames.], batch size: 25, lr: 5.42e-04 2022-05-04 15:04:53,401 INFO [train.py:715] (0/8) Epoch 3, batch 18000, loss[loss=0.216, simple_loss=0.2883, pruned_loss=0.07181, over 4839.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2328, pruned_loss=0.04821, over 971154.49 frames.], batch size: 15, lr: 5.42e-04 2022-05-04 15:04:53,403 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 15:05:02,070 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1143, simple_loss=0.2002, pruned_loss=0.01414, over 914524.00 frames. 2022-05-04 15:05:43,866 INFO [train.py:715] (0/8) Epoch 3, batch 18050, loss[loss=0.1542, simple_loss=0.2155, pruned_loss=0.04649, over 4829.00 frames.], tot_loss[loss=0.165, simple_loss=0.2327, pruned_loss=0.0487, over 970889.28 frames.], batch size: 30, lr: 5.42e-04 2022-05-04 15:06:25,504 INFO [train.py:715] (0/8) Epoch 3, batch 18100, loss[loss=0.1619, simple_loss=0.2337, pruned_loss=0.04506, over 4953.00 frames.], tot_loss[loss=0.1637, simple_loss=0.2317, pruned_loss=0.04789, over 971379.22 frames.], batch size: 39, lr: 5.42e-04 2022-05-04 15:07:06,174 INFO [train.py:715] (0/8) Epoch 3, batch 18150, loss[loss=0.1421, simple_loss=0.2204, pruned_loss=0.03189, over 4832.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2311, pruned_loss=0.0475, over 972459.07 frames.], batch size: 26, lr: 5.42e-04 2022-05-04 15:07:47,676 INFO [train.py:715] (0/8) Epoch 3, batch 18200, loss[loss=0.1728, simple_loss=0.2479, pruned_loss=0.04889, over 4854.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2317, pruned_loss=0.04762, over 972549.45 frames.], batch size: 20, lr: 5.42e-04 2022-05-04 15:08:29,469 INFO [train.py:715] (0/8) Epoch 3, batch 18250, loss[loss=0.1929, simple_loss=0.2565, pruned_loss=0.06464, over 4892.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2314, pruned_loss=0.04747, over 972081.36 frames.], batch size: 19, lr: 5.41e-04 2022-05-04 15:09:10,294 INFO [train.py:715] (0/8) Epoch 3, batch 18300, loss[loss=0.1543, simple_loss=0.2312, pruned_loss=0.03868, over 4761.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2313, pruned_loss=0.04712, over 972591.58 frames.], batch size: 14, lr: 5.41e-04 2022-05-04 15:09:51,597 INFO [train.py:715] (0/8) Epoch 3, batch 18350, loss[loss=0.1812, simple_loss=0.2432, pruned_loss=0.05961, over 4988.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2309, pruned_loss=0.04721, over 972902.56 frames.], batch size: 24, lr: 5.41e-04 2022-05-04 15:10:33,029 INFO [train.py:715] (0/8) Epoch 3, batch 18400, loss[loss=0.1792, simple_loss=0.245, pruned_loss=0.05668, over 4944.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2309, pruned_loss=0.0472, over 973017.83 frames.], batch size: 21, lr: 5.41e-04 2022-05-04 15:11:13,983 INFO [train.py:715] (0/8) Epoch 3, batch 18450, loss[loss=0.1728, simple_loss=0.2399, pruned_loss=0.05288, over 4901.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2319, pruned_loss=0.04768, over 972366.34 frames.], batch size: 22, lr: 5.41e-04 2022-05-04 15:11:55,020 INFO [train.py:715] (0/8) Epoch 3, batch 18500, loss[loss=0.1582, simple_loss=0.2292, pruned_loss=0.04355, over 4827.00 frames.], tot_loss[loss=0.1623, simple_loss=0.231, pruned_loss=0.04683, over 972212.75 frames.], batch size: 30, lr: 5.41e-04 2022-05-04 15:12:36,398 INFO [train.py:715] (0/8) Epoch 3, batch 18550, loss[loss=0.1781, simple_loss=0.251, pruned_loss=0.05265, over 4682.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2322, pruned_loss=0.04726, over 971586.02 frames.], batch size: 15, lr: 5.41e-04 2022-05-04 15:13:18,632 INFO [train.py:715] (0/8) Epoch 3, batch 18600, loss[loss=0.1655, simple_loss=0.2449, pruned_loss=0.04305, over 4803.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2312, pruned_loss=0.04715, over 970643.17 frames.], batch size: 24, lr: 5.41e-04 2022-05-04 15:13:58,615 INFO [train.py:715] (0/8) Epoch 3, batch 18650, loss[loss=0.1322, simple_loss=0.2017, pruned_loss=0.03134, over 4808.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2303, pruned_loss=0.04669, over 970786.86 frames.], batch size: 21, lr: 5.41e-04 2022-05-04 15:14:39,319 INFO [train.py:715] (0/8) Epoch 3, batch 18700, loss[loss=0.1637, simple_loss=0.2179, pruned_loss=0.05477, over 4763.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2312, pruned_loss=0.04713, over 971158.27 frames.], batch size: 12, lr: 5.40e-04 2022-05-04 15:15:20,416 INFO [train.py:715] (0/8) Epoch 3, batch 18750, loss[loss=0.1495, simple_loss=0.2149, pruned_loss=0.04208, over 4989.00 frames.], tot_loss[loss=0.1608, simple_loss=0.2294, pruned_loss=0.04612, over 970764.07 frames.], batch size: 25, lr: 5.40e-04 2022-05-04 15:16:00,278 INFO [train.py:715] (0/8) Epoch 3, batch 18800, loss[loss=0.1714, simple_loss=0.2409, pruned_loss=0.05097, over 4929.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2302, pruned_loss=0.04636, over 971963.05 frames.], batch size: 39, lr: 5.40e-04 2022-05-04 15:16:41,103 INFO [train.py:715] (0/8) Epoch 3, batch 18850, loss[loss=0.1861, simple_loss=0.2577, pruned_loss=0.05724, over 4783.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2319, pruned_loss=0.04717, over 972261.37 frames.], batch size: 18, lr: 5.40e-04 2022-05-04 15:17:21,053 INFO [train.py:715] (0/8) Epoch 3, batch 18900, loss[loss=0.1727, simple_loss=0.241, pruned_loss=0.0522, over 4803.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2323, pruned_loss=0.04714, over 972942.74 frames.], batch size: 21, lr: 5.40e-04 2022-05-04 15:18:01,539 INFO [train.py:715] (0/8) Epoch 3, batch 18950, loss[loss=0.1732, simple_loss=0.2372, pruned_loss=0.05457, over 4944.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2322, pruned_loss=0.04716, over 973500.82 frames.], batch size: 39, lr: 5.40e-04 2022-05-04 15:18:40,941 INFO [train.py:715] (0/8) Epoch 3, batch 19000, loss[loss=0.157, simple_loss=0.2289, pruned_loss=0.04256, over 4920.00 frames.], tot_loss[loss=0.163, simple_loss=0.2321, pruned_loss=0.04698, over 972759.15 frames.], batch size: 29, lr: 5.40e-04 2022-05-04 15:19:20,768 INFO [train.py:715] (0/8) Epoch 3, batch 19050, loss[loss=0.1655, simple_loss=0.2382, pruned_loss=0.04643, over 4923.00 frames.], tot_loss[loss=0.1617, simple_loss=0.231, pruned_loss=0.04617, over 973282.09 frames.], batch size: 39, lr: 5.40e-04 2022-05-04 15:20:01,071 INFO [train.py:715] (0/8) Epoch 3, batch 19100, loss[loss=0.1334, simple_loss=0.2005, pruned_loss=0.03322, over 4729.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2304, pruned_loss=0.04571, over 972398.33 frames.], batch size: 12, lr: 5.40e-04 2022-05-04 15:20:40,497 INFO [train.py:715] (0/8) Epoch 3, batch 19150, loss[loss=0.1656, simple_loss=0.2314, pruned_loss=0.04996, over 4921.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2304, pruned_loss=0.04605, over 971984.33 frames.], batch size: 18, lr: 5.40e-04 2022-05-04 15:21:20,175 INFO [train.py:715] (0/8) Epoch 3, batch 19200, loss[loss=0.1516, simple_loss=0.2202, pruned_loss=0.04152, over 4766.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2305, pruned_loss=0.04652, over 971646.77 frames.], batch size: 18, lr: 5.39e-04 2022-05-04 15:21:59,824 INFO [train.py:715] (0/8) Epoch 3, batch 19250, loss[loss=0.1537, simple_loss=0.2153, pruned_loss=0.04602, over 4886.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2306, pruned_loss=0.04661, over 971660.64 frames.], batch size: 22, lr: 5.39e-04 2022-05-04 15:22:40,130 INFO [train.py:715] (0/8) Epoch 3, batch 19300, loss[loss=0.1267, simple_loss=0.209, pruned_loss=0.02219, over 4836.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2307, pruned_loss=0.04656, over 971115.46 frames.], batch size: 13, lr: 5.39e-04 2022-05-04 15:23:19,471 INFO [train.py:715] (0/8) Epoch 3, batch 19350, loss[loss=0.2242, simple_loss=0.2812, pruned_loss=0.08354, over 4765.00 frames.], tot_loss[loss=0.1611, simple_loss=0.2299, pruned_loss=0.04612, over 971216.35 frames.], batch size: 18, lr: 5.39e-04 2022-05-04 15:23:59,198 INFO [train.py:715] (0/8) Epoch 3, batch 19400, loss[loss=0.1216, simple_loss=0.1879, pruned_loss=0.02765, over 4988.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2307, pruned_loss=0.04684, over 971680.51 frames.], batch size: 14, lr: 5.39e-04 2022-05-04 15:24:39,294 INFO [train.py:715] (0/8) Epoch 3, batch 19450, loss[loss=0.1705, simple_loss=0.2473, pruned_loss=0.04691, over 4927.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2316, pruned_loss=0.04713, over 972037.77 frames.], batch size: 18, lr: 5.39e-04 2022-05-04 15:25:18,371 INFO [train.py:715] (0/8) Epoch 3, batch 19500, loss[loss=0.1977, simple_loss=0.2496, pruned_loss=0.07289, over 4781.00 frames.], tot_loss[loss=0.1632, simple_loss=0.232, pruned_loss=0.04717, over 972032.15 frames.], batch size: 14, lr: 5.39e-04 2022-05-04 15:25:58,125 INFO [train.py:715] (0/8) Epoch 3, batch 19550, loss[loss=0.1819, simple_loss=0.2348, pruned_loss=0.06445, over 4796.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2319, pruned_loss=0.0473, over 971699.65 frames.], batch size: 12, lr: 5.39e-04 2022-05-04 15:26:37,669 INFO [train.py:715] (0/8) Epoch 3, batch 19600, loss[loss=0.1971, simple_loss=0.2467, pruned_loss=0.07378, over 4910.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2319, pruned_loss=0.0474, over 971626.36 frames.], batch size: 17, lr: 5.39e-04 2022-05-04 15:27:17,570 INFO [train.py:715] (0/8) Epoch 3, batch 19650, loss[loss=0.1554, simple_loss=0.221, pruned_loss=0.04494, over 4653.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2309, pruned_loss=0.04684, over 971139.21 frames.], batch size: 13, lr: 5.38e-04 2022-05-04 15:27:56,471 INFO [train.py:715] (0/8) Epoch 3, batch 19700, loss[loss=0.2086, simple_loss=0.2747, pruned_loss=0.07124, over 4782.00 frames.], tot_loss[loss=0.1611, simple_loss=0.2297, pruned_loss=0.04627, over 971212.66 frames.], batch size: 17, lr: 5.38e-04 2022-05-04 15:28:36,067 INFO [train.py:715] (0/8) Epoch 3, batch 19750, loss[loss=0.1885, simple_loss=0.2438, pruned_loss=0.06665, over 4949.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2297, pruned_loss=0.0463, over 971664.38 frames.], batch size: 29, lr: 5.38e-04 2022-05-04 15:29:15,537 INFO [train.py:715] (0/8) Epoch 3, batch 19800, loss[loss=0.1334, simple_loss=0.2059, pruned_loss=0.0305, over 4948.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2308, pruned_loss=0.04675, over 972278.84 frames.], batch size: 21, lr: 5.38e-04 2022-05-04 15:29:55,118 INFO [train.py:715] (0/8) Epoch 3, batch 19850, loss[loss=0.138, simple_loss=0.2111, pruned_loss=0.03245, over 4934.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2303, pruned_loss=0.0462, over 971977.52 frames.], batch size: 23, lr: 5.38e-04 2022-05-04 15:30:34,813 INFO [train.py:715] (0/8) Epoch 3, batch 19900, loss[loss=0.1356, simple_loss=0.2083, pruned_loss=0.03142, over 4830.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2301, pruned_loss=0.04617, over 972279.01 frames.], batch size: 13, lr: 5.38e-04 2022-05-04 15:31:15,110 INFO [train.py:715] (0/8) Epoch 3, batch 19950, loss[loss=0.1811, simple_loss=0.2538, pruned_loss=0.05417, over 4901.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2303, pruned_loss=0.04614, over 971880.03 frames.], batch size: 39, lr: 5.38e-04 2022-05-04 15:31:54,896 INFO [train.py:715] (0/8) Epoch 3, batch 20000, loss[loss=0.1729, simple_loss=0.2399, pruned_loss=0.05297, over 4967.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2314, pruned_loss=0.04671, over 971515.94 frames.], batch size: 15, lr: 5.38e-04 2022-05-04 15:32:34,161 INFO [train.py:715] (0/8) Epoch 3, batch 20050, loss[loss=0.1665, simple_loss=0.2369, pruned_loss=0.0481, over 4901.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2314, pruned_loss=0.04679, over 973145.73 frames.], batch size: 19, lr: 5.38e-04 2022-05-04 15:33:14,399 INFO [train.py:715] (0/8) Epoch 3, batch 20100, loss[loss=0.1728, simple_loss=0.2397, pruned_loss=0.05296, over 4781.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2315, pruned_loss=0.0471, over 973908.20 frames.], batch size: 18, lr: 5.37e-04 2022-05-04 15:33:54,295 INFO [train.py:715] (0/8) Epoch 3, batch 20150, loss[loss=0.1838, simple_loss=0.2538, pruned_loss=0.05689, over 4925.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2309, pruned_loss=0.04707, over 973922.04 frames.], batch size: 18, lr: 5.37e-04 2022-05-04 15:34:33,628 INFO [train.py:715] (0/8) Epoch 3, batch 20200, loss[loss=0.1746, simple_loss=0.2437, pruned_loss=0.05269, over 4863.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2311, pruned_loss=0.04726, over 973982.33 frames.], batch size: 20, lr: 5.37e-04 2022-05-04 15:35:13,291 INFO [train.py:715] (0/8) Epoch 3, batch 20250, loss[loss=0.1394, simple_loss=0.2105, pruned_loss=0.03416, over 4984.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2305, pruned_loss=0.0465, over 974277.27 frames.], batch size: 14, lr: 5.37e-04 2022-05-04 15:35:53,127 INFO [train.py:715] (0/8) Epoch 3, batch 20300, loss[loss=0.1723, simple_loss=0.2383, pruned_loss=0.05315, over 4826.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2302, pruned_loss=0.04637, over 973892.51 frames.], batch size: 15, lr: 5.37e-04 2022-05-04 15:36:33,508 INFO [train.py:715] (0/8) Epoch 3, batch 20350, loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03108, over 4778.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2309, pruned_loss=0.047, over 972821.79 frames.], batch size: 18, lr: 5.37e-04 2022-05-04 15:37:12,083 INFO [train.py:715] (0/8) Epoch 3, batch 20400, loss[loss=0.1353, simple_loss=0.2137, pruned_loss=0.02849, over 4950.00 frames.], tot_loss[loss=0.163, simple_loss=0.2315, pruned_loss=0.04728, over 973905.68 frames.], batch size: 24, lr: 5.37e-04 2022-05-04 15:37:51,791 INFO [train.py:715] (0/8) Epoch 3, batch 20450, loss[loss=0.1731, simple_loss=0.2428, pruned_loss=0.05168, over 4986.00 frames.], tot_loss[loss=0.1634, simple_loss=0.2317, pruned_loss=0.04756, over 973723.22 frames.], batch size: 25, lr: 5.37e-04 2022-05-04 15:38:31,857 INFO [train.py:715] (0/8) Epoch 3, batch 20500, loss[loss=0.145, simple_loss=0.223, pruned_loss=0.03349, over 4972.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2313, pruned_loss=0.04689, over 973294.48 frames.], batch size: 24, lr: 5.37e-04 2022-05-04 15:39:10,982 INFO [train.py:715] (0/8) Epoch 3, batch 20550, loss[loss=0.1167, simple_loss=0.1812, pruned_loss=0.02608, over 4820.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2315, pruned_loss=0.04694, over 972587.34 frames.], batch size: 26, lr: 5.36e-04 2022-05-04 15:39:50,431 INFO [train.py:715] (0/8) Epoch 3, batch 20600, loss[loss=0.224, simple_loss=0.2822, pruned_loss=0.0829, over 4780.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2313, pruned_loss=0.04681, over 972793.78 frames.], batch size: 14, lr: 5.36e-04 2022-05-04 15:40:30,878 INFO [train.py:715] (0/8) Epoch 3, batch 20650, loss[loss=0.1177, simple_loss=0.1905, pruned_loss=0.02239, over 4824.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2321, pruned_loss=0.04677, over 972813.95 frames.], batch size: 25, lr: 5.36e-04 2022-05-04 15:41:10,732 INFO [train.py:715] (0/8) Epoch 3, batch 20700, loss[loss=0.1863, simple_loss=0.2425, pruned_loss=0.06503, over 4901.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2313, pruned_loss=0.04665, over 973297.83 frames.], batch size: 19, lr: 5.36e-04 2022-05-04 15:41:50,204 INFO [train.py:715] (0/8) Epoch 3, batch 20750, loss[loss=0.1681, simple_loss=0.2317, pruned_loss=0.05225, over 4987.00 frames.], tot_loss[loss=0.1628, simple_loss=0.232, pruned_loss=0.04681, over 973049.45 frames.], batch size: 14, lr: 5.36e-04 2022-05-04 15:42:30,285 INFO [train.py:715] (0/8) Epoch 3, batch 20800, loss[loss=0.1539, simple_loss=0.2171, pruned_loss=0.04538, over 4863.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2309, pruned_loss=0.04634, over 973188.76 frames.], batch size: 30, lr: 5.36e-04 2022-05-04 15:43:11,033 INFO [train.py:715] (0/8) Epoch 3, batch 20850, loss[loss=0.1576, simple_loss=0.2081, pruned_loss=0.05352, over 4902.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2314, pruned_loss=0.04677, over 972513.92 frames.], batch size: 19, lr: 5.36e-04 2022-05-04 15:43:50,801 INFO [train.py:715] (0/8) Epoch 3, batch 20900, loss[loss=0.2054, simple_loss=0.2661, pruned_loss=0.07233, over 4895.00 frames.], tot_loss[loss=0.1634, simple_loss=0.232, pruned_loss=0.04742, over 972992.70 frames.], batch size: 18, lr: 5.36e-04 2022-05-04 15:44:31,208 INFO [train.py:715] (0/8) Epoch 3, batch 20950, loss[loss=0.1343, simple_loss=0.2097, pruned_loss=0.0294, over 4859.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2319, pruned_loss=0.04729, over 973419.65 frames.], batch size: 20, lr: 5.36e-04 2022-05-04 15:45:11,738 INFO [train.py:715] (0/8) Epoch 3, batch 21000, loss[loss=0.1491, simple_loss=0.2162, pruned_loss=0.04103, over 4965.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2316, pruned_loss=0.04745, over 973570.75 frames.], batch size: 15, lr: 5.36e-04 2022-05-04 15:45:11,739 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 15:45:24,194 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1137, simple_loss=0.1999, pruned_loss=0.01377, over 914524.00 frames. 2022-05-04 15:46:04,595 INFO [train.py:715] (0/8) Epoch 3, batch 21050, loss[loss=0.1738, simple_loss=0.2406, pruned_loss=0.05347, over 4920.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2306, pruned_loss=0.04682, over 973411.80 frames.], batch size: 39, lr: 5.35e-04 2022-05-04 15:46:45,378 INFO [train.py:715] (0/8) Epoch 3, batch 21100, loss[loss=0.1955, simple_loss=0.2551, pruned_loss=0.06796, over 4945.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2308, pruned_loss=0.04675, over 973048.94 frames.], batch size: 21, lr: 5.35e-04 2022-05-04 15:47:25,774 INFO [train.py:715] (0/8) Epoch 3, batch 21150, loss[loss=0.1853, simple_loss=0.2644, pruned_loss=0.05313, over 4771.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2302, pruned_loss=0.04643, over 972993.42 frames.], batch size: 17, lr: 5.35e-04 2022-05-04 15:48:08,589 INFO [train.py:715] (0/8) Epoch 3, batch 21200, loss[loss=0.1408, simple_loss=0.2174, pruned_loss=0.03207, over 4905.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2303, pruned_loss=0.04646, over 972544.35 frames.], batch size: 17, lr: 5.35e-04 2022-05-04 15:48:49,618 INFO [train.py:715] (0/8) Epoch 3, batch 21250, loss[loss=0.1928, simple_loss=0.2396, pruned_loss=0.073, over 4751.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2306, pruned_loss=0.04683, over 971570.33 frames.], batch size: 16, lr: 5.35e-04 2022-05-04 15:49:28,342 INFO [train.py:715] (0/8) Epoch 3, batch 21300, loss[loss=0.1317, simple_loss=0.2028, pruned_loss=0.03034, over 4902.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2306, pruned_loss=0.04683, over 971570.17 frames.], batch size: 18, lr: 5.35e-04 2022-05-04 15:50:10,542 INFO [train.py:715] (0/8) Epoch 3, batch 21350, loss[loss=0.1352, simple_loss=0.2112, pruned_loss=0.02963, over 4832.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2306, pruned_loss=0.04687, over 971725.06 frames.], batch size: 26, lr: 5.35e-04 2022-05-04 15:50:51,357 INFO [train.py:715] (0/8) Epoch 3, batch 21400, loss[loss=0.159, simple_loss=0.2272, pruned_loss=0.04539, over 4889.00 frames.], tot_loss[loss=0.162, simple_loss=0.2304, pruned_loss=0.04676, over 972048.85 frames.], batch size: 19, lr: 5.35e-04 2022-05-04 15:51:30,338 INFO [train.py:715] (0/8) Epoch 3, batch 21450, loss[loss=0.1652, simple_loss=0.2246, pruned_loss=0.05288, over 4782.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2301, pruned_loss=0.04642, over 971794.39 frames.], batch size: 14, lr: 5.35e-04 2022-05-04 15:52:08,618 INFO [train.py:715] (0/8) Epoch 3, batch 21500, loss[loss=0.1782, simple_loss=0.2333, pruned_loss=0.06159, over 4938.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2299, pruned_loss=0.04643, over 972345.92 frames.], batch size: 35, lr: 5.34e-04 2022-05-04 15:52:47,660 INFO [train.py:715] (0/8) Epoch 3, batch 21550, loss[loss=0.1477, simple_loss=0.2141, pruned_loss=0.04062, over 4879.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2302, pruned_loss=0.04645, over 972648.57 frames.], batch size: 16, lr: 5.34e-04 2022-05-04 15:53:27,196 INFO [train.py:715] (0/8) Epoch 3, batch 21600, loss[loss=0.1541, simple_loss=0.2175, pruned_loss=0.04529, over 4776.00 frames.], tot_loss[loss=0.163, simple_loss=0.2315, pruned_loss=0.04721, over 971506.91 frames.], batch size: 14, lr: 5.34e-04 2022-05-04 15:54:06,105 INFO [train.py:715] (0/8) Epoch 3, batch 21650, loss[loss=0.1232, simple_loss=0.1874, pruned_loss=0.02954, over 4801.00 frames.], tot_loss[loss=0.1637, simple_loss=0.232, pruned_loss=0.04774, over 971636.67 frames.], batch size: 14, lr: 5.34e-04 2022-05-04 15:54:46,388 INFO [train.py:715] (0/8) Epoch 3, batch 21700, loss[loss=0.1556, simple_loss=0.2224, pruned_loss=0.0444, over 4986.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2318, pruned_loss=0.04769, over 973106.94 frames.], batch size: 14, lr: 5.34e-04 2022-05-04 15:55:26,902 INFO [train.py:715] (0/8) Epoch 3, batch 21750, loss[loss=0.1112, simple_loss=0.1793, pruned_loss=0.02153, over 4751.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2314, pruned_loss=0.04739, over 973167.77 frames.], batch size: 12, lr: 5.34e-04 2022-05-04 15:56:06,023 INFO [train.py:715] (0/8) Epoch 3, batch 21800, loss[loss=0.1593, simple_loss=0.2376, pruned_loss=0.0405, over 4957.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2309, pruned_loss=0.04692, over 973382.90 frames.], batch size: 24, lr: 5.34e-04 2022-05-04 15:56:44,177 INFO [train.py:715] (0/8) Epoch 3, batch 21850, loss[loss=0.1438, simple_loss=0.2177, pruned_loss=0.03498, over 4892.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2306, pruned_loss=0.04662, over 973394.20 frames.], batch size: 19, lr: 5.34e-04 2022-05-04 15:57:22,929 INFO [train.py:715] (0/8) Epoch 3, batch 21900, loss[loss=0.1302, simple_loss=0.2026, pruned_loss=0.02886, over 4823.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2305, pruned_loss=0.04654, over 973153.22 frames.], batch size: 12, lr: 5.34e-04 2022-05-04 15:58:03,616 INFO [train.py:715] (0/8) Epoch 3, batch 21950, loss[loss=0.1476, simple_loss=0.2211, pruned_loss=0.03703, over 4867.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2306, pruned_loss=0.04634, over 972710.94 frames.], batch size: 16, lr: 5.34e-04 2022-05-04 15:58:43,253 INFO [train.py:715] (0/8) Epoch 3, batch 22000, loss[loss=0.1824, simple_loss=0.2467, pruned_loss=0.05901, over 4775.00 frames.], tot_loss[loss=0.1618, simple_loss=0.231, pruned_loss=0.04633, over 972496.76 frames.], batch size: 14, lr: 5.33e-04 2022-05-04 15:59:23,567 INFO [train.py:715] (0/8) Epoch 3, batch 22050, loss[loss=0.1296, simple_loss=0.2001, pruned_loss=0.02953, over 4851.00 frames.], tot_loss[loss=0.161, simple_loss=0.2299, pruned_loss=0.04606, over 972412.64 frames.], batch size: 13, lr: 5.33e-04 2022-05-04 16:00:04,302 INFO [train.py:715] (0/8) Epoch 3, batch 22100, loss[loss=0.1649, simple_loss=0.2399, pruned_loss=0.04491, over 4972.00 frames.], tot_loss[loss=0.1608, simple_loss=0.2299, pruned_loss=0.04586, over 972165.16 frames.], batch size: 15, lr: 5.33e-04 2022-05-04 16:00:44,821 INFO [train.py:715] (0/8) Epoch 3, batch 22150, loss[loss=0.177, simple_loss=0.2452, pruned_loss=0.05441, over 4986.00 frames.], tot_loss[loss=0.161, simple_loss=0.23, pruned_loss=0.04599, over 971862.82 frames.], batch size: 25, lr: 5.33e-04 2022-05-04 16:01:24,040 INFO [train.py:715] (0/8) Epoch 3, batch 22200, loss[loss=0.1469, simple_loss=0.2253, pruned_loss=0.03425, over 4783.00 frames.], tot_loss[loss=0.1608, simple_loss=0.23, pruned_loss=0.04578, over 970736.66 frames.], batch size: 18, lr: 5.33e-04 2022-05-04 16:02:04,288 INFO [train.py:715] (0/8) Epoch 3, batch 22250, loss[loss=0.1559, simple_loss=0.2278, pruned_loss=0.04201, over 4928.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2297, pruned_loss=0.04608, over 970491.85 frames.], batch size: 18, lr: 5.33e-04 2022-05-04 16:02:45,548 INFO [train.py:715] (0/8) Epoch 3, batch 22300, loss[loss=0.1563, simple_loss=0.23, pruned_loss=0.04135, over 4894.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2297, pruned_loss=0.04556, over 971093.22 frames.], batch size: 17, lr: 5.33e-04 2022-05-04 16:03:24,532 INFO [train.py:715] (0/8) Epoch 3, batch 22350, loss[loss=0.1524, simple_loss=0.2199, pruned_loss=0.04245, over 4945.00 frames.], tot_loss[loss=0.161, simple_loss=0.2301, pruned_loss=0.04589, over 971902.70 frames.], batch size: 23, lr: 5.33e-04 2022-05-04 16:04:04,609 INFO [train.py:715] (0/8) Epoch 3, batch 22400, loss[loss=0.1597, simple_loss=0.2368, pruned_loss=0.04129, over 4957.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2302, pruned_loss=0.04629, over 971795.91 frames.], batch size: 21, lr: 5.33e-04 2022-05-04 16:04:45,515 INFO [train.py:715] (0/8) Epoch 3, batch 22450, loss[loss=0.1577, simple_loss=0.2104, pruned_loss=0.05251, over 4978.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2302, pruned_loss=0.04624, over 973314.83 frames.], batch size: 35, lr: 5.32e-04 2022-05-04 16:05:25,966 INFO [train.py:715] (0/8) Epoch 3, batch 22500, loss[loss=0.2028, simple_loss=0.27, pruned_loss=0.0678, over 4941.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2303, pruned_loss=0.04642, over 973151.84 frames.], batch size: 21, lr: 5.32e-04 2022-05-04 16:06:05,379 INFO [train.py:715] (0/8) Epoch 3, batch 22550, loss[loss=0.216, simple_loss=0.2685, pruned_loss=0.08171, over 4994.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2303, pruned_loss=0.04655, over 973773.92 frames.], batch size: 16, lr: 5.32e-04 2022-05-04 16:06:45,621 INFO [train.py:715] (0/8) Epoch 3, batch 22600, loss[loss=0.17, simple_loss=0.2405, pruned_loss=0.04979, over 4818.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2302, pruned_loss=0.04583, over 972942.17 frames.], batch size: 27, lr: 5.32e-04 2022-05-04 16:07:26,483 INFO [train.py:715] (0/8) Epoch 3, batch 22650, loss[loss=0.1222, simple_loss=0.1928, pruned_loss=0.02579, over 4759.00 frames.], tot_loss[loss=0.1605, simple_loss=0.23, pruned_loss=0.04552, over 972358.62 frames.], batch size: 16, lr: 5.32e-04 2022-05-04 16:08:06,296 INFO [train.py:715] (0/8) Epoch 3, batch 22700, loss[loss=0.1768, simple_loss=0.242, pruned_loss=0.05578, over 4798.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2312, pruned_loss=0.04686, over 972691.98 frames.], batch size: 13, lr: 5.32e-04 2022-05-04 16:08:46,700 INFO [train.py:715] (0/8) Epoch 3, batch 22750, loss[loss=0.1635, simple_loss=0.2263, pruned_loss=0.05034, over 4819.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2317, pruned_loss=0.04737, over 972746.36 frames.], batch size: 13, lr: 5.32e-04 2022-05-04 16:09:27,102 INFO [train.py:715] (0/8) Epoch 3, batch 22800, loss[loss=0.1592, simple_loss=0.2323, pruned_loss=0.04307, over 4925.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2319, pruned_loss=0.04748, over 971935.18 frames.], batch size: 23, lr: 5.32e-04 2022-05-04 16:10:07,175 INFO [train.py:715] (0/8) Epoch 3, batch 22850, loss[loss=0.1831, simple_loss=0.2426, pruned_loss=0.06176, over 4840.00 frames.], tot_loss[loss=0.164, simple_loss=0.2325, pruned_loss=0.04777, over 971313.86 frames.], batch size: 15, lr: 5.32e-04 2022-05-04 16:10:46,893 INFO [train.py:715] (0/8) Epoch 3, batch 22900, loss[loss=0.173, simple_loss=0.2494, pruned_loss=0.04832, over 4912.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2325, pruned_loss=0.04771, over 971868.68 frames.], batch size: 18, lr: 5.32e-04 2022-05-04 16:11:27,322 INFO [train.py:715] (0/8) Epoch 3, batch 22950, loss[loss=0.1704, simple_loss=0.2294, pruned_loss=0.05576, over 4836.00 frames.], tot_loss[loss=0.1649, simple_loss=0.2328, pruned_loss=0.0485, over 971870.58 frames.], batch size: 30, lr: 5.31e-04 2022-05-04 16:12:08,435 INFO [train.py:715] (0/8) Epoch 3, batch 23000, loss[loss=0.1606, simple_loss=0.2253, pruned_loss=0.04793, over 4774.00 frames.], tot_loss[loss=0.165, simple_loss=0.2327, pruned_loss=0.04867, over 971909.84 frames.], batch size: 18, lr: 5.31e-04 2022-05-04 16:12:48,293 INFO [train.py:715] (0/8) Epoch 3, batch 23050, loss[loss=0.1982, simple_loss=0.2629, pruned_loss=0.06678, over 4778.00 frames.], tot_loss[loss=0.1648, simple_loss=0.2328, pruned_loss=0.04841, over 972928.05 frames.], batch size: 18, lr: 5.31e-04 2022-05-04 16:13:28,628 INFO [train.py:715] (0/8) Epoch 3, batch 23100, loss[loss=0.1731, simple_loss=0.2441, pruned_loss=0.05108, over 4865.00 frames.], tot_loss[loss=0.1646, simple_loss=0.2327, pruned_loss=0.04825, over 973290.39 frames.], batch size: 20, lr: 5.31e-04 2022-05-04 16:14:09,393 INFO [train.py:715] (0/8) Epoch 3, batch 23150, loss[loss=0.1528, simple_loss=0.2183, pruned_loss=0.04372, over 4891.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2313, pruned_loss=0.04707, over 973422.70 frames.], batch size: 19, lr: 5.31e-04 2022-05-04 16:14:49,962 INFO [train.py:715] (0/8) Epoch 3, batch 23200, loss[loss=0.1682, simple_loss=0.2382, pruned_loss=0.04909, over 4984.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2314, pruned_loss=0.04675, over 973085.49 frames.], batch size: 35, lr: 5.31e-04 2022-05-04 16:15:29,486 INFO [train.py:715] (0/8) Epoch 3, batch 23250, loss[loss=0.1628, simple_loss=0.2277, pruned_loss=0.04889, over 4763.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2314, pruned_loss=0.0467, over 973165.56 frames.], batch size: 12, lr: 5.31e-04 2022-05-04 16:16:10,260 INFO [train.py:715] (0/8) Epoch 3, batch 23300, loss[loss=0.1563, simple_loss=0.2221, pruned_loss=0.04524, over 4855.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2318, pruned_loss=0.04691, over 972432.13 frames.], batch size: 12, lr: 5.31e-04 2022-05-04 16:16:49,868 INFO [train.py:715] (0/8) Epoch 3, batch 23350, loss[loss=0.1707, simple_loss=0.2433, pruned_loss=0.04903, over 4755.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2314, pruned_loss=0.04685, over 972080.21 frames.], batch size: 19, lr: 5.31e-04 2022-05-04 16:17:27,674 INFO [train.py:715] (0/8) Epoch 3, batch 23400, loss[loss=0.1696, simple_loss=0.2334, pruned_loss=0.0529, over 4866.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2313, pruned_loss=0.04684, over 973145.59 frames.], batch size: 30, lr: 5.30e-04 2022-05-04 16:18:06,220 INFO [train.py:715] (0/8) Epoch 3, batch 23450, loss[loss=0.1881, simple_loss=0.2539, pruned_loss=0.06117, over 4955.00 frames.], tot_loss[loss=0.1634, simple_loss=0.232, pruned_loss=0.04742, over 972441.48 frames.], batch size: 35, lr: 5.30e-04 2022-05-04 16:18:44,908 INFO [train.py:715] (0/8) Epoch 3, batch 23500, loss[loss=0.1662, simple_loss=0.2337, pruned_loss=0.04932, over 4826.00 frames.], tot_loss[loss=0.1631, simple_loss=0.2319, pruned_loss=0.04718, over 972096.86 frames.], batch size: 13, lr: 5.30e-04 2022-05-04 16:19:24,105 INFO [train.py:715] (0/8) Epoch 3, batch 23550, loss[loss=0.1629, simple_loss=0.2427, pruned_loss=0.04159, over 4987.00 frames.], tot_loss[loss=0.163, simple_loss=0.2318, pruned_loss=0.04706, over 972667.54 frames.], batch size: 28, lr: 5.30e-04 2022-05-04 16:19:39,842 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-128000.pt 2022-05-04 16:20:05,338 INFO [train.py:715] (0/8) Epoch 3, batch 23600, loss[loss=0.1451, simple_loss=0.1987, pruned_loss=0.04576, over 4779.00 frames.], tot_loss[loss=0.163, simple_loss=0.2319, pruned_loss=0.04705, over 972150.43 frames.], batch size: 13, lr: 5.30e-04 2022-05-04 16:20:44,861 INFO [train.py:715] (0/8) Epoch 3, batch 23650, loss[loss=0.1323, simple_loss=0.2089, pruned_loss=0.02785, over 4847.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2302, pruned_loss=0.04621, over 972077.37 frames.], batch size: 20, lr: 5.30e-04 2022-05-04 16:21:24,817 INFO [train.py:715] (0/8) Epoch 3, batch 23700, loss[loss=0.1429, simple_loss=0.213, pruned_loss=0.03647, over 4791.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2307, pruned_loss=0.04648, over 971451.06 frames.], batch size: 24, lr: 5.30e-04 2022-05-04 16:22:03,569 INFO [train.py:715] (0/8) Epoch 3, batch 23750, loss[loss=0.1414, simple_loss=0.2196, pruned_loss=0.03163, over 4730.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2314, pruned_loss=0.04679, over 971534.43 frames.], batch size: 16, lr: 5.30e-04 2022-05-04 16:22:43,181 INFO [train.py:715] (0/8) Epoch 3, batch 23800, loss[loss=0.172, simple_loss=0.2407, pruned_loss=0.05165, over 4931.00 frames.], tot_loss[loss=0.1632, simple_loss=0.2317, pruned_loss=0.04734, over 971274.90 frames.], batch size: 39, lr: 5.30e-04 2022-05-04 16:23:22,778 INFO [train.py:715] (0/8) Epoch 3, batch 23850, loss[loss=0.1607, simple_loss=0.2303, pruned_loss=0.04557, over 4988.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2312, pruned_loss=0.04666, over 971771.72 frames.], batch size: 16, lr: 5.30e-04 2022-05-04 16:24:02,496 INFO [train.py:715] (0/8) Epoch 3, batch 23900, loss[loss=0.1759, simple_loss=0.23, pruned_loss=0.06089, over 4699.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2317, pruned_loss=0.04686, over 971046.98 frames.], batch size: 15, lr: 5.29e-04 2022-05-04 16:24:41,549 INFO [train.py:715] (0/8) Epoch 3, batch 23950, loss[loss=0.1575, simple_loss=0.2213, pruned_loss=0.04685, over 4958.00 frames.], tot_loss[loss=0.1609, simple_loss=0.23, pruned_loss=0.04593, over 970891.01 frames.], batch size: 21, lr: 5.29e-04 2022-05-04 16:25:20,395 INFO [train.py:715] (0/8) Epoch 3, batch 24000, loss[loss=0.1832, simple_loss=0.2412, pruned_loss=0.0626, over 4844.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2306, pruned_loss=0.04614, over 970886.84 frames.], batch size: 30, lr: 5.29e-04 2022-05-04 16:25:20,396 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 16:25:32,862 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1132, simple_loss=0.1992, pruned_loss=0.0136, over 914524.00 frames. 2022-05-04 16:26:12,208 INFO [train.py:715] (0/8) Epoch 3, batch 24050, loss[loss=0.1762, simple_loss=0.2343, pruned_loss=0.05901, over 4769.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2309, pruned_loss=0.04668, over 971293.37 frames.], batch size: 19, lr: 5.29e-04 2022-05-04 16:26:52,061 INFO [train.py:715] (0/8) Epoch 3, batch 24100, loss[loss=0.1237, simple_loss=0.1937, pruned_loss=0.02691, over 4758.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2307, pruned_loss=0.04674, over 971317.04 frames.], batch size: 12, lr: 5.29e-04 2022-05-04 16:27:30,861 INFO [train.py:715] (0/8) Epoch 3, batch 24150, loss[loss=0.1699, simple_loss=0.2379, pruned_loss=0.05092, over 4923.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2316, pruned_loss=0.04711, over 972622.00 frames.], batch size: 18, lr: 5.29e-04 2022-05-04 16:28:10,107 INFO [train.py:715] (0/8) Epoch 3, batch 24200, loss[loss=0.1473, simple_loss=0.216, pruned_loss=0.03935, over 4807.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2308, pruned_loss=0.0468, over 972337.22 frames.], batch size: 13, lr: 5.29e-04 2022-05-04 16:28:50,505 INFO [train.py:715] (0/8) Epoch 3, batch 24250, loss[loss=0.1746, simple_loss=0.2254, pruned_loss=0.06192, over 4748.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2299, pruned_loss=0.04655, over 972524.57 frames.], batch size: 16, lr: 5.29e-04 2022-05-04 16:29:30,770 INFO [train.py:715] (0/8) Epoch 3, batch 24300, loss[loss=0.155, simple_loss=0.2282, pruned_loss=0.04087, over 4785.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2296, pruned_loss=0.04688, over 972406.31 frames.], batch size: 14, lr: 5.29e-04 2022-05-04 16:30:10,085 INFO [train.py:715] (0/8) Epoch 3, batch 24350, loss[loss=0.1419, simple_loss=0.2116, pruned_loss=0.0361, over 4766.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2308, pruned_loss=0.04738, over 972597.49 frames.], batch size: 17, lr: 5.29e-04 2022-05-04 16:30:49,734 INFO [train.py:715] (0/8) Epoch 3, batch 24400, loss[loss=0.1508, simple_loss=0.2116, pruned_loss=0.04497, over 4923.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2302, pruned_loss=0.04711, over 973264.13 frames.], batch size: 18, lr: 5.28e-04 2022-05-04 16:31:29,802 INFO [train.py:715] (0/8) Epoch 3, batch 24450, loss[loss=0.1537, simple_loss=0.2288, pruned_loss=0.03928, over 4902.00 frames.], tot_loss[loss=0.161, simple_loss=0.2293, pruned_loss=0.04636, over 973032.44 frames.], batch size: 19, lr: 5.28e-04 2022-05-04 16:32:09,117 INFO [train.py:715] (0/8) Epoch 3, batch 24500, loss[loss=0.1782, simple_loss=0.241, pruned_loss=0.05777, over 4783.00 frames.], tot_loss[loss=0.162, simple_loss=0.2298, pruned_loss=0.04709, over 972840.93 frames.], batch size: 14, lr: 5.28e-04 2022-05-04 16:32:48,512 INFO [train.py:715] (0/8) Epoch 3, batch 24550, loss[loss=0.1475, simple_loss=0.2153, pruned_loss=0.03983, over 4860.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2305, pruned_loss=0.04742, over 972583.97 frames.], batch size: 20, lr: 5.28e-04 2022-05-04 16:33:28,753 INFO [train.py:715] (0/8) Epoch 3, batch 24600, loss[loss=0.1706, simple_loss=0.2423, pruned_loss=0.04949, over 4989.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2303, pruned_loss=0.04673, over 972115.01 frames.], batch size: 25, lr: 5.28e-04 2022-05-04 16:34:08,288 INFO [train.py:715] (0/8) Epoch 3, batch 24650, loss[loss=0.1593, simple_loss=0.227, pruned_loss=0.04584, over 4816.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2306, pruned_loss=0.04655, over 971530.35 frames.], batch size: 13, lr: 5.28e-04 2022-05-04 16:34:47,791 INFO [train.py:715] (0/8) Epoch 3, batch 24700, loss[loss=0.1334, simple_loss=0.2059, pruned_loss=0.03051, over 4836.00 frames.], tot_loss[loss=0.1625, simple_loss=0.2311, pruned_loss=0.04691, over 971283.49 frames.], batch size: 30, lr: 5.28e-04 2022-05-04 16:35:26,410 INFO [train.py:715] (0/8) Epoch 3, batch 24750, loss[loss=0.1583, simple_loss=0.2291, pruned_loss=0.0437, over 4757.00 frames.], tot_loss[loss=0.1633, simple_loss=0.2318, pruned_loss=0.04739, over 971560.05 frames.], batch size: 19, lr: 5.28e-04 2022-05-04 16:36:07,061 INFO [train.py:715] (0/8) Epoch 3, batch 24800, loss[loss=0.1412, simple_loss=0.2078, pruned_loss=0.03733, over 4830.00 frames.], tot_loss[loss=0.1624, simple_loss=0.231, pruned_loss=0.04686, over 972232.36 frames.], batch size: 30, lr: 5.28e-04 2022-05-04 16:36:46,780 INFO [train.py:715] (0/8) Epoch 3, batch 24850, loss[loss=0.1852, simple_loss=0.2522, pruned_loss=0.05912, over 4834.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2312, pruned_loss=0.0473, over 972088.49 frames.], batch size: 15, lr: 5.28e-04 2022-05-04 16:37:25,561 INFO [train.py:715] (0/8) Epoch 3, batch 24900, loss[loss=0.1749, simple_loss=0.2396, pruned_loss=0.0551, over 4815.00 frames.], tot_loss[loss=0.162, simple_loss=0.2303, pruned_loss=0.04687, over 972628.99 frames.], batch size: 26, lr: 5.27e-04 2022-05-04 16:38:05,474 INFO [train.py:715] (0/8) Epoch 3, batch 24950, loss[loss=0.1141, simple_loss=0.1894, pruned_loss=0.01939, over 4868.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2303, pruned_loss=0.04694, over 971573.34 frames.], batch size: 12, lr: 5.27e-04 2022-05-04 16:38:45,653 INFO [train.py:715] (0/8) Epoch 3, batch 25000, loss[loss=0.1611, simple_loss=0.2243, pruned_loss=0.049, over 4866.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2306, pruned_loss=0.04693, over 971816.55 frames.], batch size: 30, lr: 5.27e-04 2022-05-04 16:39:25,200 INFO [train.py:715] (0/8) Epoch 3, batch 25050, loss[loss=0.2069, simple_loss=0.2812, pruned_loss=0.06631, over 4773.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2301, pruned_loss=0.04642, over 972103.70 frames.], batch size: 19, lr: 5.27e-04 2022-05-04 16:40:04,367 INFO [train.py:715] (0/8) Epoch 3, batch 25100, loss[loss=0.1692, simple_loss=0.2301, pruned_loss=0.05409, over 4926.00 frames.], tot_loss[loss=0.1616, simple_loss=0.23, pruned_loss=0.04653, over 972064.02 frames.], batch size: 18, lr: 5.27e-04 2022-05-04 16:40:44,396 INFO [train.py:715] (0/8) Epoch 3, batch 25150, loss[loss=0.1666, simple_loss=0.2355, pruned_loss=0.04884, over 4953.00 frames.], tot_loss[loss=0.1611, simple_loss=0.2302, pruned_loss=0.04603, over 971740.57 frames.], batch size: 24, lr: 5.27e-04 2022-05-04 16:41:23,892 INFO [train.py:715] (0/8) Epoch 3, batch 25200, loss[loss=0.1743, simple_loss=0.2435, pruned_loss=0.05258, over 4791.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2285, pruned_loss=0.0454, over 971123.29 frames.], batch size: 14, lr: 5.27e-04 2022-05-04 16:42:03,020 INFO [train.py:715] (0/8) Epoch 3, batch 25250, loss[loss=0.1425, simple_loss=0.2138, pruned_loss=0.03567, over 4932.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2279, pruned_loss=0.04494, over 971133.60 frames.], batch size: 23, lr: 5.27e-04 2022-05-04 16:42:43,126 INFO [train.py:715] (0/8) Epoch 3, batch 25300, loss[loss=0.1538, simple_loss=0.208, pruned_loss=0.04975, over 4690.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2284, pruned_loss=0.04526, over 970863.89 frames.], batch size: 15, lr: 5.27e-04 2022-05-04 16:43:22,954 INFO [train.py:715] (0/8) Epoch 3, batch 25350, loss[loss=0.1901, simple_loss=0.2582, pruned_loss=0.061, over 4962.00 frames.], tot_loss[loss=0.1599, simple_loss=0.2287, pruned_loss=0.04555, over 971520.59 frames.], batch size: 31, lr: 5.26e-04 2022-05-04 16:44:02,968 INFO [train.py:715] (0/8) Epoch 3, batch 25400, loss[loss=0.1469, simple_loss=0.2151, pruned_loss=0.03937, over 4956.00 frames.], tot_loss[loss=0.1596, simple_loss=0.2285, pruned_loss=0.04541, over 971538.61 frames.], batch size: 35, lr: 5.26e-04 2022-05-04 16:44:42,162 INFO [train.py:715] (0/8) Epoch 3, batch 25450, loss[loss=0.1558, simple_loss=0.2309, pruned_loss=0.04039, over 4808.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2284, pruned_loss=0.0455, over 972044.50 frames.], batch size: 21, lr: 5.26e-04 2022-05-04 16:45:22,336 INFO [train.py:715] (0/8) Epoch 3, batch 25500, loss[loss=0.1628, simple_loss=0.2253, pruned_loss=0.05014, over 4863.00 frames.], tot_loss[loss=0.16, simple_loss=0.229, pruned_loss=0.04549, over 971854.29 frames.], batch size: 20, lr: 5.26e-04 2022-05-04 16:46:02,166 INFO [train.py:715] (0/8) Epoch 3, batch 25550, loss[loss=0.1293, simple_loss=0.2145, pruned_loss=0.02206, over 4984.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2295, pruned_loss=0.04535, over 972073.55 frames.], batch size: 28, lr: 5.26e-04 2022-05-04 16:46:41,623 INFO [train.py:715] (0/8) Epoch 3, batch 25600, loss[loss=0.1439, simple_loss=0.2228, pruned_loss=0.03251, over 4831.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2281, pruned_loss=0.04436, over 971943.00 frames.], batch size: 13, lr: 5.26e-04 2022-05-04 16:47:22,005 INFO [train.py:715] (0/8) Epoch 3, batch 25650, loss[loss=0.123, simple_loss=0.1904, pruned_loss=0.02779, over 4826.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2288, pruned_loss=0.045, over 972167.00 frames.], batch size: 13, lr: 5.26e-04 2022-05-04 16:48:02,206 INFO [train.py:715] (0/8) Epoch 3, batch 25700, loss[loss=0.1675, simple_loss=0.2336, pruned_loss=0.05075, over 4784.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2288, pruned_loss=0.04538, over 971651.33 frames.], batch size: 17, lr: 5.26e-04 2022-05-04 16:48:41,539 INFO [train.py:715] (0/8) Epoch 3, batch 25750, loss[loss=0.1755, simple_loss=0.2559, pruned_loss=0.04753, over 4969.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2292, pruned_loss=0.0458, over 972013.68 frames.], batch size: 24, lr: 5.26e-04 2022-05-04 16:49:21,104 INFO [train.py:715] (0/8) Epoch 3, batch 25800, loss[loss=0.1398, simple_loss=0.2013, pruned_loss=0.0392, over 4792.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2305, pruned_loss=0.04642, over 972226.26 frames.], batch size: 21, lr: 5.26e-04 2022-05-04 16:50:01,082 INFO [train.py:715] (0/8) Epoch 3, batch 25850, loss[loss=0.1498, simple_loss=0.219, pruned_loss=0.04028, over 4990.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2308, pruned_loss=0.04668, over 972026.57 frames.], batch size: 14, lr: 5.25e-04 2022-05-04 16:50:39,397 INFO [train.py:715] (0/8) Epoch 3, batch 25900, loss[loss=0.2043, simple_loss=0.269, pruned_loss=0.06976, over 4938.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2305, pruned_loss=0.04669, over 972531.78 frames.], batch size: 39, lr: 5.25e-04 2022-05-04 16:51:18,324 INFO [train.py:715] (0/8) Epoch 3, batch 25950, loss[loss=0.1429, simple_loss=0.2243, pruned_loss=0.03069, over 4820.00 frames.], tot_loss[loss=0.161, simple_loss=0.2298, pruned_loss=0.04609, over 972473.88 frames.], batch size: 26, lr: 5.25e-04 2022-05-04 16:51:58,434 INFO [train.py:715] (0/8) Epoch 3, batch 26000, loss[loss=0.1436, simple_loss=0.2289, pruned_loss=0.02909, over 4749.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2302, pruned_loss=0.04637, over 972143.01 frames.], batch size: 19, lr: 5.25e-04 2022-05-04 16:52:37,678 INFO [train.py:715] (0/8) Epoch 3, batch 26050, loss[loss=0.2168, simple_loss=0.2652, pruned_loss=0.08418, over 4774.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2303, pruned_loss=0.04631, over 972238.65 frames.], batch size: 14, lr: 5.25e-04 2022-05-04 16:53:16,013 INFO [train.py:715] (0/8) Epoch 3, batch 26100, loss[loss=0.1717, simple_loss=0.2294, pruned_loss=0.05699, over 4861.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2299, pruned_loss=0.0463, over 972565.01 frames.], batch size: 32, lr: 5.25e-04 2022-05-04 16:53:55,503 INFO [train.py:715] (0/8) Epoch 3, batch 26150, loss[loss=0.187, simple_loss=0.2553, pruned_loss=0.05932, over 4744.00 frames.], tot_loss[loss=0.1604, simple_loss=0.229, pruned_loss=0.04588, over 972733.48 frames.], batch size: 19, lr: 5.25e-04 2022-05-04 16:54:35,543 INFO [train.py:715] (0/8) Epoch 3, batch 26200, loss[loss=0.1977, simple_loss=0.2708, pruned_loss=0.06233, over 4817.00 frames.], tot_loss[loss=0.1606, simple_loss=0.2292, pruned_loss=0.046, over 972780.25 frames.], batch size: 27, lr: 5.25e-04 2022-05-04 16:55:13,648 INFO [train.py:715] (0/8) Epoch 3, batch 26250, loss[loss=0.1784, simple_loss=0.2486, pruned_loss=0.05409, over 4892.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2302, pruned_loss=0.04648, over 972862.89 frames.], batch size: 19, lr: 5.25e-04 2022-05-04 16:55:52,856 INFO [train.py:715] (0/8) Epoch 3, batch 26300, loss[loss=0.1739, simple_loss=0.2436, pruned_loss=0.05216, over 4799.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2303, pruned_loss=0.04656, over 973327.43 frames.], batch size: 24, lr: 5.25e-04 2022-05-04 16:56:32,819 INFO [train.py:715] (0/8) Epoch 3, batch 26350, loss[loss=0.1699, simple_loss=0.2424, pruned_loss=0.04866, over 4924.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2301, pruned_loss=0.04656, over 972829.81 frames.], batch size: 18, lr: 5.24e-04 2022-05-04 16:57:12,182 INFO [train.py:715] (0/8) Epoch 3, batch 26400, loss[loss=0.1676, simple_loss=0.2422, pruned_loss=0.04652, over 4858.00 frames.], tot_loss[loss=0.162, simple_loss=0.2304, pruned_loss=0.04678, over 972663.30 frames.], batch size: 32, lr: 5.24e-04 2022-05-04 16:57:51,174 INFO [train.py:715] (0/8) Epoch 3, batch 26450, loss[loss=0.1457, simple_loss=0.224, pruned_loss=0.03368, over 4962.00 frames.], tot_loss[loss=0.1611, simple_loss=0.2298, pruned_loss=0.04623, over 972381.51 frames.], batch size: 24, lr: 5.24e-04 2022-05-04 16:58:30,426 INFO [train.py:715] (0/8) Epoch 3, batch 26500, loss[loss=0.185, simple_loss=0.2515, pruned_loss=0.05926, over 4929.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2293, pruned_loss=0.04573, over 971950.71 frames.], batch size: 39, lr: 5.24e-04 2022-05-04 16:59:09,908 INFO [train.py:715] (0/8) Epoch 3, batch 26550, loss[loss=0.1612, simple_loss=0.2355, pruned_loss=0.04345, over 4944.00 frames.], tot_loss[loss=0.1596, simple_loss=0.2285, pruned_loss=0.04536, over 971917.02 frames.], batch size: 29, lr: 5.24e-04 2022-05-04 16:59:48,110 INFO [train.py:715] (0/8) Epoch 3, batch 26600, loss[loss=0.1548, simple_loss=0.2272, pruned_loss=0.04124, over 4792.00 frames.], tot_loss[loss=0.1615, simple_loss=0.23, pruned_loss=0.04647, over 972283.25 frames.], batch size: 24, lr: 5.24e-04 2022-05-04 17:00:27,327 INFO [train.py:715] (0/8) Epoch 3, batch 26650, loss[loss=0.1511, simple_loss=0.2228, pruned_loss=0.03974, over 4921.00 frames.], tot_loss[loss=0.1602, simple_loss=0.2291, pruned_loss=0.04565, over 971575.10 frames.], batch size: 29, lr: 5.24e-04 2022-05-04 17:01:07,869 INFO [train.py:715] (0/8) Epoch 3, batch 26700, loss[loss=0.1773, simple_loss=0.2353, pruned_loss=0.05967, over 4835.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2292, pruned_loss=0.04577, over 971855.24 frames.], batch size: 13, lr: 5.24e-04 2022-05-04 17:01:47,348 INFO [train.py:715] (0/8) Epoch 3, batch 26750, loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02926, over 4941.00 frames.], tot_loss[loss=0.161, simple_loss=0.23, pruned_loss=0.046, over 972010.65 frames.], batch size: 21, lr: 5.24e-04 2022-05-04 17:02:26,597 INFO [train.py:715] (0/8) Epoch 3, batch 26800, loss[loss=0.1917, simple_loss=0.261, pruned_loss=0.06117, over 4941.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2315, pruned_loss=0.04651, over 972533.21 frames.], batch size: 21, lr: 5.24e-04 2022-05-04 17:03:06,718 INFO [train.py:715] (0/8) Epoch 3, batch 26850, loss[loss=0.1683, simple_loss=0.2326, pruned_loss=0.05197, over 4897.00 frames.], tot_loss[loss=0.163, simple_loss=0.2315, pruned_loss=0.04727, over 971579.62 frames.], batch size: 22, lr: 5.23e-04 2022-05-04 17:03:47,102 INFO [train.py:715] (0/8) Epoch 3, batch 26900, loss[loss=0.1553, simple_loss=0.214, pruned_loss=0.04825, over 4833.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2314, pruned_loss=0.04707, over 972639.10 frames.], batch size: 26, lr: 5.23e-04 2022-05-04 17:04:26,660 INFO [train.py:715] (0/8) Epoch 3, batch 26950, loss[loss=0.1724, simple_loss=0.2476, pruned_loss=0.04862, over 4785.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2319, pruned_loss=0.04672, over 973169.40 frames.], batch size: 17, lr: 5.23e-04 2022-05-04 17:05:05,428 INFO [train.py:715] (0/8) Epoch 3, batch 27000, loss[loss=0.1311, simple_loss=0.2062, pruned_loss=0.02799, over 4975.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2314, pruned_loss=0.04662, over 973187.75 frames.], batch size: 28, lr: 5.23e-04 2022-05-04 17:05:05,429 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 17:05:14,909 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1134, simple_loss=0.1995, pruned_loss=0.01366, over 914524.00 frames. 2022-05-04 17:05:54,550 INFO [train.py:715] (0/8) Epoch 3, batch 27050, loss[loss=0.1415, simple_loss=0.2207, pruned_loss=0.03115, over 4893.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2301, pruned_loss=0.04617, over 973052.17 frames.], batch size: 22, lr: 5.23e-04 2022-05-04 17:06:34,877 INFO [train.py:715] (0/8) Epoch 3, batch 27100, loss[loss=0.1662, simple_loss=0.2292, pruned_loss=0.05158, over 4830.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2304, pruned_loss=0.04636, over 972157.16 frames.], batch size: 32, lr: 5.23e-04 2022-05-04 17:07:14,165 INFO [train.py:715] (0/8) Epoch 3, batch 27150, loss[loss=0.1719, simple_loss=0.2503, pruned_loss=0.04672, over 4813.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2308, pruned_loss=0.04678, over 972410.82 frames.], batch size: 26, lr: 5.23e-04 2022-05-04 17:07:52,931 INFO [train.py:715] (0/8) Epoch 3, batch 27200, loss[loss=0.1337, simple_loss=0.2092, pruned_loss=0.02914, over 4791.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2298, pruned_loss=0.04642, over 971819.40 frames.], batch size: 17, lr: 5.23e-04 2022-05-04 17:08:32,669 INFO [train.py:715] (0/8) Epoch 3, batch 27250, loss[loss=0.193, simple_loss=0.2543, pruned_loss=0.06588, over 4810.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2309, pruned_loss=0.04731, over 972176.15 frames.], batch size: 14, lr: 5.23e-04 2022-05-04 17:09:12,363 INFO [train.py:715] (0/8) Epoch 3, batch 27300, loss[loss=0.1545, simple_loss=0.222, pruned_loss=0.04351, over 4797.00 frames.], tot_loss[loss=0.1636, simple_loss=0.2319, pruned_loss=0.04771, over 972311.44 frames.], batch size: 24, lr: 5.23e-04 2022-05-04 17:09:51,019 INFO [train.py:715] (0/8) Epoch 3, batch 27350, loss[loss=0.1841, simple_loss=0.2535, pruned_loss=0.05741, over 4682.00 frames.], tot_loss[loss=0.1642, simple_loss=0.2321, pruned_loss=0.04812, over 972268.27 frames.], batch size: 15, lr: 5.22e-04 2022-05-04 17:10:30,265 INFO [train.py:715] (0/8) Epoch 3, batch 27400, loss[loss=0.1672, simple_loss=0.234, pruned_loss=0.05026, over 4975.00 frames.], tot_loss[loss=0.1639, simple_loss=0.2318, pruned_loss=0.048, over 971618.37 frames.], batch size: 35, lr: 5.22e-04 2022-05-04 17:11:10,412 INFO [train.py:715] (0/8) Epoch 3, batch 27450, loss[loss=0.1588, simple_loss=0.2371, pruned_loss=0.0403, over 4818.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2308, pruned_loss=0.04745, over 971798.27 frames.], batch size: 21, lr: 5.22e-04 2022-05-04 17:11:49,739 INFO [train.py:715] (0/8) Epoch 3, batch 27500, loss[loss=0.193, simple_loss=0.2493, pruned_loss=0.06832, over 4955.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2302, pruned_loss=0.04714, over 971873.73 frames.], batch size: 14, lr: 5.22e-04 2022-05-04 17:12:28,640 INFO [train.py:715] (0/8) Epoch 3, batch 27550, loss[loss=0.1408, simple_loss=0.2093, pruned_loss=0.03612, over 4869.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2301, pruned_loss=0.04671, over 971538.39 frames.], batch size: 22, lr: 5.22e-04 2022-05-04 17:13:08,354 INFO [train.py:715] (0/8) Epoch 3, batch 27600, loss[loss=0.1581, simple_loss=0.2298, pruned_loss=0.0432, over 4842.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2297, pruned_loss=0.04628, over 971728.41 frames.], batch size: 30, lr: 5.22e-04 2022-05-04 17:13:48,000 INFO [train.py:715] (0/8) Epoch 3, batch 27650, loss[loss=0.1285, simple_loss=0.2029, pruned_loss=0.02701, over 4702.00 frames.], tot_loss[loss=0.1605, simple_loss=0.2294, pruned_loss=0.0458, over 971534.40 frames.], batch size: 15, lr: 5.22e-04 2022-05-04 17:14:26,624 INFO [train.py:715] (0/8) Epoch 3, batch 27700, loss[loss=0.1591, simple_loss=0.2192, pruned_loss=0.0495, over 4893.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2302, pruned_loss=0.04637, over 971284.65 frames.], batch size: 17, lr: 5.22e-04 2022-05-04 17:15:06,395 INFO [train.py:715] (0/8) Epoch 3, batch 27750, loss[loss=0.1622, simple_loss=0.2261, pruned_loss=0.04912, over 4896.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2296, pruned_loss=0.04615, over 972422.37 frames.], batch size: 17, lr: 5.22e-04 2022-05-04 17:15:46,347 INFO [train.py:715] (0/8) Epoch 3, batch 27800, loss[loss=0.1767, simple_loss=0.2457, pruned_loss=0.05383, over 4792.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2301, pruned_loss=0.04678, over 972421.76 frames.], batch size: 18, lr: 5.22e-04 2022-05-04 17:16:25,740 INFO [train.py:715] (0/8) Epoch 3, batch 27850, loss[loss=0.1606, simple_loss=0.2329, pruned_loss=0.04411, over 4799.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2304, pruned_loss=0.04606, over 973270.59 frames.], batch size: 13, lr: 5.21e-04 2022-05-04 17:17:04,210 INFO [train.py:715] (0/8) Epoch 3, batch 27900, loss[loss=0.1518, simple_loss=0.2197, pruned_loss=0.042, over 4767.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2302, pruned_loss=0.0463, over 972306.67 frames.], batch size: 17, lr: 5.21e-04 2022-05-04 17:17:43,815 INFO [train.py:715] (0/8) Epoch 3, batch 27950, loss[loss=0.1288, simple_loss=0.2138, pruned_loss=0.02196, over 4927.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2303, pruned_loss=0.04646, over 971864.47 frames.], batch size: 23, lr: 5.21e-04 2022-05-04 17:18:23,716 INFO [train.py:715] (0/8) Epoch 3, batch 28000, loss[loss=0.1676, simple_loss=0.2339, pruned_loss=0.05067, over 4919.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2304, pruned_loss=0.04625, over 971936.82 frames.], batch size: 17, lr: 5.21e-04 2022-05-04 17:19:02,277 INFO [train.py:715] (0/8) Epoch 3, batch 28050, loss[loss=0.1431, simple_loss=0.206, pruned_loss=0.04007, over 4991.00 frames.], tot_loss[loss=0.1606, simple_loss=0.2297, pruned_loss=0.0457, over 971877.46 frames.], batch size: 14, lr: 5.21e-04 2022-05-04 17:19:41,710 INFO [train.py:715] (0/8) Epoch 3, batch 28100, loss[loss=0.2026, simple_loss=0.2615, pruned_loss=0.07187, over 4813.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2297, pruned_loss=0.0456, over 972894.71 frames.], batch size: 21, lr: 5.21e-04 2022-05-04 17:20:21,588 INFO [train.py:715] (0/8) Epoch 3, batch 28150, loss[loss=0.1571, simple_loss=0.2251, pruned_loss=0.04457, over 4794.00 frames.], tot_loss[loss=0.1599, simple_loss=0.2292, pruned_loss=0.04533, over 973036.20 frames.], batch size: 21, lr: 5.21e-04 2022-05-04 17:21:00,810 INFO [train.py:715] (0/8) Epoch 3, batch 28200, loss[loss=0.1494, simple_loss=0.2277, pruned_loss=0.0355, over 4968.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2294, pruned_loss=0.04546, over 973046.79 frames.], batch size: 24, lr: 5.21e-04 2022-05-04 17:21:39,659 INFO [train.py:715] (0/8) Epoch 3, batch 28250, loss[loss=0.1781, simple_loss=0.2508, pruned_loss=0.05269, over 4908.00 frames.], tot_loss[loss=0.161, simple_loss=0.2301, pruned_loss=0.04594, over 972563.39 frames.], batch size: 23, lr: 5.21e-04 2022-05-04 17:22:18,999 INFO [train.py:715] (0/8) Epoch 3, batch 28300, loss[loss=0.135, simple_loss=0.2014, pruned_loss=0.03426, over 4870.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2305, pruned_loss=0.04627, over 972368.26 frames.], batch size: 16, lr: 5.21e-04 2022-05-04 17:22:58,003 INFO [train.py:715] (0/8) Epoch 3, batch 28350, loss[loss=0.1585, simple_loss=0.2208, pruned_loss=0.04804, over 4779.00 frames.], tot_loss[loss=0.1628, simple_loss=0.2309, pruned_loss=0.04737, over 972882.28 frames.], batch size: 17, lr: 5.21e-04 2022-05-04 17:23:37,193 INFO [train.py:715] (0/8) Epoch 3, batch 28400, loss[loss=0.1974, simple_loss=0.2673, pruned_loss=0.06371, over 4826.00 frames.], tot_loss[loss=0.1638, simple_loss=0.2317, pruned_loss=0.04794, over 972677.76 frames.], batch size: 15, lr: 5.20e-04 2022-05-04 17:24:15,827 INFO [train.py:715] (0/8) Epoch 3, batch 28450, loss[loss=0.1482, simple_loss=0.2111, pruned_loss=0.04271, over 4809.00 frames.], tot_loss[loss=0.163, simple_loss=0.2311, pruned_loss=0.04745, over 972158.18 frames.], batch size: 12, lr: 5.20e-04 2022-05-04 17:24:55,565 INFO [train.py:715] (0/8) Epoch 3, batch 28500, loss[loss=0.1755, simple_loss=0.2426, pruned_loss=0.05422, over 4933.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2307, pruned_loss=0.04708, over 972780.89 frames.], batch size: 29, lr: 5.20e-04 2022-05-04 17:25:34,508 INFO [train.py:715] (0/8) Epoch 3, batch 28550, loss[loss=0.1639, simple_loss=0.238, pruned_loss=0.0449, over 4902.00 frames.], tot_loss[loss=0.1614, simple_loss=0.23, pruned_loss=0.04636, over 972320.95 frames.], batch size: 17, lr: 5.20e-04 2022-05-04 17:26:13,423 INFO [train.py:715] (0/8) Epoch 3, batch 28600, loss[loss=0.1648, simple_loss=0.2311, pruned_loss=0.0492, over 4868.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2302, pruned_loss=0.04636, over 972455.44 frames.], batch size: 32, lr: 5.20e-04 2022-05-04 17:26:53,129 INFO [train.py:715] (0/8) Epoch 3, batch 28650, loss[loss=0.1557, simple_loss=0.2213, pruned_loss=0.04508, over 4948.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2301, pruned_loss=0.04632, over 972440.96 frames.], batch size: 23, lr: 5.20e-04 2022-05-04 17:27:33,006 INFO [train.py:715] (0/8) Epoch 3, batch 28700, loss[loss=0.1555, simple_loss=0.2306, pruned_loss=0.04025, over 4901.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2304, pruned_loss=0.04612, over 971262.11 frames.], batch size: 16, lr: 5.20e-04 2022-05-04 17:28:12,165 INFO [train.py:715] (0/8) Epoch 3, batch 28750, loss[loss=0.1653, simple_loss=0.2239, pruned_loss=0.05332, over 4713.00 frames.], tot_loss[loss=0.1606, simple_loss=0.2299, pruned_loss=0.04566, over 972096.46 frames.], batch size: 15, lr: 5.20e-04 2022-05-04 17:28:52,001 INFO [train.py:715] (0/8) Epoch 3, batch 28800, loss[loss=0.1232, simple_loss=0.1972, pruned_loss=0.0246, over 4752.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2305, pruned_loss=0.04608, over 972008.74 frames.], batch size: 16, lr: 5.20e-04 2022-05-04 17:29:32,016 INFO [train.py:715] (0/8) Epoch 3, batch 28850, loss[loss=0.1282, simple_loss=0.2062, pruned_loss=0.02504, over 4779.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2319, pruned_loss=0.04696, over 971201.72 frames.], batch size: 18, lr: 5.20e-04 2022-05-04 17:30:11,196 INFO [train.py:715] (0/8) Epoch 3, batch 28900, loss[loss=0.1316, simple_loss=0.2094, pruned_loss=0.02691, over 4954.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2308, pruned_loss=0.04636, over 971114.81 frames.], batch size: 29, lr: 5.19e-04 2022-05-04 17:30:50,076 INFO [train.py:715] (0/8) Epoch 3, batch 28950, loss[loss=0.164, simple_loss=0.2386, pruned_loss=0.04472, over 4791.00 frames.], tot_loss[loss=0.1621, simple_loss=0.231, pruned_loss=0.04655, over 970469.03 frames.], batch size: 14, lr: 5.19e-04 2022-05-04 17:31:29,812 INFO [train.py:715] (0/8) Epoch 3, batch 29000, loss[loss=0.1696, simple_loss=0.2402, pruned_loss=0.04944, over 4951.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2317, pruned_loss=0.0466, over 971894.95 frames.], batch size: 39, lr: 5.19e-04 2022-05-04 17:32:10,053 INFO [train.py:715] (0/8) Epoch 3, batch 29050, loss[loss=0.1569, simple_loss=0.2248, pruned_loss=0.04451, over 4757.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2312, pruned_loss=0.04656, over 971028.52 frames.], batch size: 18, lr: 5.19e-04 2022-05-04 17:32:48,615 INFO [train.py:715] (0/8) Epoch 3, batch 29100, loss[loss=0.163, simple_loss=0.2305, pruned_loss=0.04774, over 4732.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2313, pruned_loss=0.04644, over 971857.58 frames.], batch size: 16, lr: 5.19e-04 2022-05-04 17:33:28,200 INFO [train.py:715] (0/8) Epoch 3, batch 29150, loss[loss=0.1687, simple_loss=0.2427, pruned_loss=0.04732, over 4884.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2299, pruned_loss=0.04591, over 971531.26 frames.], batch size: 16, lr: 5.19e-04 2022-05-04 17:34:08,091 INFO [train.py:715] (0/8) Epoch 3, batch 29200, loss[loss=0.156, simple_loss=0.215, pruned_loss=0.04853, over 4847.00 frames.], tot_loss[loss=0.161, simple_loss=0.2301, pruned_loss=0.04593, over 971373.74 frames.], batch size: 15, lr: 5.19e-04 2022-05-04 17:34:47,190 INFO [train.py:715] (0/8) Epoch 3, batch 29250, loss[loss=0.2151, simple_loss=0.2692, pruned_loss=0.08045, over 4919.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2303, pruned_loss=0.04642, over 972568.86 frames.], batch size: 39, lr: 5.19e-04 2022-05-04 17:35:26,070 INFO [train.py:715] (0/8) Epoch 3, batch 29300, loss[loss=0.1635, simple_loss=0.2197, pruned_loss=0.0537, over 4971.00 frames.], tot_loss[loss=0.161, simple_loss=0.2295, pruned_loss=0.04623, over 972128.09 frames.], batch size: 15, lr: 5.19e-04 2022-05-04 17:36:06,257 INFO [train.py:715] (0/8) Epoch 3, batch 29350, loss[loss=0.1603, simple_loss=0.2216, pruned_loss=0.04948, over 4826.00 frames.], tot_loss[loss=0.1605, simple_loss=0.2288, pruned_loss=0.04611, over 971597.91 frames.], batch size: 15, lr: 5.19e-04 2022-05-04 17:36:45,936 INFO [train.py:715] (0/8) Epoch 3, batch 29400, loss[loss=0.1436, simple_loss=0.2114, pruned_loss=0.03792, over 4873.00 frames.], tot_loss[loss=0.1621, simple_loss=0.2306, pruned_loss=0.04677, over 972584.23 frames.], batch size: 22, lr: 5.18e-04 2022-05-04 17:37:24,690 INFO [train.py:715] (0/8) Epoch 3, batch 29450, loss[loss=0.1706, simple_loss=0.2229, pruned_loss=0.05915, over 4847.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2304, pruned_loss=0.04666, over 972578.10 frames.], batch size: 30, lr: 5.18e-04 2022-05-04 17:38:03,871 INFO [train.py:715] (0/8) Epoch 3, batch 29500, loss[loss=0.1559, simple_loss=0.2274, pruned_loss=0.04223, over 4963.00 frames.], tot_loss[loss=0.1613, simple_loss=0.23, pruned_loss=0.04634, over 973135.03 frames.], batch size: 24, lr: 5.18e-04 2022-05-04 17:38:43,452 INFO [train.py:715] (0/8) Epoch 3, batch 29550, loss[loss=0.1844, simple_loss=0.2482, pruned_loss=0.0603, over 4906.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2305, pruned_loss=0.04636, over 973693.18 frames.], batch size: 17, lr: 5.18e-04 2022-05-04 17:39:22,772 INFO [train.py:715] (0/8) Epoch 3, batch 29600, loss[loss=0.1855, simple_loss=0.2604, pruned_loss=0.05533, over 4774.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2316, pruned_loss=0.04708, over 973295.58 frames.], batch size: 17, lr: 5.18e-04 2022-05-04 17:40:01,837 INFO [train.py:715] (0/8) Epoch 3, batch 29650, loss[loss=0.1398, simple_loss=0.2015, pruned_loss=0.03903, over 4830.00 frames.], tot_loss[loss=0.1627, simple_loss=0.2312, pruned_loss=0.04704, over 973526.73 frames.], batch size: 13, lr: 5.18e-04 2022-05-04 17:40:41,990 INFO [train.py:715] (0/8) Epoch 3, batch 29700, loss[loss=0.1597, simple_loss=0.2328, pruned_loss=0.0433, over 4802.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2308, pruned_loss=0.04646, over 972751.79 frames.], batch size: 21, lr: 5.18e-04 2022-05-04 17:41:22,016 INFO [train.py:715] (0/8) Epoch 3, batch 29750, loss[loss=0.1223, simple_loss=0.1975, pruned_loss=0.02361, over 4895.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2306, pruned_loss=0.04649, over 972265.93 frames.], batch size: 22, lr: 5.18e-04 2022-05-04 17:42:00,530 INFO [train.py:715] (0/8) Epoch 3, batch 29800, loss[loss=0.1718, simple_loss=0.235, pruned_loss=0.05431, over 4779.00 frames.], tot_loss[loss=0.1607, simple_loss=0.2294, pruned_loss=0.04598, over 972599.31 frames.], batch size: 18, lr: 5.18e-04 2022-05-04 17:42:40,517 INFO [train.py:715] (0/8) Epoch 3, batch 29850, loss[loss=0.1451, simple_loss=0.2238, pruned_loss=0.03314, over 4812.00 frames.], tot_loss[loss=0.1608, simple_loss=0.2298, pruned_loss=0.04594, over 972775.66 frames.], batch size: 25, lr: 5.18e-04 2022-05-04 17:43:20,047 INFO [train.py:715] (0/8) Epoch 3, batch 29900, loss[loss=0.193, simple_loss=0.2597, pruned_loss=0.06313, over 4782.00 frames.], tot_loss[loss=0.1614, simple_loss=0.2299, pruned_loss=0.04649, over 971699.67 frames.], batch size: 18, lr: 5.18e-04 2022-05-04 17:43:58,720 INFO [train.py:715] (0/8) Epoch 3, batch 29950, loss[loss=0.1326, simple_loss=0.1948, pruned_loss=0.03524, over 4830.00 frames.], tot_loss[loss=0.162, simple_loss=0.2305, pruned_loss=0.04675, over 971988.31 frames.], batch size: 12, lr: 5.17e-04 2022-05-04 17:44:37,448 INFO [train.py:715] (0/8) Epoch 3, batch 30000, loss[loss=0.1226, simple_loss=0.1888, pruned_loss=0.02821, over 4835.00 frames.], tot_loss[loss=0.1607, simple_loss=0.2292, pruned_loss=0.04615, over 972443.61 frames.], batch size: 12, lr: 5.17e-04 2022-05-04 17:44:37,449 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 17:44:47,857 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1135, simple_loss=0.1993, pruned_loss=0.01381, over 914524.00 frames. 2022-05-04 17:45:26,661 INFO [train.py:715] (0/8) Epoch 3, batch 30050, loss[loss=0.1647, simple_loss=0.2441, pruned_loss=0.04266, over 4801.00 frames.], tot_loss[loss=0.1598, simple_loss=0.229, pruned_loss=0.04537, over 972362.09 frames.], batch size: 21, lr: 5.17e-04 2022-05-04 17:46:06,301 INFO [train.py:715] (0/8) Epoch 3, batch 30100, loss[loss=0.1792, simple_loss=0.2483, pruned_loss=0.05506, over 4786.00 frames.], tot_loss[loss=0.1603, simple_loss=0.2291, pruned_loss=0.04574, over 971791.15 frames.], batch size: 17, lr: 5.17e-04 2022-05-04 17:46:46,370 INFO [train.py:715] (0/8) Epoch 3, batch 30150, loss[loss=0.192, simple_loss=0.2738, pruned_loss=0.05508, over 4769.00 frames.], tot_loss[loss=0.16, simple_loss=0.2288, pruned_loss=0.04562, over 972633.78 frames.], batch size: 17, lr: 5.17e-04 2022-05-04 17:47:24,502 INFO [train.py:715] (0/8) Epoch 3, batch 30200, loss[loss=0.1385, simple_loss=0.2076, pruned_loss=0.03467, over 4783.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2285, pruned_loss=0.04507, over 972144.75 frames.], batch size: 12, lr: 5.17e-04 2022-05-04 17:48:04,130 INFO [train.py:715] (0/8) Epoch 3, batch 30250, loss[loss=0.1922, simple_loss=0.2421, pruned_loss=0.07114, over 4912.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2291, pruned_loss=0.04552, over 972380.05 frames.], batch size: 32, lr: 5.17e-04 2022-05-04 17:48:44,311 INFO [train.py:715] (0/8) Epoch 3, batch 30300, loss[loss=0.1869, simple_loss=0.2545, pruned_loss=0.05965, over 4928.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2302, pruned_loss=0.04586, over 972052.13 frames.], batch size: 39, lr: 5.17e-04 2022-05-04 17:49:23,080 INFO [train.py:715] (0/8) Epoch 3, batch 30350, loss[loss=0.1606, simple_loss=0.2262, pruned_loss=0.04748, over 4754.00 frames.], tot_loss[loss=0.1606, simple_loss=0.2297, pruned_loss=0.04581, over 972406.36 frames.], batch size: 14, lr: 5.17e-04 2022-05-04 17:50:02,735 INFO [train.py:715] (0/8) Epoch 3, batch 30400, loss[loss=0.1669, simple_loss=0.2314, pruned_loss=0.05121, over 4990.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2301, pruned_loss=0.04618, over 972795.28 frames.], batch size: 25, lr: 5.17e-04 2022-05-04 17:50:42,521 INFO [train.py:715] (0/8) Epoch 3, batch 30450, loss[loss=0.145, simple_loss=0.2149, pruned_loss=0.03756, over 4830.00 frames.], tot_loss[loss=0.1607, simple_loss=0.2298, pruned_loss=0.04584, over 973120.58 frames.], batch size: 15, lr: 5.16e-04 2022-05-04 17:51:22,933 INFO [train.py:715] (0/8) Epoch 3, batch 30500, loss[loss=0.1868, simple_loss=0.2515, pruned_loss=0.06112, over 4979.00 frames.], tot_loss[loss=0.1629, simple_loss=0.2317, pruned_loss=0.04708, over 973392.40 frames.], batch size: 39, lr: 5.16e-04 2022-05-04 17:52:02,151 INFO [train.py:715] (0/8) Epoch 3, batch 30550, loss[loss=0.1608, simple_loss=0.2396, pruned_loss=0.04101, over 4767.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2312, pruned_loss=0.04697, over 972690.97 frames.], batch size: 17, lr: 5.16e-04 2022-05-04 17:52:41,686 INFO [train.py:715] (0/8) Epoch 3, batch 30600, loss[loss=0.1624, simple_loss=0.2203, pruned_loss=0.05228, over 4844.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2304, pruned_loss=0.04641, over 973471.43 frames.], batch size: 32, lr: 5.16e-04 2022-05-04 17:53:21,642 INFO [train.py:715] (0/8) Epoch 3, batch 30650, loss[loss=0.1853, simple_loss=0.2475, pruned_loss=0.06158, over 4936.00 frames.], tot_loss[loss=0.1616, simple_loss=0.23, pruned_loss=0.04657, over 973259.22 frames.], batch size: 18, lr: 5.16e-04 2022-05-04 17:54:00,306 INFO [train.py:715] (0/8) Epoch 3, batch 30700, loss[loss=0.1309, simple_loss=0.2, pruned_loss=0.03095, over 4775.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2296, pruned_loss=0.04613, over 972211.40 frames.], batch size: 12, lr: 5.16e-04 2022-05-04 17:54:39,863 INFO [train.py:715] (0/8) Epoch 3, batch 30750, loss[loss=0.139, simple_loss=0.2174, pruned_loss=0.03026, over 4956.00 frames.], tot_loss[loss=0.1608, simple_loss=0.2297, pruned_loss=0.04593, over 973471.96 frames.], batch size: 29, lr: 5.16e-04 2022-05-04 17:55:19,270 INFO [train.py:715] (0/8) Epoch 3, batch 30800, loss[loss=0.1394, simple_loss=0.2161, pruned_loss=0.03132, over 4960.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2297, pruned_loss=0.04609, over 973502.82 frames.], batch size: 24, lr: 5.16e-04 2022-05-04 17:55:59,089 INFO [train.py:715] (0/8) Epoch 3, batch 30850, loss[loss=0.1595, simple_loss=0.2319, pruned_loss=0.04352, over 4976.00 frames.], tot_loss[loss=0.1608, simple_loss=0.2295, pruned_loss=0.04606, over 973413.06 frames.], batch size: 15, lr: 5.16e-04 2022-05-04 17:56:37,366 INFO [train.py:715] (0/8) Epoch 3, batch 30900, loss[loss=0.1845, simple_loss=0.246, pruned_loss=0.06151, over 4885.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2305, pruned_loss=0.04646, over 972930.96 frames.], batch size: 22, lr: 5.16e-04 2022-05-04 17:57:16,438 INFO [train.py:715] (0/8) Epoch 3, batch 30950, loss[loss=0.1711, simple_loss=0.2248, pruned_loss=0.05871, over 4911.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2303, pruned_loss=0.04645, over 972585.04 frames.], batch size: 19, lr: 5.15e-04 2022-05-04 17:57:55,759 INFO [train.py:715] (0/8) Epoch 3, batch 31000, loss[loss=0.1553, simple_loss=0.2212, pruned_loss=0.04473, over 4803.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2309, pruned_loss=0.04687, over 972499.17 frames.], batch size: 21, lr: 5.15e-04 2022-05-04 17:58:35,033 INFO [train.py:715] (0/8) Epoch 3, batch 31050, loss[loss=0.1432, simple_loss=0.2162, pruned_loss=0.03509, over 4979.00 frames.], tot_loss[loss=0.1614, simple_loss=0.23, pruned_loss=0.0464, over 972971.25 frames.], batch size: 28, lr: 5.15e-04 2022-05-04 17:59:13,608 INFO [train.py:715] (0/8) Epoch 3, batch 31100, loss[loss=0.1872, simple_loss=0.2534, pruned_loss=0.06046, over 4776.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2302, pruned_loss=0.04649, over 972592.69 frames.], batch size: 17, lr: 5.15e-04 2022-05-04 17:59:53,184 INFO [train.py:715] (0/8) Epoch 3, batch 31150, loss[loss=0.1528, simple_loss=0.2263, pruned_loss=0.03969, over 4841.00 frames.], tot_loss[loss=0.1618, simple_loss=0.2309, pruned_loss=0.04636, over 972894.30 frames.], batch size: 15, lr: 5.15e-04 2022-05-04 18:00:32,419 INFO [train.py:715] (0/8) Epoch 3, batch 31200, loss[loss=0.185, simple_loss=0.2518, pruned_loss=0.05907, over 4901.00 frames.], tot_loss[loss=0.1609, simple_loss=0.23, pruned_loss=0.04597, over 972005.50 frames.], batch size: 19, lr: 5.15e-04 2022-05-04 18:01:11,063 INFO [train.py:715] (0/8) Epoch 3, batch 31250, loss[loss=0.1281, simple_loss=0.1908, pruned_loss=0.03271, over 4918.00 frames.], tot_loss[loss=0.1602, simple_loss=0.2295, pruned_loss=0.04539, over 972582.79 frames.], batch size: 17, lr: 5.15e-04 2022-05-04 18:01:50,134 INFO [train.py:715] (0/8) Epoch 3, batch 31300, loss[loss=0.1161, simple_loss=0.1911, pruned_loss=0.02054, over 4911.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2294, pruned_loss=0.04536, over 973378.09 frames.], batch size: 17, lr: 5.15e-04 2022-05-04 18:02:29,483 INFO [train.py:715] (0/8) Epoch 3, batch 31350, loss[loss=0.1591, simple_loss=0.223, pruned_loss=0.04756, over 4932.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2305, pruned_loss=0.04596, over 973242.94 frames.], batch size: 29, lr: 5.15e-04 2022-05-04 18:03:08,645 INFO [train.py:715] (0/8) Epoch 3, batch 31400, loss[loss=0.1707, simple_loss=0.2373, pruned_loss=0.05208, over 4760.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2303, pruned_loss=0.04574, over 972717.94 frames.], batch size: 19, lr: 5.15e-04 2022-05-04 18:03:47,230 INFO [train.py:715] (0/8) Epoch 3, batch 31450, loss[loss=0.1975, simple_loss=0.2421, pruned_loss=0.0765, over 4789.00 frames.], tot_loss[loss=0.1603, simple_loss=0.2296, pruned_loss=0.04553, over 972595.71 frames.], batch size: 17, lr: 5.15e-04 2022-05-04 18:04:26,976 INFO [train.py:715] (0/8) Epoch 3, batch 31500, loss[loss=0.1381, simple_loss=0.2115, pruned_loss=0.03233, over 4851.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2303, pruned_loss=0.04571, over 972374.78 frames.], batch size: 13, lr: 5.14e-04 2022-05-04 18:05:06,853 INFO [train.py:715] (0/8) Epoch 3, batch 31550, loss[loss=0.158, simple_loss=0.2294, pruned_loss=0.04329, over 4953.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2305, pruned_loss=0.04629, over 972634.60 frames.], batch size: 21, lr: 5.14e-04 2022-05-04 18:05:21,910 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-136000.pt 2022-05-04 18:05:47,988 INFO [train.py:715] (0/8) Epoch 3, batch 31600, loss[loss=0.1546, simple_loss=0.2232, pruned_loss=0.04305, over 4864.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2311, pruned_loss=0.04684, over 972946.90 frames.], batch size: 20, lr: 5.14e-04 2022-05-04 18:06:26,991 INFO [train.py:715] (0/8) Epoch 3, batch 31650, loss[loss=0.1446, simple_loss=0.2156, pruned_loss=0.03685, over 4797.00 frames.], tot_loss[loss=0.1619, simple_loss=0.2303, pruned_loss=0.0468, over 972383.66 frames.], batch size: 21, lr: 5.14e-04 2022-05-04 18:07:07,173 INFO [train.py:715] (0/8) Epoch 3, batch 31700, loss[loss=0.1736, simple_loss=0.2409, pruned_loss=0.05318, over 4916.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2297, pruned_loss=0.04634, over 972998.87 frames.], batch size: 17, lr: 5.14e-04 2022-05-04 18:07:46,353 INFO [train.py:715] (0/8) Epoch 3, batch 31750, loss[loss=0.176, simple_loss=0.251, pruned_loss=0.05045, over 4918.00 frames.], tot_loss[loss=0.1602, simple_loss=0.2285, pruned_loss=0.04588, over 973088.11 frames.], batch size: 17, lr: 5.14e-04 2022-05-04 18:08:24,498 INFO [train.py:715] (0/8) Epoch 3, batch 31800, loss[loss=0.1335, simple_loss=0.2013, pruned_loss=0.03281, over 4988.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2285, pruned_loss=0.0455, over 972481.32 frames.], batch size: 14, lr: 5.14e-04 2022-05-04 18:09:04,267 INFO [train.py:715] (0/8) Epoch 3, batch 31850, loss[loss=0.1451, simple_loss=0.2217, pruned_loss=0.03419, over 4966.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2283, pruned_loss=0.04522, over 973101.07 frames.], batch size: 24, lr: 5.14e-04 2022-05-04 18:09:43,773 INFO [train.py:715] (0/8) Epoch 3, batch 31900, loss[loss=0.1485, simple_loss=0.2244, pruned_loss=0.03633, over 4752.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2283, pruned_loss=0.04519, over 972963.33 frames.], batch size: 16, lr: 5.14e-04 2022-05-04 18:10:22,480 INFO [train.py:715] (0/8) Epoch 3, batch 31950, loss[loss=0.1415, simple_loss=0.2195, pruned_loss=0.03176, over 4893.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2284, pruned_loss=0.04526, over 972693.92 frames.], batch size: 19, lr: 5.14e-04 2022-05-04 18:11:01,412 INFO [train.py:715] (0/8) Epoch 3, batch 32000, loss[loss=0.1646, simple_loss=0.2396, pruned_loss=0.04484, over 4942.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2285, pruned_loss=0.04508, over 973042.78 frames.], batch size: 23, lr: 5.14e-04 2022-05-04 18:11:41,009 INFO [train.py:715] (0/8) Epoch 3, batch 32050, loss[loss=0.1555, simple_loss=0.2438, pruned_loss=0.03366, over 4694.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2288, pruned_loss=0.04503, over 972699.04 frames.], batch size: 15, lr: 5.13e-04 2022-05-04 18:12:19,206 INFO [train.py:715] (0/8) Epoch 3, batch 32100, loss[loss=0.1526, simple_loss=0.2171, pruned_loss=0.04407, over 4805.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2292, pruned_loss=0.04521, over 972776.21 frames.], batch size: 26, lr: 5.13e-04 2022-05-04 18:12:58,307 INFO [train.py:715] (0/8) Epoch 3, batch 32150, loss[loss=0.173, simple_loss=0.2329, pruned_loss=0.05654, over 4957.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2283, pruned_loss=0.0449, over 971502.33 frames.], batch size: 24, lr: 5.13e-04 2022-05-04 18:13:37,852 INFO [train.py:715] (0/8) Epoch 3, batch 32200, loss[loss=0.2231, simple_loss=0.2785, pruned_loss=0.08391, over 4942.00 frames.], tot_loss[loss=0.16, simple_loss=0.2293, pruned_loss=0.04535, over 972509.52 frames.], batch size: 23, lr: 5.13e-04 2022-05-04 18:14:16,673 INFO [train.py:715] (0/8) Epoch 3, batch 32250, loss[loss=0.1668, simple_loss=0.2204, pruned_loss=0.05663, over 4765.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2286, pruned_loss=0.04512, over 972331.03 frames.], batch size: 14, lr: 5.13e-04 2022-05-04 18:14:55,237 INFO [train.py:715] (0/8) Epoch 3, batch 32300, loss[loss=0.1805, simple_loss=0.2445, pruned_loss=0.0582, over 4919.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2292, pruned_loss=0.04575, over 972419.30 frames.], batch size: 19, lr: 5.13e-04 2022-05-04 18:15:34,896 INFO [train.py:715] (0/8) Epoch 3, batch 32350, loss[loss=0.1446, simple_loss=0.2226, pruned_loss=0.03325, over 4828.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2297, pruned_loss=0.04559, over 971665.75 frames.], batch size: 15, lr: 5.13e-04 2022-05-04 18:16:14,619 INFO [train.py:715] (0/8) Epoch 3, batch 32400, loss[loss=0.1415, simple_loss=0.219, pruned_loss=0.03203, over 4946.00 frames.], tot_loss[loss=0.16, simple_loss=0.2296, pruned_loss=0.04523, over 972387.02 frames.], batch size: 21, lr: 5.13e-04 2022-05-04 18:16:52,600 INFO [train.py:715] (0/8) Epoch 3, batch 32450, loss[loss=0.1531, simple_loss=0.2299, pruned_loss=0.03814, over 4950.00 frames.], tot_loss[loss=0.16, simple_loss=0.2298, pruned_loss=0.04508, over 973303.75 frames.], batch size: 21, lr: 5.13e-04 2022-05-04 18:17:32,078 INFO [train.py:715] (0/8) Epoch 3, batch 32500, loss[loss=0.153, simple_loss=0.2329, pruned_loss=0.03656, over 4841.00 frames.], tot_loss[loss=0.1602, simple_loss=0.2298, pruned_loss=0.04533, over 973114.80 frames.], batch size: 15, lr: 5.13e-04 2022-05-04 18:18:11,714 INFO [train.py:715] (0/8) Epoch 3, batch 32550, loss[loss=0.1384, simple_loss=0.2053, pruned_loss=0.03581, over 4870.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2288, pruned_loss=0.0447, over 972437.46 frames.], batch size: 16, lr: 5.12e-04 2022-05-04 18:18:50,231 INFO [train.py:715] (0/8) Epoch 3, batch 32600, loss[loss=0.1406, simple_loss=0.2237, pruned_loss=0.02874, over 4894.00 frames.], tot_loss[loss=0.1608, simple_loss=0.2295, pruned_loss=0.046, over 972971.60 frames.], batch size: 19, lr: 5.12e-04 2022-05-04 18:19:29,062 INFO [train.py:715] (0/8) Epoch 3, batch 32650, loss[loss=0.1418, simple_loss=0.2141, pruned_loss=0.03474, over 4820.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2302, pruned_loss=0.04643, over 972828.32 frames.], batch size: 27, lr: 5.12e-04 2022-05-04 18:20:08,687 INFO [train.py:715] (0/8) Epoch 3, batch 32700, loss[loss=0.1608, simple_loss=0.2328, pruned_loss=0.04435, over 4744.00 frames.], tot_loss[loss=0.1616, simple_loss=0.2304, pruned_loss=0.04643, over 972803.67 frames.], batch size: 12, lr: 5.12e-04 2022-05-04 18:20:47,704 INFO [train.py:715] (0/8) Epoch 3, batch 32750, loss[loss=0.138, simple_loss=0.218, pruned_loss=0.02895, over 4805.00 frames.], tot_loss[loss=0.1607, simple_loss=0.2293, pruned_loss=0.04608, over 972909.46 frames.], batch size: 24, lr: 5.12e-04 2022-05-04 18:21:26,284 INFO [train.py:715] (0/8) Epoch 3, batch 32800, loss[loss=0.1865, simple_loss=0.2552, pruned_loss=0.05889, over 4771.00 frames.], tot_loss[loss=0.1602, simple_loss=0.2287, pruned_loss=0.0459, over 972351.19 frames.], batch size: 18, lr: 5.12e-04 2022-05-04 18:22:05,409 INFO [train.py:715] (0/8) Epoch 3, batch 32850, loss[loss=0.1515, simple_loss=0.2275, pruned_loss=0.03769, over 4928.00 frames.], tot_loss[loss=0.1603, simple_loss=0.2286, pruned_loss=0.04595, over 972634.59 frames.], batch size: 23, lr: 5.12e-04 2022-05-04 18:22:44,588 INFO [train.py:715] (0/8) Epoch 3, batch 32900, loss[loss=0.1404, simple_loss=0.2066, pruned_loss=0.03707, over 4957.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2286, pruned_loss=0.04585, over 972617.24 frames.], batch size: 21, lr: 5.12e-04 2022-05-04 18:23:23,656 INFO [train.py:715] (0/8) Epoch 3, batch 32950, loss[loss=0.1648, simple_loss=0.2306, pruned_loss=0.04954, over 4733.00 frames.], tot_loss[loss=0.1603, simple_loss=0.2291, pruned_loss=0.04581, over 972929.94 frames.], batch size: 16, lr: 5.12e-04 2022-05-04 18:24:02,384 INFO [train.py:715] (0/8) Epoch 3, batch 33000, loss[loss=0.1432, simple_loss=0.2161, pruned_loss=0.03518, over 4753.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2287, pruned_loss=0.04573, over 972479.18 frames.], batch size: 19, lr: 5.12e-04 2022-05-04 18:24:02,385 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 18:24:11,705 INFO [train.py:742] (0/8) Epoch 3, validation: loss=0.1131, simple_loss=0.199, pruned_loss=0.01363, over 914524.00 frames. 2022-05-04 18:24:50,800 INFO [train.py:715] (0/8) Epoch 3, batch 33050, loss[loss=0.1342, simple_loss=0.2, pruned_loss=0.03417, over 4948.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2282, pruned_loss=0.04575, over 972442.21 frames.], batch size: 14, lr: 5.12e-04 2022-05-04 18:25:30,711 INFO [train.py:715] (0/8) Epoch 3, batch 33100, loss[loss=0.1173, simple_loss=0.191, pruned_loss=0.02184, over 4772.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2284, pruned_loss=0.0459, over 971637.67 frames.], batch size: 19, lr: 5.11e-04 2022-05-04 18:26:09,588 INFO [train.py:715] (0/8) Epoch 3, batch 33150, loss[loss=0.1601, simple_loss=0.2254, pruned_loss=0.04739, over 4803.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2274, pruned_loss=0.04494, over 971830.13 frames.], batch size: 24, lr: 5.11e-04 2022-05-04 18:26:48,266 INFO [train.py:715] (0/8) Epoch 3, batch 33200, loss[loss=0.1424, simple_loss=0.2095, pruned_loss=0.03764, over 4832.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2273, pruned_loss=0.04441, over 971695.46 frames.], batch size: 26, lr: 5.11e-04 2022-05-04 18:27:28,162 INFO [train.py:715] (0/8) Epoch 3, batch 33250, loss[loss=0.1306, simple_loss=0.1973, pruned_loss=0.03196, over 4943.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2274, pruned_loss=0.04461, over 972408.34 frames.], batch size: 21, lr: 5.11e-04 2022-05-04 18:28:07,721 INFO [train.py:715] (0/8) Epoch 3, batch 33300, loss[loss=0.1489, simple_loss=0.2295, pruned_loss=0.0341, over 4783.00 frames.], tot_loss[loss=0.1592, simple_loss=0.2281, pruned_loss=0.04512, over 972818.08 frames.], batch size: 14, lr: 5.11e-04 2022-05-04 18:28:46,235 INFO [train.py:715] (0/8) Epoch 3, batch 33350, loss[loss=0.1455, simple_loss=0.2127, pruned_loss=0.03914, over 4955.00 frames.], tot_loss[loss=0.159, simple_loss=0.2279, pruned_loss=0.04506, over 972714.36 frames.], batch size: 15, lr: 5.11e-04 2022-05-04 18:29:25,534 INFO [train.py:715] (0/8) Epoch 3, batch 33400, loss[loss=0.1512, simple_loss=0.2283, pruned_loss=0.03699, over 4699.00 frames.], tot_loss[loss=0.1592, simple_loss=0.2281, pruned_loss=0.04511, over 972624.00 frames.], batch size: 15, lr: 5.11e-04 2022-05-04 18:30:05,185 INFO [train.py:715] (0/8) Epoch 3, batch 33450, loss[loss=0.1445, simple_loss=0.2168, pruned_loss=0.03612, over 4819.00 frames.], tot_loss[loss=0.1592, simple_loss=0.2287, pruned_loss=0.04489, over 972472.07 frames.], batch size: 12, lr: 5.11e-04 2022-05-04 18:30:44,207 INFO [train.py:715] (0/8) Epoch 3, batch 33500, loss[loss=0.194, simple_loss=0.2651, pruned_loss=0.0614, over 4783.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2284, pruned_loss=0.04494, over 972071.34 frames.], batch size: 18, lr: 5.11e-04 2022-05-04 18:31:23,290 INFO [train.py:715] (0/8) Epoch 3, batch 33550, loss[loss=0.1625, simple_loss=0.2318, pruned_loss=0.04662, over 4881.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2278, pruned_loss=0.0447, over 972029.40 frames.], batch size: 22, lr: 5.11e-04 2022-05-04 18:32:03,650 INFO [train.py:715] (0/8) Epoch 3, batch 33600, loss[loss=0.1477, simple_loss=0.211, pruned_loss=0.04222, over 4717.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2277, pruned_loss=0.04467, over 972908.06 frames.], batch size: 12, lr: 5.11e-04 2022-05-04 18:32:43,011 INFO [train.py:715] (0/8) Epoch 3, batch 33650, loss[loss=0.1427, simple_loss=0.2133, pruned_loss=0.03606, over 4891.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2272, pruned_loss=0.04469, over 971344.84 frames.], batch size: 19, lr: 5.10e-04 2022-05-04 18:33:21,657 INFO [train.py:715] (0/8) Epoch 3, batch 33700, loss[loss=0.1721, simple_loss=0.2427, pruned_loss=0.05069, over 4922.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2276, pruned_loss=0.04513, over 972622.18 frames.], batch size: 18, lr: 5.10e-04 2022-05-04 18:34:01,451 INFO [train.py:715] (0/8) Epoch 3, batch 33750, loss[loss=0.164, simple_loss=0.2221, pruned_loss=0.05292, over 4794.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2285, pruned_loss=0.04524, over 972576.57 frames.], batch size: 18, lr: 5.10e-04 2022-05-04 18:34:40,934 INFO [train.py:715] (0/8) Epoch 3, batch 33800, loss[loss=0.1842, simple_loss=0.2624, pruned_loss=0.05299, over 4934.00 frames.], tot_loss[loss=0.1601, simple_loss=0.229, pruned_loss=0.04559, over 972880.39 frames.], batch size: 29, lr: 5.10e-04 2022-05-04 18:35:19,310 INFO [train.py:715] (0/8) Epoch 3, batch 33850, loss[loss=0.1716, simple_loss=0.2481, pruned_loss=0.04759, over 4815.00 frames.], tot_loss[loss=0.1601, simple_loss=0.229, pruned_loss=0.04562, over 972593.98 frames.], batch size: 25, lr: 5.10e-04 2022-05-04 18:35:58,138 INFO [train.py:715] (0/8) Epoch 3, batch 33900, loss[loss=0.156, simple_loss=0.2289, pruned_loss=0.04156, over 4858.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2304, pruned_loss=0.04597, over 971884.65 frames.], batch size: 15, lr: 5.10e-04 2022-05-04 18:36:38,301 INFO [train.py:715] (0/8) Epoch 3, batch 33950, loss[loss=0.1571, simple_loss=0.2195, pruned_loss=0.04736, over 4923.00 frames.], tot_loss[loss=0.1617, simple_loss=0.2311, pruned_loss=0.04621, over 972121.49 frames.], batch size: 35, lr: 5.10e-04 2022-05-04 18:37:17,239 INFO [train.py:715] (0/8) Epoch 3, batch 34000, loss[loss=0.1503, simple_loss=0.2082, pruned_loss=0.04621, over 4960.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2306, pruned_loss=0.04594, over 972861.22 frames.], batch size: 14, lr: 5.10e-04 2022-05-04 18:37:55,982 INFO [train.py:715] (0/8) Epoch 3, batch 34050, loss[loss=0.1943, simple_loss=0.2476, pruned_loss=0.07056, over 4759.00 frames.], tot_loss[loss=0.1611, simple_loss=0.2302, pruned_loss=0.04599, over 972509.23 frames.], batch size: 19, lr: 5.10e-04 2022-05-04 18:38:35,313 INFO [train.py:715] (0/8) Epoch 3, batch 34100, loss[loss=0.1795, simple_loss=0.2424, pruned_loss=0.05829, over 4939.00 frames.], tot_loss[loss=0.161, simple_loss=0.2302, pruned_loss=0.04593, over 972180.75 frames.], batch size: 35, lr: 5.10e-04 2022-05-04 18:39:15,276 INFO [train.py:715] (0/8) Epoch 3, batch 34150, loss[loss=0.1481, simple_loss=0.2133, pruned_loss=0.04141, over 4739.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2294, pruned_loss=0.04573, over 971413.82 frames.], batch size: 16, lr: 5.10e-04 2022-05-04 18:39:53,557 INFO [train.py:715] (0/8) Epoch 3, batch 34200, loss[loss=0.1753, simple_loss=0.237, pruned_loss=0.05675, over 4933.00 frames.], tot_loss[loss=0.1606, simple_loss=0.2292, pruned_loss=0.04597, over 971813.44 frames.], batch size: 23, lr: 5.09e-04 2022-05-04 18:40:33,002 INFO [train.py:715] (0/8) Epoch 3, batch 34250, loss[loss=0.2059, simple_loss=0.2649, pruned_loss=0.07345, over 4906.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2288, pruned_loss=0.04572, over 972407.97 frames.], batch size: 18, lr: 5.09e-04 2022-05-04 18:41:13,063 INFO [train.py:715] (0/8) Epoch 3, batch 34300, loss[loss=0.2289, simple_loss=0.2805, pruned_loss=0.08866, over 4874.00 frames.], tot_loss[loss=0.16, simple_loss=0.2286, pruned_loss=0.04572, over 971059.16 frames.], batch size: 32, lr: 5.09e-04 2022-05-04 18:41:52,486 INFO [train.py:715] (0/8) Epoch 3, batch 34350, loss[loss=0.1538, simple_loss=0.2268, pruned_loss=0.04037, over 4957.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2283, pruned_loss=0.04552, over 971284.51 frames.], batch size: 29, lr: 5.09e-04 2022-05-04 18:42:31,603 INFO [train.py:715] (0/8) Epoch 3, batch 34400, loss[loss=0.1823, simple_loss=0.261, pruned_loss=0.05179, over 4867.00 frames.], tot_loss[loss=0.1609, simple_loss=0.2293, pruned_loss=0.04624, over 971423.23 frames.], batch size: 20, lr: 5.09e-04 2022-05-04 18:43:11,183 INFO [train.py:715] (0/8) Epoch 3, batch 34450, loss[loss=0.1426, simple_loss=0.2169, pruned_loss=0.03417, over 4764.00 frames.], tot_loss[loss=0.1604, simple_loss=0.229, pruned_loss=0.04585, over 971495.10 frames.], batch size: 19, lr: 5.09e-04 2022-05-04 18:43:51,335 INFO [train.py:715] (0/8) Epoch 3, batch 34500, loss[loss=0.1817, simple_loss=0.2351, pruned_loss=0.06413, over 4754.00 frames.], tot_loss[loss=0.162, simple_loss=0.2303, pruned_loss=0.04683, over 971611.07 frames.], batch size: 14, lr: 5.09e-04 2022-05-04 18:44:29,765 INFO [train.py:715] (0/8) Epoch 3, batch 34550, loss[loss=0.1457, simple_loss=0.2169, pruned_loss=0.03723, over 4927.00 frames.], tot_loss[loss=0.1613, simple_loss=0.2302, pruned_loss=0.0462, over 971315.12 frames.], batch size: 29, lr: 5.09e-04 2022-05-04 18:45:08,811 INFO [train.py:715] (0/8) Epoch 3, batch 34600, loss[loss=0.1797, simple_loss=0.2459, pruned_loss=0.05672, over 4924.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2315, pruned_loss=0.04692, over 972128.81 frames.], batch size: 18, lr: 5.09e-04 2022-05-04 18:45:49,190 INFO [train.py:715] (0/8) Epoch 3, batch 34650, loss[loss=0.1481, simple_loss=0.2058, pruned_loss=0.04524, over 4989.00 frames.], tot_loss[loss=0.1626, simple_loss=0.2309, pruned_loss=0.04711, over 972233.63 frames.], batch size: 14, lr: 5.09e-04 2022-05-04 18:46:28,772 INFO [train.py:715] (0/8) Epoch 3, batch 34700, loss[loss=0.1375, simple_loss=0.2023, pruned_loss=0.03637, over 4909.00 frames.], tot_loss[loss=0.1622, simple_loss=0.2305, pruned_loss=0.04694, over 971730.27 frames.], batch size: 18, lr: 5.09e-04 2022-05-04 18:47:07,055 INFO [train.py:715] (0/8) Epoch 3, batch 34750, loss[loss=0.141, simple_loss=0.2086, pruned_loss=0.03668, over 4815.00 frames.], tot_loss[loss=0.1615, simple_loss=0.2301, pruned_loss=0.04645, over 971831.51 frames.], batch size: 13, lr: 5.08e-04 2022-05-04 18:47:44,744 INFO [train.py:715] (0/8) Epoch 3, batch 34800, loss[loss=0.1306, simple_loss=0.2096, pruned_loss=0.02579, over 4919.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2286, pruned_loss=0.04522, over 971356.07 frames.], batch size: 23, lr: 5.08e-04 2022-05-04 18:47:53,722 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-3.pt 2022-05-04 18:48:35,135 INFO [train.py:715] (0/8) Epoch 4, batch 0, loss[loss=0.1635, simple_loss=0.2403, pruned_loss=0.04334, over 4819.00 frames.], tot_loss[loss=0.1635, simple_loss=0.2403, pruned_loss=0.04334, over 4819.00 frames.], batch size: 25, lr: 4.78e-04 2022-05-04 18:49:16,500 INFO [train.py:715] (0/8) Epoch 4, batch 50, loss[loss=0.2199, simple_loss=0.2617, pruned_loss=0.08903, over 4844.00 frames.], tot_loss[loss=0.1624, simple_loss=0.2318, pruned_loss=0.04649, over 218997.95 frames.], batch size: 30, lr: 4.78e-04 2022-05-04 18:49:57,174 INFO [train.py:715] (0/8) Epoch 4, batch 100, loss[loss=0.1633, simple_loss=0.2182, pruned_loss=0.05422, over 4871.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2268, pruned_loss=0.04443, over 385613.47 frames.], batch size: 16, lr: 4.78e-04 2022-05-04 18:50:37,979 INFO [train.py:715] (0/8) Epoch 4, batch 150, loss[loss=0.2042, simple_loss=0.2745, pruned_loss=0.06693, over 4890.00 frames.], tot_loss[loss=0.1607, simple_loss=0.2284, pruned_loss=0.04644, over 515489.35 frames.], batch size: 39, lr: 4.78e-04 2022-05-04 18:51:19,045 INFO [train.py:715] (0/8) Epoch 4, batch 200, loss[loss=0.1527, simple_loss=0.222, pruned_loss=0.04172, over 4800.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2269, pruned_loss=0.04536, over 617995.02 frames.], batch size: 24, lr: 4.78e-04 2022-05-04 18:52:00,237 INFO [train.py:715] (0/8) Epoch 4, batch 250, loss[loss=0.1421, simple_loss=0.2119, pruned_loss=0.03612, over 4823.00 frames.], tot_loss[loss=0.16, simple_loss=0.2282, pruned_loss=0.04589, over 696666.69 frames.], batch size: 26, lr: 4.77e-04 2022-05-04 18:52:41,172 INFO [train.py:715] (0/8) Epoch 4, batch 300, loss[loss=0.133, simple_loss=0.2127, pruned_loss=0.02663, over 4777.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2288, pruned_loss=0.04605, over 757488.83 frames.], batch size: 14, lr: 4.77e-04 2022-05-04 18:53:22,424 INFO [train.py:715] (0/8) Epoch 4, batch 350, loss[loss=0.1594, simple_loss=0.2191, pruned_loss=0.04988, over 4963.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2286, pruned_loss=0.04556, over 805080.28 frames.], batch size: 35, lr: 4.77e-04 2022-05-04 18:54:04,552 INFO [train.py:715] (0/8) Epoch 4, batch 400, loss[loss=0.1533, simple_loss=0.2208, pruned_loss=0.0429, over 4972.00 frames.], tot_loss[loss=0.16, simple_loss=0.229, pruned_loss=0.04544, over 842260.98 frames.], batch size: 14, lr: 4.77e-04 2022-05-04 18:54:45,176 INFO [train.py:715] (0/8) Epoch 4, batch 450, loss[loss=0.1481, simple_loss=0.226, pruned_loss=0.03507, over 4800.00 frames.], tot_loss[loss=0.1599, simple_loss=0.2292, pruned_loss=0.04534, over 871132.74 frames.], batch size: 21, lr: 4.77e-04 2022-05-04 18:55:26,263 INFO [train.py:715] (0/8) Epoch 4, batch 500, loss[loss=0.166, simple_loss=0.2316, pruned_loss=0.05024, over 4943.00 frames.], tot_loss[loss=0.1601, simple_loss=0.2291, pruned_loss=0.04554, over 892896.90 frames.], batch size: 39, lr: 4.77e-04 2022-05-04 18:56:07,504 INFO [train.py:715] (0/8) Epoch 4, batch 550, loss[loss=0.1496, simple_loss=0.2213, pruned_loss=0.03892, over 4875.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2277, pruned_loss=0.04491, over 910550.69 frames.], batch size: 30, lr: 4.77e-04 2022-05-04 18:56:48,413 INFO [train.py:715] (0/8) Epoch 4, batch 600, loss[loss=0.1633, simple_loss=0.2297, pruned_loss=0.04848, over 4904.00 frames.], tot_loss[loss=0.159, simple_loss=0.2278, pruned_loss=0.04509, over 924409.13 frames.], batch size: 29, lr: 4.77e-04 2022-05-04 18:57:28,915 INFO [train.py:715] (0/8) Epoch 4, batch 650, loss[loss=0.1607, simple_loss=0.2336, pruned_loss=0.04391, over 4757.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2285, pruned_loss=0.04555, over 935252.43 frames.], batch size: 16, lr: 4.77e-04 2022-05-04 18:58:09,995 INFO [train.py:715] (0/8) Epoch 4, batch 700, loss[loss=0.2048, simple_loss=0.2618, pruned_loss=0.07385, over 4931.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2283, pruned_loss=0.04559, over 944200.14 frames.], batch size: 39, lr: 4.77e-04 2022-05-04 18:58:51,939 INFO [train.py:715] (0/8) Epoch 4, batch 750, loss[loss=0.149, simple_loss=0.2238, pruned_loss=0.03706, over 4700.00 frames.], tot_loss[loss=0.1599, simple_loss=0.2289, pruned_loss=0.0454, over 949079.05 frames.], batch size: 15, lr: 4.77e-04 2022-05-04 18:59:33,004 INFO [train.py:715] (0/8) Epoch 4, batch 800, loss[loss=0.1851, simple_loss=0.2524, pruned_loss=0.05891, over 4859.00 frames.], tot_loss[loss=0.1605, simple_loss=0.2295, pruned_loss=0.04572, over 955260.26 frames.], batch size: 20, lr: 4.77e-04 2022-05-04 19:00:13,429 INFO [train.py:715] (0/8) Epoch 4, batch 850, loss[loss=0.1843, simple_loss=0.2628, pruned_loss=0.0529, over 4931.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2295, pruned_loss=0.04566, over 959558.81 frames.], batch size: 29, lr: 4.76e-04 2022-05-04 19:00:54,494 INFO [train.py:715] (0/8) Epoch 4, batch 900, loss[loss=0.1253, simple_loss=0.2042, pruned_loss=0.02322, over 4827.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2275, pruned_loss=0.0447, over 962745.23 frames.], batch size: 26, lr: 4.76e-04 2022-05-04 19:01:35,343 INFO [train.py:715] (0/8) Epoch 4, batch 950, loss[loss=0.1694, simple_loss=0.2454, pruned_loss=0.04668, over 4939.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2275, pruned_loss=0.04488, over 964967.99 frames.], batch size: 21, lr: 4.76e-04 2022-05-04 19:02:16,226 INFO [train.py:715] (0/8) Epoch 4, batch 1000, loss[loss=0.1768, simple_loss=0.2348, pruned_loss=0.05942, over 4989.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2268, pruned_loss=0.0448, over 967487.36 frames.], batch size: 26, lr: 4.76e-04 2022-05-04 19:02:56,934 INFO [train.py:715] (0/8) Epoch 4, batch 1050, loss[loss=0.1869, simple_loss=0.2505, pruned_loss=0.06168, over 4959.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2276, pruned_loss=0.04489, over 969144.75 frames.], batch size: 35, lr: 4.76e-04 2022-05-04 19:03:38,127 INFO [train.py:715] (0/8) Epoch 4, batch 1100, loss[loss=0.1344, simple_loss=0.1978, pruned_loss=0.03549, over 4829.00 frames.], tot_loss[loss=0.159, simple_loss=0.2278, pruned_loss=0.04506, over 969234.66 frames.], batch size: 15, lr: 4.76e-04 2022-05-04 19:04:18,536 INFO [train.py:715] (0/8) Epoch 4, batch 1150, loss[loss=0.1352, simple_loss=0.2072, pruned_loss=0.03156, over 4963.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2269, pruned_loss=0.04447, over 969818.11 frames.], batch size: 24, lr: 4.76e-04 2022-05-04 19:04:58,023 INFO [train.py:715] (0/8) Epoch 4, batch 1200, loss[loss=0.175, simple_loss=0.2514, pruned_loss=0.04929, over 4963.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2273, pruned_loss=0.04472, over 970492.11 frames.], batch size: 40, lr: 4.76e-04 2022-05-04 19:05:38,583 INFO [train.py:715] (0/8) Epoch 4, batch 1250, loss[loss=0.1991, simple_loss=0.261, pruned_loss=0.06865, over 4867.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2276, pruned_loss=0.04479, over 970721.87 frames.], batch size: 32, lr: 4.76e-04 2022-05-04 19:06:19,650 INFO [train.py:715] (0/8) Epoch 4, batch 1300, loss[loss=0.1936, simple_loss=0.2655, pruned_loss=0.06087, over 4821.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2279, pruned_loss=0.04517, over 970161.37 frames.], batch size: 15, lr: 4.76e-04 2022-05-04 19:06:59,656 INFO [train.py:715] (0/8) Epoch 4, batch 1350, loss[loss=0.1775, simple_loss=0.2442, pruned_loss=0.05541, over 4912.00 frames.], tot_loss[loss=0.16, simple_loss=0.2286, pruned_loss=0.04572, over 970249.60 frames.], batch size: 18, lr: 4.76e-04 2022-05-04 19:07:40,377 INFO [train.py:715] (0/8) Epoch 4, batch 1400, loss[loss=0.1518, simple_loss=0.2301, pruned_loss=0.0367, over 4814.00 frames.], tot_loss[loss=0.1605, simple_loss=0.2294, pruned_loss=0.0458, over 970692.51 frames.], batch size: 13, lr: 4.76e-04 2022-05-04 19:08:21,347 INFO [train.py:715] (0/8) Epoch 4, batch 1450, loss[loss=0.1637, simple_loss=0.2354, pruned_loss=0.04599, over 4796.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2294, pruned_loss=0.04574, over 970721.39 frames.], batch size: 17, lr: 4.75e-04 2022-05-04 19:09:02,420 INFO [train.py:715] (0/8) Epoch 4, batch 1500, loss[loss=0.1775, simple_loss=0.2552, pruned_loss=0.04993, over 4799.00 frames.], tot_loss[loss=0.1605, simple_loss=0.2295, pruned_loss=0.04572, over 971596.23 frames.], batch size: 21, lr: 4.75e-04 2022-05-04 19:09:42,042 INFO [train.py:715] (0/8) Epoch 4, batch 1550, loss[loss=0.1909, simple_loss=0.2563, pruned_loss=0.06279, over 4792.00 frames.], tot_loss[loss=0.16, simple_loss=0.2292, pruned_loss=0.0454, over 971239.92 frames.], batch size: 18, lr: 4.75e-04 2022-05-04 19:10:23,010 INFO [train.py:715] (0/8) Epoch 4, batch 1600, loss[loss=0.1577, simple_loss=0.225, pruned_loss=0.04516, over 4762.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2282, pruned_loss=0.04516, over 971387.73 frames.], batch size: 18, lr: 4.75e-04 2022-05-04 19:11:04,738 INFO [train.py:715] (0/8) Epoch 4, batch 1650, loss[loss=0.1476, simple_loss=0.2142, pruned_loss=0.04053, over 4852.00 frames.], tot_loss[loss=0.1591, simple_loss=0.228, pruned_loss=0.04512, over 971463.44 frames.], batch size: 13, lr: 4.75e-04 2022-05-04 19:11:45,096 INFO [train.py:715] (0/8) Epoch 4, batch 1700, loss[loss=0.1749, simple_loss=0.2365, pruned_loss=0.05662, over 4835.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2286, pruned_loss=0.04537, over 971110.17 frames.], batch size: 30, lr: 4.75e-04 2022-05-04 19:12:25,108 INFO [train.py:715] (0/8) Epoch 4, batch 1750, loss[loss=0.1638, simple_loss=0.2444, pruned_loss=0.04156, over 4691.00 frames.], tot_loss[loss=0.1592, simple_loss=0.2282, pruned_loss=0.04513, over 970924.77 frames.], batch size: 15, lr: 4.75e-04 2022-05-04 19:13:06,309 INFO [train.py:715] (0/8) Epoch 4, batch 1800, loss[loss=0.1384, simple_loss=0.2081, pruned_loss=0.03434, over 4776.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2283, pruned_loss=0.04511, over 971572.53 frames.], batch size: 18, lr: 4.75e-04 2022-05-04 19:13:47,663 INFO [train.py:715] (0/8) Epoch 4, batch 1850, loss[loss=0.1586, simple_loss=0.2323, pruned_loss=0.04248, over 4743.00 frames.], tot_loss[loss=0.159, simple_loss=0.2282, pruned_loss=0.04491, over 971983.61 frames.], batch size: 16, lr: 4.75e-04 2022-05-04 19:14:27,700 INFO [train.py:715] (0/8) Epoch 4, batch 1900, loss[loss=0.1564, simple_loss=0.2182, pruned_loss=0.04725, over 4883.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2276, pruned_loss=0.04462, over 971994.54 frames.], batch size: 22, lr: 4.75e-04 2022-05-04 19:15:08,460 INFO [train.py:715] (0/8) Epoch 4, batch 1950, loss[loss=0.1413, simple_loss=0.2065, pruned_loss=0.03803, over 4778.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2268, pruned_loss=0.04438, over 973079.78 frames.], batch size: 12, lr: 4.75e-04 2022-05-04 19:15:48,963 INFO [train.py:715] (0/8) Epoch 4, batch 2000, loss[loss=0.1463, simple_loss=0.2212, pruned_loss=0.03564, over 4930.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2278, pruned_loss=0.04492, over 973312.51 frames.], batch size: 29, lr: 4.74e-04 2022-05-04 19:16:28,966 INFO [train.py:715] (0/8) Epoch 4, batch 2050, loss[loss=0.1761, simple_loss=0.2497, pruned_loss=0.0513, over 4793.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2277, pruned_loss=0.04507, over 972972.77 frames.], batch size: 18, lr: 4.74e-04 2022-05-04 19:17:08,514 INFO [train.py:715] (0/8) Epoch 4, batch 2100, loss[loss=0.1801, simple_loss=0.2588, pruned_loss=0.05068, over 4813.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2281, pruned_loss=0.04544, over 973443.62 frames.], batch size: 25, lr: 4.74e-04 2022-05-04 19:17:48,265 INFO [train.py:715] (0/8) Epoch 4, batch 2150, loss[loss=0.1766, simple_loss=0.249, pruned_loss=0.05211, over 4807.00 frames.], tot_loss[loss=0.1593, simple_loss=0.228, pruned_loss=0.04535, over 973203.39 frames.], batch size: 21, lr: 4.74e-04 2022-05-04 19:18:29,062 INFO [train.py:715] (0/8) Epoch 4, batch 2200, loss[loss=0.1716, simple_loss=0.2373, pruned_loss=0.05291, over 4928.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2282, pruned_loss=0.04562, over 973537.70 frames.], batch size: 18, lr: 4.74e-04 2022-05-04 19:19:09,440 INFO [train.py:715] (0/8) Epoch 4, batch 2250, loss[loss=0.1772, simple_loss=0.2448, pruned_loss=0.05478, over 4845.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2286, pruned_loss=0.04524, over 972921.09 frames.], batch size: 13, lr: 4.74e-04 2022-05-04 19:19:48,815 INFO [train.py:715] (0/8) Epoch 4, batch 2300, loss[loss=0.1474, simple_loss=0.2193, pruned_loss=0.03777, over 4735.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2274, pruned_loss=0.04468, over 973465.61 frames.], batch size: 16, lr: 4.74e-04 2022-05-04 19:20:28,741 INFO [train.py:715] (0/8) Epoch 4, batch 2350, loss[loss=0.1799, simple_loss=0.2528, pruned_loss=0.05354, over 4989.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2271, pruned_loss=0.04472, over 972467.84 frames.], batch size: 25, lr: 4.74e-04 2022-05-04 19:21:08,833 INFO [train.py:715] (0/8) Epoch 4, batch 2400, loss[loss=0.1387, simple_loss=0.2184, pruned_loss=0.02949, over 4873.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2267, pruned_loss=0.04479, over 971346.70 frames.], batch size: 32, lr: 4.74e-04 2022-05-04 19:21:48,323 INFO [train.py:715] (0/8) Epoch 4, batch 2450, loss[loss=0.1315, simple_loss=0.2044, pruned_loss=0.0293, over 4782.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2266, pruned_loss=0.04459, over 970742.61 frames.], batch size: 18, lr: 4.74e-04 2022-05-04 19:22:28,660 INFO [train.py:715] (0/8) Epoch 4, batch 2500, loss[loss=0.1898, simple_loss=0.2466, pruned_loss=0.06648, over 4791.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2263, pruned_loss=0.04426, over 969452.36 frames.], batch size: 17, lr: 4.74e-04 2022-05-04 19:23:09,573 INFO [train.py:715] (0/8) Epoch 4, batch 2550, loss[loss=0.1839, simple_loss=0.2587, pruned_loss=0.05455, over 4808.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2271, pruned_loss=0.04433, over 970465.25 frames.], batch size: 24, lr: 4.74e-04 2022-05-04 19:23:49,883 INFO [train.py:715] (0/8) Epoch 4, batch 2600, loss[loss=0.136, simple_loss=0.2073, pruned_loss=0.03232, over 4814.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2283, pruned_loss=0.04469, over 970573.22 frames.], batch size: 13, lr: 4.73e-04 2022-05-04 19:24:29,134 INFO [train.py:715] (0/8) Epoch 4, batch 2650, loss[loss=0.1854, simple_loss=0.2581, pruned_loss=0.05633, over 4835.00 frames.], tot_loss[loss=0.1596, simple_loss=0.2288, pruned_loss=0.04515, over 970569.12 frames.], batch size: 26, lr: 4.73e-04 2022-05-04 19:25:09,495 INFO [train.py:715] (0/8) Epoch 4, batch 2700, loss[loss=0.1586, simple_loss=0.2308, pruned_loss=0.04322, over 4872.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2281, pruned_loss=0.04452, over 970880.21 frames.], batch size: 16, lr: 4.73e-04 2022-05-04 19:25:49,763 INFO [train.py:715] (0/8) Epoch 4, batch 2750, loss[loss=0.1414, simple_loss=0.22, pruned_loss=0.03147, over 4846.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2282, pruned_loss=0.04438, over 971727.34 frames.], batch size: 13, lr: 4.73e-04 2022-05-04 19:26:29,540 INFO [train.py:715] (0/8) Epoch 4, batch 2800, loss[loss=0.1601, simple_loss=0.2318, pruned_loss=0.04419, over 4861.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2278, pruned_loss=0.04398, over 971046.87 frames.], batch size: 22, lr: 4.73e-04 2022-05-04 19:27:08,929 INFO [train.py:715] (0/8) Epoch 4, batch 2850, loss[loss=0.1553, simple_loss=0.2228, pruned_loss=0.04388, over 4869.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2274, pruned_loss=0.04397, over 971883.54 frames.], batch size: 20, lr: 4.73e-04 2022-05-04 19:27:49,244 INFO [train.py:715] (0/8) Epoch 4, batch 2900, loss[loss=0.1932, simple_loss=0.2517, pruned_loss=0.06731, over 4971.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2278, pruned_loss=0.04425, over 973177.68 frames.], batch size: 15, lr: 4.73e-04 2022-05-04 19:28:29,132 INFO [train.py:715] (0/8) Epoch 4, batch 2950, loss[loss=0.1532, simple_loss=0.2214, pruned_loss=0.04252, over 4811.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2274, pruned_loss=0.04467, over 972791.89 frames.], batch size: 21, lr: 4.73e-04 2022-05-04 19:29:08,448 INFO [train.py:715] (0/8) Epoch 4, batch 3000, loss[loss=0.1479, simple_loss=0.2276, pruned_loss=0.03406, over 4826.00 frames.], tot_loss[loss=0.158, simple_loss=0.2269, pruned_loss=0.04456, over 971910.58 frames.], batch size: 26, lr: 4.73e-04 2022-05-04 19:29:08,449 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 19:29:17,945 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1127, simple_loss=0.1984, pruned_loss=0.01346, over 914524.00 frames. 2022-05-04 19:29:57,090 INFO [train.py:715] (0/8) Epoch 4, batch 3050, loss[loss=0.1724, simple_loss=0.2383, pruned_loss=0.05326, over 4794.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2262, pruned_loss=0.04411, over 972682.22 frames.], batch size: 17, lr: 4.73e-04 2022-05-04 19:30:37,134 INFO [train.py:715] (0/8) Epoch 4, batch 3100, loss[loss=0.1579, simple_loss=0.2331, pruned_loss=0.04134, over 4923.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2268, pruned_loss=0.04449, over 972495.00 frames.], batch size: 23, lr: 4.73e-04 2022-05-04 19:31:17,411 INFO [train.py:715] (0/8) Epoch 4, batch 3150, loss[loss=0.1585, simple_loss=0.2185, pruned_loss=0.04924, over 4809.00 frames.], tot_loss[loss=0.158, simple_loss=0.227, pruned_loss=0.04454, over 972775.51 frames.], batch size: 13, lr: 4.73e-04 2022-05-04 19:31:57,023 INFO [train.py:715] (0/8) Epoch 4, batch 3200, loss[loss=0.124, simple_loss=0.2048, pruned_loss=0.02162, over 4853.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2277, pruned_loss=0.04485, over 972356.38 frames.], batch size: 13, lr: 4.72e-04 2022-05-04 19:32:36,970 INFO [train.py:715] (0/8) Epoch 4, batch 3250, loss[loss=0.1829, simple_loss=0.256, pruned_loss=0.05487, over 4870.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2279, pruned_loss=0.04519, over 972561.33 frames.], batch size: 22, lr: 4.72e-04 2022-05-04 19:33:16,912 INFO [train.py:715] (0/8) Epoch 4, batch 3300, loss[loss=0.1477, simple_loss=0.2189, pruned_loss=0.03828, over 4968.00 frames.], tot_loss[loss=0.1582, simple_loss=0.227, pruned_loss=0.04472, over 973246.48 frames.], batch size: 31, lr: 4.72e-04 2022-05-04 19:33:56,289 INFO [train.py:715] (0/8) Epoch 4, batch 3350, loss[loss=0.2056, simple_loss=0.2588, pruned_loss=0.07617, over 4749.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2269, pruned_loss=0.04442, over 973515.80 frames.], batch size: 19, lr: 4.72e-04 2022-05-04 19:34:35,328 INFO [train.py:715] (0/8) Epoch 4, batch 3400, loss[loss=0.1797, simple_loss=0.235, pruned_loss=0.06223, over 4941.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2276, pruned_loss=0.0447, over 973130.97 frames.], batch size: 23, lr: 4.72e-04 2022-05-04 19:35:15,776 INFO [train.py:715] (0/8) Epoch 4, batch 3450, loss[loss=0.1696, simple_loss=0.2428, pruned_loss=0.04817, over 4945.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2271, pruned_loss=0.04409, over 973257.77 frames.], batch size: 21, lr: 4.72e-04 2022-05-04 19:35:55,189 INFO [train.py:715] (0/8) Epoch 4, batch 3500, loss[loss=0.1412, simple_loss=0.2127, pruned_loss=0.0348, over 4826.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2267, pruned_loss=0.04387, over 972385.08 frames.], batch size: 26, lr: 4.72e-04 2022-05-04 19:36:34,855 INFO [train.py:715] (0/8) Epoch 4, batch 3550, loss[loss=0.1788, simple_loss=0.2574, pruned_loss=0.05009, over 4809.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2277, pruned_loss=0.04471, over 973463.50 frames.], batch size: 24, lr: 4.72e-04 2022-05-04 19:37:14,697 INFO [train.py:715] (0/8) Epoch 4, batch 3600, loss[loss=0.1462, simple_loss=0.2159, pruned_loss=0.03824, over 4837.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2265, pruned_loss=0.04409, over 972709.43 frames.], batch size: 15, lr: 4.72e-04 2022-05-04 19:37:54,697 INFO [train.py:715] (0/8) Epoch 4, batch 3650, loss[loss=0.1466, simple_loss=0.2147, pruned_loss=0.03924, over 4933.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2269, pruned_loss=0.04415, over 972928.18 frames.], batch size: 23, lr: 4.72e-04 2022-05-04 19:38:34,068 INFO [train.py:715] (0/8) Epoch 4, batch 3700, loss[loss=0.2019, simple_loss=0.2652, pruned_loss=0.0693, over 4961.00 frames.], tot_loss[loss=0.158, simple_loss=0.2272, pruned_loss=0.04435, over 972425.68 frames.], batch size: 35, lr: 4.72e-04 2022-05-04 19:39:13,349 INFO [train.py:715] (0/8) Epoch 4, batch 3750, loss[loss=0.1943, simple_loss=0.2751, pruned_loss=0.0568, over 4885.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2272, pruned_loss=0.04409, over 972305.00 frames.], batch size: 39, lr: 4.72e-04 2022-05-04 19:39:53,213 INFO [train.py:715] (0/8) Epoch 4, batch 3800, loss[loss=0.1885, simple_loss=0.2558, pruned_loss=0.06062, over 4883.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2281, pruned_loss=0.04486, over 971948.59 frames.], batch size: 22, lr: 4.72e-04 2022-05-04 19:40:32,933 INFO [train.py:715] (0/8) Epoch 4, batch 3850, loss[loss=0.1298, simple_loss=0.1993, pruned_loss=0.03015, over 4696.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2277, pruned_loss=0.04488, over 971582.05 frames.], batch size: 15, lr: 4.71e-04 2022-05-04 19:41:13,118 INFO [train.py:715] (0/8) Epoch 4, batch 3900, loss[loss=0.1305, simple_loss=0.2025, pruned_loss=0.02922, over 4778.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2271, pruned_loss=0.04461, over 970913.47 frames.], batch size: 17, lr: 4.71e-04 2022-05-04 19:41:53,256 INFO [train.py:715] (0/8) Epoch 4, batch 3950, loss[loss=0.1817, simple_loss=0.2505, pruned_loss=0.05642, over 4786.00 frames.], tot_loss[loss=0.159, simple_loss=0.228, pruned_loss=0.04501, over 971257.52 frames.], batch size: 18, lr: 4.71e-04 2022-05-04 19:42:33,627 INFO [train.py:715] (0/8) Epoch 4, batch 4000, loss[loss=0.1417, simple_loss=0.2152, pruned_loss=0.03409, over 4760.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2271, pruned_loss=0.04439, over 970992.15 frames.], batch size: 16, lr: 4.71e-04 2022-05-04 19:43:13,664 INFO [train.py:715] (0/8) Epoch 4, batch 4050, loss[loss=0.1413, simple_loss=0.2048, pruned_loss=0.03885, over 4807.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2265, pruned_loss=0.04418, over 971913.59 frames.], batch size: 12, lr: 4.71e-04 2022-05-04 19:43:53,236 INFO [train.py:715] (0/8) Epoch 4, batch 4100, loss[loss=0.155, simple_loss=0.2345, pruned_loss=0.03773, over 4753.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2271, pruned_loss=0.04437, over 970915.39 frames.], batch size: 19, lr: 4.71e-04 2022-05-04 19:44:33,947 INFO [train.py:715] (0/8) Epoch 4, batch 4150, loss[loss=0.1572, simple_loss=0.2192, pruned_loss=0.04761, over 4928.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2278, pruned_loss=0.04499, over 971371.73 frames.], batch size: 23, lr: 4.71e-04 2022-05-04 19:45:13,433 INFO [train.py:715] (0/8) Epoch 4, batch 4200, loss[loss=0.1178, simple_loss=0.1895, pruned_loss=0.0231, over 4928.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2272, pruned_loss=0.04496, over 972151.27 frames.], batch size: 21, lr: 4.71e-04 2022-05-04 19:45:52,908 INFO [train.py:715] (0/8) Epoch 4, batch 4250, loss[loss=0.1507, simple_loss=0.2138, pruned_loss=0.04384, over 4840.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2266, pruned_loss=0.04426, over 972302.65 frames.], batch size: 30, lr: 4.71e-04 2022-05-04 19:46:33,008 INFO [train.py:715] (0/8) Epoch 4, batch 4300, loss[loss=0.1666, simple_loss=0.2294, pruned_loss=0.05191, over 4991.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2267, pruned_loss=0.04456, over 972795.49 frames.], batch size: 16, lr: 4.71e-04 2022-05-04 19:47:13,033 INFO [train.py:715] (0/8) Epoch 4, batch 4350, loss[loss=0.1564, simple_loss=0.2221, pruned_loss=0.04538, over 4771.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2267, pruned_loss=0.04497, over 973120.47 frames.], batch size: 18, lr: 4.71e-04 2022-05-04 19:47:52,116 INFO [train.py:715] (0/8) Epoch 4, batch 4400, loss[loss=0.1443, simple_loss=0.2221, pruned_loss=0.03322, over 4871.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2272, pruned_loss=0.04471, over 971971.76 frames.], batch size: 16, lr: 4.71e-04 2022-05-04 19:48:31,828 INFO [train.py:715] (0/8) Epoch 4, batch 4450, loss[loss=0.1446, simple_loss=0.2295, pruned_loss=0.02986, over 4955.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2256, pruned_loss=0.04397, over 971185.01 frames.], batch size: 21, lr: 4.70e-04 2022-05-04 19:49:12,001 INFO [train.py:715] (0/8) Epoch 4, batch 4500, loss[loss=0.1271, simple_loss=0.2093, pruned_loss=0.02243, over 4782.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2257, pruned_loss=0.04371, over 971710.67 frames.], batch size: 18, lr: 4.70e-04 2022-05-04 19:49:51,270 INFO [train.py:715] (0/8) Epoch 4, batch 4550, loss[loss=0.1509, simple_loss=0.217, pruned_loss=0.04236, over 4799.00 frames.], tot_loss[loss=0.158, simple_loss=0.2266, pruned_loss=0.04465, over 972187.14 frames.], batch size: 13, lr: 4.70e-04 2022-05-04 19:50:30,674 INFO [train.py:715] (0/8) Epoch 4, batch 4600, loss[loss=0.1399, simple_loss=0.219, pruned_loss=0.03038, over 4810.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2268, pruned_loss=0.04479, over 972636.85 frames.], batch size: 26, lr: 4.70e-04 2022-05-04 19:51:10,988 INFO [train.py:715] (0/8) Epoch 4, batch 4650, loss[loss=0.1543, simple_loss=0.2308, pruned_loss=0.03886, over 4796.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2263, pruned_loss=0.04455, over 972011.84 frames.], batch size: 21, lr: 4.70e-04 2022-05-04 19:51:51,337 INFO [train.py:715] (0/8) Epoch 4, batch 4700, loss[loss=0.1575, simple_loss=0.2167, pruned_loss=0.04916, over 4945.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2271, pruned_loss=0.04489, over 972218.60 frames.], batch size: 35, lr: 4.70e-04 2022-05-04 19:52:31,251 INFO [train.py:715] (0/8) Epoch 4, batch 4750, loss[loss=0.1526, simple_loss=0.2272, pruned_loss=0.03903, over 4816.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2276, pruned_loss=0.04495, over 972390.26 frames.], batch size: 27, lr: 4.70e-04 2022-05-04 19:52:39,577 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-144000.pt 2022-05-04 19:53:13,035 INFO [train.py:715] (0/8) Epoch 4, batch 4800, loss[loss=0.1399, simple_loss=0.22, pruned_loss=0.02989, over 4975.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2283, pruned_loss=0.04539, over 972763.12 frames.], batch size: 24, lr: 4.70e-04 2022-05-04 19:53:53,552 INFO [train.py:715] (0/8) Epoch 4, batch 4850, loss[loss=0.1656, simple_loss=0.235, pruned_loss=0.04809, over 4783.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2276, pruned_loss=0.04479, over 972649.07 frames.], batch size: 17, lr: 4.70e-04 2022-05-04 19:54:32,955 INFO [train.py:715] (0/8) Epoch 4, batch 4900, loss[loss=0.158, simple_loss=0.2271, pruned_loss=0.04445, over 4977.00 frames.], tot_loss[loss=0.1591, simple_loss=0.228, pruned_loss=0.04507, over 972496.38 frames.], batch size: 14, lr: 4.70e-04 2022-05-04 19:55:12,348 INFO [train.py:715] (0/8) Epoch 4, batch 4950, loss[loss=0.172, simple_loss=0.2449, pruned_loss=0.04954, over 4842.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2282, pruned_loss=0.04477, over 972679.72 frames.], batch size: 15, lr: 4.70e-04 2022-05-04 19:55:52,410 INFO [train.py:715] (0/8) Epoch 4, batch 5000, loss[loss=0.1702, simple_loss=0.2285, pruned_loss=0.05597, over 4916.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2282, pruned_loss=0.04479, over 972765.81 frames.], batch size: 18, lr: 4.70e-04 2022-05-04 19:56:32,440 INFO [train.py:715] (0/8) Epoch 4, batch 5050, loss[loss=0.1649, simple_loss=0.2269, pruned_loss=0.05151, over 4914.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2274, pruned_loss=0.04436, over 973225.86 frames.], batch size: 18, lr: 4.69e-04 2022-05-04 19:57:12,341 INFO [train.py:715] (0/8) Epoch 4, batch 5100, loss[loss=0.1524, simple_loss=0.2151, pruned_loss=0.04491, over 4727.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2275, pruned_loss=0.04476, over 972617.28 frames.], batch size: 16, lr: 4.69e-04 2022-05-04 19:57:51,519 INFO [train.py:715] (0/8) Epoch 4, batch 5150, loss[loss=0.1808, simple_loss=0.2465, pruned_loss=0.05757, over 4811.00 frames.], tot_loss[loss=0.158, simple_loss=0.2272, pruned_loss=0.04443, over 973047.36 frames.], batch size: 25, lr: 4.69e-04 2022-05-04 19:58:31,723 INFO [train.py:715] (0/8) Epoch 4, batch 5200, loss[loss=0.1727, simple_loss=0.2399, pruned_loss=0.0527, over 4980.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2277, pruned_loss=0.0449, over 972987.25 frames.], batch size: 33, lr: 4.69e-04 2022-05-04 19:59:11,083 INFO [train.py:715] (0/8) Epoch 4, batch 5250, loss[loss=0.1693, simple_loss=0.2493, pruned_loss=0.04465, over 4923.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2286, pruned_loss=0.04511, over 971885.11 frames.], batch size: 18, lr: 4.69e-04 2022-05-04 19:59:50,710 INFO [train.py:715] (0/8) Epoch 4, batch 5300, loss[loss=0.1638, simple_loss=0.2335, pruned_loss=0.04708, over 4890.00 frames.], tot_loss[loss=0.1589, simple_loss=0.228, pruned_loss=0.0449, over 972003.14 frames.], batch size: 22, lr: 4.69e-04 2022-05-04 20:00:30,973 INFO [train.py:715] (0/8) Epoch 4, batch 5350, loss[loss=0.154, simple_loss=0.2188, pruned_loss=0.04465, over 4750.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2284, pruned_loss=0.04515, over 972616.39 frames.], batch size: 19, lr: 4.69e-04 2022-05-04 20:01:11,129 INFO [train.py:715] (0/8) Epoch 4, batch 5400, loss[loss=0.1588, simple_loss=0.2314, pruned_loss=0.04309, over 4865.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2282, pruned_loss=0.04535, over 972554.90 frames.], batch size: 32, lr: 4.69e-04 2022-05-04 20:01:51,433 INFO [train.py:715] (0/8) Epoch 4, batch 5450, loss[loss=0.1379, simple_loss=0.2005, pruned_loss=0.03763, over 4839.00 frames.], tot_loss[loss=0.1592, simple_loss=0.228, pruned_loss=0.04515, over 973254.98 frames.], batch size: 32, lr: 4.69e-04 2022-05-04 20:02:30,836 INFO [train.py:715] (0/8) Epoch 4, batch 5500, loss[loss=0.1821, simple_loss=0.2382, pruned_loss=0.06301, over 4883.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2273, pruned_loss=0.04478, over 972965.09 frames.], batch size: 16, lr: 4.69e-04 2022-05-04 20:03:11,384 INFO [train.py:715] (0/8) Epoch 4, batch 5550, loss[loss=0.1674, simple_loss=0.2417, pruned_loss=0.04655, over 4844.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2274, pruned_loss=0.04482, over 973400.77 frames.], batch size: 20, lr: 4.69e-04 2022-05-04 20:03:51,122 INFO [train.py:715] (0/8) Epoch 4, batch 5600, loss[loss=0.1358, simple_loss=0.2124, pruned_loss=0.02958, over 4933.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2274, pruned_loss=0.04484, over 972831.35 frames.], batch size: 21, lr: 4.69e-04 2022-05-04 20:04:31,007 INFO [train.py:715] (0/8) Epoch 4, batch 5650, loss[loss=0.1441, simple_loss=0.2158, pruned_loss=0.03618, over 4854.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2293, pruned_loss=0.04574, over 973741.16 frames.], batch size: 32, lr: 4.68e-04 2022-05-04 20:05:10,989 INFO [train.py:715] (0/8) Epoch 4, batch 5700, loss[loss=0.1657, simple_loss=0.2384, pruned_loss=0.04653, over 4913.00 frames.], tot_loss[loss=0.1604, simple_loss=0.2292, pruned_loss=0.04574, over 972926.10 frames.], batch size: 18, lr: 4.68e-04 2022-05-04 20:05:51,205 INFO [train.py:715] (0/8) Epoch 4, batch 5750, loss[loss=0.1257, simple_loss=0.1903, pruned_loss=0.03055, over 4952.00 frames.], tot_loss[loss=0.1612, simple_loss=0.2298, pruned_loss=0.0463, over 973424.17 frames.], batch size: 14, lr: 4.68e-04 2022-05-04 20:06:31,308 INFO [train.py:715] (0/8) Epoch 4, batch 5800, loss[loss=0.1375, simple_loss=0.2141, pruned_loss=0.03041, over 4885.00 frames.], tot_loss[loss=0.1606, simple_loss=0.2292, pruned_loss=0.04603, over 973072.95 frames.], batch size: 22, lr: 4.68e-04 2022-05-04 20:07:10,956 INFO [train.py:715] (0/8) Epoch 4, batch 5850, loss[loss=0.1257, simple_loss=0.1978, pruned_loss=0.02682, over 4817.00 frames.], tot_loss[loss=0.16, simple_loss=0.2289, pruned_loss=0.04557, over 971914.68 frames.], batch size: 12, lr: 4.68e-04 2022-05-04 20:07:51,261 INFO [train.py:715] (0/8) Epoch 4, batch 5900, loss[loss=0.1722, simple_loss=0.2431, pruned_loss=0.05058, over 4990.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2285, pruned_loss=0.04549, over 971595.56 frames.], batch size: 26, lr: 4.68e-04 2022-05-04 20:08:30,933 INFO [train.py:715] (0/8) Epoch 4, batch 5950, loss[loss=0.1302, simple_loss=0.1849, pruned_loss=0.0377, over 4798.00 frames.], tot_loss[loss=0.16, simple_loss=0.2287, pruned_loss=0.04566, over 971571.26 frames.], batch size: 12, lr: 4.68e-04 2022-05-04 20:09:10,571 INFO [train.py:715] (0/8) Epoch 4, batch 6000, loss[loss=0.1181, simple_loss=0.1959, pruned_loss=0.02018, over 4774.00 frames.], tot_loss[loss=0.1602, simple_loss=0.2288, pruned_loss=0.04577, over 971171.72 frames.], batch size: 18, lr: 4.68e-04 2022-05-04 20:09:10,572 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 20:09:20,452 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1124, simple_loss=0.1981, pruned_loss=0.01337, over 914524.00 frames. 2022-05-04 20:10:00,567 INFO [train.py:715] (0/8) Epoch 4, batch 6050, loss[loss=0.1593, simple_loss=0.2388, pruned_loss=0.03988, over 4790.00 frames.], tot_loss[loss=0.1597, simple_loss=0.2284, pruned_loss=0.04548, over 971386.39 frames.], batch size: 18, lr: 4.68e-04 2022-05-04 20:10:40,766 INFO [train.py:715] (0/8) Epoch 4, batch 6100, loss[loss=0.1773, simple_loss=0.2521, pruned_loss=0.05119, over 4977.00 frames.], tot_loss[loss=0.1589, simple_loss=0.228, pruned_loss=0.04495, over 971987.56 frames.], batch size: 14, lr: 4.68e-04 2022-05-04 20:11:21,159 INFO [train.py:715] (0/8) Epoch 4, batch 6150, loss[loss=0.1268, simple_loss=0.2088, pruned_loss=0.02239, over 4957.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2276, pruned_loss=0.04462, over 971638.60 frames.], batch size: 24, lr: 4.68e-04 2022-05-04 20:12:01,192 INFO [train.py:715] (0/8) Epoch 4, batch 6200, loss[loss=0.1498, simple_loss=0.2253, pruned_loss=0.03713, over 4908.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2279, pruned_loss=0.04481, over 971634.71 frames.], batch size: 19, lr: 4.68e-04 2022-05-04 20:12:40,824 INFO [train.py:715] (0/8) Epoch 4, batch 6250, loss[loss=0.188, simple_loss=0.2521, pruned_loss=0.06192, over 4809.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2279, pruned_loss=0.04425, over 972998.32 frames.], batch size: 21, lr: 4.68e-04 2022-05-04 20:13:21,461 INFO [train.py:715] (0/8) Epoch 4, batch 6300, loss[loss=0.144, simple_loss=0.2226, pruned_loss=0.03265, over 4932.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2277, pruned_loss=0.04367, over 973042.85 frames.], batch size: 23, lr: 4.67e-04 2022-05-04 20:14:00,898 INFO [train.py:715] (0/8) Epoch 4, batch 6350, loss[loss=0.1809, simple_loss=0.2518, pruned_loss=0.05499, over 4793.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2294, pruned_loss=0.04442, over 973196.26 frames.], batch size: 17, lr: 4.67e-04 2022-05-04 20:14:41,817 INFO [train.py:715] (0/8) Epoch 4, batch 6400, loss[loss=0.1382, simple_loss=0.2082, pruned_loss=0.03409, over 4694.00 frames.], tot_loss[loss=0.1596, simple_loss=0.2296, pruned_loss=0.04478, over 972423.31 frames.], batch size: 15, lr: 4.67e-04 2022-05-04 20:15:21,559 INFO [train.py:715] (0/8) Epoch 4, batch 6450, loss[loss=0.1399, simple_loss=0.217, pruned_loss=0.03144, over 4932.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2292, pruned_loss=0.04485, over 973197.48 frames.], batch size: 23, lr: 4.67e-04 2022-05-04 20:16:01,663 INFO [train.py:715] (0/8) Epoch 4, batch 6500, loss[loss=0.1659, simple_loss=0.2302, pruned_loss=0.05083, over 4873.00 frames.], tot_loss[loss=0.1598, simple_loss=0.2293, pruned_loss=0.04516, over 973834.03 frames.], batch size: 32, lr: 4.67e-04 2022-05-04 20:16:41,332 INFO [train.py:715] (0/8) Epoch 4, batch 6550, loss[loss=0.138, simple_loss=0.208, pruned_loss=0.03403, over 4746.00 frames.], tot_loss[loss=0.159, simple_loss=0.2283, pruned_loss=0.04486, over 973565.16 frames.], batch size: 16, lr: 4.67e-04 2022-05-04 20:17:20,645 INFO [train.py:715] (0/8) Epoch 4, batch 6600, loss[loss=0.1616, simple_loss=0.2166, pruned_loss=0.0533, over 4917.00 frames.], tot_loss[loss=0.1596, simple_loss=0.2288, pruned_loss=0.04521, over 972935.66 frames.], batch size: 18, lr: 4.67e-04 2022-05-04 20:18:01,333 INFO [train.py:715] (0/8) Epoch 4, batch 6650, loss[loss=0.1668, simple_loss=0.2349, pruned_loss=0.04928, over 4803.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2285, pruned_loss=0.04502, over 972516.10 frames.], batch size: 21, lr: 4.67e-04 2022-05-04 20:18:40,883 INFO [train.py:715] (0/8) Epoch 4, batch 6700, loss[loss=0.124, simple_loss=0.1926, pruned_loss=0.02767, over 4964.00 frames.], tot_loss[loss=0.1588, simple_loss=0.228, pruned_loss=0.0448, over 973003.29 frames.], batch size: 14, lr: 4.67e-04 2022-05-04 20:19:21,004 INFO [train.py:715] (0/8) Epoch 4, batch 6750, loss[loss=0.1517, simple_loss=0.2094, pruned_loss=0.04701, over 4724.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2285, pruned_loss=0.04515, over 973274.69 frames.], batch size: 16, lr: 4.67e-04 2022-05-04 20:20:00,764 INFO [train.py:715] (0/8) Epoch 4, batch 6800, loss[loss=0.163, simple_loss=0.224, pruned_loss=0.05099, over 4784.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2276, pruned_loss=0.04437, over 972906.92 frames.], batch size: 17, lr: 4.67e-04 2022-05-04 20:20:40,791 INFO [train.py:715] (0/8) Epoch 4, batch 6850, loss[loss=0.1516, simple_loss=0.2213, pruned_loss=0.04093, over 4982.00 frames.], tot_loss[loss=0.158, simple_loss=0.2273, pruned_loss=0.04435, over 972720.30 frames.], batch size: 35, lr: 4.67e-04 2022-05-04 20:21:20,096 INFO [train.py:715] (0/8) Epoch 4, batch 6900, loss[loss=0.1242, simple_loss=0.1817, pruned_loss=0.03337, over 4762.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2273, pruned_loss=0.04406, over 972477.53 frames.], batch size: 12, lr: 4.66e-04 2022-05-04 20:21:59,579 INFO [train.py:715] (0/8) Epoch 4, batch 6950, loss[loss=0.1567, simple_loss=0.2218, pruned_loss=0.0458, over 4940.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2271, pruned_loss=0.04391, over 972103.16 frames.], batch size: 39, lr: 4.66e-04 2022-05-04 20:22:39,319 INFO [train.py:715] (0/8) Epoch 4, batch 7000, loss[loss=0.137, simple_loss=0.2159, pruned_loss=0.0291, over 4802.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2272, pruned_loss=0.04398, over 971821.96 frames.], batch size: 25, lr: 4.66e-04 2022-05-04 20:23:19,196 INFO [train.py:715] (0/8) Epoch 4, batch 7050, loss[loss=0.1546, simple_loss=0.2197, pruned_loss=0.04482, over 4908.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2276, pruned_loss=0.04459, over 972353.90 frames.], batch size: 17, lr: 4.66e-04 2022-05-04 20:23:58,920 INFO [train.py:715] (0/8) Epoch 4, batch 7100, loss[loss=0.2078, simple_loss=0.275, pruned_loss=0.07034, over 4804.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2274, pruned_loss=0.04444, over 972625.54 frames.], batch size: 21, lr: 4.66e-04 2022-05-04 20:24:39,016 INFO [train.py:715] (0/8) Epoch 4, batch 7150, loss[loss=0.1478, simple_loss=0.2129, pruned_loss=0.04135, over 4915.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2282, pruned_loss=0.04469, over 971375.01 frames.], batch size: 39, lr: 4.66e-04 2022-05-04 20:25:18,943 INFO [train.py:715] (0/8) Epoch 4, batch 7200, loss[loss=0.1588, simple_loss=0.2277, pruned_loss=0.04493, over 4774.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2271, pruned_loss=0.04425, over 971256.68 frames.], batch size: 14, lr: 4.66e-04 2022-05-04 20:25:59,098 INFO [train.py:715] (0/8) Epoch 4, batch 7250, loss[loss=0.1711, simple_loss=0.2416, pruned_loss=0.0503, over 4707.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2275, pruned_loss=0.04448, over 970998.32 frames.], batch size: 15, lr: 4.66e-04 2022-05-04 20:26:38,420 INFO [train.py:715] (0/8) Epoch 4, batch 7300, loss[loss=0.1447, simple_loss=0.2152, pruned_loss=0.03715, over 4763.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2279, pruned_loss=0.04443, over 971726.23 frames.], batch size: 14, lr: 4.66e-04 2022-05-04 20:27:18,105 INFO [train.py:715] (0/8) Epoch 4, batch 7350, loss[loss=0.1494, simple_loss=0.2204, pruned_loss=0.0392, over 4854.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2278, pruned_loss=0.04462, over 971836.29 frames.], batch size: 20, lr: 4.66e-04 2022-05-04 20:27:58,070 INFO [train.py:715] (0/8) Epoch 4, batch 7400, loss[loss=0.151, simple_loss=0.2191, pruned_loss=0.04141, over 4844.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2267, pruned_loss=0.04403, over 972358.07 frames.], batch size: 20, lr: 4.66e-04 2022-05-04 20:28:38,813 INFO [train.py:715] (0/8) Epoch 4, batch 7450, loss[loss=0.186, simple_loss=0.2398, pruned_loss=0.06613, over 4778.00 frames.], tot_loss[loss=0.1571, simple_loss=0.226, pruned_loss=0.04406, over 972353.76 frames.], batch size: 14, lr: 4.66e-04 2022-05-04 20:29:18,217 INFO [train.py:715] (0/8) Epoch 4, batch 7500, loss[loss=0.1539, simple_loss=0.2245, pruned_loss=0.04165, over 4886.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2264, pruned_loss=0.04412, over 972937.82 frames.], batch size: 22, lr: 4.66e-04 2022-05-04 20:29:58,241 INFO [train.py:715] (0/8) Epoch 4, batch 7550, loss[loss=0.1338, simple_loss=0.22, pruned_loss=0.02381, over 4813.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2266, pruned_loss=0.0439, over 972571.23 frames.], batch size: 25, lr: 4.65e-04 2022-05-04 20:30:38,893 INFO [train.py:715] (0/8) Epoch 4, batch 7600, loss[loss=0.1537, simple_loss=0.2112, pruned_loss=0.04813, over 4691.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2271, pruned_loss=0.04415, over 972840.87 frames.], batch size: 15, lr: 4.65e-04 2022-05-04 20:31:18,409 INFO [train.py:715] (0/8) Epoch 4, batch 7650, loss[loss=0.1954, simple_loss=0.2543, pruned_loss=0.06832, over 4650.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2273, pruned_loss=0.04403, over 972610.17 frames.], batch size: 13, lr: 4.65e-04 2022-05-04 20:31:58,069 INFO [train.py:715] (0/8) Epoch 4, batch 7700, loss[loss=0.1627, simple_loss=0.2303, pruned_loss=0.04758, over 4764.00 frames.], tot_loss[loss=0.1562, simple_loss=0.226, pruned_loss=0.04321, over 971844.29 frames.], batch size: 19, lr: 4.65e-04 2022-05-04 20:32:38,168 INFO [train.py:715] (0/8) Epoch 4, batch 7750, loss[loss=0.1604, simple_loss=0.2398, pruned_loss=0.04049, over 4899.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2271, pruned_loss=0.0436, over 972341.78 frames.], batch size: 22, lr: 4.65e-04 2022-05-04 20:33:18,306 INFO [train.py:715] (0/8) Epoch 4, batch 7800, loss[loss=0.1603, simple_loss=0.2221, pruned_loss=0.04931, over 4828.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2275, pruned_loss=0.04417, over 971436.34 frames.], batch size: 15, lr: 4.65e-04 2022-05-04 20:33:57,308 INFO [train.py:715] (0/8) Epoch 4, batch 7850, loss[loss=0.2081, simple_loss=0.2805, pruned_loss=0.06782, over 4956.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2277, pruned_loss=0.04426, over 971077.70 frames.], batch size: 39, lr: 4.65e-04 2022-05-04 20:34:36,902 INFO [train.py:715] (0/8) Epoch 4, batch 7900, loss[loss=0.1751, simple_loss=0.236, pruned_loss=0.05713, over 4841.00 frames.], tot_loss[loss=0.157, simple_loss=0.2266, pruned_loss=0.04372, over 971419.77 frames.], batch size: 30, lr: 4.65e-04 2022-05-04 20:35:16,766 INFO [train.py:715] (0/8) Epoch 4, batch 7950, loss[loss=0.1501, simple_loss=0.2236, pruned_loss=0.03835, over 4888.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2272, pruned_loss=0.04407, over 970908.14 frames.], batch size: 19, lr: 4.65e-04 2022-05-04 20:35:56,343 INFO [train.py:715] (0/8) Epoch 4, batch 8000, loss[loss=0.1497, simple_loss=0.2228, pruned_loss=0.0383, over 4804.00 frames.], tot_loss[loss=0.1599, simple_loss=0.229, pruned_loss=0.04536, over 970938.87 frames.], batch size: 21, lr: 4.65e-04 2022-05-04 20:36:36,312 INFO [train.py:715] (0/8) Epoch 4, batch 8050, loss[loss=0.1496, simple_loss=0.2199, pruned_loss=0.03962, over 4963.00 frames.], tot_loss[loss=0.1603, simple_loss=0.2295, pruned_loss=0.04554, over 971736.41 frames.], batch size: 23, lr: 4.65e-04 2022-05-04 20:37:16,264 INFO [train.py:715] (0/8) Epoch 4, batch 8100, loss[loss=0.2059, simple_loss=0.2566, pruned_loss=0.07762, over 4779.00 frames.], tot_loss[loss=0.1609, simple_loss=0.23, pruned_loss=0.04585, over 971517.01 frames.], batch size: 17, lr: 4.65e-04 2022-05-04 20:37:56,508 INFO [train.py:715] (0/8) Epoch 4, batch 8150, loss[loss=0.1928, simple_loss=0.241, pruned_loss=0.07234, over 4760.00 frames.], tot_loss[loss=0.16, simple_loss=0.2292, pruned_loss=0.0454, over 971053.21 frames.], batch size: 19, lr: 4.65e-04 2022-05-04 20:38:35,992 INFO [train.py:715] (0/8) Epoch 4, batch 8200, loss[loss=0.1629, simple_loss=0.2232, pruned_loss=0.05126, over 4871.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2283, pruned_loss=0.04475, over 971205.08 frames.], batch size: 32, lr: 4.64e-04 2022-05-04 20:39:15,726 INFO [train.py:715] (0/8) Epoch 4, batch 8250, loss[loss=0.1448, simple_loss=0.2242, pruned_loss=0.0327, over 4816.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2288, pruned_loss=0.04491, over 972066.40 frames.], batch size: 25, lr: 4.64e-04 2022-05-04 20:39:55,878 INFO [train.py:715] (0/8) Epoch 4, batch 8300, loss[loss=0.1436, simple_loss=0.2115, pruned_loss=0.03779, over 4945.00 frames.], tot_loss[loss=0.1596, simple_loss=0.2292, pruned_loss=0.04499, over 972454.69 frames.], batch size: 14, lr: 4.64e-04 2022-05-04 20:40:35,315 INFO [train.py:715] (0/8) Epoch 4, batch 8350, loss[loss=0.1327, simple_loss=0.2087, pruned_loss=0.02837, over 4790.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2291, pruned_loss=0.04484, over 972983.88 frames.], batch size: 17, lr: 4.64e-04 2022-05-04 20:41:15,397 INFO [train.py:715] (0/8) Epoch 4, batch 8400, loss[loss=0.1461, simple_loss=0.2133, pruned_loss=0.03947, over 4914.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2277, pruned_loss=0.04439, over 973011.72 frames.], batch size: 18, lr: 4.64e-04 2022-05-04 20:41:55,745 INFO [train.py:715] (0/8) Epoch 4, batch 8450, loss[loss=0.1695, simple_loss=0.2358, pruned_loss=0.05161, over 4825.00 frames.], tot_loss[loss=0.1578, simple_loss=0.227, pruned_loss=0.0443, over 973342.39 frames.], batch size: 15, lr: 4.64e-04 2022-05-04 20:42:35,848 INFO [train.py:715] (0/8) Epoch 4, batch 8500, loss[loss=0.1545, simple_loss=0.2212, pruned_loss=0.04391, over 4955.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2264, pruned_loss=0.04427, over 972698.99 frames.], batch size: 14, lr: 4.64e-04 2022-05-04 20:43:15,263 INFO [train.py:715] (0/8) Epoch 4, batch 8550, loss[loss=0.1433, simple_loss=0.2049, pruned_loss=0.04089, over 4855.00 frames.], tot_loss[loss=0.158, simple_loss=0.2272, pruned_loss=0.04441, over 972686.98 frames.], batch size: 13, lr: 4.64e-04 2022-05-04 20:43:55,077 INFO [train.py:715] (0/8) Epoch 4, batch 8600, loss[loss=0.1307, simple_loss=0.2071, pruned_loss=0.02713, over 4913.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2271, pruned_loss=0.04426, over 972462.46 frames.], batch size: 18, lr: 4.64e-04 2022-05-04 20:44:35,239 INFO [train.py:715] (0/8) Epoch 4, batch 8650, loss[loss=0.1696, simple_loss=0.2315, pruned_loss=0.05385, over 4758.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2268, pruned_loss=0.04436, over 971940.11 frames.], batch size: 19, lr: 4.64e-04 2022-05-04 20:45:14,866 INFO [train.py:715] (0/8) Epoch 4, batch 8700, loss[loss=0.1431, simple_loss=0.2123, pruned_loss=0.03702, over 4789.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2275, pruned_loss=0.04462, over 971941.39 frames.], batch size: 18, lr: 4.64e-04 2022-05-04 20:45:55,169 INFO [train.py:715] (0/8) Epoch 4, batch 8750, loss[loss=0.1662, simple_loss=0.2348, pruned_loss=0.04877, over 4985.00 frames.], tot_loss[loss=0.1576, simple_loss=0.227, pruned_loss=0.04409, over 972631.67 frames.], batch size: 28, lr: 4.64e-04 2022-05-04 20:46:35,392 INFO [train.py:715] (0/8) Epoch 4, batch 8800, loss[loss=0.1611, simple_loss=0.2205, pruned_loss=0.05079, over 4949.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2259, pruned_loss=0.04384, over 973004.32 frames.], batch size: 15, lr: 4.63e-04 2022-05-04 20:47:15,428 INFO [train.py:715] (0/8) Epoch 4, batch 8850, loss[loss=0.1861, simple_loss=0.2372, pruned_loss=0.06753, over 4970.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2269, pruned_loss=0.04445, over 972611.64 frames.], batch size: 35, lr: 4.63e-04 2022-05-04 20:47:55,126 INFO [train.py:715] (0/8) Epoch 4, batch 8900, loss[loss=0.1674, simple_loss=0.2407, pruned_loss=0.04704, over 4869.00 frames.], tot_loss[loss=0.157, simple_loss=0.2262, pruned_loss=0.04394, over 972062.91 frames.], batch size: 38, lr: 4.63e-04 2022-05-04 20:48:34,759 INFO [train.py:715] (0/8) Epoch 4, batch 8950, loss[loss=0.1201, simple_loss=0.1894, pruned_loss=0.02542, over 4846.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2253, pruned_loss=0.04348, over 971303.21 frames.], batch size: 13, lr: 4.63e-04 2022-05-04 20:49:15,024 INFO [train.py:715] (0/8) Epoch 4, batch 9000, loss[loss=0.1622, simple_loss=0.2313, pruned_loss=0.04652, over 4928.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2257, pruned_loss=0.04342, over 971109.64 frames.], batch size: 23, lr: 4.63e-04 2022-05-04 20:49:15,025 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 20:49:24,977 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1123, simple_loss=0.1979, pruned_loss=0.01336, over 914524.00 frames. 2022-05-04 20:50:05,299 INFO [train.py:715] (0/8) Epoch 4, batch 9050, loss[loss=0.1344, simple_loss=0.2106, pruned_loss=0.02909, over 4889.00 frames.], tot_loss[loss=0.1567, simple_loss=0.226, pruned_loss=0.04364, over 971101.14 frames.], batch size: 17, lr: 4.63e-04 2022-05-04 20:50:45,312 INFO [train.py:715] (0/8) Epoch 4, batch 9100, loss[loss=0.1537, simple_loss=0.2211, pruned_loss=0.04313, over 4775.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2259, pruned_loss=0.04361, over 971548.46 frames.], batch size: 17, lr: 4.63e-04 2022-05-04 20:51:24,709 INFO [train.py:715] (0/8) Epoch 4, batch 9150, loss[loss=0.191, simple_loss=0.2561, pruned_loss=0.06291, over 4980.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2264, pruned_loss=0.04387, over 971834.96 frames.], batch size: 15, lr: 4.63e-04 2022-05-04 20:52:04,886 INFO [train.py:715] (0/8) Epoch 4, batch 9200, loss[loss=0.1651, simple_loss=0.2347, pruned_loss=0.04775, over 4875.00 frames.], tot_loss[loss=0.1577, simple_loss=0.227, pruned_loss=0.04415, over 971339.89 frames.], batch size: 16, lr: 4.63e-04 2022-05-04 20:52:45,291 INFO [train.py:715] (0/8) Epoch 4, batch 9250, loss[loss=0.1682, simple_loss=0.2418, pruned_loss=0.04732, over 4802.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2269, pruned_loss=0.04422, over 971274.96 frames.], batch size: 21, lr: 4.63e-04 2022-05-04 20:53:24,538 INFO [train.py:715] (0/8) Epoch 4, batch 9300, loss[loss=0.1529, simple_loss=0.2226, pruned_loss=0.04158, over 4758.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2278, pruned_loss=0.04427, over 970489.95 frames.], batch size: 19, lr: 4.63e-04 2022-05-04 20:54:04,523 INFO [train.py:715] (0/8) Epoch 4, batch 9350, loss[loss=0.1427, simple_loss=0.2098, pruned_loss=0.03777, over 4776.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2279, pruned_loss=0.04426, over 971452.10 frames.], batch size: 14, lr: 4.63e-04 2022-05-04 20:54:44,467 INFO [train.py:715] (0/8) Epoch 4, batch 9400, loss[loss=0.1455, simple_loss=0.2174, pruned_loss=0.03684, over 4843.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2278, pruned_loss=0.0443, over 972020.30 frames.], batch size: 20, lr: 4.63e-04 2022-05-04 20:55:23,998 INFO [train.py:715] (0/8) Epoch 4, batch 9450, loss[loss=0.1591, simple_loss=0.2234, pruned_loss=0.04739, over 4836.00 frames.], tot_loss[loss=0.158, simple_loss=0.2281, pruned_loss=0.044, over 972430.34 frames.], batch size: 30, lr: 4.62e-04 2022-05-04 20:56:04,092 INFO [train.py:715] (0/8) Epoch 4, batch 9500, loss[loss=0.1754, simple_loss=0.2311, pruned_loss=0.05991, over 4795.00 frames.], tot_loss[loss=0.1592, simple_loss=0.2285, pruned_loss=0.04489, over 972748.10 frames.], batch size: 21, lr: 4.62e-04 2022-05-04 20:56:44,149 INFO [train.py:715] (0/8) Epoch 4, batch 9550, loss[loss=0.1689, simple_loss=0.2483, pruned_loss=0.04471, over 4911.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2281, pruned_loss=0.04466, over 972615.46 frames.], batch size: 18, lr: 4.62e-04 2022-05-04 20:57:24,664 INFO [train.py:715] (0/8) Epoch 4, batch 9600, loss[loss=0.1538, simple_loss=0.2224, pruned_loss=0.04263, over 4908.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2276, pruned_loss=0.04472, over 972824.84 frames.], batch size: 17, lr: 4.62e-04 2022-05-04 20:58:04,093 INFO [train.py:715] (0/8) Epoch 4, batch 9650, loss[loss=0.1229, simple_loss=0.1988, pruned_loss=0.02346, over 4915.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2274, pruned_loss=0.04449, over 972996.28 frames.], batch size: 18, lr: 4.62e-04 2022-05-04 20:58:44,657 INFO [train.py:715] (0/8) Epoch 4, batch 9700, loss[loss=0.1675, simple_loss=0.2218, pruned_loss=0.05662, over 4972.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2269, pruned_loss=0.04419, over 971723.84 frames.], batch size: 35, lr: 4.62e-04 2022-05-04 20:59:25,196 INFO [train.py:715] (0/8) Epoch 4, batch 9750, loss[loss=0.1807, simple_loss=0.2392, pruned_loss=0.06109, over 4988.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2258, pruned_loss=0.04346, over 972090.93 frames.], batch size: 31, lr: 4.62e-04 2022-05-04 21:00:04,723 INFO [train.py:715] (0/8) Epoch 4, batch 9800, loss[loss=0.1575, simple_loss=0.2228, pruned_loss=0.04612, over 4835.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2262, pruned_loss=0.04376, over 972372.51 frames.], batch size: 30, lr: 4.62e-04 2022-05-04 21:00:43,863 INFO [train.py:715] (0/8) Epoch 4, batch 9850, loss[loss=0.1209, simple_loss=0.2012, pruned_loss=0.02034, over 4905.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2265, pruned_loss=0.04383, over 971503.30 frames.], batch size: 29, lr: 4.62e-04 2022-05-04 21:01:23,901 INFO [train.py:715] (0/8) Epoch 4, batch 9900, loss[loss=0.1521, simple_loss=0.2302, pruned_loss=0.03699, over 4809.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2267, pruned_loss=0.04378, over 971600.86 frames.], batch size: 14, lr: 4.62e-04 2022-05-04 21:02:03,377 INFO [train.py:715] (0/8) Epoch 4, batch 9950, loss[loss=0.1722, simple_loss=0.2481, pruned_loss=0.04816, over 4770.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2278, pruned_loss=0.04428, over 972201.29 frames.], batch size: 18, lr: 4.62e-04 2022-05-04 21:02:42,755 INFO [train.py:715] (0/8) Epoch 4, batch 10000, loss[loss=0.1439, simple_loss=0.2102, pruned_loss=0.03882, over 4964.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2274, pruned_loss=0.0442, over 971878.03 frames.], batch size: 24, lr: 4.62e-04 2022-05-04 21:03:22,514 INFO [train.py:715] (0/8) Epoch 4, batch 10050, loss[loss=0.2406, simple_loss=0.3127, pruned_loss=0.08426, over 4769.00 frames.], tot_loss[loss=0.158, simple_loss=0.2274, pruned_loss=0.04425, over 971524.36 frames.], batch size: 19, lr: 4.62e-04 2022-05-04 21:04:02,310 INFO [train.py:715] (0/8) Epoch 4, batch 10100, loss[loss=0.1318, simple_loss=0.2119, pruned_loss=0.02587, over 4985.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2268, pruned_loss=0.04383, over 970965.85 frames.], batch size: 25, lr: 4.61e-04 2022-05-04 21:04:41,556 INFO [train.py:715] (0/8) Epoch 4, batch 10150, loss[loss=0.1857, simple_loss=0.2623, pruned_loss=0.05453, over 4814.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2266, pruned_loss=0.04406, over 970791.56 frames.], batch size: 25, lr: 4.61e-04 2022-05-04 21:05:21,478 INFO [train.py:715] (0/8) Epoch 4, batch 10200, loss[loss=0.1344, simple_loss=0.2091, pruned_loss=0.02989, over 4819.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2264, pruned_loss=0.04365, over 970925.87 frames.], batch size: 26, lr: 4.61e-04 2022-05-04 21:06:02,063 INFO [train.py:715] (0/8) Epoch 4, batch 10250, loss[loss=0.1549, simple_loss=0.2203, pruned_loss=0.0448, over 4860.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2263, pruned_loss=0.04364, over 970714.09 frames.], batch size: 20, lr: 4.61e-04 2022-05-04 21:06:41,847 INFO [train.py:715] (0/8) Epoch 4, batch 10300, loss[loss=0.1498, simple_loss=0.2208, pruned_loss=0.03946, over 4812.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2266, pruned_loss=0.04411, over 971758.64 frames.], batch size: 27, lr: 4.61e-04 2022-05-04 21:07:21,506 INFO [train.py:715] (0/8) Epoch 4, batch 10350, loss[loss=0.1652, simple_loss=0.228, pruned_loss=0.05119, over 4880.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2273, pruned_loss=0.04461, over 971485.14 frames.], batch size: 19, lr: 4.61e-04 2022-05-04 21:08:01,700 INFO [train.py:715] (0/8) Epoch 4, batch 10400, loss[loss=0.1418, simple_loss=0.213, pruned_loss=0.03525, over 4930.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2273, pruned_loss=0.04487, over 971785.84 frames.], batch size: 23, lr: 4.61e-04 2022-05-04 21:08:42,276 INFO [train.py:715] (0/8) Epoch 4, batch 10450, loss[loss=0.1476, simple_loss=0.2182, pruned_loss=0.03849, over 4758.00 frames.], tot_loss[loss=0.158, simple_loss=0.2271, pruned_loss=0.04447, over 972315.45 frames.], batch size: 18, lr: 4.61e-04 2022-05-04 21:09:21,886 INFO [train.py:715] (0/8) Epoch 4, batch 10500, loss[loss=0.1165, simple_loss=0.1934, pruned_loss=0.01984, over 4904.00 frames.], tot_loss[loss=0.157, simple_loss=0.2264, pruned_loss=0.04375, over 971605.32 frames.], batch size: 23, lr: 4.61e-04 2022-05-04 21:10:02,146 INFO [train.py:715] (0/8) Epoch 4, batch 10550, loss[loss=0.1463, simple_loss=0.2098, pruned_loss=0.04143, over 4837.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2267, pruned_loss=0.04404, over 971748.68 frames.], batch size: 15, lr: 4.61e-04 2022-05-04 21:10:42,492 INFO [train.py:715] (0/8) Epoch 4, batch 10600, loss[loss=0.1581, simple_loss=0.2279, pruned_loss=0.04413, over 4906.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2269, pruned_loss=0.04391, over 972750.58 frames.], batch size: 17, lr: 4.61e-04 2022-05-04 21:11:22,295 INFO [train.py:715] (0/8) Epoch 4, batch 10650, loss[loss=0.1667, simple_loss=0.2363, pruned_loss=0.04855, over 4833.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2265, pruned_loss=0.04382, over 972541.84 frames.], batch size: 30, lr: 4.61e-04 2022-05-04 21:12:02,341 INFO [train.py:715] (0/8) Epoch 4, batch 10700, loss[loss=0.153, simple_loss=0.2203, pruned_loss=0.04291, over 4849.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2267, pruned_loss=0.04421, over 972099.82 frames.], batch size: 13, lr: 4.61e-04 2022-05-04 21:12:42,038 INFO [train.py:715] (0/8) Epoch 4, batch 10750, loss[loss=0.1486, simple_loss=0.2249, pruned_loss=0.03613, over 4899.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2261, pruned_loss=0.04385, over 971548.02 frames.], batch size: 19, lr: 4.60e-04 2022-05-04 21:13:22,455 INFO [train.py:715] (0/8) Epoch 4, batch 10800, loss[loss=0.1391, simple_loss=0.206, pruned_loss=0.0361, over 4754.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2255, pruned_loss=0.04356, over 971400.41 frames.], batch size: 12, lr: 4.60e-04 2022-05-04 21:14:01,772 INFO [train.py:715] (0/8) Epoch 4, batch 10850, loss[loss=0.1722, simple_loss=0.2487, pruned_loss=0.04781, over 4937.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2254, pruned_loss=0.0435, over 971443.92 frames.], batch size: 29, lr: 4.60e-04 2022-05-04 21:14:41,705 INFO [train.py:715] (0/8) Epoch 4, batch 10900, loss[loss=0.176, simple_loss=0.2401, pruned_loss=0.05596, over 4893.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2259, pruned_loss=0.04393, over 971486.20 frames.], batch size: 19, lr: 4.60e-04 2022-05-04 21:15:22,020 INFO [train.py:715] (0/8) Epoch 4, batch 10950, loss[loss=0.1612, simple_loss=0.217, pruned_loss=0.05273, over 4732.00 frames.], tot_loss[loss=0.1579, simple_loss=0.227, pruned_loss=0.04439, over 971664.44 frames.], batch size: 16, lr: 4.60e-04 2022-05-04 21:16:01,656 INFO [train.py:715] (0/8) Epoch 4, batch 11000, loss[loss=0.1363, simple_loss=0.2011, pruned_loss=0.03577, over 4982.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2273, pruned_loss=0.04445, over 972148.33 frames.], batch size: 14, lr: 4.60e-04 2022-05-04 21:16:44,077 INFO [train.py:715] (0/8) Epoch 4, batch 11050, loss[loss=0.1506, simple_loss=0.2211, pruned_loss=0.04002, over 4701.00 frames.], tot_loss[loss=0.1594, simple_loss=0.2285, pruned_loss=0.04511, over 972404.21 frames.], batch size: 15, lr: 4.60e-04 2022-05-04 21:17:24,579 INFO [train.py:715] (0/8) Epoch 4, batch 11100, loss[loss=0.1329, simple_loss=0.198, pruned_loss=0.03391, over 4712.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2271, pruned_loss=0.04414, over 972423.84 frames.], batch size: 15, lr: 4.60e-04 2022-05-04 21:18:07,366 INFO [train.py:715] (0/8) Epoch 4, batch 11150, loss[loss=0.1396, simple_loss=0.2065, pruned_loss=0.03631, over 4772.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2266, pruned_loss=0.04361, over 972280.64 frames.], batch size: 12, lr: 4.60e-04 2022-05-04 21:18:49,594 INFO [train.py:715] (0/8) Epoch 4, batch 11200, loss[loss=0.1403, simple_loss=0.21, pruned_loss=0.03527, over 4752.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2258, pruned_loss=0.04347, over 972567.57 frames.], batch size: 19, lr: 4.60e-04 2022-05-04 21:19:30,003 INFO [train.py:715] (0/8) Epoch 4, batch 11250, loss[loss=0.1542, simple_loss=0.2229, pruned_loss=0.04273, over 4940.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2259, pruned_loss=0.04368, over 973077.96 frames.], batch size: 24, lr: 4.60e-04 2022-05-04 21:20:12,910 INFO [train.py:715] (0/8) Epoch 4, batch 11300, loss[loss=0.1316, simple_loss=0.1978, pruned_loss=0.0327, over 4796.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2262, pruned_loss=0.04381, over 972878.31 frames.], batch size: 17, lr: 4.60e-04 2022-05-04 21:20:52,364 INFO [train.py:715] (0/8) Epoch 4, batch 11350, loss[loss=0.1582, simple_loss=0.2309, pruned_loss=0.0428, over 4960.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2268, pruned_loss=0.04408, over 972750.15 frames.], batch size: 15, lr: 4.60e-04 2022-05-04 21:21:31,882 INFO [train.py:715] (0/8) Epoch 4, batch 11400, loss[loss=0.1436, simple_loss=0.2143, pruned_loss=0.03645, over 4785.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2263, pruned_loss=0.04395, over 973211.69 frames.], batch size: 14, lr: 4.59e-04 2022-05-04 21:22:11,703 INFO [train.py:715] (0/8) Epoch 4, batch 11450, loss[loss=0.2087, simple_loss=0.2715, pruned_loss=0.07297, over 4927.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2261, pruned_loss=0.04408, over 973851.42 frames.], batch size: 35, lr: 4.59e-04 2022-05-04 21:22:51,365 INFO [train.py:715] (0/8) Epoch 4, batch 11500, loss[loss=0.1468, simple_loss=0.209, pruned_loss=0.04224, over 4821.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2247, pruned_loss=0.04313, over 973819.29 frames.], batch size: 15, lr: 4.59e-04 2022-05-04 21:23:30,613 INFO [train.py:715] (0/8) Epoch 4, batch 11550, loss[loss=0.1548, simple_loss=0.2262, pruned_loss=0.04169, over 4891.00 frames.], tot_loss[loss=0.1569, simple_loss=0.226, pruned_loss=0.0439, over 973657.12 frames.], batch size: 22, lr: 4.59e-04 2022-05-04 21:24:09,873 INFO [train.py:715] (0/8) Epoch 4, batch 11600, loss[loss=0.1729, simple_loss=0.2403, pruned_loss=0.05275, over 4902.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2264, pruned_loss=0.04389, over 973500.10 frames.], batch size: 19, lr: 4.59e-04 2022-05-04 21:24:50,386 INFO [train.py:715] (0/8) Epoch 4, batch 11650, loss[loss=0.1657, simple_loss=0.2235, pruned_loss=0.05394, over 4968.00 frames.], tot_loss[loss=0.1566, simple_loss=0.226, pruned_loss=0.04365, over 973484.00 frames.], batch size: 28, lr: 4.59e-04 2022-05-04 21:25:30,284 INFO [train.py:715] (0/8) Epoch 4, batch 11700, loss[loss=0.1625, simple_loss=0.2375, pruned_loss=0.04382, over 4885.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2266, pruned_loss=0.0442, over 973378.18 frames.], batch size: 16, lr: 4.59e-04 2022-05-04 21:26:10,258 INFO [train.py:715] (0/8) Epoch 4, batch 11750, loss[loss=0.1843, simple_loss=0.2562, pruned_loss=0.05621, over 4988.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2267, pruned_loss=0.04405, over 972760.70 frames.], batch size: 28, lr: 4.59e-04 2022-05-04 21:26:50,008 INFO [train.py:715] (0/8) Epoch 4, batch 11800, loss[loss=0.1564, simple_loss=0.2274, pruned_loss=0.04269, over 4952.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2264, pruned_loss=0.04406, over 972730.11 frames.], batch size: 21, lr: 4.59e-04 2022-05-04 21:27:30,274 INFO [train.py:715] (0/8) Epoch 4, batch 11850, loss[loss=0.1587, simple_loss=0.2346, pruned_loss=0.04137, over 4807.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2257, pruned_loss=0.04346, over 973302.32 frames.], batch size: 25, lr: 4.59e-04 2022-05-04 21:28:09,531 INFO [train.py:715] (0/8) Epoch 4, batch 11900, loss[loss=0.145, simple_loss=0.218, pruned_loss=0.03607, over 4790.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2267, pruned_loss=0.04385, over 972918.98 frames.], batch size: 14, lr: 4.59e-04 2022-05-04 21:28:49,309 INFO [train.py:715] (0/8) Epoch 4, batch 11950, loss[loss=0.1443, simple_loss=0.2141, pruned_loss=0.0372, over 4969.00 frames.], tot_loss[loss=0.1563, simple_loss=0.226, pruned_loss=0.04335, over 972058.45 frames.], batch size: 14, lr: 4.59e-04 2022-05-04 21:29:29,763 INFO [train.py:715] (0/8) Epoch 4, batch 12000, loss[loss=0.1654, simple_loss=0.2469, pruned_loss=0.04194, over 4811.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2264, pruned_loss=0.04338, over 971928.00 frames.], batch size: 21, lr: 4.59e-04 2022-05-04 21:29:29,764 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 21:29:49,526 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1122, simple_loss=0.198, pruned_loss=0.01324, over 914524.00 frames. 2022-05-04 21:30:30,067 INFO [train.py:715] (0/8) Epoch 4, batch 12050, loss[loss=0.145, simple_loss=0.2153, pruned_loss=0.03733, over 4831.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2267, pruned_loss=0.04356, over 972029.05 frames.], batch size: 30, lr: 4.58e-04 2022-05-04 21:31:09,880 INFO [train.py:715] (0/8) Epoch 4, batch 12100, loss[loss=0.1694, simple_loss=0.236, pruned_loss=0.0514, over 4815.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2269, pruned_loss=0.04344, over 972847.60 frames.], batch size: 21, lr: 4.58e-04 2022-05-04 21:31:50,056 INFO [train.py:715] (0/8) Epoch 4, batch 12150, loss[loss=0.1455, simple_loss=0.2203, pruned_loss=0.03533, over 4904.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2262, pruned_loss=0.04334, over 972409.39 frames.], batch size: 22, lr: 4.58e-04 2022-05-04 21:32:30,103 INFO [train.py:715] (0/8) Epoch 4, batch 12200, loss[loss=0.1915, simple_loss=0.263, pruned_loss=0.05999, over 4864.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2264, pruned_loss=0.04353, over 972528.03 frames.], batch size: 38, lr: 4.58e-04 2022-05-04 21:33:10,433 INFO [train.py:715] (0/8) Epoch 4, batch 12250, loss[loss=0.1428, simple_loss=0.2117, pruned_loss=0.03689, over 4966.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2262, pruned_loss=0.04399, over 972151.07 frames.], batch size: 35, lr: 4.58e-04 2022-05-04 21:33:49,416 INFO [train.py:715] (0/8) Epoch 4, batch 12300, loss[loss=0.1253, simple_loss=0.2031, pruned_loss=0.02371, over 4746.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2265, pruned_loss=0.044, over 972449.53 frames.], batch size: 19, lr: 4.58e-04 2022-05-04 21:34:29,434 INFO [train.py:715] (0/8) Epoch 4, batch 12350, loss[loss=0.1568, simple_loss=0.2191, pruned_loss=0.04728, over 4858.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2281, pruned_loss=0.04458, over 972485.47 frames.], batch size: 32, lr: 4.58e-04 2022-05-04 21:35:10,023 INFO [train.py:715] (0/8) Epoch 4, batch 12400, loss[loss=0.1359, simple_loss=0.2072, pruned_loss=0.03227, over 4934.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2272, pruned_loss=0.04379, over 972774.47 frames.], batch size: 29, lr: 4.58e-04 2022-05-04 21:35:49,237 INFO [train.py:715] (0/8) Epoch 4, batch 12450, loss[loss=0.1318, simple_loss=0.2147, pruned_loss=0.02448, over 4747.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2268, pruned_loss=0.04325, over 972891.68 frames.], batch size: 19, lr: 4.58e-04 2022-05-04 21:36:29,200 INFO [train.py:715] (0/8) Epoch 4, batch 12500, loss[loss=0.1404, simple_loss=0.2119, pruned_loss=0.03449, over 4760.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2267, pruned_loss=0.04331, over 972631.55 frames.], batch size: 19, lr: 4.58e-04 2022-05-04 21:37:08,757 INFO [train.py:715] (0/8) Epoch 4, batch 12550, loss[loss=0.1359, simple_loss=0.2031, pruned_loss=0.03435, over 4824.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2272, pruned_loss=0.0436, over 973225.59 frames.], batch size: 26, lr: 4.58e-04 2022-05-04 21:37:48,545 INFO [train.py:715] (0/8) Epoch 4, batch 12600, loss[loss=0.1638, simple_loss=0.238, pruned_loss=0.04483, over 4897.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2276, pruned_loss=0.04369, over 972528.33 frames.], batch size: 17, lr: 4.58e-04 2022-05-04 21:38:27,433 INFO [train.py:715] (0/8) Epoch 4, batch 12650, loss[loss=0.1446, simple_loss=0.2093, pruned_loss=0.04001, over 4925.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2275, pruned_loss=0.044, over 971792.52 frames.], batch size: 17, lr: 4.58e-04 2022-05-04 21:39:07,274 INFO [train.py:715] (0/8) Epoch 4, batch 12700, loss[loss=0.1642, simple_loss=0.237, pruned_loss=0.04569, over 4757.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2281, pruned_loss=0.04439, over 971842.83 frames.], batch size: 17, lr: 4.58e-04 2022-05-04 21:39:47,345 INFO [train.py:715] (0/8) Epoch 4, batch 12750, loss[loss=0.1649, simple_loss=0.2354, pruned_loss=0.04723, over 4704.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2289, pruned_loss=0.04485, over 971760.03 frames.], batch size: 15, lr: 4.57e-04 2022-05-04 21:39:56,163 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-152000.pt 2022-05-04 21:40:29,598 INFO [train.py:715] (0/8) Epoch 4, batch 12800, loss[loss=0.1736, simple_loss=0.2275, pruned_loss=0.05987, over 4921.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2282, pruned_loss=0.04452, over 972223.18 frames.], batch size: 17, lr: 4.57e-04 2022-05-04 21:41:08,993 INFO [train.py:715] (0/8) Epoch 4, batch 12850, loss[loss=0.1468, simple_loss=0.2112, pruned_loss=0.04119, over 4962.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2276, pruned_loss=0.04437, over 971791.72 frames.], batch size: 24, lr: 4.57e-04 2022-05-04 21:41:49,122 INFO [train.py:715] (0/8) Epoch 4, batch 12900, loss[loss=0.164, simple_loss=0.2345, pruned_loss=0.0467, over 4965.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2268, pruned_loss=0.04369, over 972163.52 frames.], batch size: 35, lr: 4.57e-04 2022-05-04 21:42:29,047 INFO [train.py:715] (0/8) Epoch 4, batch 12950, loss[loss=0.1507, simple_loss=0.2251, pruned_loss=0.03816, over 4959.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2278, pruned_loss=0.04442, over 971859.12 frames.], batch size: 24, lr: 4.57e-04 2022-05-04 21:43:07,916 INFO [train.py:715] (0/8) Epoch 4, batch 13000, loss[loss=0.1562, simple_loss=0.2162, pruned_loss=0.04808, over 4765.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2275, pruned_loss=0.04454, over 971838.07 frames.], batch size: 14, lr: 4.57e-04 2022-05-04 21:43:47,503 INFO [train.py:715] (0/8) Epoch 4, batch 13050, loss[loss=0.1668, simple_loss=0.2324, pruned_loss=0.05059, over 4841.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2274, pruned_loss=0.04441, over 972217.11 frames.], batch size: 32, lr: 4.57e-04 2022-05-04 21:44:27,465 INFO [train.py:715] (0/8) Epoch 4, batch 13100, loss[loss=0.1309, simple_loss=0.1997, pruned_loss=0.03106, over 4798.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2278, pruned_loss=0.04433, over 972066.92 frames.], batch size: 24, lr: 4.57e-04 2022-05-04 21:45:06,503 INFO [train.py:715] (0/8) Epoch 4, batch 13150, loss[loss=0.1829, simple_loss=0.2428, pruned_loss=0.06153, over 4829.00 frames.], tot_loss[loss=0.159, simple_loss=0.2282, pruned_loss=0.04491, over 972397.43 frames.], batch size: 15, lr: 4.57e-04 2022-05-04 21:45:46,243 INFO [train.py:715] (0/8) Epoch 4, batch 13200, loss[loss=0.1604, simple_loss=0.2245, pruned_loss=0.04817, over 4879.00 frames.], tot_loss[loss=0.1591, simple_loss=0.2279, pruned_loss=0.04514, over 972305.54 frames.], batch size: 22, lr: 4.57e-04 2022-05-04 21:46:26,564 INFO [train.py:715] (0/8) Epoch 4, batch 13250, loss[loss=0.1293, simple_loss=0.2097, pruned_loss=0.02446, over 4823.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2281, pruned_loss=0.04481, over 972332.11 frames.], batch size: 25, lr: 4.57e-04 2022-05-04 21:47:06,171 INFO [train.py:715] (0/8) Epoch 4, batch 13300, loss[loss=0.1576, simple_loss=0.2292, pruned_loss=0.04297, over 4783.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2274, pruned_loss=0.04403, over 971900.93 frames.], batch size: 14, lr: 4.57e-04 2022-05-04 21:47:45,756 INFO [train.py:715] (0/8) Epoch 4, batch 13350, loss[loss=0.1606, simple_loss=0.2245, pruned_loss=0.04836, over 4806.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2276, pruned_loss=0.04393, over 971704.56 frames.], batch size: 13, lr: 4.57e-04 2022-05-04 21:48:25,398 INFO [train.py:715] (0/8) Epoch 4, batch 13400, loss[loss=0.1092, simple_loss=0.1824, pruned_loss=0.01803, over 4743.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2277, pruned_loss=0.04401, over 972225.77 frames.], batch size: 12, lr: 4.56e-04 2022-05-04 21:49:05,428 INFO [train.py:715] (0/8) Epoch 4, batch 13450, loss[loss=0.1471, simple_loss=0.2227, pruned_loss=0.03575, over 4979.00 frames.], tot_loss[loss=0.158, simple_loss=0.228, pruned_loss=0.04396, over 971975.84 frames.], batch size: 28, lr: 4.56e-04 2022-05-04 21:49:45,241 INFO [train.py:715] (0/8) Epoch 4, batch 13500, loss[loss=0.1395, simple_loss=0.2152, pruned_loss=0.0319, over 4952.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2276, pruned_loss=0.0436, over 972032.10 frames.], batch size: 35, lr: 4.56e-04 2022-05-04 21:50:27,085 INFO [train.py:715] (0/8) Epoch 4, batch 13550, loss[loss=0.1508, simple_loss=0.2242, pruned_loss=0.03868, over 4780.00 frames.], tot_loss[loss=0.157, simple_loss=0.227, pruned_loss=0.04344, over 971480.80 frames.], batch size: 17, lr: 4.56e-04 2022-05-04 21:51:07,661 INFO [train.py:715] (0/8) Epoch 4, batch 13600, loss[loss=0.1507, simple_loss=0.2244, pruned_loss=0.03848, over 4858.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2273, pruned_loss=0.04351, over 971343.22 frames.], batch size: 20, lr: 4.56e-04 2022-05-04 21:51:47,212 INFO [train.py:715] (0/8) Epoch 4, batch 13650, loss[loss=0.1719, simple_loss=0.243, pruned_loss=0.05045, over 4973.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2272, pruned_loss=0.04359, over 971254.19 frames.], batch size: 28, lr: 4.56e-04 2022-05-04 21:52:26,526 INFO [train.py:715] (0/8) Epoch 4, batch 13700, loss[loss=0.1382, simple_loss=0.2094, pruned_loss=0.03354, over 4875.00 frames.], tot_loss[loss=0.1569, simple_loss=0.227, pruned_loss=0.04346, over 971810.67 frames.], batch size: 22, lr: 4.56e-04 2022-05-04 21:53:06,452 INFO [train.py:715] (0/8) Epoch 4, batch 13750, loss[loss=0.1648, simple_loss=0.2352, pruned_loss=0.04715, over 4888.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2273, pruned_loss=0.04374, over 971749.08 frames.], batch size: 22, lr: 4.56e-04 2022-05-04 21:53:48,113 INFO [train.py:715] (0/8) Epoch 4, batch 13800, loss[loss=0.1518, simple_loss=0.2211, pruned_loss=0.04126, over 4947.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2271, pruned_loss=0.04363, over 972136.46 frames.], batch size: 21, lr: 4.56e-04 2022-05-04 21:54:29,025 INFO [train.py:715] (0/8) Epoch 4, batch 13850, loss[loss=0.1403, simple_loss=0.2127, pruned_loss=0.03395, over 4799.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2261, pruned_loss=0.0433, over 972852.11 frames.], batch size: 24, lr: 4.56e-04 2022-05-04 21:55:10,918 INFO [train.py:715] (0/8) Epoch 4, batch 13900, loss[loss=0.1812, simple_loss=0.2477, pruned_loss=0.05735, over 4782.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2259, pruned_loss=0.04339, over 972793.70 frames.], batch size: 18, lr: 4.56e-04 2022-05-04 21:55:52,336 INFO [train.py:715] (0/8) Epoch 4, batch 13950, loss[loss=0.179, simple_loss=0.2536, pruned_loss=0.05216, over 4895.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2254, pruned_loss=0.04274, over 972263.16 frames.], batch size: 19, lr: 4.56e-04 2022-05-04 21:56:31,845 INFO [train.py:715] (0/8) Epoch 4, batch 14000, loss[loss=0.1511, simple_loss=0.2196, pruned_loss=0.04136, over 4908.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2256, pruned_loss=0.04295, over 972298.53 frames.], batch size: 19, lr: 4.56e-04 2022-05-04 21:57:12,904 INFO [train.py:715] (0/8) Epoch 4, batch 14050, loss[loss=0.1542, simple_loss=0.2303, pruned_loss=0.03909, over 4706.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2264, pruned_loss=0.04338, over 972333.36 frames.], batch size: 15, lr: 4.55e-04 2022-05-04 21:57:52,566 INFO [train.py:715] (0/8) Epoch 4, batch 14100, loss[loss=0.1732, simple_loss=0.2396, pruned_loss=0.05343, over 4690.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2267, pruned_loss=0.04391, over 971875.78 frames.], batch size: 15, lr: 4.55e-04 2022-05-04 21:58:32,929 INFO [train.py:715] (0/8) Epoch 4, batch 14150, loss[loss=0.147, simple_loss=0.2225, pruned_loss=0.03577, over 4980.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2258, pruned_loss=0.0435, over 972717.26 frames.], batch size: 27, lr: 4.55e-04 2022-05-04 21:59:12,287 INFO [train.py:715] (0/8) Epoch 4, batch 14200, loss[loss=0.1345, simple_loss=0.2206, pruned_loss=0.02417, over 4900.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2263, pruned_loss=0.04342, over 972491.65 frames.], batch size: 17, lr: 4.55e-04 2022-05-04 21:59:51,977 INFO [train.py:715] (0/8) Epoch 4, batch 14250, loss[loss=0.1204, simple_loss=0.1917, pruned_loss=0.02452, over 4857.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2259, pruned_loss=0.04359, over 973334.10 frames.], batch size: 20, lr: 4.55e-04 2022-05-04 22:00:32,132 INFO [train.py:715] (0/8) Epoch 4, batch 14300, loss[loss=0.1359, simple_loss=0.2033, pruned_loss=0.03423, over 4837.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2262, pruned_loss=0.04408, over 973955.74 frames.], batch size: 13, lr: 4.55e-04 2022-05-04 22:01:10,604 INFO [train.py:715] (0/8) Epoch 4, batch 14350, loss[loss=0.1653, simple_loss=0.2316, pruned_loss=0.04951, over 4871.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2258, pruned_loss=0.04346, over 974046.02 frames.], batch size: 20, lr: 4.55e-04 2022-05-04 22:01:50,872 INFO [train.py:715] (0/8) Epoch 4, batch 14400, loss[loss=0.207, simple_loss=0.2567, pruned_loss=0.0787, over 4977.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2258, pruned_loss=0.04328, over 973031.94 frames.], batch size: 15, lr: 4.55e-04 2022-05-04 22:02:30,296 INFO [train.py:715] (0/8) Epoch 4, batch 14450, loss[loss=0.1522, simple_loss=0.2283, pruned_loss=0.03803, over 4901.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2261, pruned_loss=0.04354, over 972502.64 frames.], batch size: 29, lr: 4.55e-04 2022-05-04 22:03:09,287 INFO [train.py:715] (0/8) Epoch 4, batch 14500, loss[loss=0.1669, simple_loss=0.2389, pruned_loss=0.04744, over 4907.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2259, pruned_loss=0.04314, over 972545.25 frames.], batch size: 39, lr: 4.55e-04 2022-05-04 22:03:48,140 INFO [train.py:715] (0/8) Epoch 4, batch 14550, loss[loss=0.1584, simple_loss=0.2349, pruned_loss=0.04093, over 4942.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2265, pruned_loss=0.04347, over 973463.00 frames.], batch size: 23, lr: 4.55e-04 2022-05-04 22:04:27,641 INFO [train.py:715] (0/8) Epoch 4, batch 14600, loss[loss=0.1812, simple_loss=0.2422, pruned_loss=0.06012, over 4783.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2265, pruned_loss=0.04382, over 972386.80 frames.], batch size: 18, lr: 4.55e-04 2022-05-04 22:05:07,542 INFO [train.py:715] (0/8) Epoch 4, batch 14650, loss[loss=0.123, simple_loss=0.2007, pruned_loss=0.02261, over 4779.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2263, pruned_loss=0.0441, over 972136.50 frames.], batch size: 14, lr: 4.55e-04 2022-05-04 22:05:46,289 INFO [train.py:715] (0/8) Epoch 4, batch 14700, loss[loss=0.1348, simple_loss=0.198, pruned_loss=0.03582, over 4857.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2258, pruned_loss=0.04315, over 972753.71 frames.], batch size: 32, lr: 4.55e-04 2022-05-04 22:06:26,141 INFO [train.py:715] (0/8) Epoch 4, batch 14750, loss[loss=0.1417, simple_loss=0.2109, pruned_loss=0.03621, over 4953.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2254, pruned_loss=0.04283, over 971611.97 frames.], batch size: 21, lr: 4.54e-04 2022-05-04 22:07:06,149 INFO [train.py:715] (0/8) Epoch 4, batch 14800, loss[loss=0.1521, simple_loss=0.2226, pruned_loss=0.04084, over 4756.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2254, pruned_loss=0.04254, over 972227.68 frames.], batch size: 19, lr: 4.54e-04 2022-05-04 22:07:51,024 INFO [train.py:715] (0/8) Epoch 4, batch 14850, loss[loss=0.146, simple_loss=0.2191, pruned_loss=0.0365, over 4833.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2258, pruned_loss=0.04277, over 972075.41 frames.], batch size: 30, lr: 4.54e-04 2022-05-04 22:08:31,249 INFO [train.py:715] (0/8) Epoch 4, batch 14900, loss[loss=0.1432, simple_loss=0.2189, pruned_loss=0.03378, over 4797.00 frames.], tot_loss[loss=0.157, simple_loss=0.2268, pruned_loss=0.0436, over 972452.65 frames.], batch size: 21, lr: 4.54e-04 2022-05-04 22:09:11,332 INFO [train.py:715] (0/8) Epoch 4, batch 14950, loss[loss=0.1472, simple_loss=0.2166, pruned_loss=0.03896, over 4793.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2266, pruned_loss=0.04339, over 972572.59 frames.], batch size: 24, lr: 4.54e-04 2022-05-04 22:09:51,664 INFO [train.py:715] (0/8) Epoch 4, batch 15000, loss[loss=0.1747, simple_loss=0.2416, pruned_loss=0.05391, over 4924.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2271, pruned_loss=0.04365, over 971752.45 frames.], batch size: 18, lr: 4.54e-04 2022-05-04 22:09:51,665 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 22:10:32,004 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1122, simple_loss=0.1978, pruned_loss=0.01336, over 914524.00 frames. 2022-05-04 22:11:12,729 INFO [train.py:715] (0/8) Epoch 4, batch 15050, loss[loss=0.1534, simple_loss=0.229, pruned_loss=0.03891, over 4916.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2269, pruned_loss=0.04383, over 971550.35 frames.], batch size: 29, lr: 4.54e-04 2022-05-04 22:11:52,174 INFO [train.py:715] (0/8) Epoch 4, batch 15100, loss[loss=0.1344, simple_loss=0.2114, pruned_loss=0.02873, over 4896.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2268, pruned_loss=0.0439, over 972012.04 frames.], batch size: 19, lr: 4.54e-04 2022-05-04 22:12:32,068 INFO [train.py:715] (0/8) Epoch 4, batch 15150, loss[loss=0.1882, simple_loss=0.2489, pruned_loss=0.06374, over 4700.00 frames.], tot_loss[loss=0.157, simple_loss=0.2269, pruned_loss=0.04356, over 971825.23 frames.], batch size: 15, lr: 4.54e-04 2022-05-04 22:13:12,022 INFO [train.py:715] (0/8) Epoch 4, batch 15200, loss[loss=0.1795, simple_loss=0.251, pruned_loss=0.05398, over 4778.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2273, pruned_loss=0.0441, over 971434.84 frames.], batch size: 18, lr: 4.54e-04 2022-05-04 22:13:51,742 INFO [train.py:715] (0/8) Epoch 4, batch 15250, loss[loss=0.1736, simple_loss=0.2416, pruned_loss=0.05286, over 4988.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2267, pruned_loss=0.04413, over 972017.88 frames.], batch size: 25, lr: 4.54e-04 2022-05-04 22:14:31,955 INFO [train.py:715] (0/8) Epoch 4, batch 15300, loss[loss=0.1834, simple_loss=0.2414, pruned_loss=0.06273, over 4912.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2276, pruned_loss=0.04445, over 971755.35 frames.], batch size: 39, lr: 4.54e-04 2022-05-04 22:15:12,418 INFO [train.py:715] (0/8) Epoch 4, batch 15350, loss[loss=0.1194, simple_loss=0.1817, pruned_loss=0.02853, over 4803.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2261, pruned_loss=0.04344, over 972365.94 frames.], batch size: 13, lr: 4.54e-04 2022-05-04 22:15:52,254 INFO [train.py:715] (0/8) Epoch 4, batch 15400, loss[loss=0.1556, simple_loss=0.2284, pruned_loss=0.0414, over 4837.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2259, pruned_loss=0.04361, over 972316.73 frames.], batch size: 13, lr: 4.53e-04 2022-05-04 22:16:32,477 INFO [train.py:715] (0/8) Epoch 4, batch 15450, loss[loss=0.1404, simple_loss=0.2184, pruned_loss=0.03116, over 4781.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2269, pruned_loss=0.0437, over 972953.53 frames.], batch size: 18, lr: 4.53e-04 2022-05-04 22:17:12,930 INFO [train.py:715] (0/8) Epoch 4, batch 15500, loss[loss=0.173, simple_loss=0.2488, pruned_loss=0.04863, over 4772.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2266, pruned_loss=0.04366, over 972839.76 frames.], batch size: 18, lr: 4.53e-04 2022-05-04 22:17:53,282 INFO [train.py:715] (0/8) Epoch 4, batch 15550, loss[loss=0.193, simple_loss=0.2584, pruned_loss=0.06378, over 4947.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2258, pruned_loss=0.04362, over 972559.50 frames.], batch size: 15, lr: 4.53e-04 2022-05-04 22:18:32,665 INFO [train.py:715] (0/8) Epoch 4, batch 15600, loss[loss=0.1591, simple_loss=0.2109, pruned_loss=0.05368, over 4934.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2261, pruned_loss=0.04401, over 971576.66 frames.], batch size: 21, lr: 4.53e-04 2022-05-04 22:19:13,494 INFO [train.py:715] (0/8) Epoch 4, batch 15650, loss[loss=0.1446, simple_loss=0.2224, pruned_loss=0.03343, over 4870.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2266, pruned_loss=0.04452, over 972004.13 frames.], batch size: 22, lr: 4.53e-04 2022-05-04 22:19:53,084 INFO [train.py:715] (0/8) Epoch 4, batch 15700, loss[loss=0.1786, simple_loss=0.2307, pruned_loss=0.06324, over 4977.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2267, pruned_loss=0.04459, over 971977.80 frames.], batch size: 14, lr: 4.53e-04 2022-05-04 22:20:33,261 INFO [train.py:715] (0/8) Epoch 4, batch 15750, loss[loss=0.1343, simple_loss=0.2091, pruned_loss=0.02972, over 4929.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2268, pruned_loss=0.0445, over 971975.96 frames.], batch size: 23, lr: 4.53e-04 2022-05-04 22:21:12,807 INFO [train.py:715] (0/8) Epoch 4, batch 15800, loss[loss=0.1634, simple_loss=0.2296, pruned_loss=0.04857, over 4958.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2263, pruned_loss=0.04374, over 972183.47 frames.], batch size: 35, lr: 4.53e-04 2022-05-04 22:21:53,775 INFO [train.py:715] (0/8) Epoch 4, batch 15850, loss[loss=0.1654, simple_loss=0.2399, pruned_loss=0.04541, over 4753.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2259, pruned_loss=0.04392, over 972679.83 frames.], batch size: 19, lr: 4.53e-04 2022-05-04 22:22:34,969 INFO [train.py:715] (0/8) Epoch 4, batch 15900, loss[loss=0.1629, simple_loss=0.2363, pruned_loss=0.04474, over 4799.00 frames.], tot_loss[loss=0.1578, simple_loss=0.227, pruned_loss=0.04427, over 972597.86 frames.], batch size: 24, lr: 4.53e-04 2022-05-04 22:23:14,294 INFO [train.py:715] (0/8) Epoch 4, batch 15950, loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03278, over 4959.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2265, pruned_loss=0.04417, over 971581.19 frames.], batch size: 24, lr: 4.53e-04 2022-05-04 22:23:54,443 INFO [train.py:715] (0/8) Epoch 4, batch 16000, loss[loss=0.1496, simple_loss=0.2274, pruned_loss=0.03591, over 4984.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2268, pruned_loss=0.04452, over 971412.88 frames.], batch size: 25, lr: 4.53e-04 2022-05-04 22:24:34,927 INFO [train.py:715] (0/8) Epoch 4, batch 16050, loss[loss=0.2041, simple_loss=0.2775, pruned_loss=0.06533, over 4940.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2271, pruned_loss=0.04437, over 971833.08 frames.], batch size: 21, lr: 4.53e-04 2022-05-04 22:25:14,766 INFO [train.py:715] (0/8) Epoch 4, batch 16100, loss[loss=0.1474, simple_loss=0.2315, pruned_loss=0.03164, over 4800.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2278, pruned_loss=0.04468, over 971677.30 frames.], batch size: 24, lr: 4.52e-04 2022-05-04 22:25:54,141 INFO [train.py:715] (0/8) Epoch 4, batch 16150, loss[loss=0.2051, simple_loss=0.2643, pruned_loss=0.07296, over 4864.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2275, pruned_loss=0.04455, over 972473.72 frames.], batch size: 16, lr: 4.52e-04 2022-05-04 22:26:34,754 INFO [train.py:715] (0/8) Epoch 4, batch 16200, loss[loss=0.159, simple_loss=0.2373, pruned_loss=0.04035, over 4928.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2271, pruned_loss=0.04425, over 972498.64 frames.], batch size: 29, lr: 4.52e-04 2022-05-04 22:27:15,072 INFO [train.py:715] (0/8) Epoch 4, batch 16250, loss[loss=0.1503, simple_loss=0.2278, pruned_loss=0.0364, over 4954.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2268, pruned_loss=0.04392, over 973237.79 frames.], batch size: 21, lr: 4.52e-04 2022-05-04 22:27:54,408 INFO [train.py:715] (0/8) Epoch 4, batch 16300, loss[loss=0.1837, simple_loss=0.2545, pruned_loss=0.05647, over 4974.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2275, pruned_loss=0.0438, over 973356.88 frames.], batch size: 39, lr: 4.52e-04 2022-05-04 22:28:34,986 INFO [train.py:715] (0/8) Epoch 4, batch 16350, loss[loss=0.1636, simple_loss=0.2257, pruned_loss=0.05076, over 4871.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2285, pruned_loss=0.04446, over 973899.74 frames.], batch size: 32, lr: 4.52e-04 2022-05-04 22:29:15,660 INFO [train.py:715] (0/8) Epoch 4, batch 16400, loss[loss=0.1451, simple_loss=0.2117, pruned_loss=0.03923, over 4806.00 frames.], tot_loss[loss=0.159, simple_loss=0.2286, pruned_loss=0.04468, over 973287.37 frames.], batch size: 21, lr: 4.52e-04 2022-05-04 22:29:56,018 INFO [train.py:715] (0/8) Epoch 4, batch 16450, loss[loss=0.1377, simple_loss=0.2126, pruned_loss=0.0314, over 4906.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2276, pruned_loss=0.04402, over 973136.77 frames.], batch size: 17, lr: 4.52e-04 2022-05-04 22:30:35,454 INFO [train.py:715] (0/8) Epoch 4, batch 16500, loss[loss=0.16, simple_loss=0.227, pruned_loss=0.0465, over 4794.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2269, pruned_loss=0.04408, over 973214.88 frames.], batch size: 14, lr: 4.52e-04 2022-05-04 22:31:15,345 INFO [train.py:715] (0/8) Epoch 4, batch 16550, loss[loss=0.1243, simple_loss=0.2056, pruned_loss=0.02152, over 4780.00 frames.], tot_loss[loss=0.157, simple_loss=0.2268, pruned_loss=0.0436, over 973597.10 frames.], batch size: 18, lr: 4.52e-04 2022-05-04 22:31:55,172 INFO [train.py:715] (0/8) Epoch 4, batch 16600, loss[loss=0.1577, simple_loss=0.2278, pruned_loss=0.04384, over 4772.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2264, pruned_loss=0.04351, over 973267.61 frames.], batch size: 18, lr: 4.52e-04 2022-05-04 22:32:33,988 INFO [train.py:715] (0/8) Epoch 4, batch 16650, loss[loss=0.1521, simple_loss=0.2166, pruned_loss=0.04376, over 4802.00 frames.], tot_loss[loss=0.1562, simple_loss=0.226, pruned_loss=0.04324, over 973039.46 frames.], batch size: 21, lr: 4.52e-04 2022-05-04 22:33:12,846 INFO [train.py:715] (0/8) Epoch 4, batch 16700, loss[loss=0.1498, simple_loss=0.2387, pruned_loss=0.03049, over 4960.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2262, pruned_loss=0.04311, over 972787.64 frames.], batch size: 24, lr: 4.52e-04 2022-05-04 22:33:52,190 INFO [train.py:715] (0/8) Epoch 4, batch 16750, loss[loss=0.1421, simple_loss=0.222, pruned_loss=0.03109, over 4835.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2263, pruned_loss=0.04323, over 973027.38 frames.], batch size: 15, lr: 4.52e-04 2022-05-04 22:34:31,636 INFO [train.py:715] (0/8) Epoch 4, batch 16800, loss[loss=0.1237, simple_loss=0.2013, pruned_loss=0.02302, over 4690.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2258, pruned_loss=0.04303, over 972326.06 frames.], batch size: 15, lr: 4.51e-04 2022-05-04 22:35:10,402 INFO [train.py:715] (0/8) Epoch 4, batch 16850, loss[loss=0.1473, simple_loss=0.2104, pruned_loss=0.04214, over 4786.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2265, pruned_loss=0.04356, over 971529.98 frames.], batch size: 14, lr: 4.51e-04 2022-05-04 22:35:50,739 INFO [train.py:715] (0/8) Epoch 4, batch 16900, loss[loss=0.1606, simple_loss=0.2268, pruned_loss=0.04718, over 4900.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2273, pruned_loss=0.04412, over 972233.93 frames.], batch size: 17, lr: 4.51e-04 2022-05-04 22:36:31,090 INFO [train.py:715] (0/8) Epoch 4, batch 16950, loss[loss=0.1383, simple_loss=0.209, pruned_loss=0.03376, over 4746.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2281, pruned_loss=0.04459, over 972562.92 frames.], batch size: 16, lr: 4.51e-04 2022-05-04 22:37:10,626 INFO [train.py:715] (0/8) Epoch 4, batch 17000, loss[loss=0.1694, simple_loss=0.2433, pruned_loss=0.04781, over 4970.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2273, pruned_loss=0.0439, over 972597.17 frames.], batch size: 35, lr: 4.51e-04 2022-05-04 22:37:50,454 INFO [train.py:715] (0/8) Epoch 4, batch 17050, loss[loss=0.1494, simple_loss=0.2204, pruned_loss=0.03921, over 4797.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2276, pruned_loss=0.04407, over 971951.09 frames.], batch size: 24, lr: 4.51e-04 2022-05-04 22:38:30,858 INFO [train.py:715] (0/8) Epoch 4, batch 17100, loss[loss=0.1616, simple_loss=0.2191, pruned_loss=0.05208, over 4783.00 frames.], tot_loss[loss=0.1589, simple_loss=0.2283, pruned_loss=0.04476, over 972919.98 frames.], batch size: 12, lr: 4.51e-04 2022-05-04 22:39:10,965 INFO [train.py:715] (0/8) Epoch 4, batch 17150, loss[loss=0.1601, simple_loss=0.2258, pruned_loss=0.04719, over 4794.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2282, pruned_loss=0.04466, over 972677.83 frames.], batch size: 12, lr: 4.51e-04 2022-05-04 22:39:50,106 INFO [train.py:715] (0/8) Epoch 4, batch 17200, loss[loss=0.1219, simple_loss=0.2004, pruned_loss=0.02165, over 4703.00 frames.], tot_loss[loss=0.1587, simple_loss=0.2281, pruned_loss=0.0447, over 972574.58 frames.], batch size: 12, lr: 4.51e-04 2022-05-04 22:40:30,247 INFO [train.py:715] (0/8) Epoch 4, batch 17250, loss[loss=0.1248, simple_loss=0.1966, pruned_loss=0.02648, over 4817.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2275, pruned_loss=0.04422, over 972903.43 frames.], batch size: 25, lr: 4.51e-04 2022-05-04 22:41:10,206 INFO [train.py:715] (0/8) Epoch 4, batch 17300, loss[loss=0.1433, simple_loss=0.2166, pruned_loss=0.03497, over 4986.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2274, pruned_loss=0.04442, over 972702.61 frames.], batch size: 14, lr: 4.51e-04 2022-05-04 22:41:49,928 INFO [train.py:715] (0/8) Epoch 4, batch 17350, loss[loss=0.141, simple_loss=0.2088, pruned_loss=0.03658, over 4916.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2262, pruned_loss=0.04404, over 972603.61 frames.], batch size: 23, lr: 4.51e-04 2022-05-04 22:42:29,446 INFO [train.py:715] (0/8) Epoch 4, batch 17400, loss[loss=0.1744, simple_loss=0.2541, pruned_loss=0.04738, over 4989.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2265, pruned_loss=0.04391, over 972573.62 frames.], batch size: 14, lr: 4.51e-04 2022-05-04 22:43:09,753 INFO [train.py:715] (0/8) Epoch 4, batch 17450, loss[loss=0.1466, simple_loss=0.2202, pruned_loss=0.03647, over 4726.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2261, pruned_loss=0.0435, over 972707.15 frames.], batch size: 12, lr: 4.51e-04 2022-05-04 22:43:50,026 INFO [train.py:715] (0/8) Epoch 4, batch 17500, loss[loss=0.1467, simple_loss=0.214, pruned_loss=0.03965, over 4882.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2267, pruned_loss=0.04401, over 972919.18 frames.], batch size: 22, lr: 4.50e-04 2022-05-04 22:44:29,247 INFO [train.py:715] (0/8) Epoch 4, batch 17550, loss[loss=0.1111, simple_loss=0.181, pruned_loss=0.02056, over 4791.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2253, pruned_loss=0.04283, over 972517.56 frames.], batch size: 12, lr: 4.50e-04 2022-05-04 22:45:09,116 INFO [train.py:715] (0/8) Epoch 4, batch 17600, loss[loss=0.1704, simple_loss=0.2433, pruned_loss=0.04878, over 4788.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2254, pruned_loss=0.04347, over 972584.98 frames.], batch size: 24, lr: 4.50e-04 2022-05-04 22:45:49,514 INFO [train.py:715] (0/8) Epoch 4, batch 17650, loss[loss=0.1334, simple_loss=0.2035, pruned_loss=0.03161, over 4960.00 frames.], tot_loss[loss=0.1558, simple_loss=0.225, pruned_loss=0.04324, over 972018.27 frames.], batch size: 15, lr: 4.50e-04 2022-05-04 22:46:29,570 INFO [train.py:715] (0/8) Epoch 4, batch 17700, loss[loss=0.1384, simple_loss=0.2047, pruned_loss=0.03599, over 4867.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2252, pruned_loss=0.04316, over 971848.24 frames.], batch size: 38, lr: 4.50e-04 2022-05-04 22:47:09,160 INFO [train.py:715] (0/8) Epoch 4, batch 17750, loss[loss=0.1548, simple_loss=0.2261, pruned_loss=0.04181, over 4946.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2253, pruned_loss=0.04345, over 972806.88 frames.], batch size: 15, lr: 4.50e-04 2022-05-04 22:47:49,259 INFO [train.py:715] (0/8) Epoch 4, batch 17800, loss[loss=0.1558, simple_loss=0.2319, pruned_loss=0.03983, over 4921.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2261, pruned_loss=0.04402, over 972862.99 frames.], batch size: 18, lr: 4.50e-04 2022-05-04 22:48:29,925 INFO [train.py:715] (0/8) Epoch 4, batch 17850, loss[loss=0.1733, simple_loss=0.2406, pruned_loss=0.053, over 4749.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2264, pruned_loss=0.04444, over 973347.82 frames.], batch size: 19, lr: 4.50e-04 2022-05-04 22:49:09,025 INFO [train.py:715] (0/8) Epoch 4, batch 17900, loss[loss=0.1409, simple_loss=0.2251, pruned_loss=0.02833, over 4851.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2266, pruned_loss=0.04444, over 972924.06 frames.], batch size: 20, lr: 4.50e-04 2022-05-04 22:49:49,021 INFO [train.py:715] (0/8) Epoch 4, batch 17950, loss[loss=0.1985, simple_loss=0.2465, pruned_loss=0.07519, over 4914.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2265, pruned_loss=0.04457, over 972833.45 frames.], batch size: 17, lr: 4.50e-04 2022-05-04 22:50:29,181 INFO [train.py:715] (0/8) Epoch 4, batch 18000, loss[loss=0.1464, simple_loss=0.2188, pruned_loss=0.03696, over 4932.00 frames.], tot_loss[loss=0.1588, simple_loss=0.2276, pruned_loss=0.04502, over 972883.15 frames.], batch size: 21, lr: 4.50e-04 2022-05-04 22:50:29,182 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 22:50:38,825 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1119, simple_loss=0.1976, pruned_loss=0.01313, over 914524.00 frames. 2022-05-04 22:51:19,283 INFO [train.py:715] (0/8) Epoch 4, batch 18050, loss[loss=0.1463, simple_loss=0.2191, pruned_loss=0.03669, over 4848.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2276, pruned_loss=0.04472, over 972595.46 frames.], batch size: 32, lr: 4.50e-04 2022-05-04 22:51:59,523 INFO [train.py:715] (0/8) Epoch 4, batch 18100, loss[loss=0.1825, simple_loss=0.2429, pruned_loss=0.061, over 4760.00 frames.], tot_loss[loss=0.1586, simple_loss=0.2276, pruned_loss=0.0448, over 972993.09 frames.], batch size: 16, lr: 4.50e-04 2022-05-04 22:52:39,099 INFO [train.py:715] (0/8) Epoch 4, batch 18150, loss[loss=0.1587, simple_loss=0.2218, pruned_loss=0.04781, over 4746.00 frames.], tot_loss[loss=0.1585, simple_loss=0.2278, pruned_loss=0.04466, over 972381.27 frames.], batch size: 16, lr: 4.50e-04 2022-05-04 22:53:19,407 INFO [train.py:715] (0/8) Epoch 4, batch 18200, loss[loss=0.1322, simple_loss=0.2058, pruned_loss=0.02926, over 4751.00 frames.], tot_loss[loss=0.1587, simple_loss=0.228, pruned_loss=0.04473, over 972222.24 frames.], batch size: 19, lr: 4.49e-04 2022-05-04 22:53:59,872 INFO [train.py:715] (0/8) Epoch 4, batch 18250, loss[loss=0.1659, simple_loss=0.2289, pruned_loss=0.05139, over 4920.00 frames.], tot_loss[loss=0.158, simple_loss=0.2273, pruned_loss=0.04434, over 972575.78 frames.], batch size: 18, lr: 4.49e-04 2022-05-04 22:54:39,573 INFO [train.py:715] (0/8) Epoch 4, batch 18300, loss[loss=0.1585, simple_loss=0.2375, pruned_loss=0.03973, over 4753.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2265, pruned_loss=0.04369, over 972175.44 frames.], batch size: 19, lr: 4.49e-04 2022-05-04 22:55:19,282 INFO [train.py:715] (0/8) Epoch 4, batch 18350, loss[loss=0.1583, simple_loss=0.2301, pruned_loss=0.04321, over 4981.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2268, pruned_loss=0.04371, over 971589.75 frames.], batch size: 14, lr: 4.49e-04 2022-05-04 22:56:00,396 INFO [train.py:715] (0/8) Epoch 4, batch 18400, loss[loss=0.1336, simple_loss=0.2066, pruned_loss=0.03028, over 4777.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2268, pruned_loss=0.04365, over 972352.12 frames.], batch size: 18, lr: 4.49e-04 2022-05-04 22:56:40,808 INFO [train.py:715] (0/8) Epoch 4, batch 18450, loss[loss=0.169, simple_loss=0.2423, pruned_loss=0.0478, over 4802.00 frames.], tot_loss[loss=0.157, simple_loss=0.2263, pruned_loss=0.04389, over 971587.75 frames.], batch size: 24, lr: 4.49e-04 2022-05-04 22:57:20,897 INFO [train.py:715] (0/8) Epoch 4, batch 18500, loss[loss=0.1777, simple_loss=0.2361, pruned_loss=0.05967, over 4821.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2258, pruned_loss=0.04326, over 971415.28 frames.], batch size: 26, lr: 4.49e-04 2022-05-04 22:58:01,189 INFO [train.py:715] (0/8) Epoch 4, batch 18550, loss[loss=0.179, simple_loss=0.2476, pruned_loss=0.05517, over 4956.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2269, pruned_loss=0.04409, over 970806.08 frames.], batch size: 24, lr: 4.49e-04 2022-05-04 22:58:41,890 INFO [train.py:715] (0/8) Epoch 4, batch 18600, loss[loss=0.1401, simple_loss=0.2185, pruned_loss=0.03084, over 4814.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2265, pruned_loss=0.04367, over 971744.71 frames.], batch size: 27, lr: 4.49e-04 2022-05-04 22:59:21,457 INFO [train.py:715] (0/8) Epoch 4, batch 18650, loss[loss=0.148, simple_loss=0.2121, pruned_loss=0.04194, over 4972.00 frames.], tot_loss[loss=0.1563, simple_loss=0.226, pruned_loss=0.04326, over 973283.86 frames.], batch size: 35, lr: 4.49e-04 2022-05-04 23:00:01,607 INFO [train.py:715] (0/8) Epoch 4, batch 18700, loss[loss=0.1591, simple_loss=0.2303, pruned_loss=0.04397, over 4938.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2266, pruned_loss=0.04335, over 972641.47 frames.], batch size: 21, lr: 4.49e-04 2022-05-04 23:00:42,468 INFO [train.py:715] (0/8) Epoch 4, batch 18750, loss[loss=0.1421, simple_loss=0.21, pruned_loss=0.03712, over 4807.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2258, pruned_loss=0.04319, over 973110.97 frames.], batch size: 21, lr: 4.49e-04 2022-05-04 23:01:21,944 INFO [train.py:715] (0/8) Epoch 4, batch 18800, loss[loss=0.1404, simple_loss=0.2183, pruned_loss=0.03122, over 4869.00 frames.], tot_loss[loss=0.1552, simple_loss=0.225, pruned_loss=0.04268, over 973139.37 frames.], batch size: 20, lr: 4.49e-04 2022-05-04 23:02:02,027 INFO [train.py:715] (0/8) Epoch 4, batch 18850, loss[loss=0.162, simple_loss=0.229, pruned_loss=0.04752, over 4949.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2249, pruned_loss=0.04271, over 972684.38 frames.], batch size: 21, lr: 4.49e-04 2022-05-04 23:02:42,431 INFO [train.py:715] (0/8) Epoch 4, batch 18900, loss[loss=0.1615, simple_loss=0.2375, pruned_loss=0.04279, over 4849.00 frames.], tot_loss[loss=0.1565, simple_loss=0.226, pruned_loss=0.04348, over 973380.79 frames.], batch size: 13, lr: 4.48e-04 2022-05-04 23:03:22,745 INFO [train.py:715] (0/8) Epoch 4, batch 18950, loss[loss=0.161, simple_loss=0.2299, pruned_loss=0.04603, over 4797.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2268, pruned_loss=0.04425, over 972838.40 frames.], batch size: 21, lr: 4.48e-04 2022-05-04 23:04:01,998 INFO [train.py:715] (0/8) Epoch 4, batch 19000, loss[loss=0.1235, simple_loss=0.1971, pruned_loss=0.02497, over 4977.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2256, pruned_loss=0.044, over 972705.42 frames.], batch size: 14, lr: 4.48e-04 2022-05-04 23:04:42,518 INFO [train.py:715] (0/8) Epoch 4, batch 19050, loss[loss=0.1323, simple_loss=0.2059, pruned_loss=0.02929, over 4979.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2252, pruned_loss=0.04373, over 972653.66 frames.], batch size: 15, lr: 4.48e-04 2022-05-04 23:05:23,218 INFO [train.py:715] (0/8) Epoch 4, batch 19100, loss[loss=0.161, simple_loss=0.2355, pruned_loss=0.04327, over 4798.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2251, pruned_loss=0.04317, over 972015.61 frames.], batch size: 14, lr: 4.48e-04 2022-05-04 23:06:03,169 INFO [train.py:715] (0/8) Epoch 4, batch 19150, loss[loss=0.1624, simple_loss=0.228, pruned_loss=0.04844, over 4862.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2254, pruned_loss=0.04306, over 971612.10 frames.], batch size: 13, lr: 4.48e-04 2022-05-04 23:06:43,530 INFO [train.py:715] (0/8) Epoch 4, batch 19200, loss[loss=0.1263, simple_loss=0.2045, pruned_loss=0.02402, over 4844.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2258, pruned_loss=0.04355, over 972031.72 frames.], batch size: 20, lr: 4.48e-04 2022-05-04 23:07:24,305 INFO [train.py:715] (0/8) Epoch 4, batch 19250, loss[loss=0.1491, simple_loss=0.2223, pruned_loss=0.03792, over 4957.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2258, pruned_loss=0.04347, over 971523.33 frames.], batch size: 39, lr: 4.48e-04 2022-05-04 23:08:04,907 INFO [train.py:715] (0/8) Epoch 4, batch 19300, loss[loss=0.153, simple_loss=0.2123, pruned_loss=0.04683, over 4878.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2253, pruned_loss=0.04316, over 971166.94 frames.], batch size: 32, lr: 4.48e-04 2022-05-04 23:08:44,082 INFO [train.py:715] (0/8) Epoch 4, batch 19350, loss[loss=0.1253, simple_loss=0.1936, pruned_loss=0.02849, over 4947.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2247, pruned_loss=0.0428, over 971342.72 frames.], batch size: 29, lr: 4.48e-04 2022-05-04 23:09:24,776 INFO [train.py:715] (0/8) Epoch 4, batch 19400, loss[loss=0.1238, simple_loss=0.1995, pruned_loss=0.02403, over 4764.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2235, pruned_loss=0.0418, over 971257.96 frames.], batch size: 12, lr: 4.48e-04 2022-05-04 23:10:06,277 INFO [train.py:715] (0/8) Epoch 4, batch 19450, loss[loss=0.1589, simple_loss=0.2129, pruned_loss=0.05247, over 4887.00 frames.], tot_loss[loss=0.1551, simple_loss=0.2249, pruned_loss=0.04263, over 972021.34 frames.], batch size: 32, lr: 4.48e-04 2022-05-04 23:10:47,441 INFO [train.py:715] (0/8) Epoch 4, batch 19500, loss[loss=0.15, simple_loss=0.2182, pruned_loss=0.04089, over 4782.00 frames.], tot_loss[loss=0.1565, simple_loss=0.226, pruned_loss=0.04352, over 972112.57 frames.], batch size: 17, lr: 4.48e-04 2022-05-04 23:11:27,082 INFO [train.py:715] (0/8) Epoch 4, batch 19550, loss[loss=0.1487, simple_loss=0.2197, pruned_loss=0.0389, over 4842.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2272, pruned_loss=0.04446, over 972401.91 frames.], batch size: 13, lr: 4.48e-04 2022-05-04 23:12:07,479 INFO [train.py:715] (0/8) Epoch 4, batch 19600, loss[loss=0.1647, simple_loss=0.2307, pruned_loss=0.04939, over 4781.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2266, pruned_loss=0.04428, over 972606.79 frames.], batch size: 14, lr: 4.47e-04 2022-05-04 23:12:47,701 INFO [train.py:715] (0/8) Epoch 4, batch 19650, loss[loss=0.1778, simple_loss=0.2429, pruned_loss=0.05636, over 4747.00 frames.], tot_loss[loss=0.1577, simple_loss=0.227, pruned_loss=0.04423, over 972845.50 frames.], batch size: 16, lr: 4.47e-04 2022-05-04 23:13:26,471 INFO [train.py:715] (0/8) Epoch 4, batch 19700, loss[loss=0.1498, simple_loss=0.2167, pruned_loss=0.04147, over 4958.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2268, pruned_loss=0.04406, over 973293.90 frames.], batch size: 15, lr: 4.47e-04 2022-05-04 23:14:07,152 INFO [train.py:715] (0/8) Epoch 4, batch 19750, loss[loss=0.1471, simple_loss=0.2152, pruned_loss=0.03951, over 4827.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2273, pruned_loss=0.04384, over 972518.98 frames.], batch size: 27, lr: 4.47e-04 2022-05-04 23:14:47,972 INFO [train.py:715] (0/8) Epoch 4, batch 19800, loss[loss=0.15, simple_loss=0.2317, pruned_loss=0.03414, over 4887.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2268, pruned_loss=0.04341, over 973753.98 frames.], batch size: 22, lr: 4.47e-04 2022-05-04 23:15:27,714 INFO [train.py:715] (0/8) Epoch 4, batch 19850, loss[loss=0.2114, simple_loss=0.248, pruned_loss=0.0874, over 4790.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2276, pruned_loss=0.04412, over 973381.09 frames.], batch size: 12, lr: 4.47e-04 2022-05-04 23:16:07,787 INFO [train.py:715] (0/8) Epoch 4, batch 19900, loss[loss=0.1581, simple_loss=0.22, pruned_loss=0.04806, over 4960.00 frames.], tot_loss[loss=0.158, simple_loss=0.2274, pruned_loss=0.04434, over 973419.17 frames.], batch size: 35, lr: 4.47e-04 2022-05-04 23:16:47,901 INFO [train.py:715] (0/8) Epoch 4, batch 19950, loss[loss=0.1324, simple_loss=0.2063, pruned_loss=0.0293, over 4940.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2273, pruned_loss=0.04414, over 972685.89 frames.], batch size: 21, lr: 4.47e-04 2022-05-04 23:17:28,065 INFO [train.py:715] (0/8) Epoch 4, batch 20000, loss[loss=0.1544, simple_loss=0.2177, pruned_loss=0.0455, over 4812.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2273, pruned_loss=0.04407, over 973211.47 frames.], batch size: 21, lr: 4.47e-04 2022-05-04 23:18:06,766 INFO [train.py:715] (0/8) Epoch 4, batch 20050, loss[loss=0.2072, simple_loss=0.2761, pruned_loss=0.06914, over 4871.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2278, pruned_loss=0.04441, over 972929.39 frames.], batch size: 38, lr: 4.47e-04 2022-05-04 23:18:46,562 INFO [train.py:715] (0/8) Epoch 4, batch 20100, loss[loss=0.1821, simple_loss=0.2417, pruned_loss=0.06123, over 4779.00 frames.], tot_loss[loss=0.1595, simple_loss=0.2287, pruned_loss=0.04515, over 972442.78 frames.], batch size: 14, lr: 4.47e-04 2022-05-04 23:19:26,629 INFO [train.py:715] (0/8) Epoch 4, batch 20150, loss[loss=0.1285, simple_loss=0.2016, pruned_loss=0.0277, over 4967.00 frames.], tot_loss[loss=0.1583, simple_loss=0.2275, pruned_loss=0.04455, over 972018.79 frames.], batch size: 28, lr: 4.47e-04 2022-05-04 23:20:06,055 INFO [train.py:715] (0/8) Epoch 4, batch 20200, loss[loss=0.1671, simple_loss=0.2411, pruned_loss=0.0465, over 4949.00 frames.], tot_loss[loss=0.1581, simple_loss=0.2279, pruned_loss=0.04418, over 972753.12 frames.], batch size: 21, lr: 4.47e-04 2022-05-04 23:20:45,799 INFO [train.py:715] (0/8) Epoch 4, batch 20250, loss[loss=0.1949, simple_loss=0.2413, pruned_loss=0.07427, over 4676.00 frames.], tot_loss[loss=0.1587, simple_loss=0.228, pruned_loss=0.04473, over 972907.85 frames.], batch size: 13, lr: 4.47e-04 2022-05-04 23:21:26,114 INFO [train.py:715] (0/8) Epoch 4, batch 20300, loss[loss=0.1645, simple_loss=0.2289, pruned_loss=0.05004, over 4764.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2279, pruned_loss=0.04443, over 972186.25 frames.], batch size: 18, lr: 4.46e-04 2022-05-04 23:22:06,212 INFO [train.py:715] (0/8) Epoch 4, batch 20350, loss[loss=0.1416, simple_loss=0.213, pruned_loss=0.03514, over 4905.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2275, pruned_loss=0.04397, over 971882.09 frames.], batch size: 17, lr: 4.46e-04 2022-05-04 23:22:45,049 INFO [train.py:715] (0/8) Epoch 4, batch 20400, loss[loss=0.1658, simple_loss=0.2447, pruned_loss=0.04349, over 4925.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2268, pruned_loss=0.04344, over 972239.20 frames.], batch size: 39, lr: 4.46e-04 2022-05-04 23:23:25,033 INFO [train.py:715] (0/8) Epoch 4, batch 20450, loss[loss=0.1791, simple_loss=0.2476, pruned_loss=0.05528, over 4697.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2273, pruned_loss=0.04364, over 971929.95 frames.], batch size: 15, lr: 4.46e-04 2022-05-04 23:24:04,973 INFO [train.py:715] (0/8) Epoch 4, batch 20500, loss[loss=0.1729, simple_loss=0.2372, pruned_loss=0.05431, over 4862.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2274, pruned_loss=0.0441, over 972151.17 frames.], batch size: 32, lr: 4.46e-04 2022-05-04 23:24:44,750 INFO [train.py:715] (0/8) Epoch 4, batch 20550, loss[loss=0.1746, simple_loss=0.2472, pruned_loss=0.05096, over 4829.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2273, pruned_loss=0.04407, over 972171.41 frames.], batch size: 15, lr: 4.46e-04 2022-05-04 23:25:23,722 INFO [train.py:715] (0/8) Epoch 4, batch 20600, loss[loss=0.117, simple_loss=0.1937, pruned_loss=0.0201, over 4807.00 frames.], tot_loss[loss=0.1593, simple_loss=0.2289, pruned_loss=0.0448, over 972514.99 frames.], batch size: 13, lr: 4.46e-04 2022-05-04 23:26:03,666 INFO [train.py:715] (0/8) Epoch 4, batch 20650, loss[loss=0.1268, simple_loss=0.1925, pruned_loss=0.03058, over 4787.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2274, pruned_loss=0.044, over 972962.82 frames.], batch size: 17, lr: 4.46e-04 2022-05-04 23:26:44,155 INFO [train.py:715] (0/8) Epoch 4, batch 20700, loss[loss=0.1694, simple_loss=0.2517, pruned_loss=0.04352, over 4919.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2258, pruned_loss=0.04329, over 972538.47 frames.], batch size: 39, lr: 4.46e-04 2022-05-04 23:27:22,806 INFO [train.py:715] (0/8) Epoch 4, batch 20750, loss[loss=0.1254, simple_loss=0.1958, pruned_loss=0.02754, over 4851.00 frames.], tot_loss[loss=0.157, simple_loss=0.2267, pruned_loss=0.04371, over 972635.20 frames.], batch size: 15, lr: 4.46e-04 2022-05-04 23:27:32,127 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-160000.pt 2022-05-04 23:28:04,813 INFO [train.py:715] (0/8) Epoch 4, batch 20800, loss[loss=0.1536, simple_loss=0.2174, pruned_loss=0.0449, over 4983.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2261, pruned_loss=0.04369, over 972717.47 frames.], batch size: 15, lr: 4.46e-04 2022-05-04 23:28:44,598 INFO [train.py:715] (0/8) Epoch 4, batch 20850, loss[loss=0.1628, simple_loss=0.2286, pruned_loss=0.04853, over 4762.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2259, pruned_loss=0.04377, over 972043.65 frames.], batch size: 19, lr: 4.46e-04 2022-05-04 23:29:24,434 INFO [train.py:715] (0/8) Epoch 4, batch 20900, loss[loss=0.1402, simple_loss=0.2067, pruned_loss=0.03681, over 4949.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2263, pruned_loss=0.0437, over 972758.20 frames.], batch size: 21, lr: 4.46e-04 2022-05-04 23:30:03,470 INFO [train.py:715] (0/8) Epoch 4, batch 20950, loss[loss=0.152, simple_loss=0.2121, pruned_loss=0.04598, over 4837.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2276, pruned_loss=0.0441, over 973284.63 frames.], batch size: 30, lr: 4.46e-04 2022-05-04 23:30:43,446 INFO [train.py:715] (0/8) Epoch 4, batch 21000, loss[loss=0.1399, simple_loss=0.2153, pruned_loss=0.03223, over 4965.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2258, pruned_loss=0.04339, over 972919.40 frames.], batch size: 24, lr: 4.46e-04 2022-05-04 23:30:43,447 INFO [train.py:733] (0/8) Computing validation loss 2022-05-04 23:30:52,895 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1116, simple_loss=0.1973, pruned_loss=0.01293, over 914524.00 frames. 2022-05-04 23:31:33,187 INFO [train.py:715] (0/8) Epoch 4, batch 21050, loss[loss=0.1818, simple_loss=0.2451, pruned_loss=0.0592, over 4913.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2257, pruned_loss=0.04344, over 972033.82 frames.], batch size: 17, lr: 4.45e-04 2022-05-04 23:32:12,980 INFO [train.py:715] (0/8) Epoch 4, batch 21100, loss[loss=0.1375, simple_loss=0.21, pruned_loss=0.03246, over 4788.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2259, pruned_loss=0.04348, over 972492.73 frames.], batch size: 18, lr: 4.45e-04 2022-05-04 23:32:52,570 INFO [train.py:715] (0/8) Epoch 4, batch 21150, loss[loss=0.1487, simple_loss=0.2229, pruned_loss=0.03725, over 4706.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2266, pruned_loss=0.0438, over 972157.10 frames.], batch size: 15, lr: 4.45e-04 2022-05-04 23:33:32,148 INFO [train.py:715] (0/8) Epoch 4, batch 21200, loss[loss=0.1413, simple_loss=0.2008, pruned_loss=0.04094, over 4817.00 frames.], tot_loss[loss=0.1582, simple_loss=0.2273, pruned_loss=0.04451, over 971478.29 frames.], batch size: 26, lr: 4.45e-04 2022-05-04 23:34:12,366 INFO [train.py:715] (0/8) Epoch 4, batch 21250, loss[loss=0.151, simple_loss=0.2225, pruned_loss=0.03976, over 4848.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2264, pruned_loss=0.04432, over 971996.04 frames.], batch size: 20, lr: 4.45e-04 2022-05-04 23:34:51,186 INFO [train.py:715] (0/8) Epoch 4, batch 21300, loss[loss=0.1467, simple_loss=0.2205, pruned_loss=0.03648, over 4893.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2258, pruned_loss=0.04389, over 971257.09 frames.], batch size: 16, lr: 4.45e-04 2022-05-04 23:35:30,244 INFO [train.py:715] (0/8) Epoch 4, batch 21350, loss[loss=0.1849, simple_loss=0.2488, pruned_loss=0.0605, over 4893.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2254, pruned_loss=0.0436, over 970903.33 frames.], batch size: 17, lr: 4.45e-04 2022-05-04 23:36:09,894 INFO [train.py:715] (0/8) Epoch 4, batch 21400, loss[loss=0.1575, simple_loss=0.2352, pruned_loss=0.03985, over 4866.00 frames.], tot_loss[loss=0.1559, simple_loss=0.225, pruned_loss=0.04336, over 971362.89 frames.], batch size: 20, lr: 4.45e-04 2022-05-04 23:36:49,456 INFO [train.py:715] (0/8) Epoch 4, batch 21450, loss[loss=0.1754, simple_loss=0.2563, pruned_loss=0.04727, over 4790.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2247, pruned_loss=0.04296, over 971793.11 frames.], batch size: 21, lr: 4.45e-04 2022-05-04 23:37:28,648 INFO [train.py:715] (0/8) Epoch 4, batch 21500, loss[loss=0.143, simple_loss=0.2126, pruned_loss=0.03671, over 4785.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2247, pruned_loss=0.0429, over 972164.22 frames.], batch size: 14, lr: 4.45e-04 2022-05-04 23:38:08,475 INFO [train.py:715] (0/8) Epoch 4, batch 21550, loss[loss=0.2045, simple_loss=0.2636, pruned_loss=0.07268, over 4812.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2255, pruned_loss=0.04376, over 972047.57 frames.], batch size: 25, lr: 4.45e-04 2022-05-04 23:38:48,843 INFO [train.py:715] (0/8) Epoch 4, batch 21600, loss[loss=0.1735, simple_loss=0.2602, pruned_loss=0.04338, over 4806.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2261, pruned_loss=0.04341, over 972403.93 frames.], batch size: 25, lr: 4.45e-04 2022-05-04 23:39:28,099 INFO [train.py:715] (0/8) Epoch 4, batch 21650, loss[loss=0.1476, simple_loss=0.2158, pruned_loss=0.03974, over 4840.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2268, pruned_loss=0.04345, over 971810.68 frames.], batch size: 12, lr: 4.45e-04 2022-05-04 23:40:08,351 INFO [train.py:715] (0/8) Epoch 4, batch 21700, loss[loss=0.1381, simple_loss=0.2082, pruned_loss=0.03399, over 4780.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2265, pruned_loss=0.04331, over 971993.13 frames.], batch size: 18, lr: 4.45e-04 2022-05-04 23:40:49,397 INFO [train.py:715] (0/8) Epoch 4, batch 21750, loss[loss=0.1781, simple_loss=0.2428, pruned_loss=0.05674, over 4751.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2259, pruned_loss=0.04323, over 971501.33 frames.], batch size: 19, lr: 4.44e-04 2022-05-04 23:41:29,016 INFO [train.py:715] (0/8) Epoch 4, batch 21800, loss[loss=0.1303, simple_loss=0.207, pruned_loss=0.02682, over 4783.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2263, pruned_loss=0.04369, over 972070.12 frames.], batch size: 18, lr: 4.44e-04 2022-05-04 23:42:08,601 INFO [train.py:715] (0/8) Epoch 4, batch 21850, loss[loss=0.1559, simple_loss=0.2168, pruned_loss=0.04748, over 4942.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2267, pruned_loss=0.04379, over 972742.73 frames.], batch size: 29, lr: 4.44e-04 2022-05-04 23:42:48,653 INFO [train.py:715] (0/8) Epoch 4, batch 21900, loss[loss=0.2207, simple_loss=0.2879, pruned_loss=0.07675, over 4831.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2263, pruned_loss=0.04363, over 972601.32 frames.], batch size: 15, lr: 4.44e-04 2022-05-04 23:43:29,096 INFO [train.py:715] (0/8) Epoch 4, batch 21950, loss[loss=0.1569, simple_loss=0.2301, pruned_loss=0.04181, over 4903.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2254, pruned_loss=0.04337, over 972815.31 frames.], batch size: 19, lr: 4.44e-04 2022-05-04 23:44:08,307 INFO [train.py:715] (0/8) Epoch 4, batch 22000, loss[loss=0.1985, simple_loss=0.248, pruned_loss=0.07457, over 4739.00 frames.], tot_loss[loss=0.156, simple_loss=0.2256, pruned_loss=0.04321, over 972742.35 frames.], batch size: 16, lr: 4.44e-04 2022-05-04 23:44:48,078 INFO [train.py:715] (0/8) Epoch 4, batch 22050, loss[loss=0.182, simple_loss=0.2431, pruned_loss=0.06048, over 4921.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2264, pruned_loss=0.0439, over 973270.95 frames.], batch size: 17, lr: 4.44e-04 2022-05-04 23:45:28,539 INFO [train.py:715] (0/8) Epoch 4, batch 22100, loss[loss=0.1359, simple_loss=0.2123, pruned_loss=0.02976, over 4874.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2262, pruned_loss=0.04338, over 972915.41 frames.], batch size: 16, lr: 4.44e-04 2022-05-04 23:46:08,390 INFO [train.py:715] (0/8) Epoch 4, batch 22150, loss[loss=0.1541, simple_loss=0.2112, pruned_loss=0.04851, over 4777.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2245, pruned_loss=0.0424, over 972029.56 frames.], batch size: 14, lr: 4.44e-04 2022-05-04 23:46:47,297 INFO [train.py:715] (0/8) Epoch 4, batch 22200, loss[loss=0.1603, simple_loss=0.2258, pruned_loss=0.04745, over 4864.00 frames.], tot_loss[loss=0.155, simple_loss=0.2251, pruned_loss=0.04246, over 972727.35 frames.], batch size: 20, lr: 4.44e-04 2022-05-04 23:47:27,364 INFO [train.py:715] (0/8) Epoch 4, batch 22250, loss[loss=0.1765, simple_loss=0.2417, pruned_loss=0.05563, over 4972.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2244, pruned_loss=0.04269, over 972179.70 frames.], batch size: 39, lr: 4.44e-04 2022-05-04 23:48:07,755 INFO [train.py:715] (0/8) Epoch 4, batch 22300, loss[loss=0.2137, simple_loss=0.2798, pruned_loss=0.0738, over 4734.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2252, pruned_loss=0.04281, over 972430.39 frames.], batch size: 16, lr: 4.44e-04 2022-05-04 23:48:46,512 INFO [train.py:715] (0/8) Epoch 4, batch 22350, loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02918, over 4851.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2243, pruned_loss=0.04258, over 972046.24 frames.], batch size: 13, lr: 4.44e-04 2022-05-04 23:49:25,542 INFO [train.py:715] (0/8) Epoch 4, batch 22400, loss[loss=0.2029, simple_loss=0.2617, pruned_loss=0.07204, over 4871.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2246, pruned_loss=0.04257, over 972023.05 frames.], batch size: 16, lr: 4.44e-04 2022-05-04 23:50:06,126 INFO [train.py:715] (0/8) Epoch 4, batch 22450, loss[loss=0.1567, simple_loss=0.2162, pruned_loss=0.04856, over 4827.00 frames.], tot_loss[loss=0.1561, simple_loss=0.226, pruned_loss=0.04316, over 972323.65 frames.], batch size: 13, lr: 4.44e-04 2022-05-04 23:50:45,319 INFO [train.py:715] (0/8) Epoch 4, batch 22500, loss[loss=0.1447, simple_loss=0.2202, pruned_loss=0.03462, over 4921.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2271, pruned_loss=0.04392, over 972440.54 frames.], batch size: 29, lr: 4.43e-04 2022-05-04 23:51:24,257 INFO [train.py:715] (0/8) Epoch 4, batch 22550, loss[loss=0.1476, simple_loss=0.2231, pruned_loss=0.03606, over 4898.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2269, pruned_loss=0.04399, over 973002.45 frames.], batch size: 19, lr: 4.43e-04 2022-05-04 23:52:04,193 INFO [train.py:715] (0/8) Epoch 4, batch 22600, loss[loss=0.1685, simple_loss=0.2536, pruned_loss=0.04168, over 4983.00 frames.], tot_loss[loss=0.1584, simple_loss=0.228, pruned_loss=0.04437, over 972235.81 frames.], batch size: 24, lr: 4.43e-04 2022-05-04 23:52:44,022 INFO [train.py:715] (0/8) Epoch 4, batch 22650, loss[loss=0.2103, simple_loss=0.2791, pruned_loss=0.07073, over 4918.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2278, pruned_loss=0.04455, over 972389.62 frames.], batch size: 18, lr: 4.43e-04 2022-05-04 23:53:22,950 INFO [train.py:715] (0/8) Epoch 4, batch 22700, loss[loss=0.1749, simple_loss=0.2355, pruned_loss=0.05711, over 4831.00 frames.], tot_loss[loss=0.1584, simple_loss=0.2276, pruned_loss=0.04456, over 971671.66 frames.], batch size: 25, lr: 4.43e-04 2022-05-04 23:54:02,343 INFO [train.py:715] (0/8) Epoch 4, batch 22750, loss[loss=0.1361, simple_loss=0.2138, pruned_loss=0.02919, over 4940.00 frames.], tot_loss[loss=0.158, simple_loss=0.2276, pruned_loss=0.04419, over 972415.45 frames.], batch size: 29, lr: 4.43e-04 2022-05-04 23:54:42,049 INFO [train.py:715] (0/8) Epoch 4, batch 22800, loss[loss=0.1638, simple_loss=0.2411, pruned_loss=0.04327, over 4979.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2269, pruned_loss=0.04376, over 972440.91 frames.], batch size: 39, lr: 4.43e-04 2022-05-04 23:55:21,170 INFO [train.py:715] (0/8) Epoch 4, batch 22850, loss[loss=0.1614, simple_loss=0.2363, pruned_loss=0.0433, over 4789.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2271, pruned_loss=0.04363, over 972204.62 frames.], batch size: 24, lr: 4.43e-04 2022-05-04 23:55:59,903 INFO [train.py:715] (0/8) Epoch 4, batch 22900, loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03055, over 4805.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2277, pruned_loss=0.04404, over 971620.49 frames.], batch size: 13, lr: 4.43e-04 2022-05-04 23:56:39,568 INFO [train.py:715] (0/8) Epoch 4, batch 22950, loss[loss=0.2194, simple_loss=0.2731, pruned_loss=0.08284, over 4925.00 frames.], tot_loss[loss=0.1578, simple_loss=0.2273, pruned_loss=0.0441, over 971058.20 frames.], batch size: 29, lr: 4.43e-04 2022-05-04 23:57:19,679 INFO [train.py:715] (0/8) Epoch 4, batch 23000, loss[loss=0.1409, simple_loss=0.2099, pruned_loss=0.03598, over 4793.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2261, pruned_loss=0.04358, over 971127.18 frames.], batch size: 18, lr: 4.43e-04 2022-05-04 23:57:58,018 INFO [train.py:715] (0/8) Epoch 4, batch 23050, loss[loss=0.175, simple_loss=0.2241, pruned_loss=0.06299, over 4880.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2256, pruned_loss=0.04291, over 971851.42 frames.], batch size: 32, lr: 4.43e-04 2022-05-04 23:58:37,634 INFO [train.py:715] (0/8) Epoch 4, batch 23100, loss[loss=0.1471, simple_loss=0.2248, pruned_loss=0.03473, over 4964.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2258, pruned_loss=0.04277, over 971709.64 frames.], batch size: 28, lr: 4.43e-04 2022-05-04 23:59:18,007 INFO [train.py:715] (0/8) Epoch 4, batch 23150, loss[loss=0.1615, simple_loss=0.2306, pruned_loss=0.04617, over 4913.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2258, pruned_loss=0.04295, over 971763.30 frames.], batch size: 19, lr: 4.43e-04 2022-05-04 23:59:57,818 INFO [train.py:715] (0/8) Epoch 4, batch 23200, loss[loss=0.1509, simple_loss=0.2154, pruned_loss=0.04317, over 4889.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2266, pruned_loss=0.04336, over 972094.10 frames.], batch size: 19, lr: 4.42e-04 2022-05-05 00:00:36,527 INFO [train.py:715] (0/8) Epoch 4, batch 23250, loss[loss=0.1404, simple_loss=0.2143, pruned_loss=0.03326, over 4796.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2265, pruned_loss=0.04366, over 971956.65 frames.], batch size: 24, lr: 4.42e-04 2022-05-05 00:01:16,400 INFO [train.py:715] (0/8) Epoch 4, batch 23300, loss[loss=0.1593, simple_loss=0.2252, pruned_loss=0.04667, over 4759.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2264, pruned_loss=0.04411, over 971470.08 frames.], batch size: 19, lr: 4.42e-04 2022-05-05 00:01:56,691 INFO [train.py:715] (0/8) Epoch 4, batch 23350, loss[loss=0.1906, simple_loss=0.2343, pruned_loss=0.07344, over 4834.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2267, pruned_loss=0.04436, over 972139.68 frames.], batch size: 13, lr: 4.42e-04 2022-05-05 00:02:35,080 INFO [train.py:715] (0/8) Epoch 4, batch 23400, loss[loss=0.1241, simple_loss=0.2048, pruned_loss=0.02168, over 4817.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2266, pruned_loss=0.04419, over 972535.45 frames.], batch size: 25, lr: 4.42e-04 2022-05-05 00:03:14,414 INFO [train.py:715] (0/8) Epoch 4, batch 23450, loss[loss=0.1761, simple_loss=0.2425, pruned_loss=0.05484, over 4951.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2264, pruned_loss=0.0442, over 972192.56 frames.], batch size: 23, lr: 4.42e-04 2022-05-05 00:03:54,998 INFO [train.py:715] (0/8) Epoch 4, batch 23500, loss[loss=0.184, simple_loss=0.2621, pruned_loss=0.05298, over 4843.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2257, pruned_loss=0.04359, over 971929.42 frames.], batch size: 34, lr: 4.42e-04 2022-05-05 00:04:33,370 INFO [train.py:715] (0/8) Epoch 4, batch 23550, loss[loss=0.1427, simple_loss=0.218, pruned_loss=0.03375, over 4792.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2251, pruned_loss=0.04308, over 971933.35 frames.], batch size: 21, lr: 4.42e-04 2022-05-05 00:05:12,658 INFO [train.py:715] (0/8) Epoch 4, batch 23600, loss[loss=0.1601, simple_loss=0.222, pruned_loss=0.04909, over 4872.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2247, pruned_loss=0.04294, over 972573.74 frames.], batch size: 20, lr: 4.42e-04 2022-05-05 00:05:53,467 INFO [train.py:715] (0/8) Epoch 4, batch 23650, loss[loss=0.1881, simple_loss=0.2542, pruned_loss=0.06098, over 4956.00 frames.], tot_loss[loss=0.1565, simple_loss=0.226, pruned_loss=0.04354, over 973933.22 frames.], batch size: 15, lr: 4.42e-04 2022-05-05 00:06:34,860 INFO [train.py:715] (0/8) Epoch 4, batch 23700, loss[loss=0.1574, simple_loss=0.22, pruned_loss=0.04738, over 4878.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2268, pruned_loss=0.04385, over 973679.41 frames.], batch size: 32, lr: 4.42e-04 2022-05-05 00:07:14,370 INFO [train.py:715] (0/8) Epoch 4, batch 23750, loss[loss=0.1708, simple_loss=0.2426, pruned_loss=0.04949, over 4796.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2271, pruned_loss=0.04408, over 973563.40 frames.], batch size: 13, lr: 4.42e-04 2022-05-05 00:07:53,774 INFO [train.py:715] (0/8) Epoch 4, batch 23800, loss[loss=0.1961, simple_loss=0.269, pruned_loss=0.06158, over 4919.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2268, pruned_loss=0.04427, over 973642.86 frames.], batch size: 18, lr: 4.42e-04 2022-05-05 00:08:34,377 INFO [train.py:715] (0/8) Epoch 4, batch 23850, loss[loss=0.1661, simple_loss=0.2289, pruned_loss=0.05161, over 4924.00 frames.], tot_loss[loss=0.1576, simple_loss=0.2268, pruned_loss=0.04422, over 973692.14 frames.], batch size: 39, lr: 4.42e-04 2022-05-05 00:09:13,907 INFO [train.py:715] (0/8) Epoch 4, batch 23900, loss[loss=0.1503, simple_loss=0.2225, pruned_loss=0.03905, over 4920.00 frames.], tot_loss[loss=0.158, simple_loss=0.2272, pruned_loss=0.04441, over 973054.75 frames.], batch size: 17, lr: 4.42e-04 2022-05-05 00:09:53,720 INFO [train.py:715] (0/8) Epoch 4, batch 23950, loss[loss=0.1514, simple_loss=0.2337, pruned_loss=0.03456, over 4817.00 frames.], tot_loss[loss=0.1569, simple_loss=0.226, pruned_loss=0.04393, over 972777.00 frames.], batch size: 25, lr: 4.41e-04 2022-05-05 00:10:34,504 INFO [train.py:715] (0/8) Epoch 4, batch 24000, loss[loss=0.1428, simple_loss=0.2119, pruned_loss=0.03689, over 4778.00 frames.], tot_loss[loss=0.156, simple_loss=0.2258, pruned_loss=0.04313, over 972715.57 frames.], batch size: 17, lr: 4.41e-04 2022-05-05 00:10:34,505 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 00:10:44,333 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1115, simple_loss=0.1974, pruned_loss=0.01276, over 914524.00 frames. 2022-05-05 00:11:25,473 INFO [train.py:715] (0/8) Epoch 4, batch 24050, loss[loss=0.1507, simple_loss=0.2174, pruned_loss=0.04204, over 4940.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2257, pruned_loss=0.04304, over 971945.12 frames.], batch size: 21, lr: 4.41e-04 2022-05-05 00:12:06,057 INFO [train.py:715] (0/8) Epoch 4, batch 24100, loss[loss=0.1486, simple_loss=0.2192, pruned_loss=0.039, over 4803.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2253, pruned_loss=0.04298, over 972721.22 frames.], batch size: 21, lr: 4.41e-04 2022-05-05 00:12:45,925 INFO [train.py:715] (0/8) Epoch 4, batch 24150, loss[loss=0.1285, simple_loss=0.1986, pruned_loss=0.02917, over 4978.00 frames.], tot_loss[loss=0.1556, simple_loss=0.225, pruned_loss=0.04315, over 972383.01 frames.], batch size: 16, lr: 4.41e-04 2022-05-05 00:13:25,909 INFO [train.py:715] (0/8) Epoch 4, batch 24200, loss[loss=0.1268, simple_loss=0.2098, pruned_loss=0.02192, over 4983.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2239, pruned_loss=0.04221, over 972018.28 frames.], batch size: 28, lr: 4.41e-04 2022-05-05 00:14:07,339 INFO [train.py:715] (0/8) Epoch 4, batch 24250, loss[loss=0.1714, simple_loss=0.234, pruned_loss=0.05433, over 4698.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2243, pruned_loss=0.04228, over 971924.84 frames.], batch size: 15, lr: 4.41e-04 2022-05-05 00:14:46,253 INFO [train.py:715] (0/8) Epoch 4, batch 24300, loss[loss=0.1419, simple_loss=0.2029, pruned_loss=0.0405, over 4986.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2261, pruned_loss=0.04333, over 972539.31 frames.], batch size: 14, lr: 4.41e-04 2022-05-05 00:15:26,722 INFO [train.py:715] (0/8) Epoch 4, batch 24350, loss[loss=0.1705, simple_loss=0.2438, pruned_loss=0.04856, over 4913.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2259, pruned_loss=0.04338, over 972006.55 frames.], batch size: 17, lr: 4.41e-04 2022-05-05 00:16:07,659 INFO [train.py:715] (0/8) Epoch 4, batch 24400, loss[loss=0.1343, simple_loss=0.2077, pruned_loss=0.03047, over 4871.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2258, pruned_loss=0.04348, over 972206.39 frames.], batch size: 16, lr: 4.41e-04 2022-05-05 00:16:47,247 INFO [train.py:715] (0/8) Epoch 4, batch 24450, loss[loss=0.1714, simple_loss=0.2517, pruned_loss=0.04554, over 4815.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2254, pruned_loss=0.04309, over 971677.78 frames.], batch size: 27, lr: 4.41e-04 2022-05-05 00:17:27,021 INFO [train.py:715] (0/8) Epoch 4, batch 24500, loss[loss=0.1613, simple_loss=0.2324, pruned_loss=0.04514, over 4804.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2263, pruned_loss=0.04338, over 972236.87 frames.], batch size: 24, lr: 4.41e-04 2022-05-05 00:18:06,872 INFO [train.py:715] (0/8) Epoch 4, batch 24550, loss[loss=0.1658, simple_loss=0.2363, pruned_loss=0.04759, over 4788.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2259, pruned_loss=0.04293, over 972538.96 frames.], batch size: 18, lr: 4.41e-04 2022-05-05 00:18:48,104 INFO [train.py:715] (0/8) Epoch 4, batch 24600, loss[loss=0.1695, simple_loss=0.2246, pruned_loss=0.05722, over 4782.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2263, pruned_loss=0.04337, over 971912.46 frames.], batch size: 17, lr: 4.41e-04 2022-05-05 00:19:27,464 INFO [train.py:715] (0/8) Epoch 4, batch 24650, loss[loss=0.1395, simple_loss=0.2184, pruned_loss=0.03034, over 4777.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2264, pruned_loss=0.04327, over 971008.28 frames.], batch size: 12, lr: 4.41e-04 2022-05-05 00:20:08,210 INFO [train.py:715] (0/8) Epoch 4, batch 24700, loss[loss=0.1276, simple_loss=0.1999, pruned_loss=0.02765, over 4959.00 frames.], tot_loss[loss=0.1568, simple_loss=0.2268, pruned_loss=0.04336, over 971669.69 frames.], batch size: 14, lr: 4.40e-04 2022-05-05 00:20:49,275 INFO [train.py:715] (0/8) Epoch 4, batch 24750, loss[loss=0.1585, simple_loss=0.2288, pruned_loss=0.04408, over 4740.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2271, pruned_loss=0.04349, over 971237.76 frames.], batch size: 12, lr: 4.40e-04 2022-05-05 00:21:28,788 INFO [train.py:715] (0/8) Epoch 4, batch 24800, loss[loss=0.1282, simple_loss=0.1975, pruned_loss=0.0294, over 4785.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2261, pruned_loss=0.04325, over 971289.83 frames.], batch size: 17, lr: 4.40e-04 2022-05-05 00:22:08,797 INFO [train.py:715] (0/8) Epoch 4, batch 24850, loss[loss=0.1454, simple_loss=0.224, pruned_loss=0.03345, over 4959.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2261, pruned_loss=0.04316, over 971758.02 frames.], batch size: 24, lr: 4.40e-04 2022-05-05 00:22:49,036 INFO [train.py:715] (0/8) Epoch 4, batch 24900, loss[loss=0.1479, simple_loss=0.2269, pruned_loss=0.03443, over 4921.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2264, pruned_loss=0.04288, over 972601.66 frames.], batch size: 29, lr: 4.40e-04 2022-05-05 00:23:30,186 INFO [train.py:715] (0/8) Epoch 4, batch 24950, loss[loss=0.1486, simple_loss=0.2232, pruned_loss=0.03699, over 4980.00 frames.], tot_loss[loss=0.156, simple_loss=0.2265, pruned_loss=0.04273, over 973218.44 frames.], batch size: 15, lr: 4.40e-04 2022-05-05 00:24:09,089 INFO [train.py:715] (0/8) Epoch 4, batch 25000, loss[loss=0.1731, simple_loss=0.238, pruned_loss=0.05413, over 4779.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2264, pruned_loss=0.04286, over 972747.74 frames.], batch size: 17, lr: 4.40e-04 2022-05-05 00:24:49,331 INFO [train.py:715] (0/8) Epoch 4, batch 25050, loss[loss=0.1491, simple_loss=0.2252, pruned_loss=0.03648, over 4871.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2268, pruned_loss=0.04312, over 972807.65 frames.], batch size: 22, lr: 4.40e-04 2022-05-05 00:25:30,452 INFO [train.py:715] (0/8) Epoch 4, batch 25100, loss[loss=0.1263, simple_loss=0.2044, pruned_loss=0.02412, over 4909.00 frames.], tot_loss[loss=0.1546, simple_loss=0.2252, pruned_loss=0.042, over 972863.77 frames.], batch size: 17, lr: 4.40e-04 2022-05-05 00:26:10,371 INFO [train.py:715] (0/8) Epoch 4, batch 25150, loss[loss=0.1766, simple_loss=0.2373, pruned_loss=0.05798, over 4836.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2264, pruned_loss=0.04266, over 972850.56 frames.], batch size: 15, lr: 4.40e-04 2022-05-05 00:26:49,791 INFO [train.py:715] (0/8) Epoch 4, batch 25200, loss[loss=0.2054, simple_loss=0.2712, pruned_loss=0.06982, over 4955.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2268, pruned_loss=0.0429, over 973858.43 frames.], batch size: 24, lr: 4.40e-04 2022-05-05 00:27:30,064 INFO [train.py:715] (0/8) Epoch 4, batch 25250, loss[loss=0.132, simple_loss=0.1999, pruned_loss=0.03209, over 4841.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2266, pruned_loss=0.04309, over 973368.15 frames.], batch size: 12, lr: 4.40e-04 2022-05-05 00:28:10,079 INFO [train.py:715] (0/8) Epoch 4, batch 25300, loss[loss=0.14, simple_loss=0.2123, pruned_loss=0.03385, over 4818.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2264, pruned_loss=0.04324, over 973368.87 frames.], batch size: 26, lr: 4.40e-04 2022-05-05 00:28:47,881 INFO [train.py:715] (0/8) Epoch 4, batch 25350, loss[loss=0.1213, simple_loss=0.1941, pruned_loss=0.02424, over 4948.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2266, pruned_loss=0.04305, over 974425.88 frames.], batch size: 29, lr: 4.40e-04 2022-05-05 00:29:26,728 INFO [train.py:715] (0/8) Epoch 4, batch 25400, loss[loss=0.1624, simple_loss=0.2412, pruned_loss=0.04179, over 4752.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2269, pruned_loss=0.0437, over 974078.68 frames.], batch size: 19, lr: 4.40e-04 2022-05-05 00:30:06,393 INFO [train.py:715] (0/8) Epoch 4, batch 25450, loss[loss=0.1451, simple_loss=0.2263, pruned_loss=0.03193, over 4746.00 frames.], tot_loss[loss=0.1572, simple_loss=0.2272, pruned_loss=0.04358, over 974311.42 frames.], batch size: 16, lr: 4.39e-04 2022-05-05 00:30:45,451 INFO [train.py:715] (0/8) Epoch 4, batch 25500, loss[loss=0.1545, simple_loss=0.2108, pruned_loss=0.04914, over 4790.00 frames.], tot_loss[loss=0.157, simple_loss=0.2271, pruned_loss=0.04348, over 974231.00 frames.], batch size: 13, lr: 4.39e-04 2022-05-05 00:31:25,314 INFO [train.py:715] (0/8) Epoch 4, batch 25550, loss[loss=0.1442, simple_loss=0.2101, pruned_loss=0.03913, over 4865.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2275, pruned_loss=0.0439, over 973437.62 frames.], batch size: 32, lr: 4.39e-04 2022-05-05 00:32:05,295 INFO [train.py:715] (0/8) Epoch 4, batch 25600, loss[loss=0.1504, simple_loss=0.2192, pruned_loss=0.04083, over 4641.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2273, pruned_loss=0.04378, over 972271.53 frames.], batch size: 13, lr: 4.39e-04 2022-05-05 00:32:45,567 INFO [train.py:715] (0/8) Epoch 4, batch 25650, loss[loss=0.1817, simple_loss=0.253, pruned_loss=0.05523, over 4989.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2268, pruned_loss=0.04334, over 972121.72 frames.], batch size: 16, lr: 4.39e-04 2022-05-05 00:33:24,670 INFO [train.py:715] (0/8) Epoch 4, batch 25700, loss[loss=0.1774, simple_loss=0.2389, pruned_loss=0.05793, over 4852.00 frames.], tot_loss[loss=0.156, simple_loss=0.2262, pruned_loss=0.04288, over 972021.74 frames.], batch size: 20, lr: 4.39e-04 2022-05-05 00:34:04,657 INFO [train.py:715] (0/8) Epoch 4, batch 25750, loss[loss=0.1641, simple_loss=0.2425, pruned_loss=0.04284, over 4865.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2262, pruned_loss=0.04311, over 972073.30 frames.], batch size: 20, lr: 4.39e-04 2022-05-05 00:34:45,098 INFO [train.py:715] (0/8) Epoch 4, batch 25800, loss[loss=0.1412, simple_loss=0.2166, pruned_loss=0.03293, over 4985.00 frames.], tot_loss[loss=0.157, simple_loss=0.2269, pruned_loss=0.04352, over 972019.43 frames.], batch size: 25, lr: 4.39e-04 2022-05-05 00:35:24,461 INFO [train.py:715] (0/8) Epoch 4, batch 25850, loss[loss=0.1429, simple_loss=0.2124, pruned_loss=0.03667, over 4824.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2266, pruned_loss=0.04299, over 972235.55 frames.], batch size: 21, lr: 4.39e-04 2022-05-05 00:36:03,600 INFO [train.py:715] (0/8) Epoch 4, batch 25900, loss[loss=0.138, simple_loss=0.2103, pruned_loss=0.03288, over 4764.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2267, pruned_loss=0.04334, over 972347.65 frames.], batch size: 18, lr: 4.39e-04 2022-05-05 00:36:43,851 INFO [train.py:715] (0/8) Epoch 4, batch 25950, loss[loss=0.1568, simple_loss=0.2336, pruned_loss=0.03998, over 4863.00 frames.], tot_loss[loss=0.157, simple_loss=0.2271, pruned_loss=0.04348, over 972685.31 frames.], batch size: 30, lr: 4.39e-04 2022-05-05 00:37:24,112 INFO [train.py:715] (0/8) Epoch 4, batch 26000, loss[loss=0.1832, simple_loss=0.2469, pruned_loss=0.05974, over 4854.00 frames.], tot_loss[loss=0.1574, simple_loss=0.2276, pruned_loss=0.04366, over 971638.56 frames.], batch size: 30, lr: 4.39e-04 2022-05-05 00:38:02,819 INFO [train.py:715] (0/8) Epoch 4, batch 26050, loss[loss=0.1561, simple_loss=0.2286, pruned_loss=0.04177, over 4772.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2265, pruned_loss=0.04289, over 970718.14 frames.], batch size: 12, lr: 4.39e-04 2022-05-05 00:38:42,225 INFO [train.py:715] (0/8) Epoch 4, batch 26100, loss[loss=0.1341, simple_loss=0.212, pruned_loss=0.02814, over 4788.00 frames.], tot_loss[loss=0.1568, simple_loss=0.227, pruned_loss=0.04328, over 970745.36 frames.], batch size: 14, lr: 4.39e-04 2022-05-05 00:39:22,685 INFO [train.py:715] (0/8) Epoch 4, batch 26150, loss[loss=0.1645, simple_loss=0.2325, pruned_loss=0.04827, over 4844.00 frames.], tot_loss[loss=0.1567, simple_loss=0.2269, pruned_loss=0.04325, over 971474.65 frames.], batch size: 20, lr: 4.39e-04 2022-05-05 00:40:01,758 INFO [train.py:715] (0/8) Epoch 4, batch 26200, loss[loss=0.1211, simple_loss=0.1934, pruned_loss=0.02442, over 4646.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2264, pruned_loss=0.04373, over 971194.89 frames.], batch size: 13, lr: 4.38e-04 2022-05-05 00:40:41,529 INFO [train.py:715] (0/8) Epoch 4, batch 26250, loss[loss=0.1707, simple_loss=0.2404, pruned_loss=0.05056, over 4812.00 frames.], tot_loss[loss=0.156, simple_loss=0.2258, pruned_loss=0.04312, over 971372.92 frames.], batch size: 25, lr: 4.38e-04 2022-05-05 00:41:21,392 INFO [train.py:715] (0/8) Epoch 4, batch 26300, loss[loss=0.1445, simple_loss=0.212, pruned_loss=0.0385, over 4739.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2255, pruned_loss=0.04319, over 970919.97 frames.], batch size: 16, lr: 4.38e-04 2022-05-05 00:42:01,536 INFO [train.py:715] (0/8) Epoch 4, batch 26350, loss[loss=0.1683, simple_loss=0.2369, pruned_loss=0.04987, over 4968.00 frames.], tot_loss[loss=0.1564, simple_loss=0.226, pruned_loss=0.04337, over 971680.93 frames.], batch size: 14, lr: 4.38e-04 2022-05-05 00:42:40,878 INFO [train.py:715] (0/8) Epoch 4, batch 26400, loss[loss=0.1865, simple_loss=0.2406, pruned_loss=0.06623, over 4930.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2264, pruned_loss=0.04408, over 970931.66 frames.], batch size: 35, lr: 4.38e-04 2022-05-05 00:43:20,963 INFO [train.py:715] (0/8) Epoch 4, batch 26450, loss[loss=0.1436, simple_loss=0.2061, pruned_loss=0.04054, over 4749.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2253, pruned_loss=0.04351, over 970578.41 frames.], batch size: 16, lr: 4.38e-04 2022-05-05 00:44:01,486 INFO [train.py:715] (0/8) Epoch 4, batch 26500, loss[loss=0.1414, simple_loss=0.2086, pruned_loss=0.0371, over 4889.00 frames.], tot_loss[loss=0.155, simple_loss=0.2243, pruned_loss=0.04286, over 970430.06 frames.], batch size: 22, lr: 4.38e-04 2022-05-05 00:44:40,388 INFO [train.py:715] (0/8) Epoch 4, batch 26550, loss[loss=0.1617, simple_loss=0.2249, pruned_loss=0.04922, over 4859.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2239, pruned_loss=0.04254, over 970988.19 frames.], batch size: 20, lr: 4.38e-04 2022-05-05 00:45:20,037 INFO [train.py:715] (0/8) Epoch 4, batch 26600, loss[loss=0.1396, simple_loss=0.2132, pruned_loss=0.03299, over 4944.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2238, pruned_loss=0.04231, over 972484.59 frames.], batch size: 29, lr: 4.38e-04 2022-05-05 00:46:00,424 INFO [train.py:715] (0/8) Epoch 4, batch 26650, loss[loss=0.1411, simple_loss=0.2218, pruned_loss=0.03018, over 4694.00 frames.], tot_loss[loss=0.1551, simple_loss=0.2245, pruned_loss=0.04287, over 971922.56 frames.], batch size: 15, lr: 4.38e-04 2022-05-05 00:46:41,227 INFO [train.py:715] (0/8) Epoch 4, batch 26700, loss[loss=0.1837, simple_loss=0.2543, pruned_loss=0.05658, over 4962.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2246, pruned_loss=0.04257, over 972057.11 frames.], batch size: 15, lr: 4.38e-04 2022-05-05 00:47:20,021 INFO [train.py:715] (0/8) Epoch 4, batch 26750, loss[loss=0.1665, simple_loss=0.2436, pruned_loss=0.04473, over 4975.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2251, pruned_loss=0.04307, over 972140.46 frames.], batch size: 15, lr: 4.38e-04 2022-05-05 00:47:59,596 INFO [train.py:715] (0/8) Epoch 4, batch 26800, loss[loss=0.1465, simple_loss=0.2099, pruned_loss=0.0416, over 4793.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2251, pruned_loss=0.04282, over 972671.98 frames.], batch size: 14, lr: 4.38e-04 2022-05-05 00:48:39,809 INFO [train.py:715] (0/8) Epoch 4, batch 26850, loss[loss=0.1509, simple_loss=0.2211, pruned_loss=0.04037, over 4983.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2243, pruned_loss=0.04232, over 972730.69 frames.], batch size: 15, lr: 4.38e-04 2022-05-05 00:49:18,748 INFO [train.py:715] (0/8) Epoch 4, batch 26900, loss[loss=0.174, simple_loss=0.2443, pruned_loss=0.05187, over 4827.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2239, pruned_loss=0.04195, over 972882.86 frames.], batch size: 25, lr: 4.38e-04 2022-05-05 00:49:58,560 INFO [train.py:715] (0/8) Epoch 4, batch 26950, loss[loss=0.1661, simple_loss=0.2417, pruned_loss=0.04527, over 4971.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2248, pruned_loss=0.04278, over 972541.96 frames.], batch size: 15, lr: 4.37e-04 2022-05-05 00:50:38,533 INFO [train.py:715] (0/8) Epoch 4, batch 27000, loss[loss=0.1413, simple_loss=0.2094, pruned_loss=0.03663, over 4829.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2254, pruned_loss=0.04343, over 972714.38 frames.], batch size: 26, lr: 4.37e-04 2022-05-05 00:50:38,534 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 00:50:48,692 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1114, simple_loss=0.197, pruned_loss=0.01284, over 914524.00 frames. 2022-05-05 00:51:28,854 INFO [train.py:715] (0/8) Epoch 4, batch 27050, loss[loss=0.172, simple_loss=0.2275, pruned_loss=0.05825, over 4792.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2249, pruned_loss=0.0433, over 972659.04 frames.], batch size: 12, lr: 4.37e-04 2022-05-05 00:52:08,420 INFO [train.py:715] (0/8) Epoch 4, batch 27100, loss[loss=0.1239, simple_loss=0.184, pruned_loss=0.0319, over 4981.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2244, pruned_loss=0.04313, over 972553.10 frames.], batch size: 14, lr: 4.37e-04 2022-05-05 00:52:47,727 INFO [train.py:715] (0/8) Epoch 4, batch 27150, loss[loss=0.1827, simple_loss=0.2628, pruned_loss=0.05128, over 4632.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2248, pruned_loss=0.04297, over 972773.77 frames.], batch size: 13, lr: 4.37e-04 2022-05-05 00:53:27,407 INFO [train.py:715] (0/8) Epoch 4, batch 27200, loss[loss=0.127, simple_loss=0.2043, pruned_loss=0.02487, over 4799.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2249, pruned_loss=0.04311, over 972817.46 frames.], batch size: 17, lr: 4.37e-04 2022-05-05 00:54:07,877 INFO [train.py:715] (0/8) Epoch 4, batch 27250, loss[loss=0.1493, simple_loss=0.2199, pruned_loss=0.03935, over 4986.00 frames.], tot_loss[loss=0.1556, simple_loss=0.225, pruned_loss=0.04309, over 972316.21 frames.], batch size: 25, lr: 4.37e-04 2022-05-05 00:54:46,638 INFO [train.py:715] (0/8) Epoch 4, batch 27300, loss[loss=0.1249, simple_loss=0.2048, pruned_loss=0.02253, over 4792.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2249, pruned_loss=0.0432, over 972114.20 frames.], batch size: 24, lr: 4.37e-04 2022-05-05 00:55:26,635 INFO [train.py:715] (0/8) Epoch 4, batch 27350, loss[loss=0.144, simple_loss=0.2106, pruned_loss=0.03872, over 4759.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2248, pruned_loss=0.04304, over 972383.75 frames.], batch size: 19, lr: 4.37e-04 2022-05-05 00:56:06,584 INFO [train.py:715] (0/8) Epoch 4, batch 27400, loss[loss=0.1528, simple_loss=0.2364, pruned_loss=0.03463, over 4793.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2263, pruned_loss=0.04329, over 972521.31 frames.], batch size: 24, lr: 4.37e-04 2022-05-05 00:56:45,007 INFO [train.py:715] (0/8) Epoch 4, batch 27450, loss[loss=0.1628, simple_loss=0.2243, pruned_loss=0.05063, over 4835.00 frames.], tot_loss[loss=0.156, simple_loss=0.2259, pruned_loss=0.04305, over 972160.53 frames.], batch size: 15, lr: 4.37e-04 2022-05-05 00:57:24,956 INFO [train.py:715] (0/8) Epoch 4, batch 27500, loss[loss=0.1451, simple_loss=0.2165, pruned_loss=0.03688, over 4884.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2257, pruned_loss=0.04309, over 972812.65 frames.], batch size: 32, lr: 4.37e-04 2022-05-05 00:58:03,976 INFO [train.py:715] (0/8) Epoch 4, batch 27550, loss[loss=0.1528, simple_loss=0.2338, pruned_loss=0.03588, over 4877.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2261, pruned_loss=0.04301, over 972920.91 frames.], batch size: 22, lr: 4.37e-04 2022-05-05 00:58:43,892 INFO [train.py:715] (0/8) Epoch 4, batch 27600, loss[loss=0.1526, simple_loss=0.2205, pruned_loss=0.04231, over 4770.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2259, pruned_loss=0.04285, over 972415.50 frames.], batch size: 17, lr: 4.37e-04 2022-05-05 00:59:22,455 INFO [train.py:715] (0/8) Epoch 4, batch 27650, loss[loss=0.1789, simple_loss=0.2407, pruned_loss=0.05853, over 4893.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2256, pruned_loss=0.04274, over 972595.08 frames.], batch size: 22, lr: 4.37e-04 2022-05-05 01:00:01,782 INFO [train.py:715] (0/8) Epoch 4, batch 27700, loss[loss=0.1677, simple_loss=0.2299, pruned_loss=0.05275, over 4821.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2258, pruned_loss=0.04282, over 972345.21 frames.], batch size: 15, lr: 4.36e-04 2022-05-05 01:00:41,407 INFO [train.py:715] (0/8) Epoch 4, batch 27750, loss[loss=0.174, simple_loss=0.2381, pruned_loss=0.05495, over 4908.00 frames.], tot_loss[loss=0.155, simple_loss=0.2252, pruned_loss=0.04237, over 972375.97 frames.], batch size: 18, lr: 4.36e-04 2022-05-05 01:01:20,724 INFO [train.py:715] (0/8) Epoch 4, batch 27800, loss[loss=0.1583, simple_loss=0.2379, pruned_loss=0.03929, over 4859.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2255, pruned_loss=0.04243, over 972208.46 frames.], batch size: 16, lr: 4.36e-04 2022-05-05 01:01:59,778 INFO [train.py:715] (0/8) Epoch 4, batch 27850, loss[loss=0.1539, simple_loss=0.2265, pruned_loss=0.04068, over 4753.00 frames.], tot_loss[loss=0.155, simple_loss=0.225, pruned_loss=0.04247, over 972671.09 frames.], batch size: 19, lr: 4.36e-04 2022-05-05 01:02:38,864 INFO [train.py:715] (0/8) Epoch 4, batch 27900, loss[loss=0.132, simple_loss=0.205, pruned_loss=0.02947, over 4969.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2241, pruned_loss=0.04223, over 972854.73 frames.], batch size: 15, lr: 4.36e-04 2022-05-05 01:03:18,311 INFO [train.py:715] (0/8) Epoch 4, batch 27950, loss[loss=0.145, simple_loss=0.2218, pruned_loss=0.03403, over 4957.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2244, pruned_loss=0.04216, over 972505.34 frames.], batch size: 21, lr: 4.36e-04 2022-05-05 01:03:57,877 INFO [train.py:715] (0/8) Epoch 4, batch 28000, loss[loss=0.1643, simple_loss=0.2343, pruned_loss=0.04715, over 4775.00 frames.], tot_loss[loss=0.155, simple_loss=0.2248, pruned_loss=0.04255, over 972658.98 frames.], batch size: 19, lr: 4.36e-04 2022-05-05 01:04:37,838 INFO [train.py:715] (0/8) Epoch 4, batch 28050, loss[loss=0.1801, simple_loss=0.2594, pruned_loss=0.05042, over 4870.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2256, pruned_loss=0.04308, over 972839.99 frames.], batch size: 22, lr: 4.36e-04 2022-05-05 01:05:17,717 INFO [train.py:715] (0/8) Epoch 4, batch 28100, loss[loss=0.1496, simple_loss=0.2103, pruned_loss=0.04442, over 4776.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2254, pruned_loss=0.04283, over 971876.27 frames.], batch size: 17, lr: 4.36e-04 2022-05-05 01:05:57,314 INFO [train.py:715] (0/8) Epoch 4, batch 28150, loss[loss=0.1495, simple_loss=0.2171, pruned_loss=0.04097, over 4918.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2249, pruned_loss=0.0425, over 972445.58 frames.], batch size: 18, lr: 4.36e-04 2022-05-05 01:06:36,804 INFO [train.py:715] (0/8) Epoch 4, batch 28200, loss[loss=0.1449, simple_loss=0.209, pruned_loss=0.04039, over 4911.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2244, pruned_loss=0.04204, over 972601.40 frames.], batch size: 18, lr: 4.36e-04 2022-05-05 01:07:15,871 INFO [train.py:715] (0/8) Epoch 4, batch 28250, loss[loss=0.1703, simple_loss=0.2382, pruned_loss=0.05124, over 4763.00 frames.], tot_loss[loss=0.154, simple_loss=0.2246, pruned_loss=0.04167, over 972469.03 frames.], batch size: 16, lr: 4.36e-04 2022-05-05 01:07:55,430 INFO [train.py:715] (0/8) Epoch 4, batch 28300, loss[loss=0.1384, simple_loss=0.2156, pruned_loss=0.0306, over 4812.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2247, pruned_loss=0.0418, over 972030.68 frames.], batch size: 25, lr: 4.36e-04 2022-05-05 01:08:34,752 INFO [train.py:715] (0/8) Epoch 4, batch 28350, loss[loss=0.1339, simple_loss=0.1992, pruned_loss=0.03431, over 4802.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2247, pruned_loss=0.04182, over 972797.15 frames.], batch size: 12, lr: 4.36e-04 2022-05-05 01:09:14,653 INFO [train.py:715] (0/8) Epoch 4, batch 28400, loss[loss=0.1464, simple_loss=0.201, pruned_loss=0.04588, over 4834.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2246, pruned_loss=0.04236, over 971679.97 frames.], batch size: 30, lr: 4.36e-04 2022-05-05 01:09:53,871 INFO [train.py:715] (0/8) Epoch 4, batch 28450, loss[loss=0.1293, simple_loss=0.204, pruned_loss=0.02729, over 4761.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2247, pruned_loss=0.04257, over 972334.88 frames.], batch size: 19, lr: 4.36e-04 2022-05-05 01:10:32,528 INFO [train.py:715] (0/8) Epoch 4, batch 28500, loss[loss=0.141, simple_loss=0.2267, pruned_loss=0.02761, over 4919.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2253, pruned_loss=0.0427, over 972579.34 frames.], batch size: 18, lr: 4.35e-04 2022-05-05 01:11:12,040 INFO [train.py:715] (0/8) Epoch 4, batch 28550, loss[loss=0.1399, simple_loss=0.2169, pruned_loss=0.03149, over 4980.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2257, pruned_loss=0.04288, over 972857.82 frames.], batch size: 26, lr: 4.35e-04 2022-05-05 01:11:51,204 INFO [train.py:715] (0/8) Epoch 4, batch 28600, loss[loss=0.1368, simple_loss=0.2042, pruned_loss=0.03468, over 4775.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2261, pruned_loss=0.04272, over 972638.19 frames.], batch size: 14, lr: 4.35e-04 2022-05-05 01:12:30,864 INFO [train.py:715] (0/8) Epoch 4, batch 28650, loss[loss=0.1602, simple_loss=0.2431, pruned_loss=0.03864, over 4796.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2263, pruned_loss=0.04273, over 972899.13 frames.], batch size: 17, lr: 4.35e-04 2022-05-05 01:13:10,039 INFO [train.py:715] (0/8) Epoch 4, batch 28700, loss[loss=0.1565, simple_loss=0.2122, pruned_loss=0.05037, over 4884.00 frames.], tot_loss[loss=0.156, simple_loss=0.2262, pruned_loss=0.04293, over 971986.94 frames.], batch size: 16, lr: 4.35e-04 2022-05-05 01:13:49,553 INFO [train.py:715] (0/8) Epoch 4, batch 28750, loss[loss=0.1279, simple_loss=0.1812, pruned_loss=0.03731, over 4780.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2253, pruned_loss=0.04254, over 972209.11 frames.], batch size: 12, lr: 4.35e-04 2022-05-05 01:13:58,134 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-168000.pt 2022-05-05 01:14:31,714 INFO [train.py:715] (0/8) Epoch 4, batch 28800, loss[loss=0.1583, simple_loss=0.2469, pruned_loss=0.0348, over 4824.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2266, pruned_loss=0.04287, over 972058.03 frames.], batch size: 25, lr: 4.35e-04 2022-05-05 01:15:10,516 INFO [train.py:715] (0/8) Epoch 4, batch 28850, loss[loss=0.1276, simple_loss=0.1922, pruned_loss=0.03149, over 4828.00 frames.], tot_loss[loss=0.156, simple_loss=0.2263, pruned_loss=0.04285, over 972152.15 frames.], batch size: 13, lr: 4.35e-04 2022-05-05 01:15:50,213 INFO [train.py:715] (0/8) Epoch 4, batch 28900, loss[loss=0.1859, simple_loss=0.2487, pruned_loss=0.06156, over 4837.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2265, pruned_loss=0.04306, over 972609.72 frames.], batch size: 30, lr: 4.35e-04 2022-05-05 01:16:29,313 INFO [train.py:715] (0/8) Epoch 4, batch 28950, loss[loss=0.1668, simple_loss=0.2323, pruned_loss=0.05069, over 4806.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2263, pruned_loss=0.04317, over 972356.97 frames.], batch size: 21, lr: 4.35e-04 2022-05-05 01:17:08,543 INFO [train.py:715] (0/8) Epoch 4, batch 29000, loss[loss=0.1527, simple_loss=0.2226, pruned_loss=0.04142, over 4985.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2264, pruned_loss=0.04313, over 973058.77 frames.], batch size: 15, lr: 4.35e-04 2022-05-05 01:17:48,160 INFO [train.py:715] (0/8) Epoch 4, batch 29050, loss[loss=0.1577, simple_loss=0.2338, pruned_loss=0.04075, over 4964.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2267, pruned_loss=0.04285, over 971933.94 frames.], batch size: 24, lr: 4.35e-04 2022-05-05 01:18:28,178 INFO [train.py:715] (0/8) Epoch 4, batch 29100, loss[loss=0.1313, simple_loss=0.2025, pruned_loss=0.03, over 4822.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2262, pruned_loss=0.04272, over 971228.03 frames.], batch size: 12, lr: 4.35e-04 2022-05-05 01:19:07,856 INFO [train.py:715] (0/8) Epoch 4, batch 29150, loss[loss=0.1576, simple_loss=0.2252, pruned_loss=0.04503, over 4826.00 frames.], tot_loss[loss=0.1562, simple_loss=0.2261, pruned_loss=0.04311, over 972050.15 frames.], batch size: 26, lr: 4.35e-04 2022-05-05 01:19:46,737 INFO [train.py:715] (0/8) Epoch 4, batch 29200, loss[loss=0.1773, simple_loss=0.2332, pruned_loss=0.06075, over 4785.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2254, pruned_loss=0.04302, over 971955.77 frames.], batch size: 17, lr: 4.35e-04 2022-05-05 01:20:26,108 INFO [train.py:715] (0/8) Epoch 4, batch 29250, loss[loss=0.1864, simple_loss=0.2326, pruned_loss=0.07012, over 4841.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2247, pruned_loss=0.04253, over 972301.00 frames.], batch size: 13, lr: 4.34e-04 2022-05-05 01:21:05,005 INFO [train.py:715] (0/8) Epoch 4, batch 29300, loss[loss=0.1381, simple_loss=0.2181, pruned_loss=0.02907, over 4818.00 frames.], tot_loss[loss=0.1552, simple_loss=0.225, pruned_loss=0.04274, over 972357.18 frames.], batch size: 27, lr: 4.34e-04 2022-05-05 01:21:43,980 INFO [train.py:715] (0/8) Epoch 4, batch 29350, loss[loss=0.1545, simple_loss=0.2217, pruned_loss=0.04366, over 4940.00 frames.], tot_loss[loss=0.154, simple_loss=0.2239, pruned_loss=0.04203, over 972358.25 frames.], batch size: 40, lr: 4.34e-04 2022-05-05 01:22:22,965 INFO [train.py:715] (0/8) Epoch 4, batch 29400, loss[loss=0.1671, simple_loss=0.2262, pruned_loss=0.05396, over 4956.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2233, pruned_loss=0.0419, over 971535.15 frames.], batch size: 21, lr: 4.34e-04 2022-05-05 01:23:02,045 INFO [train.py:715] (0/8) Epoch 4, batch 29450, loss[loss=0.1395, simple_loss=0.2086, pruned_loss=0.03521, over 4811.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2229, pruned_loss=0.04187, over 972580.80 frames.], batch size: 13, lr: 4.34e-04 2022-05-05 01:23:41,624 INFO [train.py:715] (0/8) Epoch 4, batch 29500, loss[loss=0.1725, simple_loss=0.2278, pruned_loss=0.05855, over 4966.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2237, pruned_loss=0.04221, over 973110.32 frames.], batch size: 35, lr: 4.34e-04 2022-05-05 01:24:20,880 INFO [train.py:715] (0/8) Epoch 4, batch 29550, loss[loss=0.1628, simple_loss=0.2331, pruned_loss=0.04625, over 4905.00 frames.], tot_loss[loss=0.1532, simple_loss=0.223, pruned_loss=0.04171, over 973376.21 frames.], batch size: 39, lr: 4.34e-04 2022-05-05 01:25:00,164 INFO [train.py:715] (0/8) Epoch 4, batch 29600, loss[loss=0.1317, simple_loss=0.2137, pruned_loss=0.02489, over 4954.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2234, pruned_loss=0.04196, over 973258.72 frames.], batch size: 21, lr: 4.34e-04 2022-05-05 01:25:39,284 INFO [train.py:715] (0/8) Epoch 4, batch 29650, loss[loss=0.1529, simple_loss=0.2311, pruned_loss=0.03734, over 4767.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2229, pruned_loss=0.0419, over 973364.00 frames.], batch size: 17, lr: 4.34e-04 2022-05-05 01:26:18,056 INFO [train.py:715] (0/8) Epoch 4, batch 29700, loss[loss=0.1667, simple_loss=0.2321, pruned_loss=0.05065, over 4985.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2241, pruned_loss=0.04265, over 973417.29 frames.], batch size: 33, lr: 4.34e-04 2022-05-05 01:26:57,625 INFO [train.py:715] (0/8) Epoch 4, batch 29750, loss[loss=0.1315, simple_loss=0.2001, pruned_loss=0.03147, over 4749.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2248, pruned_loss=0.04287, over 973171.32 frames.], batch size: 16, lr: 4.34e-04 2022-05-05 01:27:36,802 INFO [train.py:715] (0/8) Epoch 4, batch 29800, loss[loss=0.1654, simple_loss=0.232, pruned_loss=0.04935, over 4838.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2248, pruned_loss=0.04252, over 972217.67 frames.], batch size: 32, lr: 4.34e-04 2022-05-05 01:28:16,335 INFO [train.py:715] (0/8) Epoch 4, batch 29850, loss[loss=0.1805, simple_loss=0.2546, pruned_loss=0.0532, over 4907.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2249, pruned_loss=0.04271, over 971489.75 frames.], batch size: 19, lr: 4.34e-04 2022-05-05 01:28:55,200 INFO [train.py:715] (0/8) Epoch 4, batch 29900, loss[loss=0.1346, simple_loss=0.2026, pruned_loss=0.03331, over 4909.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2255, pruned_loss=0.04287, over 971620.97 frames.], batch size: 18, lr: 4.34e-04 2022-05-05 01:29:34,839 INFO [train.py:715] (0/8) Epoch 4, batch 29950, loss[loss=0.1646, simple_loss=0.2403, pruned_loss=0.04443, over 4816.00 frames.], tot_loss[loss=0.1564, simple_loss=0.226, pruned_loss=0.04343, over 970758.05 frames.], batch size: 15, lr: 4.34e-04 2022-05-05 01:30:13,991 INFO [train.py:715] (0/8) Epoch 4, batch 30000, loss[loss=0.1402, simple_loss=0.2118, pruned_loss=0.03431, over 4883.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2251, pruned_loss=0.04308, over 971149.52 frames.], batch size: 22, lr: 4.34e-04 2022-05-05 01:30:13,994 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 01:30:23,829 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1113, simple_loss=0.1968, pruned_loss=0.01286, over 914524.00 frames. 2022-05-05 01:31:03,993 INFO [train.py:715] (0/8) Epoch 4, batch 30050, loss[loss=0.1548, simple_loss=0.2221, pruned_loss=0.04373, over 4829.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2239, pruned_loss=0.0423, over 971515.08 frames.], batch size: 13, lr: 4.33e-04 2022-05-05 01:31:43,421 INFO [train.py:715] (0/8) Epoch 4, batch 30100, loss[loss=0.1401, simple_loss=0.2171, pruned_loss=0.03156, over 4807.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2237, pruned_loss=0.04198, over 972165.67 frames.], batch size: 21, lr: 4.33e-04 2022-05-05 01:32:23,323 INFO [train.py:715] (0/8) Epoch 4, batch 30150, loss[loss=0.1279, simple_loss=0.2, pruned_loss=0.02792, over 4813.00 frames.], tot_loss[loss=0.153, simple_loss=0.223, pruned_loss=0.0415, over 971953.19 frames.], batch size: 21, lr: 4.33e-04 2022-05-05 01:33:02,791 INFO [train.py:715] (0/8) Epoch 4, batch 30200, loss[loss=0.147, simple_loss=0.2095, pruned_loss=0.04226, over 4695.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2236, pruned_loss=0.04161, over 972346.42 frames.], batch size: 15, lr: 4.33e-04 2022-05-05 01:33:42,427 INFO [train.py:715] (0/8) Epoch 4, batch 30250, loss[loss=0.212, simple_loss=0.2733, pruned_loss=0.07529, over 4910.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2242, pruned_loss=0.04176, over 972186.26 frames.], batch size: 17, lr: 4.33e-04 2022-05-05 01:34:21,596 INFO [train.py:715] (0/8) Epoch 4, batch 30300, loss[loss=0.1484, simple_loss=0.2128, pruned_loss=0.04203, over 4982.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2237, pruned_loss=0.0417, over 972445.08 frames.], batch size: 35, lr: 4.33e-04 2022-05-05 01:35:01,078 INFO [train.py:715] (0/8) Epoch 4, batch 30350, loss[loss=0.1895, simple_loss=0.25, pruned_loss=0.06445, over 4834.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2239, pruned_loss=0.04201, over 972667.40 frames.], batch size: 30, lr: 4.33e-04 2022-05-05 01:35:41,058 INFO [train.py:715] (0/8) Epoch 4, batch 30400, loss[loss=0.1363, simple_loss=0.2111, pruned_loss=0.03076, over 4769.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2243, pruned_loss=0.04209, over 973016.86 frames.], batch size: 17, lr: 4.33e-04 2022-05-05 01:36:20,213 INFO [train.py:715] (0/8) Epoch 4, batch 30450, loss[loss=0.1398, simple_loss=0.2104, pruned_loss=0.0346, over 4759.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2255, pruned_loss=0.04261, over 972977.70 frames.], batch size: 14, lr: 4.33e-04 2022-05-05 01:36:59,977 INFO [train.py:715] (0/8) Epoch 4, batch 30500, loss[loss=0.1589, simple_loss=0.2347, pruned_loss=0.04152, over 4762.00 frames.], tot_loss[loss=0.156, simple_loss=0.2261, pruned_loss=0.0429, over 972894.53 frames.], batch size: 19, lr: 4.33e-04 2022-05-05 01:37:40,027 INFO [train.py:715] (0/8) Epoch 4, batch 30550, loss[loss=0.1522, simple_loss=0.221, pruned_loss=0.04165, over 4862.00 frames.], tot_loss[loss=0.1551, simple_loss=0.225, pruned_loss=0.04257, over 973753.14 frames.], batch size: 20, lr: 4.33e-04 2022-05-05 01:38:19,334 INFO [train.py:715] (0/8) Epoch 4, batch 30600, loss[loss=0.1435, simple_loss=0.2136, pruned_loss=0.03665, over 4750.00 frames.], tot_loss[loss=0.1552, simple_loss=0.225, pruned_loss=0.04274, over 973640.85 frames.], batch size: 19, lr: 4.33e-04 2022-05-05 01:38:58,942 INFO [train.py:715] (0/8) Epoch 4, batch 30650, loss[loss=0.1832, simple_loss=0.2551, pruned_loss=0.05567, over 4931.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2254, pruned_loss=0.04309, over 973695.22 frames.], batch size: 39, lr: 4.33e-04 2022-05-05 01:39:38,412 INFO [train.py:715] (0/8) Epoch 4, batch 30700, loss[loss=0.1202, simple_loss=0.1906, pruned_loss=0.0249, over 4969.00 frames.], tot_loss[loss=0.155, simple_loss=0.225, pruned_loss=0.04254, over 973968.05 frames.], batch size: 28, lr: 4.33e-04 2022-05-05 01:40:18,144 INFO [train.py:715] (0/8) Epoch 4, batch 30750, loss[loss=0.1846, simple_loss=0.257, pruned_loss=0.05609, over 4943.00 frames.], tot_loss[loss=0.155, simple_loss=0.2248, pruned_loss=0.04259, over 973922.81 frames.], batch size: 35, lr: 4.33e-04 2022-05-05 01:40:57,686 INFO [train.py:715] (0/8) Epoch 4, batch 30800, loss[loss=0.1576, simple_loss=0.2262, pruned_loss=0.04447, over 4903.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2255, pruned_loss=0.04262, over 973100.68 frames.], batch size: 19, lr: 4.32e-04 2022-05-05 01:41:37,512 INFO [train.py:715] (0/8) Epoch 4, batch 30850, loss[loss=0.1414, simple_loss=0.2059, pruned_loss=0.03839, over 4856.00 frames.], tot_loss[loss=0.156, simple_loss=0.226, pruned_loss=0.04298, over 973800.42 frames.], batch size: 20, lr: 4.32e-04 2022-05-05 01:42:17,788 INFO [train.py:715] (0/8) Epoch 4, batch 30900, loss[loss=0.152, simple_loss=0.2189, pruned_loss=0.04252, over 4851.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2256, pruned_loss=0.04282, over 974574.48 frames.], batch size: 12, lr: 4.32e-04 2022-05-05 01:42:57,264 INFO [train.py:715] (0/8) Epoch 4, batch 30950, loss[loss=0.1874, simple_loss=0.2601, pruned_loss=0.05741, over 4899.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2259, pruned_loss=0.04314, over 974114.57 frames.], batch size: 23, lr: 4.32e-04 2022-05-05 01:43:36,634 INFO [train.py:715] (0/8) Epoch 4, batch 31000, loss[loss=0.1891, simple_loss=0.2443, pruned_loss=0.06691, over 4784.00 frames.], tot_loss[loss=0.1561, simple_loss=0.226, pruned_loss=0.04313, over 973952.46 frames.], batch size: 17, lr: 4.32e-04 2022-05-05 01:44:16,109 INFO [train.py:715] (0/8) Epoch 4, batch 31050, loss[loss=0.1806, simple_loss=0.2565, pruned_loss=0.05231, over 4877.00 frames.], tot_loss[loss=0.1566, simple_loss=0.2264, pruned_loss=0.04336, over 973868.32 frames.], batch size: 32, lr: 4.32e-04 2022-05-05 01:44:55,520 INFO [train.py:715] (0/8) Epoch 4, batch 31100, loss[loss=0.1408, simple_loss=0.212, pruned_loss=0.0348, over 4974.00 frames.], tot_loss[loss=0.156, simple_loss=0.2256, pruned_loss=0.04315, over 974360.36 frames.], batch size: 35, lr: 4.32e-04 2022-05-05 01:45:35,031 INFO [train.py:715] (0/8) Epoch 4, batch 31150, loss[loss=0.186, simple_loss=0.2613, pruned_loss=0.05539, over 4834.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2258, pruned_loss=0.04302, over 973888.82 frames.], batch size: 26, lr: 4.32e-04 2022-05-05 01:46:13,902 INFO [train.py:715] (0/8) Epoch 4, batch 31200, loss[loss=0.1577, simple_loss=0.2323, pruned_loss=0.04154, over 4838.00 frames.], tot_loss[loss=0.1558, simple_loss=0.2258, pruned_loss=0.04293, over 974176.55 frames.], batch size: 32, lr: 4.32e-04 2022-05-05 01:46:53,972 INFO [train.py:715] (0/8) Epoch 4, batch 31250, loss[loss=0.1562, simple_loss=0.2298, pruned_loss=0.04132, over 4949.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2265, pruned_loss=0.04331, over 974089.48 frames.], batch size: 21, lr: 4.32e-04 2022-05-05 01:47:33,182 INFO [train.py:715] (0/8) Epoch 4, batch 31300, loss[loss=0.1668, simple_loss=0.2428, pruned_loss=0.04542, over 4754.00 frames.], tot_loss[loss=0.1565, simple_loss=0.2264, pruned_loss=0.04328, over 973457.03 frames.], batch size: 14, lr: 4.32e-04 2022-05-05 01:48:12,189 INFO [train.py:715] (0/8) Epoch 4, batch 31350, loss[loss=0.149, simple_loss=0.2298, pruned_loss=0.0341, over 4828.00 frames.], tot_loss[loss=0.157, simple_loss=0.2267, pruned_loss=0.04364, over 974027.96 frames.], batch size: 25, lr: 4.32e-04 2022-05-05 01:48:52,071 INFO [train.py:715] (0/8) Epoch 4, batch 31400, loss[loss=0.1508, simple_loss=0.2222, pruned_loss=0.03971, over 4821.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2268, pruned_loss=0.04351, over 973342.25 frames.], batch size: 26, lr: 4.32e-04 2022-05-05 01:49:31,802 INFO [train.py:715] (0/8) Epoch 4, batch 31450, loss[loss=0.1934, simple_loss=0.253, pruned_loss=0.06694, over 4734.00 frames.], tot_loss[loss=0.156, simple_loss=0.2258, pruned_loss=0.04306, over 973214.70 frames.], batch size: 16, lr: 4.32e-04 2022-05-05 01:50:11,368 INFO [train.py:715] (0/8) Epoch 4, batch 31500, loss[loss=0.1714, simple_loss=0.2456, pruned_loss=0.0486, over 4881.00 frames.], tot_loss[loss=0.157, simple_loss=0.2266, pruned_loss=0.04372, over 972431.61 frames.], batch size: 16, lr: 4.32e-04 2022-05-05 01:50:51,745 INFO [train.py:715] (0/8) Epoch 4, batch 31550, loss[loss=0.1572, simple_loss=0.2191, pruned_loss=0.04766, over 4770.00 frames.], tot_loss[loss=0.1577, simple_loss=0.2271, pruned_loss=0.0441, over 972421.76 frames.], batch size: 17, lr: 4.32e-04 2022-05-05 01:51:32,269 INFO [train.py:715] (0/8) Epoch 4, batch 31600, loss[loss=0.1283, simple_loss=0.1923, pruned_loss=0.03209, over 4781.00 frames.], tot_loss[loss=0.1579, simple_loss=0.2272, pruned_loss=0.04431, over 973767.31 frames.], batch size: 14, lr: 4.31e-04 2022-05-05 01:52:11,914 INFO [train.py:715] (0/8) Epoch 4, batch 31650, loss[loss=0.133, simple_loss=0.2044, pruned_loss=0.03082, over 4916.00 frames.], tot_loss[loss=0.157, simple_loss=0.2265, pruned_loss=0.0437, over 974091.78 frames.], batch size: 18, lr: 4.31e-04 2022-05-05 01:52:51,497 INFO [train.py:715] (0/8) Epoch 4, batch 31700, loss[loss=0.1705, simple_loss=0.2494, pruned_loss=0.04575, over 4780.00 frames.], tot_loss[loss=0.1575, simple_loss=0.2272, pruned_loss=0.04386, over 973990.02 frames.], batch size: 19, lr: 4.31e-04 2022-05-05 01:53:31,556 INFO [train.py:715] (0/8) Epoch 4, batch 31750, loss[loss=0.1446, simple_loss=0.2113, pruned_loss=0.03895, over 4822.00 frames.], tot_loss[loss=0.1569, simple_loss=0.2269, pruned_loss=0.04342, over 973398.36 frames.], batch size: 13, lr: 4.31e-04 2022-05-05 01:54:11,601 INFO [train.py:715] (0/8) Epoch 4, batch 31800, loss[loss=0.1586, simple_loss=0.2413, pruned_loss=0.03789, over 4772.00 frames.], tot_loss[loss=0.1571, simple_loss=0.2277, pruned_loss=0.04331, over 972662.83 frames.], batch size: 17, lr: 4.31e-04 2022-05-05 01:54:51,199 INFO [train.py:715] (0/8) Epoch 4, batch 31850, loss[loss=0.1505, simple_loss=0.2245, pruned_loss=0.03827, over 4911.00 frames.], tot_loss[loss=0.1573, simple_loss=0.2277, pruned_loss=0.04339, over 972390.25 frames.], batch size: 17, lr: 4.31e-04 2022-05-05 01:55:30,803 INFO [train.py:715] (0/8) Epoch 4, batch 31900, loss[loss=0.1413, simple_loss=0.2026, pruned_loss=0.04004, over 4784.00 frames.], tot_loss[loss=0.1569, simple_loss=0.227, pruned_loss=0.04338, over 972544.65 frames.], batch size: 18, lr: 4.31e-04 2022-05-05 01:56:11,028 INFO [train.py:715] (0/8) Epoch 4, batch 31950, loss[loss=0.1834, simple_loss=0.2537, pruned_loss=0.05658, over 4825.00 frames.], tot_loss[loss=0.1564, simple_loss=0.2265, pruned_loss=0.04318, over 972484.21 frames.], batch size: 27, lr: 4.31e-04 2022-05-05 01:56:50,986 INFO [train.py:715] (0/8) Epoch 4, batch 32000, loss[loss=0.1385, simple_loss=0.211, pruned_loss=0.03295, over 4959.00 frames.], tot_loss[loss=0.1563, simple_loss=0.2266, pruned_loss=0.04294, over 972016.05 frames.], batch size: 15, lr: 4.31e-04 2022-05-05 01:57:30,373 INFO [train.py:715] (0/8) Epoch 4, batch 32050, loss[loss=0.1335, simple_loss=0.2127, pruned_loss=0.02714, over 4927.00 frames.], tot_loss[loss=0.155, simple_loss=0.2253, pruned_loss=0.04232, over 971419.60 frames.], batch size: 18, lr: 4.31e-04 2022-05-05 01:58:10,940 INFO [train.py:715] (0/8) Epoch 4, batch 32100, loss[loss=0.151, simple_loss=0.2211, pruned_loss=0.0405, over 4978.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2237, pruned_loss=0.04151, over 971956.46 frames.], batch size: 14, lr: 4.31e-04 2022-05-05 01:58:50,867 INFO [train.py:715] (0/8) Epoch 4, batch 32150, loss[loss=0.1377, simple_loss=0.2004, pruned_loss=0.03756, over 4964.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2247, pruned_loss=0.04221, over 973058.81 frames.], batch size: 15, lr: 4.31e-04 2022-05-05 01:59:30,406 INFO [train.py:715] (0/8) Epoch 4, batch 32200, loss[loss=0.1553, simple_loss=0.2305, pruned_loss=0.04005, over 4839.00 frames.], tot_loss[loss=0.155, simple_loss=0.2249, pruned_loss=0.04253, over 972506.65 frames.], batch size: 32, lr: 4.31e-04 2022-05-05 02:00:10,363 INFO [train.py:715] (0/8) Epoch 4, batch 32250, loss[loss=0.1399, simple_loss=0.2105, pruned_loss=0.03469, over 4776.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2244, pruned_loss=0.04223, over 971881.18 frames.], batch size: 18, lr: 4.31e-04 2022-05-05 02:00:51,157 INFO [train.py:715] (0/8) Epoch 4, batch 32300, loss[loss=0.1488, simple_loss=0.2179, pruned_loss=0.03988, over 4767.00 frames.], tot_loss[loss=0.1548, simple_loss=0.225, pruned_loss=0.04229, over 971467.32 frames.], batch size: 18, lr: 4.31e-04 2022-05-05 02:01:31,942 INFO [train.py:715] (0/8) Epoch 4, batch 32350, loss[loss=0.1408, simple_loss=0.2069, pruned_loss=0.03728, over 4981.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2246, pruned_loss=0.04207, over 972011.26 frames.], batch size: 14, lr: 4.31e-04 2022-05-05 02:02:12,275 INFO [train.py:715] (0/8) Epoch 4, batch 32400, loss[loss=0.1638, simple_loss=0.2327, pruned_loss=0.04744, over 4960.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2254, pruned_loss=0.04254, over 971642.71 frames.], batch size: 24, lr: 4.30e-04 2022-05-05 02:02:52,626 INFO [train.py:715] (0/8) Epoch 4, batch 32450, loss[loss=0.1514, simple_loss=0.2233, pruned_loss=0.03976, over 4832.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2257, pruned_loss=0.04267, over 970420.85 frames.], batch size: 15, lr: 4.30e-04 2022-05-05 02:03:31,863 INFO [train.py:715] (0/8) Epoch 4, batch 32500, loss[loss=0.172, simple_loss=0.2366, pruned_loss=0.05371, over 4939.00 frames.], tot_loss[loss=0.1551, simple_loss=0.2254, pruned_loss=0.04246, over 970951.54 frames.], batch size: 23, lr: 4.30e-04 2022-05-05 02:04:11,771 INFO [train.py:715] (0/8) Epoch 4, batch 32550, loss[loss=0.1677, simple_loss=0.2321, pruned_loss=0.05161, over 4991.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2244, pruned_loss=0.0421, over 971498.68 frames.], batch size: 15, lr: 4.30e-04 2022-05-05 02:04:50,739 INFO [train.py:715] (0/8) Epoch 4, batch 32600, loss[loss=0.1659, simple_loss=0.2269, pruned_loss=0.0524, over 4857.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2239, pruned_loss=0.04221, over 972089.53 frames.], batch size: 32, lr: 4.30e-04 2022-05-05 02:05:30,803 INFO [train.py:715] (0/8) Epoch 4, batch 32650, loss[loss=0.1717, simple_loss=0.2367, pruned_loss=0.05331, over 4705.00 frames.], tot_loss[loss=0.1551, simple_loss=0.2249, pruned_loss=0.04265, over 972941.91 frames.], batch size: 15, lr: 4.30e-04 2022-05-05 02:06:09,914 INFO [train.py:715] (0/8) Epoch 4, batch 32700, loss[loss=0.1344, simple_loss=0.2091, pruned_loss=0.02983, over 4944.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2245, pruned_loss=0.04248, over 972422.83 frames.], batch size: 29, lr: 4.30e-04 2022-05-05 02:06:49,546 INFO [train.py:715] (0/8) Epoch 4, batch 32750, loss[loss=0.1489, simple_loss=0.2191, pruned_loss=0.03939, over 4740.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2243, pruned_loss=0.04231, over 972305.76 frames.], batch size: 19, lr: 4.30e-04 2022-05-05 02:07:29,260 INFO [train.py:715] (0/8) Epoch 4, batch 32800, loss[loss=0.1522, simple_loss=0.2258, pruned_loss=0.03933, over 4842.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2255, pruned_loss=0.04311, over 971901.98 frames.], batch size: 15, lr: 4.30e-04 2022-05-05 02:08:09,335 INFO [train.py:715] (0/8) Epoch 4, batch 32850, loss[loss=0.1416, simple_loss=0.2228, pruned_loss=0.03013, over 4971.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2248, pruned_loss=0.04277, over 971891.63 frames.], batch size: 28, lr: 4.30e-04 2022-05-05 02:08:49,842 INFO [train.py:715] (0/8) Epoch 4, batch 32900, loss[loss=0.1498, simple_loss=0.2216, pruned_loss=0.03895, over 4855.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2242, pruned_loss=0.04221, over 972154.80 frames.], batch size: 20, lr: 4.30e-04 2022-05-05 02:09:30,072 INFO [train.py:715] (0/8) Epoch 4, batch 32950, loss[loss=0.1562, simple_loss=0.2263, pruned_loss=0.043, over 4970.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2238, pruned_loss=0.04183, over 971344.83 frames.], batch size: 35, lr: 4.30e-04 2022-05-05 02:10:10,303 INFO [train.py:715] (0/8) Epoch 4, batch 33000, loss[loss=0.1789, simple_loss=0.2367, pruned_loss=0.0605, over 4953.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2248, pruned_loss=0.04202, over 971265.18 frames.], batch size: 39, lr: 4.30e-04 2022-05-05 02:10:10,304 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 02:10:20,092 INFO [train.py:742] (0/8) Epoch 4, validation: loss=0.1115, simple_loss=0.197, pruned_loss=0.01298, over 914524.00 frames. 2022-05-05 02:11:00,296 INFO [train.py:715] (0/8) Epoch 4, batch 33050, loss[loss=0.1812, simple_loss=0.2406, pruned_loss=0.06091, over 4910.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2254, pruned_loss=0.04276, over 971362.18 frames.], batch size: 18, lr: 4.30e-04 2022-05-05 02:11:40,006 INFO [train.py:715] (0/8) Epoch 4, batch 33100, loss[loss=0.1708, simple_loss=0.2399, pruned_loss=0.05087, over 4796.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2254, pruned_loss=0.04272, over 971850.42 frames.], batch size: 24, lr: 4.30e-04 2022-05-05 02:12:20,027 INFO [train.py:715] (0/8) Epoch 4, batch 33150, loss[loss=0.1646, simple_loss=0.2306, pruned_loss=0.04928, over 4816.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2253, pruned_loss=0.04258, over 971540.67 frames.], batch size: 27, lr: 4.30e-04 2022-05-05 02:13:00,226 INFO [train.py:715] (0/8) Epoch 4, batch 33200, loss[loss=0.1585, simple_loss=0.2371, pruned_loss=0.03993, over 4788.00 frames.], tot_loss[loss=0.1551, simple_loss=0.225, pruned_loss=0.0426, over 971766.94 frames.], batch size: 14, lr: 4.29e-04 2022-05-05 02:13:40,204 INFO [train.py:715] (0/8) Epoch 4, batch 33250, loss[loss=0.1396, simple_loss=0.2104, pruned_loss=0.03435, over 4941.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2246, pruned_loss=0.04236, over 972211.10 frames.], batch size: 21, lr: 4.29e-04 2022-05-05 02:14:20,213 INFO [train.py:715] (0/8) Epoch 4, batch 33300, loss[loss=0.1359, simple_loss=0.2129, pruned_loss=0.0294, over 4958.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2251, pruned_loss=0.04238, over 973151.18 frames.], batch size: 21, lr: 4.29e-04 2022-05-05 02:14:59,209 INFO [train.py:715] (0/8) Epoch 4, batch 33350, loss[loss=0.1392, simple_loss=0.2089, pruned_loss=0.03476, over 4987.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2256, pruned_loss=0.04263, over 972070.49 frames.], batch size: 35, lr: 4.29e-04 2022-05-05 02:15:38,983 INFO [train.py:715] (0/8) Epoch 4, batch 33400, loss[loss=0.191, simple_loss=0.2525, pruned_loss=0.06478, over 4849.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2257, pruned_loss=0.04268, over 972537.20 frames.], batch size: 30, lr: 4.29e-04 2022-05-05 02:16:18,844 INFO [train.py:715] (0/8) Epoch 4, batch 33450, loss[loss=0.1722, simple_loss=0.244, pruned_loss=0.05018, over 4791.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2254, pruned_loss=0.04254, over 971689.78 frames.], batch size: 17, lr: 4.29e-04 2022-05-05 02:16:58,396 INFO [train.py:715] (0/8) Epoch 4, batch 33500, loss[loss=0.1736, simple_loss=0.2333, pruned_loss=0.05697, over 4932.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2255, pruned_loss=0.04269, over 971895.04 frames.], batch size: 39, lr: 4.29e-04 2022-05-05 02:17:38,200 INFO [train.py:715] (0/8) Epoch 4, batch 33550, loss[loss=0.1476, simple_loss=0.2155, pruned_loss=0.03984, over 4849.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2255, pruned_loss=0.04249, over 972402.93 frames.], batch size: 32, lr: 4.29e-04 2022-05-05 02:18:17,700 INFO [train.py:715] (0/8) Epoch 4, batch 33600, loss[loss=0.149, simple_loss=0.2119, pruned_loss=0.0431, over 4931.00 frames.], tot_loss[loss=0.156, simple_loss=0.2261, pruned_loss=0.04292, over 971983.97 frames.], batch size: 21, lr: 4.29e-04 2022-05-05 02:18:57,440 INFO [train.py:715] (0/8) Epoch 4, batch 33650, loss[loss=0.1317, simple_loss=0.2085, pruned_loss=0.0274, over 4957.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2249, pruned_loss=0.04236, over 971811.17 frames.], batch size: 15, lr: 4.29e-04 2022-05-05 02:19:36,826 INFO [train.py:715] (0/8) Epoch 4, batch 33700, loss[loss=0.1753, simple_loss=0.2484, pruned_loss=0.05111, over 4911.00 frames.], tot_loss[loss=0.154, simple_loss=0.2244, pruned_loss=0.04183, over 971510.49 frames.], batch size: 19, lr: 4.29e-04 2022-05-05 02:20:16,625 INFO [train.py:715] (0/8) Epoch 4, batch 33750, loss[loss=0.1772, simple_loss=0.2292, pruned_loss=0.06258, over 4732.00 frames.], tot_loss[loss=0.1546, simple_loss=0.2248, pruned_loss=0.04216, over 971841.10 frames.], batch size: 12, lr: 4.29e-04 2022-05-05 02:20:56,486 INFO [train.py:715] (0/8) Epoch 4, batch 33800, loss[loss=0.1385, simple_loss=0.2022, pruned_loss=0.03743, over 4940.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2242, pruned_loss=0.04204, over 971659.57 frames.], batch size: 21, lr: 4.29e-04 2022-05-05 02:21:35,972 INFO [train.py:715] (0/8) Epoch 4, batch 33850, loss[loss=0.1364, simple_loss=0.2046, pruned_loss=0.03413, over 4824.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2239, pruned_loss=0.04215, over 972016.20 frames.], batch size: 25, lr: 4.29e-04 2022-05-05 02:22:15,605 INFO [train.py:715] (0/8) Epoch 4, batch 33900, loss[loss=0.1528, simple_loss=0.2165, pruned_loss=0.04456, over 4851.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2248, pruned_loss=0.04257, over 971930.77 frames.], batch size: 15, lr: 4.29e-04 2022-05-05 02:22:55,354 INFO [train.py:715] (0/8) Epoch 4, batch 33950, loss[loss=0.1621, simple_loss=0.231, pruned_loss=0.04662, over 4818.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2248, pruned_loss=0.04279, over 972031.21 frames.], batch size: 27, lr: 4.29e-04 2022-05-05 02:23:35,327 INFO [train.py:715] (0/8) Epoch 4, batch 34000, loss[loss=0.1467, simple_loss=0.2135, pruned_loss=0.03991, over 4692.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2249, pruned_loss=0.0431, over 971958.67 frames.], batch size: 15, lr: 4.28e-04 2022-05-05 02:24:14,851 INFO [train.py:715] (0/8) Epoch 4, batch 34050, loss[loss=0.1309, simple_loss=0.2005, pruned_loss=0.03068, over 4911.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2255, pruned_loss=0.0431, over 972163.99 frames.], batch size: 17, lr: 4.28e-04 2022-05-05 02:24:54,569 INFO [train.py:715] (0/8) Epoch 4, batch 34100, loss[loss=0.1844, simple_loss=0.2495, pruned_loss=0.05965, over 4782.00 frames.], tot_loss[loss=0.156, simple_loss=0.2259, pruned_loss=0.04304, over 972072.76 frames.], batch size: 17, lr: 4.28e-04 2022-05-05 02:25:34,628 INFO [train.py:715] (0/8) Epoch 4, batch 34150, loss[loss=0.1593, simple_loss=0.2311, pruned_loss=0.04374, over 4956.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2255, pruned_loss=0.04249, over 972493.69 frames.], batch size: 21, lr: 4.28e-04 2022-05-05 02:26:13,486 INFO [train.py:715] (0/8) Epoch 4, batch 34200, loss[loss=0.1287, simple_loss=0.2038, pruned_loss=0.02682, over 4817.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2248, pruned_loss=0.04238, over 973389.37 frames.], batch size: 26, lr: 4.28e-04 2022-05-05 02:26:54,316 INFO [train.py:715] (0/8) Epoch 4, batch 34250, loss[loss=0.1283, simple_loss=0.194, pruned_loss=0.03132, over 4829.00 frames.], tot_loss[loss=0.1552, simple_loss=0.2246, pruned_loss=0.04295, over 973325.80 frames.], batch size: 26, lr: 4.28e-04 2022-05-05 02:27:34,190 INFO [train.py:715] (0/8) Epoch 4, batch 34300, loss[loss=0.1611, simple_loss=0.236, pruned_loss=0.04306, over 4845.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2242, pruned_loss=0.04283, over 972967.38 frames.], batch size: 15, lr: 4.28e-04 2022-05-05 02:28:13,942 INFO [train.py:715] (0/8) Epoch 4, batch 34350, loss[loss=0.1464, simple_loss=0.216, pruned_loss=0.03837, over 4881.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2235, pruned_loss=0.04236, over 972420.61 frames.], batch size: 16, lr: 4.28e-04 2022-05-05 02:28:53,975 INFO [train.py:715] (0/8) Epoch 4, batch 34400, loss[loss=0.1465, simple_loss=0.2201, pruned_loss=0.03645, over 4931.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2247, pruned_loss=0.04295, over 973491.48 frames.], batch size: 21, lr: 4.28e-04 2022-05-05 02:29:33,806 INFO [train.py:715] (0/8) Epoch 4, batch 34450, loss[loss=0.1224, simple_loss=0.1935, pruned_loss=0.02567, over 4979.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2251, pruned_loss=0.04318, over 972433.48 frames.], batch size: 24, lr: 4.28e-04 2022-05-05 02:30:14,467 INFO [train.py:715] (0/8) Epoch 4, batch 34500, loss[loss=0.1404, simple_loss=0.2165, pruned_loss=0.03213, over 4978.00 frames.], tot_loss[loss=0.1556, simple_loss=0.2251, pruned_loss=0.04299, over 972017.44 frames.], batch size: 39, lr: 4.28e-04 2022-05-05 02:30:53,313 INFO [train.py:715] (0/8) Epoch 4, batch 34550, loss[loss=0.2212, simple_loss=0.2787, pruned_loss=0.08188, over 4936.00 frames.], tot_loss[loss=0.1553, simple_loss=0.2251, pruned_loss=0.04274, over 971874.03 frames.], batch size: 21, lr: 4.28e-04 2022-05-05 02:31:33,258 INFO [train.py:715] (0/8) Epoch 4, batch 34600, loss[loss=0.1323, simple_loss=0.2074, pruned_loss=0.02865, over 4846.00 frames.], tot_loss[loss=0.1559, simple_loss=0.2254, pruned_loss=0.04314, over 971555.34 frames.], batch size: 13, lr: 4.28e-04 2022-05-05 02:32:13,236 INFO [train.py:715] (0/8) Epoch 4, batch 34650, loss[loss=0.1524, simple_loss=0.2272, pruned_loss=0.03879, over 4833.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2252, pruned_loss=0.04312, over 971773.13 frames.], batch size: 15, lr: 4.28e-04 2022-05-05 02:32:52,589 INFO [train.py:715] (0/8) Epoch 4, batch 34700, loss[loss=0.1543, simple_loss=0.2216, pruned_loss=0.04354, over 4873.00 frames.], tot_loss[loss=0.1551, simple_loss=0.2248, pruned_loss=0.04276, over 972482.58 frames.], batch size: 16, lr: 4.28e-04 2022-05-05 02:33:30,871 INFO [train.py:715] (0/8) Epoch 4, batch 34750, loss[loss=0.1423, simple_loss=0.2176, pruned_loss=0.03356, over 4928.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2246, pruned_loss=0.04247, over 972524.45 frames.], batch size: 18, lr: 4.28e-04 2022-05-05 02:34:07,931 INFO [train.py:715] (0/8) Epoch 4, batch 34800, loss[loss=0.1902, simple_loss=0.2658, pruned_loss=0.05732, over 4885.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2261, pruned_loss=0.04306, over 973029.01 frames.], batch size: 19, lr: 4.27e-04 2022-05-05 02:34:16,613 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-4.pt 2022-05-05 02:34:57,763 INFO [train.py:715] (0/8) Epoch 5, batch 0, loss[loss=0.1623, simple_loss=0.2273, pruned_loss=0.04866, over 4772.00 frames.], tot_loss[loss=0.1623, simple_loss=0.2273, pruned_loss=0.04866, over 4772.00 frames.], batch size: 17, lr: 4.02e-04 2022-05-05 02:35:38,098 INFO [train.py:715] (0/8) Epoch 5, batch 50, loss[loss=0.1381, simple_loss=0.2089, pruned_loss=0.03363, over 4979.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2214, pruned_loss=0.03917, over 219743.34 frames.], batch size: 25, lr: 4.02e-04 2022-05-05 02:36:17,797 INFO [train.py:715] (0/8) Epoch 5, batch 100, loss[loss=0.1622, simple_loss=0.2255, pruned_loss=0.04943, over 4865.00 frames.], tot_loss[loss=0.153, simple_loss=0.2236, pruned_loss=0.04123, over 386266.16 frames.], batch size: 30, lr: 4.02e-04 2022-05-05 02:36:57,765 INFO [train.py:715] (0/8) Epoch 5, batch 150, loss[loss=0.1529, simple_loss=0.2303, pruned_loss=0.03777, over 4760.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2252, pruned_loss=0.04215, over 515675.58 frames.], batch size: 14, lr: 4.02e-04 2022-05-05 02:37:38,286 INFO [train.py:715] (0/8) Epoch 5, batch 200, loss[loss=0.1736, simple_loss=0.2361, pruned_loss=0.05558, over 4747.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2255, pruned_loss=0.04273, over 616920.34 frames.], batch size: 16, lr: 4.02e-04 2022-05-05 02:38:17,734 INFO [train.py:715] (0/8) Epoch 5, batch 250, loss[loss=0.147, simple_loss=0.2071, pruned_loss=0.04347, over 4895.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2244, pruned_loss=0.04259, over 695661.56 frames.], batch size: 17, lr: 4.02e-04 2022-05-05 02:38:57,154 INFO [train.py:715] (0/8) Epoch 5, batch 300, loss[loss=0.1498, simple_loss=0.2171, pruned_loss=0.04122, over 4783.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2243, pruned_loss=0.04221, over 756790.59 frames.], batch size: 14, lr: 4.01e-04 2022-05-05 02:39:36,892 INFO [train.py:715] (0/8) Epoch 5, batch 350, loss[loss=0.163, simple_loss=0.2334, pruned_loss=0.04626, over 4851.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2247, pruned_loss=0.04214, over 804412.74 frames.], batch size: 32, lr: 4.01e-04 2022-05-05 02:40:16,659 INFO [train.py:715] (0/8) Epoch 5, batch 400, loss[loss=0.1693, simple_loss=0.23, pruned_loss=0.05427, over 4855.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2239, pruned_loss=0.04188, over 841878.29 frames.], batch size: 30, lr: 4.01e-04 2022-05-05 02:40:56,049 INFO [train.py:715] (0/8) Epoch 5, batch 450, loss[loss=0.1515, simple_loss=0.2154, pruned_loss=0.04379, over 4821.00 frames.], tot_loss[loss=0.1527, simple_loss=0.223, pruned_loss=0.04117, over 871155.95 frames.], batch size: 15, lr: 4.01e-04 2022-05-05 02:41:35,798 INFO [train.py:715] (0/8) Epoch 5, batch 500, loss[loss=0.181, simple_loss=0.2594, pruned_loss=0.05128, over 4773.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2239, pruned_loss=0.04173, over 893168.80 frames.], batch size: 18, lr: 4.01e-04 2022-05-05 02:42:15,651 INFO [train.py:715] (0/8) Epoch 5, batch 550, loss[loss=0.1625, simple_loss=0.2327, pruned_loss=0.04614, over 4968.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2232, pruned_loss=0.04181, over 910318.59 frames.], batch size: 21, lr: 4.01e-04 2022-05-05 02:42:54,755 INFO [train.py:715] (0/8) Epoch 5, batch 600, loss[loss=0.1508, simple_loss=0.2187, pruned_loss=0.04142, over 4810.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2234, pruned_loss=0.04158, over 923311.97 frames.], batch size: 26, lr: 4.01e-04 2022-05-05 02:43:34,142 INFO [train.py:715] (0/8) Epoch 5, batch 650, loss[loss=0.1286, simple_loss=0.1933, pruned_loss=0.03199, over 4704.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2233, pruned_loss=0.04139, over 934848.77 frames.], batch size: 15, lr: 4.01e-04 2022-05-05 02:44:13,848 INFO [train.py:715] (0/8) Epoch 5, batch 700, loss[loss=0.1435, simple_loss=0.2098, pruned_loss=0.0386, over 4684.00 frames.], tot_loss[loss=0.1543, simple_loss=0.224, pruned_loss=0.04228, over 942581.69 frames.], batch size: 15, lr: 4.01e-04 2022-05-05 02:44:53,910 INFO [train.py:715] (0/8) Epoch 5, batch 750, loss[loss=0.1823, simple_loss=0.2473, pruned_loss=0.05863, over 4896.00 frames.], tot_loss[loss=0.153, simple_loss=0.223, pruned_loss=0.04147, over 949094.74 frames.], batch size: 39, lr: 4.01e-04 2022-05-05 02:45:33,281 INFO [train.py:715] (0/8) Epoch 5, batch 800, loss[loss=0.141, simple_loss=0.2121, pruned_loss=0.03491, over 4764.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2226, pruned_loss=0.04133, over 954428.22 frames.], batch size: 12, lr: 4.01e-04 2022-05-05 02:46:12,785 INFO [train.py:715] (0/8) Epoch 5, batch 850, loss[loss=0.1543, simple_loss=0.2347, pruned_loss=0.03693, over 4894.00 frames.], tot_loss[loss=0.1533, simple_loss=0.223, pruned_loss=0.0418, over 958546.92 frames.], batch size: 19, lr: 4.01e-04 2022-05-05 02:46:52,363 INFO [train.py:715] (0/8) Epoch 5, batch 900, loss[loss=0.1535, simple_loss=0.2261, pruned_loss=0.04047, over 4799.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2228, pruned_loss=0.04142, over 961602.98 frames.], batch size: 21, lr: 4.01e-04 2022-05-05 02:47:31,843 INFO [train.py:715] (0/8) Epoch 5, batch 950, loss[loss=0.1495, simple_loss=0.208, pruned_loss=0.04554, over 4804.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2232, pruned_loss=0.04181, over 964508.21 frames.], batch size: 13, lr: 4.01e-04 2022-05-05 02:48:11,354 INFO [train.py:715] (0/8) Epoch 5, batch 1000, loss[loss=0.1162, simple_loss=0.1887, pruned_loss=0.02184, over 4978.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2229, pruned_loss=0.04166, over 965690.76 frames.], batch size: 14, lr: 4.01e-04 2022-05-05 02:48:50,616 INFO [train.py:715] (0/8) Epoch 5, batch 1050, loss[loss=0.1598, simple_loss=0.2337, pruned_loss=0.04291, over 4976.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2228, pruned_loss=0.0414, over 966884.19 frames.], batch size: 35, lr: 4.01e-04 2022-05-05 02:49:30,322 INFO [train.py:715] (0/8) Epoch 5, batch 1100, loss[loss=0.1777, simple_loss=0.2391, pruned_loss=0.05814, over 4751.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2232, pruned_loss=0.04181, over 968165.03 frames.], batch size: 16, lr: 4.01e-04 2022-05-05 02:50:09,331 INFO [train.py:715] (0/8) Epoch 5, batch 1150, loss[loss=0.1645, simple_loss=0.2437, pruned_loss=0.04267, over 4787.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2236, pruned_loss=0.04186, over 969492.00 frames.], batch size: 18, lr: 4.00e-04 2022-05-05 02:50:49,093 INFO [train.py:715] (0/8) Epoch 5, batch 1200, loss[loss=0.1721, simple_loss=0.2431, pruned_loss=0.05055, over 4946.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2234, pruned_loss=0.04172, over 970799.77 frames.], batch size: 29, lr: 4.00e-04 2022-05-05 02:51:29,240 INFO [train.py:715] (0/8) Epoch 5, batch 1250, loss[loss=0.1502, simple_loss=0.228, pruned_loss=0.03622, over 4950.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2242, pruned_loss=0.04174, over 971815.71 frames.], batch size: 35, lr: 4.00e-04 2022-05-05 02:52:08,411 INFO [train.py:715] (0/8) Epoch 5, batch 1300, loss[loss=0.1862, simple_loss=0.2605, pruned_loss=0.05594, over 4821.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2246, pruned_loss=0.04195, over 972094.36 frames.], batch size: 15, lr: 4.00e-04 2022-05-05 02:52:48,189 INFO [train.py:715] (0/8) Epoch 5, batch 1350, loss[loss=0.1707, simple_loss=0.2362, pruned_loss=0.05258, over 4710.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2246, pruned_loss=0.04203, over 972636.39 frames.], batch size: 15, lr: 4.00e-04 2022-05-05 02:53:27,483 INFO [train.py:715] (0/8) Epoch 5, batch 1400, loss[loss=0.1335, simple_loss=0.2082, pruned_loss=0.02941, over 4814.00 frames.], tot_loss[loss=0.1554, simple_loss=0.2252, pruned_loss=0.04281, over 972683.07 frames.], batch size: 27, lr: 4.00e-04 2022-05-05 02:54:07,304 INFO [train.py:715] (0/8) Epoch 5, batch 1450, loss[loss=0.188, simple_loss=0.2534, pruned_loss=0.06134, over 4777.00 frames.], tot_loss[loss=0.1557, simple_loss=0.2259, pruned_loss=0.04276, over 972804.29 frames.], batch size: 18, lr: 4.00e-04 2022-05-05 02:54:46,731 INFO [train.py:715] (0/8) Epoch 5, batch 1500, loss[loss=0.1304, simple_loss=0.1926, pruned_loss=0.03411, over 4831.00 frames.], tot_loss[loss=0.1561, simple_loss=0.2264, pruned_loss=0.04287, over 973177.55 frames.], batch size: 13, lr: 4.00e-04 2022-05-05 02:55:25,726 INFO [train.py:715] (0/8) Epoch 5, batch 1550, loss[loss=0.1401, simple_loss=0.2195, pruned_loss=0.03039, over 4783.00 frames.], tot_loss[loss=0.1557, simple_loss=0.226, pruned_loss=0.04275, over 973376.35 frames.], batch size: 17, lr: 4.00e-04 2022-05-05 02:56:05,367 INFO [train.py:715] (0/8) Epoch 5, batch 1600, loss[loss=0.1287, simple_loss=0.2025, pruned_loss=0.02751, over 4775.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2247, pruned_loss=0.04214, over 973693.17 frames.], batch size: 17, lr: 4.00e-04 2022-05-05 02:56:45,703 INFO [train.py:715] (0/8) Epoch 5, batch 1650, loss[loss=0.1352, simple_loss=0.2145, pruned_loss=0.02797, over 4824.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2242, pruned_loss=0.04168, over 973178.68 frames.], batch size: 26, lr: 4.00e-04 2022-05-05 02:57:24,644 INFO [train.py:715] (0/8) Epoch 5, batch 1700, loss[loss=0.1429, simple_loss=0.2235, pruned_loss=0.03122, over 4983.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2238, pruned_loss=0.04136, over 973371.59 frames.], batch size: 25, lr: 4.00e-04 2022-05-05 02:58:05,301 INFO [train.py:715] (0/8) Epoch 5, batch 1750, loss[loss=0.1326, simple_loss=0.205, pruned_loss=0.03006, over 4946.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2246, pruned_loss=0.042, over 972284.38 frames.], batch size: 29, lr: 4.00e-04 2022-05-05 02:58:45,444 INFO [train.py:715] (0/8) Epoch 5, batch 1800, loss[loss=0.1858, simple_loss=0.2477, pruned_loss=0.0619, over 4761.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2247, pruned_loss=0.04251, over 972399.76 frames.], batch size: 19, lr: 4.00e-04 2022-05-05 02:59:25,893 INFO [train.py:715] (0/8) Epoch 5, batch 1850, loss[loss=0.1344, simple_loss=0.223, pruned_loss=0.02288, over 4869.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2245, pruned_loss=0.042, over 972510.12 frames.], batch size: 20, lr: 4.00e-04 2022-05-05 03:00:06,292 INFO [train.py:715] (0/8) Epoch 5, batch 1900, loss[loss=0.1467, simple_loss=0.2244, pruned_loss=0.03448, over 4951.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2244, pruned_loss=0.04206, over 972219.13 frames.], batch size: 24, lr: 4.00e-04 2022-05-05 03:00:46,050 INFO [train.py:715] (0/8) Epoch 5, batch 1950, loss[loss=0.1669, simple_loss=0.2284, pruned_loss=0.05273, over 4918.00 frames.], tot_loss[loss=0.154, simple_loss=0.224, pruned_loss=0.042, over 971370.96 frames.], batch size: 23, lr: 4.00e-04 2022-05-05 03:00:47,948 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-176000.pt 2022-05-05 03:01:29,144 INFO [train.py:715] (0/8) Epoch 5, batch 2000, loss[loss=0.1499, simple_loss=0.2222, pruned_loss=0.03882, over 4961.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2238, pruned_loss=0.04167, over 971430.81 frames.], batch size: 29, lr: 4.00e-04 2022-05-05 03:02:09,157 INFO [train.py:715] (0/8) Epoch 5, batch 2050, loss[loss=0.1545, simple_loss=0.2213, pruned_loss=0.04388, over 4802.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2238, pruned_loss=0.04164, over 972287.25 frames.], batch size: 24, lr: 3.99e-04 2022-05-05 03:02:49,516 INFO [train.py:715] (0/8) Epoch 5, batch 2100, loss[loss=0.1639, simple_loss=0.238, pruned_loss=0.04492, over 4986.00 frames.], tot_loss[loss=0.1534, simple_loss=0.224, pruned_loss=0.04144, over 972994.34 frames.], batch size: 16, lr: 3.99e-04 2022-05-05 03:03:30,099 INFO [train.py:715] (0/8) Epoch 5, batch 2150, loss[loss=0.1787, simple_loss=0.2378, pruned_loss=0.0598, over 4956.00 frames.], tot_loss[loss=0.1539, simple_loss=0.224, pruned_loss=0.04189, over 973588.57 frames.], batch size: 15, lr: 3.99e-04 2022-05-05 03:04:09,664 INFO [train.py:715] (0/8) Epoch 5, batch 2200, loss[loss=0.1489, simple_loss=0.2144, pruned_loss=0.04168, over 4968.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2238, pruned_loss=0.04188, over 972791.45 frames.], batch size: 15, lr: 3.99e-04 2022-05-05 03:04:50,062 INFO [train.py:715] (0/8) Epoch 5, batch 2250, loss[loss=0.1717, simple_loss=0.2312, pruned_loss=0.05611, over 4946.00 frames.], tot_loss[loss=0.1546, simple_loss=0.2247, pruned_loss=0.0423, over 972695.38 frames.], batch size: 35, lr: 3.99e-04 2022-05-05 03:05:30,782 INFO [train.py:715] (0/8) Epoch 5, batch 2300, loss[loss=0.1734, simple_loss=0.2345, pruned_loss=0.05618, over 4772.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2237, pruned_loss=0.04183, over 973202.97 frames.], batch size: 17, lr: 3.99e-04 2022-05-05 03:06:10,988 INFO [train.py:715] (0/8) Epoch 5, batch 2350, loss[loss=0.1321, simple_loss=0.2086, pruned_loss=0.02785, over 4771.00 frames.], tot_loss[loss=0.1531, simple_loss=0.223, pruned_loss=0.04158, over 972515.39 frames.], batch size: 17, lr: 3.99e-04 2022-05-05 03:06:51,190 INFO [train.py:715] (0/8) Epoch 5, batch 2400, loss[loss=0.144, simple_loss=0.2227, pruned_loss=0.03264, over 4822.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2226, pruned_loss=0.04104, over 973097.03 frames.], batch size: 27, lr: 3.99e-04 2022-05-05 03:07:31,712 INFO [train.py:715] (0/8) Epoch 5, batch 2450, loss[loss=0.1589, simple_loss=0.2274, pruned_loss=0.04521, over 4860.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2233, pruned_loss=0.04164, over 973241.73 frames.], batch size: 32, lr: 3.99e-04 2022-05-05 03:08:12,415 INFO [train.py:715] (0/8) Epoch 5, batch 2500, loss[loss=0.1991, simple_loss=0.263, pruned_loss=0.06766, over 4788.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2235, pruned_loss=0.04186, over 972792.65 frames.], batch size: 14, lr: 3.99e-04 2022-05-05 03:08:52,446 INFO [train.py:715] (0/8) Epoch 5, batch 2550, loss[loss=0.1337, simple_loss=0.2025, pruned_loss=0.03249, over 4798.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2237, pruned_loss=0.04146, over 972833.38 frames.], batch size: 12, lr: 3.99e-04 2022-05-05 03:09:33,370 INFO [train.py:715] (0/8) Epoch 5, batch 2600, loss[loss=0.1218, simple_loss=0.1876, pruned_loss=0.02803, over 4990.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2241, pruned_loss=0.04136, over 972576.53 frames.], batch size: 15, lr: 3.99e-04 2022-05-05 03:10:13,555 INFO [train.py:715] (0/8) Epoch 5, batch 2650, loss[loss=0.1602, simple_loss=0.237, pruned_loss=0.04171, over 4952.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2237, pruned_loss=0.04129, over 972935.88 frames.], batch size: 24, lr: 3.99e-04 2022-05-05 03:10:54,121 INFO [train.py:715] (0/8) Epoch 5, batch 2700, loss[loss=0.1435, simple_loss=0.2166, pruned_loss=0.03518, over 4927.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2233, pruned_loss=0.04093, over 973041.58 frames.], batch size: 29, lr: 3.99e-04 2022-05-05 03:11:34,315 INFO [train.py:715] (0/8) Epoch 5, batch 2750, loss[loss=0.1486, simple_loss=0.2136, pruned_loss=0.04183, over 4789.00 frames.], tot_loss[loss=0.1527, simple_loss=0.223, pruned_loss=0.04124, over 972718.24 frames.], batch size: 21, lr: 3.99e-04 2022-05-05 03:12:14,287 INFO [train.py:715] (0/8) Epoch 5, batch 2800, loss[loss=0.1639, simple_loss=0.2294, pruned_loss=0.0492, over 4883.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2235, pruned_loss=0.04114, over 972336.47 frames.], batch size: 16, lr: 3.99e-04 2022-05-05 03:12:54,883 INFO [train.py:715] (0/8) Epoch 5, batch 2850, loss[loss=0.1213, simple_loss=0.1852, pruned_loss=0.02866, over 4783.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2228, pruned_loss=0.04094, over 973143.31 frames.], batch size: 17, lr: 3.99e-04 2022-05-05 03:13:35,008 INFO [train.py:715] (0/8) Epoch 5, batch 2900, loss[loss=0.1439, simple_loss=0.2148, pruned_loss=0.03651, over 4970.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2233, pruned_loss=0.04142, over 972499.12 frames.], batch size: 14, lr: 3.99e-04 2022-05-05 03:14:15,396 INFO [train.py:715] (0/8) Epoch 5, batch 2950, loss[loss=0.1574, simple_loss=0.2309, pruned_loss=0.04197, over 4830.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2227, pruned_loss=0.04096, over 971649.78 frames.], batch size: 26, lr: 3.98e-04 2022-05-05 03:14:54,475 INFO [train.py:715] (0/8) Epoch 5, batch 3000, loss[loss=0.154, simple_loss=0.2264, pruned_loss=0.04081, over 4863.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2229, pruned_loss=0.04094, over 971043.76 frames.], batch size: 20, lr: 3.98e-04 2022-05-05 03:14:54,476 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 03:15:03,921 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1108, simple_loss=0.1962, pruned_loss=0.01274, over 914524.00 frames. 2022-05-05 03:15:42,392 INFO [train.py:715] (0/8) Epoch 5, batch 3050, loss[loss=0.1339, simple_loss=0.2071, pruned_loss=0.03036, over 4910.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2232, pruned_loss=0.04153, over 970567.06 frames.], batch size: 17, lr: 3.98e-04 2022-05-05 03:16:21,556 INFO [train.py:715] (0/8) Epoch 5, batch 3100, loss[loss=0.1189, simple_loss=0.1911, pruned_loss=0.0234, over 4824.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2236, pruned_loss=0.04151, over 971197.42 frames.], batch size: 26, lr: 3.98e-04 2022-05-05 03:17:00,517 INFO [train.py:715] (0/8) Epoch 5, batch 3150, loss[loss=0.1567, simple_loss=0.2188, pruned_loss=0.04731, over 4808.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2233, pruned_loss=0.04118, over 971573.11 frames.], batch size: 21, lr: 3.98e-04 2022-05-05 03:17:40,036 INFO [train.py:715] (0/8) Epoch 5, batch 3200, loss[loss=0.1515, simple_loss=0.2317, pruned_loss=0.03561, over 4807.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2232, pruned_loss=0.04102, over 971463.84 frames.], batch size: 25, lr: 3.98e-04 2022-05-05 03:18:19,740 INFO [train.py:715] (0/8) Epoch 5, batch 3250, loss[loss=0.1207, simple_loss=0.1947, pruned_loss=0.02338, over 4921.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2233, pruned_loss=0.04102, over 971718.88 frames.], batch size: 17, lr: 3.98e-04 2022-05-05 03:18:58,957 INFO [train.py:715] (0/8) Epoch 5, batch 3300, loss[loss=0.1837, simple_loss=0.2589, pruned_loss=0.0542, over 4849.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2241, pruned_loss=0.04155, over 972124.80 frames.], batch size: 30, lr: 3.98e-04 2022-05-05 03:19:38,235 INFO [train.py:715] (0/8) Epoch 5, batch 3350, loss[loss=0.1769, simple_loss=0.2451, pruned_loss=0.05438, over 4767.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2239, pruned_loss=0.0409, over 972146.39 frames.], batch size: 19, lr: 3.98e-04 2022-05-05 03:20:17,970 INFO [train.py:715] (0/8) Epoch 5, batch 3400, loss[loss=0.1374, simple_loss=0.2078, pruned_loss=0.03355, over 4992.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2241, pruned_loss=0.04105, over 973072.13 frames.], batch size: 16, lr: 3.98e-04 2022-05-05 03:20:57,513 INFO [train.py:715] (0/8) Epoch 5, batch 3450, loss[loss=0.152, simple_loss=0.2277, pruned_loss=0.03816, over 4771.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2247, pruned_loss=0.04151, over 973083.85 frames.], batch size: 18, lr: 3.98e-04 2022-05-05 03:21:36,807 INFO [train.py:715] (0/8) Epoch 5, batch 3500, loss[loss=0.143, simple_loss=0.2209, pruned_loss=0.03254, over 4707.00 frames.], tot_loss[loss=0.1532, simple_loss=0.224, pruned_loss=0.04116, over 972642.68 frames.], batch size: 15, lr: 3.98e-04 2022-05-05 03:22:16,030 INFO [train.py:715] (0/8) Epoch 5, batch 3550, loss[loss=0.1878, simple_loss=0.2727, pruned_loss=0.05145, over 4901.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2242, pruned_loss=0.04148, over 971872.07 frames.], batch size: 19, lr: 3.98e-04 2022-05-05 03:22:55,529 INFO [train.py:715] (0/8) Epoch 5, batch 3600, loss[loss=0.1379, simple_loss=0.223, pruned_loss=0.02638, over 4995.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2239, pruned_loss=0.04155, over 972327.51 frames.], batch size: 14, lr: 3.98e-04 2022-05-05 03:23:34,517 INFO [train.py:715] (0/8) Epoch 5, batch 3650, loss[loss=0.156, simple_loss=0.2231, pruned_loss=0.04443, over 4800.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2228, pruned_loss=0.04133, over 972804.62 frames.], batch size: 25, lr: 3.98e-04 2022-05-05 03:24:13,760 INFO [train.py:715] (0/8) Epoch 5, batch 3700, loss[loss=0.1247, simple_loss=0.1914, pruned_loss=0.02901, over 4851.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2225, pruned_loss=0.04129, over 972093.80 frames.], batch size: 38, lr: 3.98e-04 2022-05-05 03:24:53,921 INFO [train.py:715] (0/8) Epoch 5, batch 3750, loss[loss=0.1522, simple_loss=0.2098, pruned_loss=0.04736, over 4938.00 frames.], tot_loss[loss=0.153, simple_loss=0.2229, pruned_loss=0.04158, over 972102.14 frames.], batch size: 35, lr: 3.98e-04 2022-05-05 03:25:33,694 INFO [train.py:715] (0/8) Epoch 5, batch 3800, loss[loss=0.173, simple_loss=0.2413, pruned_loss=0.05235, over 4831.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2222, pruned_loss=0.04108, over 972141.31 frames.], batch size: 15, lr: 3.97e-04 2022-05-05 03:26:13,091 INFO [train.py:715] (0/8) Epoch 5, batch 3850, loss[loss=0.1646, simple_loss=0.2245, pruned_loss=0.05235, over 4848.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2229, pruned_loss=0.04114, over 971463.22 frames.], batch size: 34, lr: 3.97e-04 2022-05-05 03:26:52,958 INFO [train.py:715] (0/8) Epoch 5, batch 3900, loss[loss=0.1516, simple_loss=0.2198, pruned_loss=0.0417, over 4916.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2241, pruned_loss=0.04212, over 971349.55 frames.], batch size: 23, lr: 3.97e-04 2022-05-05 03:27:32,993 INFO [train.py:715] (0/8) Epoch 5, batch 3950, loss[loss=0.1626, simple_loss=0.2277, pruned_loss=0.04877, over 4961.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2242, pruned_loss=0.04207, over 971669.62 frames.], batch size: 35, lr: 3.97e-04 2022-05-05 03:28:13,082 INFO [train.py:715] (0/8) Epoch 5, batch 4000, loss[loss=0.1266, simple_loss=0.198, pruned_loss=0.02759, over 4988.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2238, pruned_loss=0.0417, over 972183.53 frames.], batch size: 28, lr: 3.97e-04 2022-05-05 03:28:53,738 INFO [train.py:715] (0/8) Epoch 5, batch 4050, loss[loss=0.1846, simple_loss=0.2651, pruned_loss=0.05203, over 4808.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2242, pruned_loss=0.04187, over 971306.73 frames.], batch size: 27, lr: 3.97e-04 2022-05-05 03:29:33,842 INFO [train.py:715] (0/8) Epoch 5, batch 4100, loss[loss=0.1509, simple_loss=0.2255, pruned_loss=0.03814, over 4783.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2238, pruned_loss=0.04141, over 971519.62 frames.], batch size: 18, lr: 3.97e-04 2022-05-05 03:30:14,064 INFO [train.py:715] (0/8) Epoch 5, batch 4150, loss[loss=0.1388, simple_loss=0.2084, pruned_loss=0.03458, over 4818.00 frames.], tot_loss[loss=0.1538, simple_loss=0.224, pruned_loss=0.0418, over 971899.55 frames.], batch size: 26, lr: 3.97e-04 2022-05-05 03:30:53,445 INFO [train.py:715] (0/8) Epoch 5, batch 4200, loss[loss=0.121, simple_loss=0.1962, pruned_loss=0.02286, over 4923.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2238, pruned_loss=0.04175, over 972141.69 frames.], batch size: 29, lr: 3.97e-04 2022-05-05 03:31:32,792 INFO [train.py:715] (0/8) Epoch 5, batch 4250, loss[loss=0.1422, simple_loss=0.2221, pruned_loss=0.0312, over 4810.00 frames.], tot_loss[loss=0.1535, simple_loss=0.224, pruned_loss=0.04154, over 973171.35 frames.], batch size: 25, lr: 3.97e-04 2022-05-05 03:32:12,486 INFO [train.py:715] (0/8) Epoch 5, batch 4300, loss[loss=0.15, simple_loss=0.2216, pruned_loss=0.0392, over 4886.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2239, pruned_loss=0.04186, over 973816.57 frames.], batch size: 22, lr: 3.97e-04 2022-05-05 03:32:52,098 INFO [train.py:715] (0/8) Epoch 5, batch 4350, loss[loss=0.1591, simple_loss=0.2272, pruned_loss=0.04548, over 4975.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2245, pruned_loss=0.04201, over 973056.56 frames.], batch size: 25, lr: 3.97e-04 2022-05-05 03:33:32,065 INFO [train.py:715] (0/8) Epoch 5, batch 4400, loss[loss=0.1249, simple_loss=0.2048, pruned_loss=0.02252, over 4745.00 frames.], tot_loss[loss=0.1538, simple_loss=0.224, pruned_loss=0.0418, over 972891.32 frames.], batch size: 19, lr: 3.97e-04 2022-05-05 03:34:10,946 INFO [train.py:715] (0/8) Epoch 5, batch 4450, loss[loss=0.1264, simple_loss=0.2011, pruned_loss=0.02587, over 4795.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2236, pruned_loss=0.04182, over 972875.15 frames.], batch size: 24, lr: 3.97e-04 2022-05-05 03:34:50,791 INFO [train.py:715] (0/8) Epoch 5, batch 4500, loss[loss=0.1355, simple_loss=0.2191, pruned_loss=0.02592, over 4792.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2235, pruned_loss=0.04179, over 972230.69 frames.], batch size: 14, lr: 3.97e-04 2022-05-05 03:35:30,123 INFO [train.py:715] (0/8) Epoch 5, batch 4550, loss[loss=0.1792, simple_loss=0.2576, pruned_loss=0.05039, over 4857.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2232, pruned_loss=0.0416, over 972557.00 frames.], batch size: 20, lr: 3.97e-04 2022-05-05 03:36:09,738 INFO [train.py:715] (0/8) Epoch 5, batch 4600, loss[loss=0.1639, simple_loss=0.2412, pruned_loss=0.04334, over 4916.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2234, pruned_loss=0.04159, over 973338.39 frames.], batch size: 17, lr: 3.97e-04 2022-05-05 03:36:50,095 INFO [train.py:715] (0/8) Epoch 5, batch 4650, loss[loss=0.1706, simple_loss=0.2526, pruned_loss=0.0443, over 4844.00 frames.], tot_loss[loss=0.153, simple_loss=0.2233, pruned_loss=0.0414, over 972790.64 frames.], batch size: 15, lr: 3.97e-04 2022-05-05 03:37:30,432 INFO [train.py:715] (0/8) Epoch 5, batch 4700, loss[loss=0.1215, simple_loss=0.1947, pruned_loss=0.02414, over 4822.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2235, pruned_loss=0.04148, over 972046.30 frames.], batch size: 27, lr: 3.96e-04 2022-05-05 03:38:10,931 INFO [train.py:715] (0/8) Epoch 5, batch 4750, loss[loss=0.2034, simple_loss=0.2806, pruned_loss=0.06316, over 4709.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2236, pruned_loss=0.04149, over 971652.12 frames.], batch size: 15, lr: 3.96e-04 2022-05-05 03:38:50,694 INFO [train.py:715] (0/8) Epoch 5, batch 4800, loss[loss=0.1747, simple_loss=0.2477, pruned_loss=0.05081, over 4943.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2237, pruned_loss=0.04129, over 971476.58 frames.], batch size: 23, lr: 3.96e-04 2022-05-05 03:39:31,183 INFO [train.py:715] (0/8) Epoch 5, batch 4850, loss[loss=0.1879, simple_loss=0.2569, pruned_loss=0.05948, over 4797.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2234, pruned_loss=0.04142, over 971481.07 frames.], batch size: 17, lr: 3.96e-04 2022-05-05 03:40:11,784 INFO [train.py:715] (0/8) Epoch 5, batch 4900, loss[loss=0.1762, simple_loss=0.244, pruned_loss=0.05425, over 4840.00 frames.], tot_loss[loss=0.153, simple_loss=0.2231, pruned_loss=0.04142, over 971994.91 frames.], batch size: 32, lr: 3.96e-04 2022-05-05 03:40:51,916 INFO [train.py:715] (0/8) Epoch 5, batch 4950, loss[loss=0.1314, simple_loss=0.2067, pruned_loss=0.02803, over 4897.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2225, pruned_loss=0.04091, over 971883.20 frames.], batch size: 19, lr: 3.96e-04 2022-05-05 03:41:32,222 INFO [train.py:715] (0/8) Epoch 5, batch 5000, loss[loss=0.1723, simple_loss=0.2557, pruned_loss=0.04449, over 4962.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2227, pruned_loss=0.04079, over 972133.70 frames.], batch size: 24, lr: 3.96e-04 2022-05-05 03:42:13,228 INFO [train.py:715] (0/8) Epoch 5, batch 5050, loss[loss=0.1473, simple_loss=0.2289, pruned_loss=0.03284, over 4799.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2226, pruned_loss=0.04077, over 971985.28 frames.], batch size: 17, lr: 3.96e-04 2022-05-05 03:42:52,852 INFO [train.py:715] (0/8) Epoch 5, batch 5100, loss[loss=0.1362, simple_loss=0.2057, pruned_loss=0.03338, over 4808.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2232, pruned_loss=0.04085, over 972371.03 frames.], batch size: 21, lr: 3.96e-04 2022-05-05 03:43:32,131 INFO [train.py:715] (0/8) Epoch 5, batch 5150, loss[loss=0.1598, simple_loss=0.2353, pruned_loss=0.04214, over 4769.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2231, pruned_loss=0.04085, over 972427.10 frames.], batch size: 18, lr: 3.96e-04 2022-05-05 03:44:11,853 INFO [train.py:715] (0/8) Epoch 5, batch 5200, loss[loss=0.1403, simple_loss=0.2177, pruned_loss=0.03152, over 4849.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2235, pruned_loss=0.04076, over 972787.59 frames.], batch size: 20, lr: 3.96e-04 2022-05-05 03:44:51,642 INFO [train.py:715] (0/8) Epoch 5, batch 5250, loss[loss=0.124, simple_loss=0.1949, pruned_loss=0.02658, over 4924.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2236, pruned_loss=0.04103, over 972830.36 frames.], batch size: 29, lr: 3.96e-04 2022-05-05 03:45:32,215 INFO [train.py:715] (0/8) Epoch 5, batch 5300, loss[loss=0.1405, simple_loss=0.2141, pruned_loss=0.0335, over 4899.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2243, pruned_loss=0.04155, over 972417.76 frames.], batch size: 19, lr: 3.96e-04 2022-05-05 03:46:12,525 INFO [train.py:715] (0/8) Epoch 5, batch 5350, loss[loss=0.141, simple_loss=0.2187, pruned_loss=0.03166, over 4929.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2241, pruned_loss=0.04134, over 972587.25 frames.], batch size: 23, lr: 3.96e-04 2022-05-05 03:46:52,866 INFO [train.py:715] (0/8) Epoch 5, batch 5400, loss[loss=0.167, simple_loss=0.23, pruned_loss=0.05197, over 4968.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2239, pruned_loss=0.04153, over 973184.20 frames.], batch size: 35, lr: 3.96e-04 2022-05-05 03:47:32,580 INFO [train.py:715] (0/8) Epoch 5, batch 5450, loss[loss=0.1405, simple_loss=0.221, pruned_loss=0.03002, over 4820.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2237, pruned_loss=0.04164, over 972875.39 frames.], batch size: 27, lr: 3.96e-04 2022-05-05 03:48:12,694 INFO [train.py:715] (0/8) Epoch 5, batch 5500, loss[loss=0.1599, simple_loss=0.2229, pruned_loss=0.04841, over 4970.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2239, pruned_loss=0.04183, over 972549.02 frames.], batch size: 28, lr: 3.96e-04 2022-05-05 03:48:53,030 INFO [train.py:715] (0/8) Epoch 5, batch 5550, loss[loss=0.1546, simple_loss=0.2248, pruned_loss=0.04217, over 4805.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2255, pruned_loss=0.04275, over 972812.71 frames.], batch size: 25, lr: 3.96e-04 2022-05-05 03:49:33,410 INFO [train.py:715] (0/8) Epoch 5, batch 5600, loss[loss=0.1512, simple_loss=0.2317, pruned_loss=0.03539, over 4951.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2241, pruned_loss=0.04176, over 972859.52 frames.], batch size: 21, lr: 3.95e-04 2022-05-05 03:50:13,541 INFO [train.py:715] (0/8) Epoch 5, batch 5650, loss[loss=0.1554, simple_loss=0.234, pruned_loss=0.0384, over 4899.00 frames.], tot_loss[loss=0.1534, simple_loss=0.224, pruned_loss=0.04138, over 972978.96 frames.], batch size: 19, lr: 3.95e-04 2022-05-05 03:50:52,895 INFO [train.py:715] (0/8) Epoch 5, batch 5700, loss[loss=0.1299, simple_loss=0.1966, pruned_loss=0.03155, over 4977.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2238, pruned_loss=0.04171, over 973564.23 frames.], batch size: 14, lr: 3.95e-04 2022-05-05 03:51:33,318 INFO [train.py:715] (0/8) Epoch 5, batch 5750, loss[loss=0.1513, simple_loss=0.2284, pruned_loss=0.03712, over 4758.00 frames.], tot_loss[loss=0.154, simple_loss=0.2243, pruned_loss=0.04182, over 973195.34 frames.], batch size: 19, lr: 3.95e-04 2022-05-05 03:52:13,227 INFO [train.py:715] (0/8) Epoch 5, batch 5800, loss[loss=0.153, simple_loss=0.2194, pruned_loss=0.04335, over 4829.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2239, pruned_loss=0.04133, over 972599.89 frames.], batch size: 30, lr: 3.95e-04 2022-05-05 03:52:53,765 INFO [train.py:715] (0/8) Epoch 5, batch 5850, loss[loss=0.154, simple_loss=0.2187, pruned_loss=0.04466, over 4878.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2246, pruned_loss=0.04185, over 972294.14 frames.], batch size: 22, lr: 3.95e-04 2022-05-05 03:53:33,392 INFO [train.py:715] (0/8) Epoch 5, batch 5900, loss[loss=0.1501, simple_loss=0.2098, pruned_loss=0.04526, over 4842.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2242, pruned_loss=0.04132, over 973147.24 frames.], batch size: 32, lr: 3.95e-04 2022-05-05 03:54:13,786 INFO [train.py:715] (0/8) Epoch 5, batch 5950, loss[loss=0.1502, simple_loss=0.2306, pruned_loss=0.03486, over 4953.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2247, pruned_loss=0.04152, over 972697.03 frames.], batch size: 21, lr: 3.95e-04 2022-05-05 03:54:53,621 INFO [train.py:715] (0/8) Epoch 5, batch 6000, loss[loss=0.1451, simple_loss=0.2188, pruned_loss=0.03572, over 4775.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2236, pruned_loss=0.04097, over 972887.84 frames.], batch size: 14, lr: 3.95e-04 2022-05-05 03:54:53,622 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 03:55:03,072 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1106, simple_loss=0.1959, pruned_loss=0.01263, over 914524.00 frames. 2022-05-05 03:55:42,933 INFO [train.py:715] (0/8) Epoch 5, batch 6050, loss[loss=0.1455, simple_loss=0.2269, pruned_loss=0.03202, over 4770.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2238, pruned_loss=0.04106, over 972947.63 frames.], batch size: 14, lr: 3.95e-04 2022-05-05 03:56:22,017 INFO [train.py:715] (0/8) Epoch 5, batch 6100, loss[loss=0.1391, simple_loss=0.2116, pruned_loss=0.03332, over 4967.00 frames.], tot_loss[loss=0.1525, simple_loss=0.223, pruned_loss=0.04095, over 973290.18 frames.], batch size: 21, lr: 3.95e-04 2022-05-05 03:57:01,849 INFO [train.py:715] (0/8) Epoch 5, batch 6150, loss[loss=0.1652, simple_loss=0.2361, pruned_loss=0.04721, over 4924.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2233, pruned_loss=0.04118, over 973517.49 frames.], batch size: 18, lr: 3.95e-04 2022-05-05 03:57:40,840 INFO [train.py:715] (0/8) Epoch 5, batch 6200, loss[loss=0.1652, simple_loss=0.237, pruned_loss=0.04674, over 4937.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2238, pruned_loss=0.0417, over 972925.02 frames.], batch size: 29, lr: 3.95e-04 2022-05-05 03:58:21,085 INFO [train.py:715] (0/8) Epoch 5, batch 6250, loss[loss=0.1557, simple_loss=0.2262, pruned_loss=0.04256, over 4932.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2227, pruned_loss=0.04116, over 972733.61 frames.], batch size: 23, lr: 3.95e-04 2022-05-05 03:58:59,731 INFO [train.py:715] (0/8) Epoch 5, batch 6300, loss[loss=0.143, simple_loss=0.2218, pruned_loss=0.03205, over 4891.00 frames.], tot_loss[loss=0.152, simple_loss=0.2223, pruned_loss=0.04088, over 972976.22 frames.], batch size: 22, lr: 3.95e-04 2022-05-05 03:59:39,534 INFO [train.py:715] (0/8) Epoch 5, batch 6350, loss[loss=0.1577, simple_loss=0.236, pruned_loss=0.03966, over 4753.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2236, pruned_loss=0.04172, over 972855.36 frames.], batch size: 19, lr: 3.95e-04 2022-05-05 04:00:18,904 INFO [train.py:715] (0/8) Epoch 5, batch 6400, loss[loss=0.1679, simple_loss=0.2318, pruned_loss=0.05197, over 4881.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2231, pruned_loss=0.04155, over 972559.70 frames.], batch size: 19, lr: 3.95e-04 2022-05-05 04:00:57,768 INFO [train.py:715] (0/8) Epoch 5, batch 6450, loss[loss=0.1772, simple_loss=0.262, pruned_loss=0.04624, over 4823.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2234, pruned_loss=0.04216, over 972810.52 frames.], batch size: 26, lr: 3.95e-04 2022-05-05 04:01:37,234 INFO [train.py:715] (0/8) Epoch 5, batch 6500, loss[loss=0.1258, simple_loss=0.1984, pruned_loss=0.02658, over 4838.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2236, pruned_loss=0.04207, over 973402.70 frames.], batch size: 32, lr: 3.95e-04 2022-05-05 04:02:16,586 INFO [train.py:715] (0/8) Epoch 5, batch 6550, loss[loss=0.1279, simple_loss=0.2041, pruned_loss=0.02584, over 4738.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2241, pruned_loss=0.04191, over 973176.08 frames.], batch size: 16, lr: 3.94e-04 2022-05-05 04:02:55,727 INFO [train.py:715] (0/8) Epoch 5, batch 6600, loss[loss=0.1695, simple_loss=0.2519, pruned_loss=0.04359, over 4909.00 frames.], tot_loss[loss=0.1546, simple_loss=0.225, pruned_loss=0.0421, over 973316.22 frames.], batch size: 17, lr: 3.94e-04 2022-05-05 04:03:35,248 INFO [train.py:715] (0/8) Epoch 5, batch 6650, loss[loss=0.1268, simple_loss=0.2091, pruned_loss=0.02228, over 4949.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2246, pruned_loss=0.04207, over 972990.71 frames.], batch size: 21, lr: 3.94e-04 2022-05-05 04:04:15,787 INFO [train.py:715] (0/8) Epoch 5, batch 6700, loss[loss=0.1673, simple_loss=0.2397, pruned_loss=0.04748, over 4984.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2246, pruned_loss=0.04197, over 972629.32 frames.], batch size: 28, lr: 3.94e-04 2022-05-05 04:04:56,110 INFO [train.py:715] (0/8) Epoch 5, batch 6750, loss[loss=0.1471, simple_loss=0.2208, pruned_loss=0.03672, over 4750.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2244, pruned_loss=0.04257, over 972154.07 frames.], batch size: 19, lr: 3.94e-04 2022-05-05 04:05:36,106 INFO [train.py:715] (0/8) Epoch 5, batch 6800, loss[loss=0.1673, simple_loss=0.2299, pruned_loss=0.05238, over 4927.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2247, pruned_loss=0.0424, over 972588.15 frames.], batch size: 23, lr: 3.94e-04 2022-05-05 04:06:16,592 INFO [train.py:715] (0/8) Epoch 5, batch 6850, loss[loss=0.138, simple_loss=0.222, pruned_loss=0.02703, over 4991.00 frames.], tot_loss[loss=0.1548, simple_loss=0.2251, pruned_loss=0.04229, over 972714.67 frames.], batch size: 16, lr: 3.94e-04 2022-05-05 04:06:56,548 INFO [train.py:715] (0/8) Epoch 5, batch 6900, loss[loss=0.1551, simple_loss=0.2266, pruned_loss=0.04174, over 4797.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2246, pruned_loss=0.04224, over 972715.38 frames.], batch size: 24, lr: 3.94e-04 2022-05-05 04:07:37,126 INFO [train.py:715] (0/8) Epoch 5, batch 6950, loss[loss=0.1387, simple_loss=0.2099, pruned_loss=0.03371, over 4989.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2241, pruned_loss=0.04221, over 972618.07 frames.], batch size: 24, lr: 3.94e-04 2022-05-05 04:08:16,566 INFO [train.py:715] (0/8) Epoch 5, batch 7000, loss[loss=0.139, simple_loss=0.2084, pruned_loss=0.03486, over 4868.00 frames.], tot_loss[loss=0.1544, simple_loss=0.2243, pruned_loss=0.04224, over 972869.24 frames.], batch size: 20, lr: 3.94e-04 2022-05-05 04:08:56,463 INFO [train.py:715] (0/8) Epoch 5, batch 7050, loss[loss=0.1814, simple_loss=0.2476, pruned_loss=0.05754, over 4827.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2237, pruned_loss=0.04137, over 972167.56 frames.], batch size: 13, lr: 3.94e-04 2022-05-05 04:09:36,250 INFO [train.py:715] (0/8) Epoch 5, batch 7100, loss[loss=0.1382, simple_loss=0.2075, pruned_loss=0.03443, over 4818.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2227, pruned_loss=0.04072, over 971191.44 frames.], batch size: 26, lr: 3.94e-04 2022-05-05 04:10:15,687 INFO [train.py:715] (0/8) Epoch 5, batch 7150, loss[loss=0.1768, simple_loss=0.229, pruned_loss=0.06229, over 4985.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2229, pruned_loss=0.04105, over 971491.07 frames.], batch size: 15, lr: 3.94e-04 2022-05-05 04:10:55,643 INFO [train.py:715] (0/8) Epoch 5, batch 7200, loss[loss=0.1785, simple_loss=0.2505, pruned_loss=0.05328, over 4959.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2228, pruned_loss=0.04081, over 972203.06 frames.], batch size: 15, lr: 3.94e-04 2022-05-05 04:11:35,239 INFO [train.py:715] (0/8) Epoch 5, batch 7250, loss[loss=0.1502, simple_loss=0.2258, pruned_loss=0.03736, over 4778.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2225, pruned_loss=0.04034, over 971927.93 frames.], batch size: 18, lr: 3.94e-04 2022-05-05 04:12:15,755 INFO [train.py:715] (0/8) Epoch 5, batch 7300, loss[loss=0.1471, simple_loss=0.2175, pruned_loss=0.03836, over 4899.00 frames.], tot_loss[loss=0.152, simple_loss=0.2228, pruned_loss=0.04056, over 971693.36 frames.], batch size: 19, lr: 3.94e-04 2022-05-05 04:12:55,313 INFO [train.py:715] (0/8) Epoch 5, batch 7350, loss[loss=0.1498, simple_loss=0.2153, pruned_loss=0.04211, over 4850.00 frames.], tot_loss[loss=0.153, simple_loss=0.2238, pruned_loss=0.04107, over 972046.35 frames.], batch size: 32, lr: 3.94e-04 2022-05-05 04:13:34,916 INFO [train.py:715] (0/8) Epoch 5, batch 7400, loss[loss=0.1717, simple_loss=0.2457, pruned_loss=0.0489, over 4754.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2236, pruned_loss=0.04084, over 971385.43 frames.], batch size: 19, lr: 3.94e-04 2022-05-05 04:14:14,462 INFO [train.py:715] (0/8) Epoch 5, batch 7450, loss[loss=0.1624, simple_loss=0.2334, pruned_loss=0.04573, over 4685.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2235, pruned_loss=0.04084, over 971443.04 frames.], batch size: 15, lr: 3.93e-04 2022-05-05 04:14:53,549 INFO [train.py:715] (0/8) Epoch 5, batch 7500, loss[loss=0.166, simple_loss=0.2438, pruned_loss=0.04409, over 4952.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2236, pruned_loss=0.04106, over 971860.47 frames.], batch size: 24, lr: 3.93e-04 2022-05-05 04:15:33,686 INFO [train.py:715] (0/8) Epoch 5, batch 7550, loss[loss=0.1777, simple_loss=0.2417, pruned_loss=0.05689, over 4905.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2232, pruned_loss=0.04084, over 971111.12 frames.], batch size: 19, lr: 3.93e-04 2022-05-05 04:16:13,352 INFO [train.py:715] (0/8) Epoch 5, batch 7600, loss[loss=0.1528, simple_loss=0.2186, pruned_loss=0.04354, over 4967.00 frames.], tot_loss[loss=0.152, simple_loss=0.223, pruned_loss=0.04056, over 971707.96 frames.], batch size: 15, lr: 3.93e-04 2022-05-05 04:16:53,607 INFO [train.py:715] (0/8) Epoch 5, batch 7650, loss[loss=0.1677, simple_loss=0.234, pruned_loss=0.05071, over 4939.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2229, pruned_loss=0.04081, over 971860.02 frames.], batch size: 21, lr: 3.93e-04 2022-05-05 04:17:33,264 INFO [train.py:715] (0/8) Epoch 5, batch 7700, loss[loss=0.1498, simple_loss=0.2178, pruned_loss=0.04088, over 4790.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2238, pruned_loss=0.04094, over 972054.68 frames.], batch size: 18, lr: 3.93e-04 2022-05-05 04:18:12,778 INFO [train.py:715] (0/8) Epoch 5, batch 7750, loss[loss=0.1612, simple_loss=0.2311, pruned_loss=0.04567, over 4856.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2239, pruned_loss=0.04123, over 972643.89 frames.], batch size: 13, lr: 3.93e-04 2022-05-05 04:18:52,927 INFO [train.py:715] (0/8) Epoch 5, batch 7800, loss[loss=0.1706, simple_loss=0.2274, pruned_loss=0.05691, over 4851.00 frames.], tot_loss[loss=0.1524, simple_loss=0.223, pruned_loss=0.04089, over 972889.19 frames.], batch size: 30, lr: 3.93e-04 2022-05-05 04:19:32,130 INFO [train.py:715] (0/8) Epoch 5, batch 7850, loss[loss=0.1596, simple_loss=0.2336, pruned_loss=0.04281, over 4928.00 frames.], tot_loss[loss=0.1524, simple_loss=0.223, pruned_loss=0.04088, over 972084.91 frames.], batch size: 29, lr: 3.93e-04 2022-05-05 04:20:12,357 INFO [train.py:715] (0/8) Epoch 5, batch 7900, loss[loss=0.1787, simple_loss=0.2433, pruned_loss=0.05702, over 4928.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2226, pruned_loss=0.04091, over 972036.29 frames.], batch size: 18, lr: 3.93e-04 2022-05-05 04:20:51,908 INFO [train.py:715] (0/8) Epoch 5, batch 7950, loss[loss=0.1402, simple_loss=0.216, pruned_loss=0.03219, over 4951.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2229, pruned_loss=0.04128, over 972332.91 frames.], batch size: 23, lr: 3.93e-04 2022-05-05 04:21:32,117 INFO [train.py:715] (0/8) Epoch 5, batch 8000, loss[loss=0.1472, simple_loss=0.2169, pruned_loss=0.03878, over 4846.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2221, pruned_loss=0.04113, over 973096.00 frames.], batch size: 20, lr: 3.93e-04 2022-05-05 04:22:11,570 INFO [train.py:715] (0/8) Epoch 5, batch 8050, loss[loss=0.1633, simple_loss=0.2373, pruned_loss=0.04465, over 4878.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2226, pruned_loss=0.04162, over 972951.12 frames.], batch size: 32, lr: 3.93e-04 2022-05-05 04:22:51,022 INFO [train.py:715] (0/8) Epoch 5, batch 8100, loss[loss=0.1688, simple_loss=0.2445, pruned_loss=0.04655, over 4842.00 frames.], tot_loss[loss=0.1531, simple_loss=0.223, pruned_loss=0.04159, over 973118.48 frames.], batch size: 15, lr: 3.93e-04 2022-05-05 04:23:30,809 INFO [train.py:715] (0/8) Epoch 5, batch 8150, loss[loss=0.1535, simple_loss=0.2214, pruned_loss=0.04279, over 4876.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2234, pruned_loss=0.04183, over 972495.24 frames.], batch size: 38, lr: 3.93e-04 2022-05-05 04:24:09,995 INFO [train.py:715] (0/8) Epoch 5, batch 8200, loss[loss=0.1906, simple_loss=0.2494, pruned_loss=0.06586, over 4884.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2234, pruned_loss=0.04162, over 972664.25 frames.], batch size: 13, lr: 3.93e-04 2022-05-05 04:24:50,013 INFO [train.py:715] (0/8) Epoch 5, batch 8250, loss[loss=0.1308, simple_loss=0.1997, pruned_loss=0.03097, over 4833.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2231, pruned_loss=0.04164, over 971803.20 frames.], batch size: 26, lr: 3.93e-04 2022-05-05 04:25:29,483 INFO [train.py:715] (0/8) Epoch 5, batch 8300, loss[loss=0.2011, simple_loss=0.2562, pruned_loss=0.07298, over 4640.00 frames.], tot_loss[loss=0.153, simple_loss=0.2234, pruned_loss=0.04126, over 972419.29 frames.], batch size: 13, lr: 3.93e-04 2022-05-05 04:26:09,426 INFO [train.py:715] (0/8) Epoch 5, batch 8350, loss[loss=0.1372, simple_loss=0.2154, pruned_loss=0.02951, over 4960.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2231, pruned_loss=0.04096, over 973294.90 frames.], batch size: 24, lr: 3.93e-04 2022-05-05 04:26:48,501 INFO [train.py:715] (0/8) Epoch 5, batch 8400, loss[loss=0.1778, simple_loss=0.2511, pruned_loss=0.05224, over 4946.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2235, pruned_loss=0.04162, over 973734.42 frames.], batch size: 24, lr: 3.92e-04 2022-05-05 04:27:27,553 INFO [train.py:715] (0/8) Epoch 5, batch 8450, loss[loss=0.1557, simple_loss=0.2198, pruned_loss=0.0458, over 4817.00 frames.], tot_loss[loss=0.1532, simple_loss=0.223, pruned_loss=0.04168, over 973017.86 frames.], batch size: 27, lr: 3.92e-04 2022-05-05 04:28:06,817 INFO [train.py:715] (0/8) Epoch 5, batch 8500, loss[loss=0.1706, simple_loss=0.2303, pruned_loss=0.05548, over 4793.00 frames.], tot_loss[loss=0.1532, simple_loss=0.223, pruned_loss=0.04171, over 972609.28 frames.], batch size: 14, lr: 3.92e-04 2022-05-05 04:28:45,802 INFO [train.py:715] (0/8) Epoch 5, batch 8550, loss[loss=0.1505, simple_loss=0.2195, pruned_loss=0.04077, over 4955.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2228, pruned_loss=0.04165, over 973086.98 frames.], batch size: 24, lr: 3.92e-04 2022-05-05 04:29:25,246 INFO [train.py:715] (0/8) Epoch 5, batch 8600, loss[loss=0.1471, simple_loss=0.2202, pruned_loss=0.03696, over 4938.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2232, pruned_loss=0.04146, over 973125.30 frames.], batch size: 39, lr: 3.92e-04 2022-05-05 04:30:04,415 INFO [train.py:715] (0/8) Epoch 5, batch 8650, loss[loss=0.1576, simple_loss=0.2152, pruned_loss=0.04997, over 4864.00 frames.], tot_loss[loss=0.1527, simple_loss=0.223, pruned_loss=0.0412, over 973716.38 frames.], batch size: 20, lr: 3.92e-04 2022-05-05 04:30:43,886 INFO [train.py:715] (0/8) Epoch 5, batch 8700, loss[loss=0.1865, simple_loss=0.2518, pruned_loss=0.06062, over 4969.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2216, pruned_loss=0.04059, over 973685.04 frames.], batch size: 35, lr: 3.92e-04 2022-05-05 04:31:23,271 INFO [train.py:715] (0/8) Epoch 5, batch 8750, loss[loss=0.1803, simple_loss=0.2484, pruned_loss=0.05612, over 4706.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2219, pruned_loss=0.04082, over 972746.83 frames.], batch size: 15, lr: 3.92e-04 2022-05-05 04:32:02,280 INFO [train.py:715] (0/8) Epoch 5, batch 8800, loss[loss=0.1579, simple_loss=0.2259, pruned_loss=0.04498, over 4917.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2232, pruned_loss=0.04148, over 972858.11 frames.], batch size: 17, lr: 3.92e-04 2022-05-05 04:32:42,162 INFO [train.py:715] (0/8) Epoch 5, batch 8850, loss[loss=0.1525, simple_loss=0.2336, pruned_loss=0.03568, over 4701.00 frames.], tot_loss[loss=0.153, simple_loss=0.223, pruned_loss=0.04147, over 972727.86 frames.], batch size: 15, lr: 3.92e-04 2022-05-05 04:33:20,885 INFO [train.py:715] (0/8) Epoch 5, batch 8900, loss[loss=0.1513, simple_loss=0.2236, pruned_loss=0.03956, over 4790.00 frames.], tot_loss[loss=0.1531, simple_loss=0.223, pruned_loss=0.04157, over 972319.96 frames.], batch size: 18, lr: 3.92e-04 2022-05-05 04:33:59,743 INFO [train.py:715] (0/8) Epoch 5, batch 8950, loss[loss=0.1538, simple_loss=0.2232, pruned_loss=0.04219, over 4816.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2236, pruned_loss=0.04201, over 971368.47 frames.], batch size: 13, lr: 3.92e-04 2022-05-05 04:34:39,035 INFO [train.py:715] (0/8) Epoch 5, batch 9000, loss[loss=0.1513, simple_loss=0.2274, pruned_loss=0.03756, over 4972.00 frames.], tot_loss[loss=0.1547, simple_loss=0.2244, pruned_loss=0.04251, over 971821.31 frames.], batch size: 15, lr: 3.92e-04 2022-05-05 04:34:39,036 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 04:34:48,553 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1105, simple_loss=0.196, pruned_loss=0.01252, over 914524.00 frames. 2022-05-05 04:35:28,195 INFO [train.py:715] (0/8) Epoch 5, batch 9050, loss[loss=0.1668, simple_loss=0.2316, pruned_loss=0.05102, over 4753.00 frames.], tot_loss[loss=0.1555, simple_loss=0.2254, pruned_loss=0.0428, over 971001.12 frames.], batch size: 19, lr: 3.92e-04 2022-05-05 04:36:07,672 INFO [train.py:715] (0/8) Epoch 5, batch 9100, loss[loss=0.1313, simple_loss=0.201, pruned_loss=0.03078, over 4972.00 frames.], tot_loss[loss=0.1554, simple_loss=0.225, pruned_loss=0.04291, over 971543.51 frames.], batch size: 24, lr: 3.92e-04 2022-05-05 04:36:46,715 INFO [train.py:715] (0/8) Epoch 5, batch 9150, loss[loss=0.1428, simple_loss=0.2115, pruned_loss=0.037, over 4904.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2232, pruned_loss=0.04195, over 972126.01 frames.], batch size: 18, lr: 3.92e-04 2022-05-05 04:37:26,207 INFO [train.py:715] (0/8) Epoch 5, batch 9200, loss[loss=0.1613, simple_loss=0.2099, pruned_loss=0.05632, over 4967.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2239, pruned_loss=0.04214, over 972459.65 frames.], batch size: 35, lr: 3.92e-04 2022-05-05 04:38:06,420 INFO [train.py:715] (0/8) Epoch 5, batch 9250, loss[loss=0.1559, simple_loss=0.2218, pruned_loss=0.04499, over 4867.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2248, pruned_loss=0.04204, over 972713.55 frames.], batch size: 32, lr: 3.92e-04 2022-05-05 04:38:45,295 INFO [train.py:715] (0/8) Epoch 5, batch 9300, loss[loss=0.1174, simple_loss=0.1983, pruned_loss=0.01821, over 4949.00 frames.], tot_loss[loss=0.1543, simple_loss=0.2249, pruned_loss=0.04187, over 972694.77 frames.], batch size: 21, lr: 3.91e-04 2022-05-05 04:39:24,929 INFO [train.py:715] (0/8) Epoch 5, batch 9350, loss[loss=0.1644, simple_loss=0.2349, pruned_loss=0.04698, over 4912.00 frames.], tot_loss[loss=0.154, simple_loss=0.2241, pruned_loss=0.04199, over 972767.77 frames.], batch size: 17, lr: 3.91e-04 2022-05-05 04:40:04,422 INFO [train.py:715] (0/8) Epoch 5, batch 9400, loss[loss=0.1463, simple_loss=0.2176, pruned_loss=0.0375, over 4882.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2238, pruned_loss=0.04171, over 972026.07 frames.], batch size: 16, lr: 3.91e-04 2022-05-05 04:40:43,712 INFO [train.py:715] (0/8) Epoch 5, batch 9450, loss[loss=0.1762, simple_loss=0.2448, pruned_loss=0.05377, over 4827.00 frames.], tot_loss[loss=0.154, simple_loss=0.2238, pruned_loss=0.04214, over 972238.57 frames.], batch size: 15, lr: 3.91e-04 2022-05-05 04:41:22,594 INFO [train.py:715] (0/8) Epoch 5, batch 9500, loss[loss=0.1689, simple_loss=0.2384, pruned_loss=0.04973, over 4927.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2229, pruned_loss=0.0414, over 972846.97 frames.], batch size: 18, lr: 3.91e-04 2022-05-05 04:42:02,155 INFO [train.py:715] (0/8) Epoch 5, batch 9550, loss[loss=0.1555, simple_loss=0.2163, pruned_loss=0.04738, over 4799.00 frames.], tot_loss[loss=0.154, simple_loss=0.2242, pruned_loss=0.04191, over 973154.64 frames.], batch size: 12, lr: 3.91e-04 2022-05-05 04:42:41,920 INFO [train.py:715] (0/8) Epoch 5, batch 9600, loss[loss=0.1451, simple_loss=0.2202, pruned_loss=0.035, over 4933.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2235, pruned_loss=0.04166, over 972547.00 frames.], batch size: 39, lr: 3.91e-04 2022-05-05 04:43:21,156 INFO [train.py:715] (0/8) Epoch 5, batch 9650, loss[loss=0.1296, simple_loss=0.2011, pruned_loss=0.02908, over 4966.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2235, pruned_loss=0.04137, over 972000.99 frames.], batch size: 14, lr: 3.91e-04 2022-05-05 04:44:00,815 INFO [train.py:715] (0/8) Epoch 5, batch 9700, loss[loss=0.1488, simple_loss=0.2127, pruned_loss=0.04248, over 4975.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2232, pruned_loss=0.04112, over 972386.41 frames.], batch size: 14, lr: 3.91e-04 2022-05-05 04:44:40,234 INFO [train.py:715] (0/8) Epoch 5, batch 9750, loss[loss=0.1391, simple_loss=0.2091, pruned_loss=0.03454, over 4905.00 frames.], tot_loss[loss=0.152, simple_loss=0.2227, pruned_loss=0.04066, over 972220.95 frames.], batch size: 19, lr: 3.91e-04 2022-05-05 04:45:19,136 INFO [train.py:715] (0/8) Epoch 5, batch 9800, loss[loss=0.1459, simple_loss=0.2327, pruned_loss=0.0296, over 4990.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2228, pruned_loss=0.04048, over 973074.94 frames.], batch size: 25, lr: 3.91e-04 2022-05-05 04:45:58,978 INFO [train.py:715] (0/8) Epoch 5, batch 9850, loss[loss=0.1531, simple_loss=0.2286, pruned_loss=0.03885, over 4852.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2231, pruned_loss=0.04108, over 972514.31 frames.], batch size: 20, lr: 3.91e-04 2022-05-05 04:46:38,172 INFO [train.py:715] (0/8) Epoch 5, batch 9900, loss[loss=0.1183, simple_loss=0.1721, pruned_loss=0.03226, over 4788.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2231, pruned_loss=0.04114, over 972803.29 frames.], batch size: 12, lr: 3.91e-04 2022-05-05 04:47:17,939 INFO [train.py:715] (0/8) Epoch 5, batch 9950, loss[loss=0.1694, simple_loss=0.2368, pruned_loss=0.05104, over 4771.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2238, pruned_loss=0.0414, over 972675.43 frames.], batch size: 17, lr: 3.91e-04 2022-05-05 04:47:19,455 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-184000.pt 2022-05-05 04:47:59,851 INFO [train.py:715] (0/8) Epoch 5, batch 10000, loss[loss=0.1454, simple_loss=0.2221, pruned_loss=0.03431, over 4925.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2234, pruned_loss=0.04167, over 972454.91 frames.], batch size: 18, lr: 3.91e-04 2022-05-05 04:48:39,804 INFO [train.py:715] (0/8) Epoch 5, batch 10050, loss[loss=0.1384, simple_loss=0.214, pruned_loss=0.03138, over 4821.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2235, pruned_loss=0.04216, over 972489.50 frames.], batch size: 26, lr: 3.91e-04 2022-05-05 04:49:19,416 INFO [train.py:715] (0/8) Epoch 5, batch 10100, loss[loss=0.1326, simple_loss=0.2073, pruned_loss=0.02893, over 4973.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2229, pruned_loss=0.04169, over 973149.30 frames.], batch size: 24, lr: 3.91e-04 2022-05-05 04:49:58,582 INFO [train.py:715] (0/8) Epoch 5, batch 10150, loss[loss=0.1872, simple_loss=0.2506, pruned_loss=0.06186, over 4907.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2231, pruned_loss=0.04175, over 972174.85 frames.], batch size: 17, lr: 3.91e-04 2022-05-05 04:50:38,453 INFO [train.py:715] (0/8) Epoch 5, batch 10200, loss[loss=0.1455, simple_loss=0.2019, pruned_loss=0.04457, over 4753.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2234, pruned_loss=0.04213, over 972466.64 frames.], batch size: 12, lr: 3.91e-04 2022-05-05 04:51:17,792 INFO [train.py:715] (0/8) Epoch 5, batch 10250, loss[loss=0.1629, simple_loss=0.2177, pruned_loss=0.054, over 4734.00 frames.], tot_loss[loss=0.1541, simple_loss=0.2235, pruned_loss=0.04238, over 971612.39 frames.], batch size: 16, lr: 3.90e-04 2022-05-05 04:51:56,802 INFO [train.py:715] (0/8) Epoch 5, batch 10300, loss[loss=0.125, simple_loss=0.1984, pruned_loss=0.0258, over 4747.00 frames.], tot_loss[loss=0.1526, simple_loss=0.222, pruned_loss=0.04155, over 970265.61 frames.], batch size: 19, lr: 3.90e-04 2022-05-05 04:52:36,626 INFO [train.py:715] (0/8) Epoch 5, batch 10350, loss[loss=0.1377, simple_loss=0.209, pruned_loss=0.03322, over 4968.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2224, pruned_loss=0.04209, over 971181.30 frames.], batch size: 24, lr: 3.90e-04 2022-05-05 04:53:15,664 INFO [train.py:715] (0/8) Epoch 5, batch 10400, loss[loss=0.1613, simple_loss=0.2237, pruned_loss=0.04946, over 4898.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2219, pruned_loss=0.04166, over 971559.57 frames.], batch size: 19, lr: 3.90e-04 2022-05-05 04:53:55,616 INFO [train.py:715] (0/8) Epoch 5, batch 10450, loss[loss=0.1601, simple_loss=0.2407, pruned_loss=0.03978, over 4832.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2224, pruned_loss=0.04193, over 971120.93 frames.], batch size: 15, lr: 3.90e-04 2022-05-05 04:54:35,514 INFO [train.py:715] (0/8) Epoch 5, batch 10500, loss[loss=0.1286, simple_loss=0.2053, pruned_loss=0.02595, over 4907.00 frames.], tot_loss[loss=0.153, simple_loss=0.222, pruned_loss=0.04203, over 972191.20 frames.], batch size: 17, lr: 3.90e-04 2022-05-05 04:55:15,978 INFO [train.py:715] (0/8) Epoch 5, batch 10550, loss[loss=0.1516, simple_loss=0.2315, pruned_loss=0.03585, over 4776.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2224, pruned_loss=0.04158, over 972898.15 frames.], batch size: 14, lr: 3.90e-04 2022-05-05 04:55:55,072 INFO [train.py:715] (0/8) Epoch 5, batch 10600, loss[loss=0.1474, simple_loss=0.2207, pruned_loss=0.03705, over 4940.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2229, pruned_loss=0.04145, over 972969.57 frames.], batch size: 29, lr: 3.90e-04 2022-05-05 04:56:34,540 INFO [train.py:715] (0/8) Epoch 5, batch 10650, loss[loss=0.1543, simple_loss=0.2429, pruned_loss=0.03285, over 4859.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2239, pruned_loss=0.0419, over 972989.21 frames.], batch size: 20, lr: 3.90e-04 2022-05-05 04:57:14,067 INFO [train.py:715] (0/8) Epoch 5, batch 10700, loss[loss=0.1495, simple_loss=0.2218, pruned_loss=0.03865, over 4865.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2251, pruned_loss=0.04231, over 972064.43 frames.], batch size: 22, lr: 3.90e-04 2022-05-05 04:57:53,028 INFO [train.py:715] (0/8) Epoch 5, batch 10750, loss[loss=0.1462, simple_loss=0.2262, pruned_loss=0.03314, over 4768.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2241, pruned_loss=0.04135, over 971959.18 frames.], batch size: 17, lr: 3.90e-04 2022-05-05 04:58:32,276 INFO [train.py:715] (0/8) Epoch 5, batch 10800, loss[loss=0.1789, simple_loss=0.2474, pruned_loss=0.05522, over 4830.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2244, pruned_loss=0.04165, over 971918.47 frames.], batch size: 15, lr: 3.90e-04 2022-05-05 04:59:11,505 INFO [train.py:715] (0/8) Epoch 5, batch 10850, loss[loss=0.1489, simple_loss=0.2251, pruned_loss=0.03638, over 4956.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2227, pruned_loss=0.04076, over 971267.28 frames.], batch size: 21, lr: 3.90e-04 2022-05-05 04:59:51,498 INFO [train.py:715] (0/8) Epoch 5, batch 10900, loss[loss=0.1836, simple_loss=0.2575, pruned_loss=0.05483, over 4777.00 frames.], tot_loss[loss=0.153, simple_loss=0.2235, pruned_loss=0.04123, over 970634.26 frames.], batch size: 14, lr: 3.90e-04 2022-05-05 05:00:30,694 INFO [train.py:715] (0/8) Epoch 5, batch 10950, loss[loss=0.1525, simple_loss=0.2253, pruned_loss=0.03982, over 4922.00 frames.], tot_loss[loss=0.152, simple_loss=0.2226, pruned_loss=0.04065, over 970603.29 frames.], batch size: 18, lr: 3.90e-04 2022-05-05 05:01:10,469 INFO [train.py:715] (0/8) Epoch 5, batch 11000, loss[loss=0.1668, simple_loss=0.2364, pruned_loss=0.04857, over 4874.00 frames.], tot_loss[loss=0.152, simple_loss=0.2231, pruned_loss=0.04045, over 970908.70 frames.], batch size: 16, lr: 3.90e-04 2022-05-05 05:01:49,961 INFO [train.py:715] (0/8) Epoch 5, batch 11050, loss[loss=0.1239, simple_loss=0.1947, pruned_loss=0.02654, over 4799.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2223, pruned_loss=0.04032, over 970360.71 frames.], batch size: 12, lr: 3.90e-04 2022-05-05 05:02:29,387 INFO [train.py:715] (0/8) Epoch 5, batch 11100, loss[loss=0.1521, simple_loss=0.22, pruned_loss=0.04207, over 4753.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2224, pruned_loss=0.04071, over 971362.37 frames.], batch size: 19, lr: 3.90e-04 2022-05-05 05:03:08,928 INFO [train.py:715] (0/8) Epoch 5, batch 11150, loss[loss=0.1434, simple_loss=0.2107, pruned_loss=0.03805, over 4977.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2224, pruned_loss=0.04031, over 971654.91 frames.], batch size: 25, lr: 3.90e-04 2022-05-05 05:03:48,018 INFO [train.py:715] (0/8) Epoch 5, batch 11200, loss[loss=0.1421, simple_loss=0.2176, pruned_loss=0.0333, over 4878.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2223, pruned_loss=0.04005, over 971506.28 frames.], batch size: 32, lr: 3.89e-04 2022-05-05 05:04:27,935 INFO [train.py:715] (0/8) Epoch 5, batch 11250, loss[loss=0.1412, simple_loss=0.2098, pruned_loss=0.03629, over 4821.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2225, pruned_loss=0.04066, over 971556.88 frames.], batch size: 27, lr: 3.89e-04 2022-05-05 05:05:07,255 INFO [train.py:715] (0/8) Epoch 5, batch 11300, loss[loss=0.1687, simple_loss=0.2319, pruned_loss=0.05272, over 4875.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2223, pruned_loss=0.04074, over 971609.34 frames.], batch size: 32, lr: 3.89e-04 2022-05-05 05:05:46,393 INFO [train.py:715] (0/8) Epoch 5, batch 11350, loss[loss=0.1505, simple_loss=0.2149, pruned_loss=0.043, over 4906.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2225, pruned_loss=0.04102, over 971724.45 frames.], batch size: 39, lr: 3.89e-04 2022-05-05 05:06:27,198 INFO [train.py:715] (0/8) Epoch 5, batch 11400, loss[loss=0.1703, simple_loss=0.2366, pruned_loss=0.05196, over 4885.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2233, pruned_loss=0.04184, over 971823.65 frames.], batch size: 32, lr: 3.89e-04 2022-05-05 05:07:07,355 INFO [train.py:715] (0/8) Epoch 5, batch 11450, loss[loss=0.1392, simple_loss=0.2185, pruned_loss=0.02993, over 4930.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2238, pruned_loss=0.04185, over 972113.38 frames.], batch size: 39, lr: 3.89e-04 2022-05-05 05:07:47,391 INFO [train.py:715] (0/8) Epoch 5, batch 11500, loss[loss=0.1581, simple_loss=0.232, pruned_loss=0.04216, over 4819.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2235, pruned_loss=0.04174, over 971989.40 frames.], batch size: 13, lr: 3.89e-04 2022-05-05 05:08:27,418 INFO [train.py:715] (0/8) Epoch 5, batch 11550, loss[loss=0.125, simple_loss=0.1952, pruned_loss=0.02739, over 4982.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2231, pruned_loss=0.04141, over 971805.71 frames.], batch size: 25, lr: 3.89e-04 2022-05-05 05:09:07,605 INFO [train.py:715] (0/8) Epoch 5, batch 11600, loss[loss=0.1631, simple_loss=0.2351, pruned_loss=0.04554, over 4851.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2229, pruned_loss=0.04108, over 972244.79 frames.], batch size: 30, lr: 3.89e-04 2022-05-05 05:09:48,309 INFO [train.py:715] (0/8) Epoch 5, batch 11650, loss[loss=0.1528, simple_loss=0.2243, pruned_loss=0.04063, over 4960.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2233, pruned_loss=0.04111, over 972468.38 frames.], batch size: 35, lr: 3.89e-04 2022-05-05 05:10:28,061 INFO [train.py:715] (0/8) Epoch 5, batch 11700, loss[loss=0.1634, simple_loss=0.234, pruned_loss=0.04643, over 4863.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2238, pruned_loss=0.04136, over 972905.76 frames.], batch size: 20, lr: 3.89e-04 2022-05-05 05:11:08,774 INFO [train.py:715] (0/8) Epoch 5, batch 11750, loss[loss=0.1394, simple_loss=0.2059, pruned_loss=0.03644, over 4796.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2238, pruned_loss=0.04152, over 972537.95 frames.], batch size: 14, lr: 3.89e-04 2022-05-05 05:11:48,921 INFO [train.py:715] (0/8) Epoch 5, batch 11800, loss[loss=0.138, simple_loss=0.2116, pruned_loss=0.03225, over 4928.00 frames.], tot_loss[loss=0.1526, simple_loss=0.223, pruned_loss=0.0411, over 972403.85 frames.], batch size: 23, lr: 3.89e-04 2022-05-05 05:12:29,041 INFO [train.py:715] (0/8) Epoch 5, batch 11850, loss[loss=0.137, simple_loss=0.2115, pruned_loss=0.03123, over 4805.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2227, pruned_loss=0.04101, over 972420.76 frames.], batch size: 21, lr: 3.89e-04 2022-05-05 05:13:08,180 INFO [train.py:715] (0/8) Epoch 5, batch 11900, loss[loss=0.1745, simple_loss=0.2462, pruned_loss=0.05141, over 4809.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2226, pruned_loss=0.04064, over 972224.94 frames.], batch size: 25, lr: 3.89e-04 2022-05-05 05:13:47,505 INFO [train.py:715] (0/8) Epoch 5, batch 11950, loss[loss=0.1523, simple_loss=0.2247, pruned_loss=0.03997, over 4704.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2224, pruned_loss=0.04032, over 971944.28 frames.], batch size: 15, lr: 3.89e-04 2022-05-05 05:14:27,514 INFO [train.py:715] (0/8) Epoch 5, batch 12000, loss[loss=0.1471, simple_loss=0.2171, pruned_loss=0.03851, over 4804.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2216, pruned_loss=0.04008, over 971764.41 frames.], batch size: 25, lr: 3.89e-04 2022-05-05 05:14:27,515 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 05:14:37,327 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1103, simple_loss=0.1957, pruned_loss=0.01243, over 914524.00 frames. 2022-05-05 05:15:17,597 INFO [train.py:715] (0/8) Epoch 5, batch 12050, loss[loss=0.1509, simple_loss=0.2109, pruned_loss=0.04541, over 4898.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2222, pruned_loss=0.04042, over 972306.30 frames.], batch size: 17, lr: 3.89e-04 2022-05-05 05:15:57,243 INFO [train.py:715] (0/8) Epoch 5, batch 12100, loss[loss=0.1396, simple_loss=0.2172, pruned_loss=0.03098, over 4815.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2229, pruned_loss=0.04029, over 972010.70 frames.], batch size: 21, lr: 3.89e-04 2022-05-05 05:16:36,754 INFO [train.py:715] (0/8) Epoch 5, batch 12150, loss[loss=0.1558, simple_loss=0.2293, pruned_loss=0.04114, over 4881.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2224, pruned_loss=0.04034, over 971770.18 frames.], batch size: 22, lr: 3.88e-04 2022-05-05 05:17:16,020 INFO [train.py:715] (0/8) Epoch 5, batch 12200, loss[loss=0.1944, simple_loss=0.2485, pruned_loss=0.07017, over 4835.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2226, pruned_loss=0.04062, over 971706.04 frames.], batch size: 13, lr: 3.88e-04 2022-05-05 05:17:56,094 INFO [train.py:715] (0/8) Epoch 5, batch 12250, loss[loss=0.1454, simple_loss=0.2202, pruned_loss=0.0353, over 4754.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2223, pruned_loss=0.04032, over 971337.15 frames.], batch size: 19, lr: 3.88e-04 2022-05-05 05:18:35,379 INFO [train.py:715] (0/8) Epoch 5, batch 12300, loss[loss=0.1569, simple_loss=0.2242, pruned_loss=0.0448, over 4763.00 frames.], tot_loss[loss=0.1526, simple_loss=0.223, pruned_loss=0.04108, over 971933.88 frames.], batch size: 14, lr: 3.88e-04 2022-05-05 05:19:14,278 INFO [train.py:715] (0/8) Epoch 5, batch 12350, loss[loss=0.1309, simple_loss=0.2007, pruned_loss=0.03057, over 4906.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2231, pruned_loss=0.04074, over 972215.21 frames.], batch size: 22, lr: 3.88e-04 2022-05-05 05:19:53,842 INFO [train.py:715] (0/8) Epoch 5, batch 12400, loss[loss=0.1371, simple_loss=0.2046, pruned_loss=0.03478, over 4979.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2225, pruned_loss=0.04061, over 972761.96 frames.], batch size: 25, lr: 3.88e-04 2022-05-05 05:20:33,430 INFO [train.py:715] (0/8) Epoch 5, batch 12450, loss[loss=0.1484, simple_loss=0.2172, pruned_loss=0.03983, over 4701.00 frames.], tot_loss[loss=0.152, simple_loss=0.2228, pruned_loss=0.04064, over 973178.36 frames.], batch size: 15, lr: 3.88e-04 2022-05-05 05:21:12,662 INFO [train.py:715] (0/8) Epoch 5, batch 12500, loss[loss=0.1483, simple_loss=0.2168, pruned_loss=0.03993, over 4982.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2236, pruned_loss=0.04105, over 972607.51 frames.], batch size: 15, lr: 3.88e-04 2022-05-05 05:21:51,876 INFO [train.py:715] (0/8) Epoch 5, batch 12550, loss[loss=0.1337, simple_loss=0.2122, pruned_loss=0.02762, over 4891.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2238, pruned_loss=0.04104, over 973099.44 frames.], batch size: 22, lr: 3.88e-04 2022-05-05 05:22:30,625 INFO [train.py:715] (0/8) Epoch 5, batch 12600, loss[loss=0.1474, simple_loss=0.2175, pruned_loss=0.03867, over 4963.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2232, pruned_loss=0.04054, over 973337.84 frames.], batch size: 15, lr: 3.88e-04 2022-05-05 05:23:08,925 INFO [train.py:715] (0/8) Epoch 5, batch 12650, loss[loss=0.1483, simple_loss=0.2106, pruned_loss=0.04302, over 4825.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2233, pruned_loss=0.0411, over 973357.95 frames.], batch size: 30, lr: 3.88e-04 2022-05-05 05:23:47,145 INFO [train.py:715] (0/8) Epoch 5, batch 12700, loss[loss=0.135, simple_loss=0.2056, pruned_loss=0.03224, over 4885.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2228, pruned_loss=0.04121, over 972500.57 frames.], batch size: 32, lr: 3.88e-04 2022-05-05 05:24:27,016 INFO [train.py:715] (0/8) Epoch 5, batch 12750, loss[loss=0.1335, simple_loss=0.2093, pruned_loss=0.02883, over 4929.00 frames.], tot_loss[loss=0.153, simple_loss=0.2236, pruned_loss=0.04124, over 973164.69 frames.], batch size: 23, lr: 3.88e-04 2022-05-05 05:25:06,589 INFO [train.py:715] (0/8) Epoch 5, batch 12800, loss[loss=0.1691, simple_loss=0.2315, pruned_loss=0.05334, over 4844.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2239, pruned_loss=0.04141, over 973129.14 frames.], batch size: 34, lr: 3.88e-04 2022-05-05 05:25:46,755 INFO [train.py:715] (0/8) Epoch 5, batch 12850, loss[loss=0.1811, simple_loss=0.2633, pruned_loss=0.04945, over 4738.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2247, pruned_loss=0.04153, over 972777.20 frames.], batch size: 16, lr: 3.88e-04 2022-05-05 05:26:26,305 INFO [train.py:715] (0/8) Epoch 5, batch 12900, loss[loss=0.1525, simple_loss=0.234, pruned_loss=0.03549, over 4688.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2235, pruned_loss=0.04074, over 972303.17 frames.], batch size: 15, lr: 3.88e-04 2022-05-05 05:27:06,308 INFO [train.py:715] (0/8) Epoch 5, batch 12950, loss[loss=0.1707, simple_loss=0.25, pruned_loss=0.0457, over 4820.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2243, pruned_loss=0.0411, over 972551.63 frames.], batch size: 25, lr: 3.88e-04 2022-05-05 05:27:45,734 INFO [train.py:715] (0/8) Epoch 5, batch 13000, loss[loss=0.1651, simple_loss=0.2395, pruned_loss=0.04531, over 4853.00 frames.], tot_loss[loss=0.1532, simple_loss=0.224, pruned_loss=0.04119, over 972577.40 frames.], batch size: 20, lr: 3.88e-04 2022-05-05 05:28:25,606 INFO [train.py:715] (0/8) Epoch 5, batch 13050, loss[loss=0.1377, simple_loss=0.2058, pruned_loss=0.03478, over 4935.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2235, pruned_loss=0.0409, over 972892.41 frames.], batch size: 23, lr: 3.88e-04 2022-05-05 05:29:03,804 INFO [train.py:715] (0/8) Epoch 5, batch 13100, loss[loss=0.1636, simple_loss=0.241, pruned_loss=0.04312, over 4788.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2234, pruned_loss=0.04094, over 972913.68 frames.], batch size: 17, lr: 3.87e-04 2022-05-05 05:29:42,386 INFO [train.py:715] (0/8) Epoch 5, batch 13150, loss[loss=0.1456, simple_loss=0.2222, pruned_loss=0.03451, over 4913.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2236, pruned_loss=0.04077, over 974048.93 frames.], batch size: 18, lr: 3.87e-04 2022-05-05 05:30:20,476 INFO [train.py:715] (0/8) Epoch 5, batch 13200, loss[loss=0.1473, simple_loss=0.2243, pruned_loss=0.0352, over 4819.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2237, pruned_loss=0.0407, over 974147.18 frames.], batch size: 12, lr: 3.87e-04 2022-05-05 05:30:58,487 INFO [train.py:715] (0/8) Epoch 5, batch 13250, loss[loss=0.1405, simple_loss=0.2087, pruned_loss=0.0361, over 4948.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2239, pruned_loss=0.04096, over 974239.25 frames.], batch size: 21, lr: 3.87e-04 2022-05-05 05:31:37,090 INFO [train.py:715] (0/8) Epoch 5, batch 13300, loss[loss=0.1825, simple_loss=0.2392, pruned_loss=0.06294, over 4883.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2241, pruned_loss=0.04128, over 973543.28 frames.], batch size: 16, lr: 3.87e-04 2022-05-05 05:32:14,950 INFO [train.py:715] (0/8) Epoch 5, batch 13350, loss[loss=0.1862, simple_loss=0.2491, pruned_loss=0.06166, over 4872.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2245, pruned_loss=0.04198, over 973675.14 frames.], batch size: 32, lr: 3.87e-04 2022-05-05 05:32:53,083 INFO [train.py:715] (0/8) Epoch 5, batch 13400, loss[loss=0.1468, simple_loss=0.2049, pruned_loss=0.04434, over 4966.00 frames.], tot_loss[loss=0.154, simple_loss=0.2243, pruned_loss=0.04185, over 973012.83 frames.], batch size: 14, lr: 3.87e-04 2022-05-05 05:33:30,827 INFO [train.py:715] (0/8) Epoch 5, batch 13450, loss[loss=0.1527, simple_loss=0.2256, pruned_loss=0.03989, over 4812.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2236, pruned_loss=0.04181, over 973135.76 frames.], batch size: 13, lr: 3.87e-04 2022-05-05 05:34:09,165 INFO [train.py:715] (0/8) Epoch 5, batch 13500, loss[loss=0.151, simple_loss=0.2248, pruned_loss=0.03857, over 4927.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2237, pruned_loss=0.04159, over 972171.64 frames.], batch size: 29, lr: 3.87e-04 2022-05-05 05:34:47,069 INFO [train.py:715] (0/8) Epoch 5, batch 13550, loss[loss=0.1657, simple_loss=0.2322, pruned_loss=0.04958, over 4956.00 frames.], tot_loss[loss=0.152, simple_loss=0.2224, pruned_loss=0.0408, over 972272.37 frames.], batch size: 24, lr: 3.87e-04 2022-05-05 05:35:24,565 INFO [train.py:715] (0/8) Epoch 5, batch 13600, loss[loss=0.1762, simple_loss=0.2365, pruned_loss=0.05791, over 4840.00 frames.], tot_loss[loss=0.1527, simple_loss=0.223, pruned_loss=0.04118, over 972333.44 frames.], batch size: 30, lr: 3.87e-04 2022-05-05 05:36:03,220 INFO [train.py:715] (0/8) Epoch 5, batch 13650, loss[loss=0.187, simple_loss=0.2585, pruned_loss=0.05773, over 4884.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2236, pruned_loss=0.04163, over 972571.29 frames.], batch size: 16, lr: 3.87e-04 2022-05-05 05:36:41,015 INFO [train.py:715] (0/8) Epoch 5, batch 13700, loss[loss=0.1538, simple_loss=0.2287, pruned_loss=0.03945, over 4928.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2233, pruned_loss=0.04146, over 972289.17 frames.], batch size: 23, lr: 3.87e-04 2022-05-05 05:37:19,072 INFO [train.py:715] (0/8) Epoch 5, batch 13750, loss[loss=0.1588, simple_loss=0.2261, pruned_loss=0.04572, over 4912.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2235, pruned_loss=0.04151, over 972655.44 frames.], batch size: 19, lr: 3.87e-04 2022-05-05 05:37:56,879 INFO [train.py:715] (0/8) Epoch 5, batch 13800, loss[loss=0.1515, simple_loss=0.2289, pruned_loss=0.03701, over 4746.00 frames.], tot_loss[loss=0.154, simple_loss=0.2239, pruned_loss=0.04204, over 973390.10 frames.], batch size: 16, lr: 3.87e-04 2022-05-05 05:38:35,343 INFO [train.py:715] (0/8) Epoch 5, batch 13850, loss[loss=0.1655, simple_loss=0.228, pruned_loss=0.05149, over 4902.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2237, pruned_loss=0.04184, over 972292.31 frames.], batch size: 17, lr: 3.87e-04 2022-05-05 05:39:13,569 INFO [train.py:715] (0/8) Epoch 5, batch 13900, loss[loss=0.1394, simple_loss=0.2118, pruned_loss=0.0335, over 4879.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2222, pruned_loss=0.04106, over 972137.22 frames.], batch size: 32, lr: 3.87e-04 2022-05-05 05:39:51,054 INFO [train.py:715] (0/8) Epoch 5, batch 13950, loss[loss=0.1425, simple_loss=0.2179, pruned_loss=0.03351, over 4798.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2227, pruned_loss=0.04112, over 971944.51 frames.], batch size: 24, lr: 3.87e-04 2022-05-05 05:40:29,783 INFO [train.py:715] (0/8) Epoch 5, batch 14000, loss[loss=0.1513, simple_loss=0.2328, pruned_loss=0.03491, over 4852.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2229, pruned_loss=0.04113, over 972414.79 frames.], batch size: 20, lr: 3.87e-04 2022-05-05 05:41:07,811 INFO [train.py:715] (0/8) Epoch 5, batch 14050, loss[loss=0.1611, simple_loss=0.2307, pruned_loss=0.04574, over 4751.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2221, pruned_loss=0.04074, over 972141.68 frames.], batch size: 12, lr: 3.87e-04 2022-05-05 05:41:45,575 INFO [train.py:715] (0/8) Epoch 5, batch 14100, loss[loss=0.1267, simple_loss=0.2012, pruned_loss=0.02604, over 4814.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2221, pruned_loss=0.04031, over 971861.47 frames.], batch size: 25, lr: 3.86e-04 2022-05-05 05:42:23,451 INFO [train.py:715] (0/8) Epoch 5, batch 14150, loss[loss=0.181, simple_loss=0.2391, pruned_loss=0.06141, over 4919.00 frames.], tot_loss[loss=0.152, simple_loss=0.2225, pruned_loss=0.04077, over 972047.90 frames.], batch size: 18, lr: 3.86e-04 2022-05-05 05:43:01,795 INFO [train.py:715] (0/8) Epoch 5, batch 14200, loss[loss=0.138, simple_loss=0.2027, pruned_loss=0.03666, over 4831.00 frames.], tot_loss[loss=0.152, simple_loss=0.2223, pruned_loss=0.04088, over 971328.47 frames.], batch size: 30, lr: 3.86e-04 2022-05-05 05:43:40,046 INFO [train.py:715] (0/8) Epoch 5, batch 14250, loss[loss=0.168, simple_loss=0.24, pruned_loss=0.04804, over 4943.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2225, pruned_loss=0.0411, over 970701.54 frames.], batch size: 35, lr: 3.86e-04 2022-05-05 05:44:18,048 INFO [train.py:715] (0/8) Epoch 5, batch 14300, loss[loss=0.1671, simple_loss=0.2451, pruned_loss=0.04452, over 4928.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2219, pruned_loss=0.04087, over 970725.82 frames.], batch size: 29, lr: 3.86e-04 2022-05-05 05:44:56,431 INFO [train.py:715] (0/8) Epoch 5, batch 14350, loss[loss=0.1558, simple_loss=0.2324, pruned_loss=0.03962, over 4966.00 frames.], tot_loss[loss=0.1524, simple_loss=0.223, pruned_loss=0.04091, over 971914.97 frames.], batch size: 24, lr: 3.86e-04 2022-05-05 05:45:34,228 INFO [train.py:715] (0/8) Epoch 5, batch 14400, loss[loss=0.1741, simple_loss=0.2415, pruned_loss=0.05333, over 4802.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2225, pruned_loss=0.04065, over 971803.10 frames.], batch size: 24, lr: 3.86e-04 2022-05-05 05:46:11,859 INFO [train.py:715] (0/8) Epoch 5, batch 14450, loss[loss=0.2031, simple_loss=0.2791, pruned_loss=0.06351, over 4958.00 frames.], tot_loss[loss=0.152, simple_loss=0.2227, pruned_loss=0.04063, over 972606.43 frames.], batch size: 15, lr: 3.86e-04 2022-05-05 05:46:49,662 INFO [train.py:715] (0/8) Epoch 5, batch 14500, loss[loss=0.137, simple_loss=0.2128, pruned_loss=0.03059, over 4975.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2232, pruned_loss=0.04048, over 972738.88 frames.], batch size: 24, lr: 3.86e-04 2022-05-05 05:47:27,992 INFO [train.py:715] (0/8) Epoch 5, batch 14550, loss[loss=0.1451, simple_loss=0.2221, pruned_loss=0.03409, over 4776.00 frames.], tot_loss[loss=0.1519, simple_loss=0.223, pruned_loss=0.04043, over 971912.98 frames.], batch size: 12, lr: 3.86e-04 2022-05-05 05:48:06,091 INFO [train.py:715] (0/8) Epoch 5, batch 14600, loss[loss=0.2005, simple_loss=0.2663, pruned_loss=0.0673, over 4753.00 frames.], tot_loss[loss=0.153, simple_loss=0.2241, pruned_loss=0.04098, over 972083.80 frames.], batch size: 19, lr: 3.86e-04 2022-05-05 05:48:44,024 INFO [train.py:715] (0/8) Epoch 5, batch 14650, loss[loss=0.1418, simple_loss=0.2091, pruned_loss=0.03721, over 4697.00 frames.], tot_loss[loss=0.1531, simple_loss=0.224, pruned_loss=0.04106, over 971298.15 frames.], batch size: 15, lr: 3.86e-04 2022-05-05 05:49:22,271 INFO [train.py:715] (0/8) Epoch 5, batch 14700, loss[loss=0.1479, simple_loss=0.2176, pruned_loss=0.03909, over 4699.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2222, pruned_loss=0.04063, over 970546.78 frames.], batch size: 15, lr: 3.86e-04 2022-05-05 05:49:59,642 INFO [train.py:715] (0/8) Epoch 5, batch 14750, loss[loss=0.1334, simple_loss=0.2064, pruned_loss=0.0302, over 4772.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2214, pruned_loss=0.04001, over 970735.50 frames.], batch size: 18, lr: 3.86e-04 2022-05-05 05:50:37,673 INFO [train.py:715] (0/8) Epoch 5, batch 14800, loss[loss=0.1395, simple_loss=0.2083, pruned_loss=0.03529, over 4857.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2216, pruned_loss=0.04016, over 970375.32 frames.], batch size: 22, lr: 3.86e-04 2022-05-05 05:51:15,489 INFO [train.py:715] (0/8) Epoch 5, batch 14850, loss[loss=0.1738, simple_loss=0.2402, pruned_loss=0.05372, over 4910.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2209, pruned_loss=0.03997, over 970197.84 frames.], batch size: 17, lr: 3.86e-04 2022-05-05 05:51:54,086 INFO [train.py:715] (0/8) Epoch 5, batch 14900, loss[loss=0.1517, simple_loss=0.2186, pruned_loss=0.04239, over 4805.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2225, pruned_loss=0.04043, over 970207.29 frames.], batch size: 26, lr: 3.86e-04 2022-05-05 05:52:32,746 INFO [train.py:715] (0/8) Epoch 5, batch 14950, loss[loss=0.1566, simple_loss=0.2299, pruned_loss=0.0417, over 4831.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2231, pruned_loss=0.0409, over 970915.13 frames.], batch size: 26, lr: 3.86e-04 2022-05-05 05:53:10,806 INFO [train.py:715] (0/8) Epoch 5, batch 15000, loss[loss=0.1307, simple_loss=0.1988, pruned_loss=0.0313, over 4872.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2222, pruned_loss=0.04068, over 971674.48 frames.], batch size: 13, lr: 3.86e-04 2022-05-05 05:53:10,807 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 05:53:21,083 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1105, simple_loss=0.1958, pruned_loss=0.01261, over 914524.00 frames. 2022-05-05 05:53:58,554 INFO [train.py:715] (0/8) Epoch 5, batch 15050, loss[loss=0.1629, simple_loss=0.2322, pruned_loss=0.04677, over 4870.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2224, pruned_loss=0.04063, over 971186.27 frames.], batch size: 20, lr: 3.85e-04 2022-05-05 05:54:37,213 INFO [train.py:715] (0/8) Epoch 5, batch 15100, loss[loss=0.126, simple_loss=0.2062, pruned_loss=0.02293, over 4758.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2225, pruned_loss=0.04065, over 971636.40 frames.], batch size: 19, lr: 3.85e-04 2022-05-05 05:55:15,132 INFO [train.py:715] (0/8) Epoch 5, batch 15150, loss[loss=0.1625, simple_loss=0.2266, pruned_loss=0.04919, over 4948.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2225, pruned_loss=0.04038, over 972263.84 frames.], batch size: 21, lr: 3.85e-04 2022-05-05 05:55:53,268 INFO [train.py:715] (0/8) Epoch 5, batch 15200, loss[loss=0.1381, simple_loss=0.218, pruned_loss=0.02912, over 4819.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2215, pruned_loss=0.04003, over 972752.00 frames.], batch size: 25, lr: 3.85e-04 2022-05-05 05:56:32,185 INFO [train.py:715] (0/8) Epoch 5, batch 15250, loss[loss=0.1241, simple_loss=0.1981, pruned_loss=0.02509, over 4762.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2226, pruned_loss=0.03996, over 972116.63 frames.], batch size: 18, lr: 3.85e-04 2022-05-05 05:57:10,895 INFO [train.py:715] (0/8) Epoch 5, batch 15300, loss[loss=0.1547, simple_loss=0.2351, pruned_loss=0.0372, over 4815.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2222, pruned_loss=0.0401, over 972976.42 frames.], batch size: 25, lr: 3.85e-04 2022-05-05 05:57:50,135 INFO [train.py:715] (0/8) Epoch 5, batch 15350, loss[loss=0.1727, simple_loss=0.2472, pruned_loss=0.04908, over 4867.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2221, pruned_loss=0.03961, over 972515.34 frames.], batch size: 20, lr: 3.85e-04 2022-05-05 05:58:28,472 INFO [train.py:715] (0/8) Epoch 5, batch 15400, loss[loss=0.1264, simple_loss=0.2001, pruned_loss=0.02636, over 4827.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2224, pruned_loss=0.04001, over 973383.51 frames.], batch size: 26, lr: 3.85e-04 2022-05-05 05:59:07,517 INFO [train.py:715] (0/8) Epoch 5, batch 15450, loss[loss=0.1482, simple_loss=0.2296, pruned_loss=0.03337, over 4912.00 frames.], tot_loss[loss=0.1521, simple_loss=0.223, pruned_loss=0.04061, over 973434.26 frames.], batch size: 18, lr: 3.85e-04 2022-05-05 05:59:46,048 INFO [train.py:715] (0/8) Epoch 5, batch 15500, loss[loss=0.1658, simple_loss=0.2363, pruned_loss=0.04761, over 4793.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2228, pruned_loss=0.03987, over 973484.94 frames.], batch size: 24, lr: 3.85e-04 2022-05-05 06:00:25,315 INFO [train.py:715] (0/8) Epoch 5, batch 15550, loss[loss=0.1235, simple_loss=0.2008, pruned_loss=0.02306, over 4826.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2236, pruned_loss=0.04068, over 973135.09 frames.], batch size: 13, lr: 3.85e-04 2022-05-05 06:01:03,323 INFO [train.py:715] (0/8) Epoch 5, batch 15600, loss[loss=0.1791, simple_loss=0.2555, pruned_loss=0.05137, over 4967.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2234, pruned_loss=0.04058, over 972899.51 frames.], batch size: 24, lr: 3.85e-04 2022-05-05 06:01:40,923 INFO [train.py:715] (0/8) Epoch 5, batch 15650, loss[loss=0.1588, simple_loss=0.2334, pruned_loss=0.04215, over 4893.00 frames.], tot_loss[loss=0.152, simple_loss=0.2232, pruned_loss=0.04045, over 972443.39 frames.], batch size: 19, lr: 3.85e-04 2022-05-05 06:02:18,443 INFO [train.py:715] (0/8) Epoch 5, batch 15700, loss[loss=0.1561, simple_loss=0.2318, pruned_loss=0.04023, over 4894.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2235, pruned_loss=0.0411, over 972242.64 frames.], batch size: 22, lr: 3.85e-04 2022-05-05 06:02:56,462 INFO [train.py:715] (0/8) Epoch 5, batch 15750, loss[loss=0.1661, simple_loss=0.2312, pruned_loss=0.05053, over 4835.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2232, pruned_loss=0.04111, over 971990.76 frames.], batch size: 15, lr: 3.85e-04 2022-05-05 06:03:34,886 INFO [train.py:715] (0/8) Epoch 5, batch 15800, loss[loss=0.2022, simple_loss=0.2524, pruned_loss=0.07597, over 4875.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2222, pruned_loss=0.04036, over 972275.46 frames.], batch size: 32, lr: 3.85e-04 2022-05-05 06:04:12,952 INFO [train.py:715] (0/8) Epoch 5, batch 15850, loss[loss=0.1531, simple_loss=0.2196, pruned_loss=0.04323, over 4750.00 frames.], tot_loss[loss=0.153, simple_loss=0.2238, pruned_loss=0.04106, over 972441.10 frames.], batch size: 16, lr: 3.85e-04 2022-05-05 06:04:50,527 INFO [train.py:715] (0/8) Epoch 5, batch 15900, loss[loss=0.1215, simple_loss=0.1972, pruned_loss=0.02288, over 4777.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2239, pruned_loss=0.04113, over 972215.43 frames.], batch size: 14, lr: 3.85e-04 2022-05-05 06:05:28,341 INFO [train.py:715] (0/8) Epoch 5, batch 15950, loss[loss=0.1413, simple_loss=0.2184, pruned_loss=0.03216, over 4769.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2232, pruned_loss=0.04082, over 971960.03 frames.], batch size: 17, lr: 3.85e-04 2022-05-05 06:06:05,810 INFO [train.py:715] (0/8) Epoch 5, batch 16000, loss[loss=0.1587, simple_loss=0.2321, pruned_loss=0.04266, over 4768.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2225, pruned_loss=0.04054, over 971738.37 frames.], batch size: 19, lr: 3.85e-04 2022-05-05 06:06:43,533 INFO [train.py:715] (0/8) Epoch 5, batch 16050, loss[loss=0.1469, simple_loss=0.2148, pruned_loss=0.03949, over 4916.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2223, pruned_loss=0.04027, over 972343.01 frames.], batch size: 17, lr: 3.84e-04 2022-05-05 06:07:21,598 INFO [train.py:715] (0/8) Epoch 5, batch 16100, loss[loss=0.1491, simple_loss=0.229, pruned_loss=0.0346, over 4797.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2223, pruned_loss=0.04051, over 971473.59 frames.], batch size: 14, lr: 3.84e-04 2022-05-05 06:08:00,776 INFO [train.py:715] (0/8) Epoch 5, batch 16150, loss[loss=0.1918, simple_loss=0.2476, pruned_loss=0.06797, over 4843.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2224, pruned_loss=0.04094, over 972095.04 frames.], batch size: 15, lr: 3.84e-04 2022-05-05 06:08:39,726 INFO [train.py:715] (0/8) Epoch 5, batch 16200, loss[loss=0.1353, simple_loss=0.2145, pruned_loss=0.02802, over 4975.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2208, pruned_loss=0.03999, over 972419.57 frames.], batch size: 25, lr: 3.84e-04 2022-05-05 06:09:18,287 INFO [train.py:715] (0/8) Epoch 5, batch 16250, loss[loss=0.1329, simple_loss=0.2038, pruned_loss=0.03101, over 4861.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2216, pruned_loss=0.04098, over 972860.17 frames.], batch size: 32, lr: 3.84e-04 2022-05-05 06:09:56,095 INFO [train.py:715] (0/8) Epoch 5, batch 16300, loss[loss=0.1407, simple_loss=0.2113, pruned_loss=0.03501, over 4755.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2229, pruned_loss=0.04131, over 972528.39 frames.], batch size: 19, lr: 3.84e-04 2022-05-05 06:10:34,108 INFO [train.py:715] (0/8) Epoch 5, batch 16350, loss[loss=0.1929, simple_loss=0.2484, pruned_loss=0.06868, over 4933.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2221, pruned_loss=0.04052, over 971909.00 frames.], batch size: 29, lr: 3.84e-04 2022-05-05 06:11:12,492 INFO [train.py:715] (0/8) Epoch 5, batch 16400, loss[loss=0.143, simple_loss=0.2132, pruned_loss=0.0364, over 4949.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2227, pruned_loss=0.04077, over 971405.63 frames.], batch size: 21, lr: 3.84e-04 2022-05-05 06:11:50,948 INFO [train.py:715] (0/8) Epoch 5, batch 16450, loss[loss=0.1394, simple_loss=0.2111, pruned_loss=0.0339, over 4906.00 frames.], tot_loss[loss=0.152, simple_loss=0.2227, pruned_loss=0.04064, over 971824.93 frames.], batch size: 17, lr: 3.84e-04 2022-05-05 06:12:30,298 INFO [train.py:715] (0/8) Epoch 5, batch 16500, loss[loss=0.1223, simple_loss=0.1898, pruned_loss=0.0274, over 4937.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2222, pruned_loss=0.04057, over 971617.85 frames.], batch size: 29, lr: 3.84e-04 2022-05-05 06:13:08,218 INFO [train.py:715] (0/8) Epoch 5, batch 16550, loss[loss=0.1561, simple_loss=0.2247, pruned_loss=0.04377, over 4688.00 frames.], tot_loss[loss=0.1517, simple_loss=0.222, pruned_loss=0.04072, over 971548.21 frames.], batch size: 15, lr: 3.84e-04 2022-05-05 06:13:46,903 INFO [train.py:715] (0/8) Epoch 5, batch 16600, loss[loss=0.1452, simple_loss=0.2209, pruned_loss=0.03475, over 4874.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2223, pruned_loss=0.04048, over 971493.36 frames.], batch size: 30, lr: 3.84e-04 2022-05-05 06:14:25,618 INFO [train.py:715] (0/8) Epoch 5, batch 16650, loss[loss=0.1704, simple_loss=0.2494, pruned_loss=0.04568, over 4921.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2232, pruned_loss=0.04092, over 972253.37 frames.], batch size: 29, lr: 3.84e-04 2022-05-05 06:15:04,293 INFO [train.py:715] (0/8) Epoch 5, batch 16700, loss[loss=0.1764, simple_loss=0.2502, pruned_loss=0.05127, over 4901.00 frames.], tot_loss[loss=0.1522, simple_loss=0.223, pruned_loss=0.04074, over 972951.81 frames.], batch size: 19, lr: 3.84e-04 2022-05-05 06:15:42,482 INFO [train.py:715] (0/8) Epoch 5, batch 16750, loss[loss=0.13, simple_loss=0.2059, pruned_loss=0.02702, over 4782.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2237, pruned_loss=0.04109, over 972846.70 frames.], batch size: 18, lr: 3.84e-04 2022-05-05 06:16:20,934 INFO [train.py:715] (0/8) Epoch 5, batch 16800, loss[loss=0.1436, simple_loss=0.215, pruned_loss=0.03613, over 4803.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2235, pruned_loss=0.04105, over 972468.90 frames.], batch size: 21, lr: 3.84e-04 2022-05-05 06:17:00,068 INFO [train.py:715] (0/8) Epoch 5, batch 16850, loss[loss=0.1363, simple_loss=0.2116, pruned_loss=0.03048, over 4833.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2234, pruned_loss=0.041, over 972887.82 frames.], batch size: 30, lr: 3.84e-04 2022-05-05 06:17:37,928 INFO [train.py:715] (0/8) Epoch 5, batch 16900, loss[loss=0.1557, simple_loss=0.2301, pruned_loss=0.04065, over 4829.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2234, pruned_loss=0.04117, over 973438.09 frames.], batch size: 27, lr: 3.84e-04 2022-05-05 06:18:16,754 INFO [train.py:715] (0/8) Epoch 5, batch 16950, loss[loss=0.1189, simple_loss=0.1953, pruned_loss=0.02125, over 4696.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2231, pruned_loss=0.041, over 972212.57 frames.], batch size: 15, lr: 3.84e-04 2022-05-05 06:18:55,159 INFO [train.py:715] (0/8) Epoch 5, batch 17000, loss[loss=0.1464, simple_loss=0.2104, pruned_loss=0.04121, over 4846.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2235, pruned_loss=0.04151, over 972552.64 frames.], batch size: 32, lr: 3.84e-04 2022-05-05 06:19:33,547 INFO [train.py:715] (0/8) Epoch 5, batch 17050, loss[loss=0.1704, simple_loss=0.2365, pruned_loss=0.05219, over 4963.00 frames.], tot_loss[loss=0.153, simple_loss=0.223, pruned_loss=0.04144, over 972463.05 frames.], batch size: 15, lr: 3.83e-04 2022-05-05 06:20:11,939 INFO [train.py:715] (0/8) Epoch 5, batch 17100, loss[loss=0.1507, simple_loss=0.2182, pruned_loss=0.04163, over 4726.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2227, pruned_loss=0.04113, over 972169.10 frames.], batch size: 12, lr: 3.83e-04 2022-05-05 06:20:49,748 INFO [train.py:715] (0/8) Epoch 5, batch 17150, loss[loss=0.1479, simple_loss=0.215, pruned_loss=0.04044, over 4846.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2229, pruned_loss=0.04127, over 972597.36 frames.], batch size: 32, lr: 3.83e-04 2022-05-05 06:21:27,627 INFO [train.py:715] (0/8) Epoch 5, batch 17200, loss[loss=0.1528, simple_loss=0.2224, pruned_loss=0.04163, over 4836.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2229, pruned_loss=0.04119, over 972233.41 frames.], batch size: 15, lr: 3.83e-04 2022-05-05 06:22:04,733 INFO [train.py:715] (0/8) Epoch 5, batch 17250, loss[loss=0.144, simple_loss=0.2224, pruned_loss=0.03281, over 4912.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2231, pruned_loss=0.04101, over 972316.36 frames.], batch size: 18, lr: 3.83e-04 2022-05-05 06:22:42,968 INFO [train.py:715] (0/8) Epoch 5, batch 17300, loss[loss=0.171, simple_loss=0.236, pruned_loss=0.053, over 4854.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2234, pruned_loss=0.04113, over 972114.81 frames.], batch size: 32, lr: 3.83e-04 2022-05-05 06:23:22,495 INFO [train.py:715] (0/8) Epoch 5, batch 17350, loss[loss=0.1315, simple_loss=0.197, pruned_loss=0.03298, over 4656.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2238, pruned_loss=0.04161, over 972092.18 frames.], batch size: 13, lr: 3.83e-04 2022-05-05 06:24:00,867 INFO [train.py:715] (0/8) Epoch 5, batch 17400, loss[loss=0.1278, simple_loss=0.2029, pruned_loss=0.02635, over 4834.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2239, pruned_loss=0.04191, over 972628.72 frames.], batch size: 13, lr: 3.83e-04 2022-05-05 06:24:39,479 INFO [train.py:715] (0/8) Epoch 5, batch 17450, loss[loss=0.1215, simple_loss=0.195, pruned_loss=0.024, over 4639.00 frames.], tot_loss[loss=0.1528, simple_loss=0.223, pruned_loss=0.04125, over 973112.26 frames.], batch size: 13, lr: 3.83e-04 2022-05-05 06:25:17,952 INFO [train.py:715] (0/8) Epoch 5, batch 17500, loss[loss=0.1547, simple_loss=0.2345, pruned_loss=0.03744, over 4813.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2227, pruned_loss=0.04109, over 972544.59 frames.], batch size: 25, lr: 3.83e-04 2022-05-05 06:25:56,805 INFO [train.py:715] (0/8) Epoch 5, batch 17550, loss[loss=0.1504, simple_loss=0.2334, pruned_loss=0.03365, over 4963.00 frames.], tot_loss[loss=0.152, simple_loss=0.2224, pruned_loss=0.04079, over 972281.88 frames.], batch size: 24, lr: 3.83e-04 2022-05-05 06:26:35,439 INFO [train.py:715] (0/8) Epoch 5, batch 17600, loss[loss=0.1749, simple_loss=0.2495, pruned_loss=0.05017, over 4744.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2226, pruned_loss=0.04119, over 971209.31 frames.], batch size: 16, lr: 3.83e-04 2022-05-05 06:27:14,152 INFO [train.py:715] (0/8) Epoch 5, batch 17650, loss[loss=0.1471, simple_loss=0.22, pruned_loss=0.03715, over 4762.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2215, pruned_loss=0.04035, over 971317.77 frames.], batch size: 17, lr: 3.83e-04 2022-05-05 06:27:52,806 INFO [train.py:715] (0/8) Epoch 5, batch 17700, loss[loss=0.1393, simple_loss=0.2203, pruned_loss=0.02921, over 4984.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2222, pruned_loss=0.04099, over 971715.93 frames.], batch size: 27, lr: 3.83e-04 2022-05-05 06:28:31,725 INFO [train.py:715] (0/8) Epoch 5, batch 17750, loss[loss=0.1499, simple_loss=0.2156, pruned_loss=0.04207, over 4866.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2215, pruned_loss=0.04082, over 972822.19 frames.], batch size: 34, lr: 3.83e-04 2022-05-05 06:29:09,750 INFO [train.py:715] (0/8) Epoch 5, batch 17800, loss[loss=0.1697, simple_loss=0.2197, pruned_loss=0.05984, over 4855.00 frames.], tot_loss[loss=0.152, simple_loss=0.222, pruned_loss=0.041, over 973490.73 frames.], batch size: 12, lr: 3.83e-04 2022-05-05 06:29:48,581 INFO [train.py:715] (0/8) Epoch 5, batch 17850, loss[loss=0.1356, simple_loss=0.2193, pruned_loss=0.026, over 4889.00 frames.], tot_loss[loss=0.1526, simple_loss=0.223, pruned_loss=0.04114, over 972675.42 frames.], batch size: 22, lr: 3.83e-04 2022-05-05 06:30:27,676 INFO [train.py:715] (0/8) Epoch 5, batch 17900, loss[loss=0.2046, simple_loss=0.281, pruned_loss=0.06415, over 4769.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2233, pruned_loss=0.0411, over 972615.33 frames.], batch size: 17, lr: 3.83e-04 2022-05-05 06:31:06,330 INFO [train.py:715] (0/8) Epoch 5, batch 17950, loss[loss=0.1216, simple_loss=0.2011, pruned_loss=0.02101, over 4826.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2235, pruned_loss=0.04092, over 972785.76 frames.], batch size: 26, lr: 3.83e-04 2022-05-05 06:31:07,854 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-192000.pt 2022-05-05 06:31:47,052 INFO [train.py:715] (0/8) Epoch 5, batch 18000, loss[loss=0.1304, simple_loss=0.2045, pruned_loss=0.02812, over 4916.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2224, pruned_loss=0.04053, over 972926.05 frames.], batch size: 18, lr: 3.83e-04 2022-05-05 06:31:47,053 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 06:31:59,754 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1102, simple_loss=0.1955, pruned_loss=0.01245, over 914524.00 frames. 2022-05-05 06:32:38,353 INFO [train.py:715] (0/8) Epoch 5, batch 18050, loss[loss=0.1441, simple_loss=0.2135, pruned_loss=0.03735, over 4805.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2231, pruned_loss=0.0411, over 973058.45 frames.], batch size: 25, lr: 3.82e-04 2022-05-05 06:33:17,593 INFO [train.py:715] (0/8) Epoch 5, batch 18100, loss[loss=0.1588, simple_loss=0.2311, pruned_loss=0.04328, over 4970.00 frames.], tot_loss[loss=0.1534, simple_loss=0.2235, pruned_loss=0.04162, over 973422.53 frames.], batch size: 28, lr: 3.82e-04 2022-05-05 06:33:56,334 INFO [train.py:715] (0/8) Epoch 5, batch 18150, loss[loss=0.14, simple_loss=0.2103, pruned_loss=0.0349, over 4875.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2226, pruned_loss=0.04108, over 972897.70 frames.], batch size: 32, lr: 3.82e-04 2022-05-05 06:34:34,854 INFO [train.py:715] (0/8) Epoch 5, batch 18200, loss[loss=0.1383, simple_loss=0.2093, pruned_loss=0.03368, over 4813.00 frames.], tot_loss[loss=0.1514, simple_loss=0.222, pruned_loss=0.04036, over 971154.95 frames.], batch size: 25, lr: 3.82e-04 2022-05-05 06:35:14,238 INFO [train.py:715] (0/8) Epoch 5, batch 18250, loss[loss=0.1945, simple_loss=0.2653, pruned_loss=0.06191, over 4828.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2212, pruned_loss=0.03988, over 971535.27 frames.], batch size: 15, lr: 3.82e-04 2022-05-05 06:35:53,135 INFO [train.py:715] (0/8) Epoch 5, batch 18300, loss[loss=0.1475, simple_loss=0.2247, pruned_loss=0.03518, over 4820.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2223, pruned_loss=0.04015, over 972192.85 frames.], batch size: 13, lr: 3.82e-04 2022-05-05 06:36:31,707 INFO [train.py:715] (0/8) Epoch 5, batch 18350, loss[loss=0.1545, simple_loss=0.2298, pruned_loss=0.03964, over 4878.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2214, pruned_loss=0.03968, over 971595.75 frames.], batch size: 22, lr: 3.82e-04 2022-05-05 06:37:09,997 INFO [train.py:715] (0/8) Epoch 5, batch 18400, loss[loss=0.1644, simple_loss=0.2415, pruned_loss=0.04366, over 4944.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2218, pruned_loss=0.03979, over 971605.65 frames.], batch size: 39, lr: 3.82e-04 2022-05-05 06:37:49,155 INFO [train.py:715] (0/8) Epoch 5, batch 18450, loss[loss=0.1681, simple_loss=0.2365, pruned_loss=0.04987, over 4893.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2207, pruned_loss=0.03952, over 971498.77 frames.], batch size: 39, lr: 3.82e-04 2022-05-05 06:38:27,813 INFO [train.py:715] (0/8) Epoch 5, batch 18500, loss[loss=0.1462, simple_loss=0.2255, pruned_loss=0.03341, over 4884.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2216, pruned_loss=0.04012, over 970779.21 frames.], batch size: 16, lr: 3.82e-04 2022-05-05 06:39:06,122 INFO [train.py:715] (0/8) Epoch 5, batch 18550, loss[loss=0.1676, simple_loss=0.2305, pruned_loss=0.05234, over 4792.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2215, pruned_loss=0.04049, over 971559.18 frames.], batch size: 17, lr: 3.82e-04 2022-05-05 06:39:45,168 INFO [train.py:715] (0/8) Epoch 5, batch 18600, loss[loss=0.1292, simple_loss=0.2093, pruned_loss=0.02458, over 4816.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2209, pruned_loss=0.04008, over 971444.95 frames.], batch size: 27, lr: 3.82e-04 2022-05-05 06:40:23,777 INFO [train.py:715] (0/8) Epoch 5, batch 18650, loss[loss=0.1304, simple_loss=0.2018, pruned_loss=0.02946, over 4890.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2212, pruned_loss=0.04012, over 971600.71 frames.], batch size: 22, lr: 3.82e-04 2022-05-05 06:41:01,934 INFO [train.py:715] (0/8) Epoch 5, batch 18700, loss[loss=0.1604, simple_loss=0.229, pruned_loss=0.04594, over 4779.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2217, pruned_loss=0.04041, over 971449.68 frames.], batch size: 18, lr: 3.82e-04 2022-05-05 06:41:40,674 INFO [train.py:715] (0/8) Epoch 5, batch 18750, loss[loss=0.1578, simple_loss=0.2254, pruned_loss=0.04508, over 4969.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2227, pruned_loss=0.04099, over 971765.81 frames.], batch size: 15, lr: 3.82e-04 2022-05-05 06:42:19,952 INFO [train.py:715] (0/8) Epoch 5, batch 18800, loss[loss=0.1443, simple_loss=0.2209, pruned_loss=0.03389, over 4763.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2234, pruned_loss=0.04124, over 972166.81 frames.], batch size: 18, lr: 3.82e-04 2022-05-05 06:42:59,656 INFO [train.py:715] (0/8) Epoch 5, batch 18850, loss[loss=0.1222, simple_loss=0.184, pruned_loss=0.0302, over 4980.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2243, pruned_loss=0.04173, over 971394.16 frames.], batch size: 14, lr: 3.82e-04 2022-05-05 06:43:38,445 INFO [train.py:715] (0/8) Epoch 5, batch 18900, loss[loss=0.1272, simple_loss=0.2021, pruned_loss=0.0261, over 4776.00 frames.], tot_loss[loss=0.1538, simple_loss=0.2245, pruned_loss=0.04157, over 971784.66 frames.], batch size: 14, lr: 3.82e-04 2022-05-05 06:44:16,641 INFO [train.py:715] (0/8) Epoch 5, batch 18950, loss[loss=0.14, simple_loss=0.2174, pruned_loss=0.03132, over 4849.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2242, pruned_loss=0.04152, over 971594.23 frames.], batch size: 30, lr: 3.82e-04 2022-05-05 06:44:56,112 INFO [train.py:715] (0/8) Epoch 5, batch 19000, loss[loss=0.1309, simple_loss=0.2118, pruned_loss=0.02504, over 4759.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2236, pruned_loss=0.04108, over 971584.29 frames.], batch size: 19, lr: 3.82e-04 2022-05-05 06:45:34,087 INFO [train.py:715] (0/8) Epoch 5, batch 19050, loss[loss=0.1296, simple_loss=0.2076, pruned_loss=0.02581, over 4818.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2234, pruned_loss=0.04079, over 971665.91 frames.], batch size: 13, lr: 3.81e-04 2022-05-05 06:46:13,036 INFO [train.py:715] (0/8) Epoch 5, batch 19100, loss[loss=0.1404, simple_loss=0.2187, pruned_loss=0.03106, over 4891.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2229, pruned_loss=0.04072, over 972212.33 frames.], batch size: 20, lr: 3.81e-04 2022-05-05 06:46:52,733 INFO [train.py:715] (0/8) Epoch 5, batch 19150, loss[loss=0.1577, simple_loss=0.2196, pruned_loss=0.0479, over 4887.00 frames.], tot_loss[loss=0.1508, simple_loss=0.222, pruned_loss=0.03982, over 971478.83 frames.], batch size: 32, lr: 3.81e-04 2022-05-05 06:47:31,312 INFO [train.py:715] (0/8) Epoch 5, batch 19200, loss[loss=0.1264, simple_loss=0.203, pruned_loss=0.02486, over 4860.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2214, pruned_loss=0.03959, over 971903.09 frames.], batch size: 20, lr: 3.81e-04 2022-05-05 06:48:10,844 INFO [train.py:715] (0/8) Epoch 5, batch 19250, loss[loss=0.1592, simple_loss=0.2356, pruned_loss=0.04141, over 4968.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2218, pruned_loss=0.04013, over 972819.28 frames.], batch size: 29, lr: 3.81e-04 2022-05-05 06:48:48,906 INFO [train.py:715] (0/8) Epoch 5, batch 19300, loss[loss=0.1901, simple_loss=0.276, pruned_loss=0.05212, over 4979.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2225, pruned_loss=0.0404, over 973654.37 frames.], batch size: 28, lr: 3.81e-04 2022-05-05 06:49:27,997 INFO [train.py:715] (0/8) Epoch 5, batch 19350, loss[loss=0.1239, simple_loss=0.1992, pruned_loss=0.02435, over 4817.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2221, pruned_loss=0.04001, over 972692.34 frames.], batch size: 25, lr: 3.81e-04 2022-05-05 06:50:06,756 INFO [train.py:715] (0/8) Epoch 5, batch 19400, loss[loss=0.1496, simple_loss=0.2188, pruned_loss=0.04015, over 4982.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2218, pruned_loss=0.0402, over 973577.68 frames.], batch size: 25, lr: 3.81e-04 2022-05-05 06:50:45,412 INFO [train.py:715] (0/8) Epoch 5, batch 19450, loss[loss=0.1594, simple_loss=0.2287, pruned_loss=0.04508, over 4934.00 frames.], tot_loss[loss=0.151, simple_loss=0.2216, pruned_loss=0.04019, over 973461.17 frames.], batch size: 23, lr: 3.81e-04 2022-05-05 06:51:25,049 INFO [train.py:715] (0/8) Epoch 5, batch 19500, loss[loss=0.1687, simple_loss=0.2392, pruned_loss=0.04916, over 4818.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2218, pruned_loss=0.0405, over 973326.40 frames.], batch size: 26, lr: 3.81e-04 2022-05-05 06:52:03,847 INFO [train.py:715] (0/8) Epoch 5, batch 19550, loss[loss=0.158, simple_loss=0.2309, pruned_loss=0.04253, over 4888.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2217, pruned_loss=0.04057, over 973628.97 frames.], batch size: 19, lr: 3.81e-04 2022-05-05 06:52:42,734 INFO [train.py:715] (0/8) Epoch 5, batch 19600, loss[loss=0.1452, simple_loss=0.218, pruned_loss=0.03616, over 4973.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2224, pruned_loss=0.0411, over 973770.50 frames.], batch size: 14, lr: 3.81e-04 2022-05-05 06:53:21,188 INFO [train.py:715] (0/8) Epoch 5, batch 19650, loss[loss=0.1516, simple_loss=0.2172, pruned_loss=0.04305, over 4813.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2227, pruned_loss=0.04112, over 973265.46 frames.], batch size: 25, lr: 3.81e-04 2022-05-05 06:54:00,672 INFO [train.py:715] (0/8) Epoch 5, batch 19700, loss[loss=0.1645, simple_loss=0.227, pruned_loss=0.05102, over 4797.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2222, pruned_loss=0.04049, over 973259.84 frames.], batch size: 12, lr: 3.81e-04 2022-05-05 06:54:39,902 INFO [train.py:715] (0/8) Epoch 5, batch 19750, loss[loss=0.1379, simple_loss=0.1963, pruned_loss=0.03972, over 4708.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2225, pruned_loss=0.04092, over 972669.83 frames.], batch size: 12, lr: 3.81e-04 2022-05-05 06:55:17,842 INFO [train.py:715] (0/8) Epoch 5, batch 19800, loss[loss=0.1622, simple_loss=0.2235, pruned_loss=0.05052, over 4776.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2229, pruned_loss=0.04134, over 973103.08 frames.], batch size: 14, lr: 3.81e-04 2022-05-05 06:55:56,844 INFO [train.py:715] (0/8) Epoch 5, batch 19850, loss[loss=0.1534, simple_loss=0.2232, pruned_loss=0.04178, over 4816.00 frames.], tot_loss[loss=0.152, simple_loss=0.2222, pruned_loss=0.04095, over 972565.81 frames.], batch size: 26, lr: 3.81e-04 2022-05-05 06:56:35,745 INFO [train.py:715] (0/8) Epoch 5, batch 19900, loss[loss=0.1288, simple_loss=0.2095, pruned_loss=0.02399, over 4815.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2223, pruned_loss=0.041, over 972272.66 frames.], batch size: 26, lr: 3.81e-04 2022-05-05 06:57:14,678 INFO [train.py:715] (0/8) Epoch 5, batch 19950, loss[loss=0.1348, simple_loss=0.2077, pruned_loss=0.03096, over 4769.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2224, pruned_loss=0.04113, over 972500.51 frames.], batch size: 16, lr: 3.81e-04 2022-05-05 06:57:53,092 INFO [train.py:715] (0/8) Epoch 5, batch 20000, loss[loss=0.1243, simple_loss=0.1986, pruned_loss=0.02499, over 4753.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2208, pruned_loss=0.04004, over 972068.43 frames.], batch size: 19, lr: 3.81e-04 2022-05-05 06:58:32,596 INFO [train.py:715] (0/8) Epoch 5, batch 20050, loss[loss=0.1698, simple_loss=0.2332, pruned_loss=0.05321, over 4826.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2203, pruned_loss=0.03978, over 972372.94 frames.], batch size: 15, lr: 3.81e-04 2022-05-05 06:59:12,129 INFO [train.py:715] (0/8) Epoch 5, batch 20100, loss[loss=0.1611, simple_loss=0.2333, pruned_loss=0.04447, over 4811.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2208, pruned_loss=0.03987, over 972473.48 frames.], batch size: 27, lr: 3.80e-04 2022-05-05 06:59:50,435 INFO [train.py:715] (0/8) Epoch 5, batch 20150, loss[loss=0.1398, simple_loss=0.2109, pruned_loss=0.0344, over 4970.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2207, pruned_loss=0.03991, over 973196.31 frames.], batch size: 14, lr: 3.80e-04 2022-05-05 07:00:30,255 INFO [train.py:715] (0/8) Epoch 5, batch 20200, loss[loss=0.1659, simple_loss=0.225, pruned_loss=0.05344, over 4839.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2202, pruned_loss=0.03944, over 972934.69 frames.], batch size: 30, lr: 3.80e-04 2022-05-05 07:01:09,274 INFO [train.py:715] (0/8) Epoch 5, batch 20250, loss[loss=0.1522, simple_loss=0.2205, pruned_loss=0.04198, over 4901.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2202, pruned_loss=0.03964, over 972962.51 frames.], batch size: 17, lr: 3.80e-04 2022-05-05 07:01:47,787 INFO [train.py:715] (0/8) Epoch 5, batch 20300, loss[loss=0.1505, simple_loss=0.2282, pruned_loss=0.0364, over 4969.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2199, pruned_loss=0.03963, over 972953.19 frames.], batch size: 15, lr: 3.80e-04 2022-05-05 07:02:25,746 INFO [train.py:715] (0/8) Epoch 5, batch 20350, loss[loss=0.1405, simple_loss=0.2074, pruned_loss=0.03685, over 4922.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2205, pruned_loss=0.04006, over 972476.15 frames.], batch size: 17, lr: 3.80e-04 2022-05-05 07:03:04,303 INFO [train.py:715] (0/8) Epoch 5, batch 20400, loss[loss=0.1387, simple_loss=0.2131, pruned_loss=0.03212, over 4835.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2215, pruned_loss=0.04002, over 972838.42 frames.], batch size: 27, lr: 3.80e-04 2022-05-05 07:03:43,171 INFO [train.py:715] (0/8) Epoch 5, batch 20450, loss[loss=0.1464, simple_loss=0.2265, pruned_loss=0.03315, over 4920.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2228, pruned_loss=0.04078, over 972789.27 frames.], batch size: 23, lr: 3.80e-04 2022-05-05 07:04:21,311 INFO [train.py:715] (0/8) Epoch 5, batch 20500, loss[loss=0.1363, simple_loss=0.2115, pruned_loss=0.03051, over 4958.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2225, pruned_loss=0.0408, over 972376.67 frames.], batch size: 24, lr: 3.80e-04 2022-05-05 07:05:00,715 INFO [train.py:715] (0/8) Epoch 5, batch 20550, loss[loss=0.1623, simple_loss=0.2194, pruned_loss=0.05259, over 4943.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2226, pruned_loss=0.04087, over 972500.12 frames.], batch size: 21, lr: 3.80e-04 2022-05-05 07:05:39,982 INFO [train.py:715] (0/8) Epoch 5, batch 20600, loss[loss=0.1235, simple_loss=0.2014, pruned_loss=0.02284, over 4925.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2229, pruned_loss=0.04071, over 972386.41 frames.], batch size: 18, lr: 3.80e-04 2022-05-05 07:06:18,973 INFO [train.py:715] (0/8) Epoch 5, batch 20650, loss[loss=0.1699, simple_loss=0.2388, pruned_loss=0.05048, over 4930.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2232, pruned_loss=0.04104, over 973143.94 frames.], batch size: 29, lr: 3.80e-04 2022-05-05 07:06:58,190 INFO [train.py:715] (0/8) Epoch 5, batch 20700, loss[loss=0.1951, simple_loss=0.2478, pruned_loss=0.0712, over 4841.00 frames.], tot_loss[loss=0.152, simple_loss=0.2228, pruned_loss=0.04058, over 972713.74 frames.], batch size: 13, lr: 3.80e-04 2022-05-05 07:07:36,954 INFO [train.py:715] (0/8) Epoch 5, batch 20750, loss[loss=0.1516, simple_loss=0.2112, pruned_loss=0.04596, over 4652.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2222, pruned_loss=0.04005, over 973412.60 frames.], batch size: 13, lr: 3.80e-04 2022-05-05 07:08:16,381 INFO [train.py:715] (0/8) Epoch 5, batch 20800, loss[loss=0.1701, simple_loss=0.2379, pruned_loss=0.05116, over 4874.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2219, pruned_loss=0.0404, over 972646.34 frames.], batch size: 16, lr: 3.80e-04 2022-05-05 07:08:55,021 INFO [train.py:715] (0/8) Epoch 5, batch 20850, loss[loss=0.1754, simple_loss=0.2503, pruned_loss=0.05021, over 4763.00 frames.], tot_loss[loss=0.1514, simple_loss=0.222, pruned_loss=0.04041, over 971672.45 frames.], batch size: 18, lr: 3.80e-04 2022-05-05 07:09:34,329 INFO [train.py:715] (0/8) Epoch 5, batch 20900, loss[loss=0.137, simple_loss=0.2083, pruned_loss=0.03292, over 4969.00 frames.], tot_loss[loss=0.151, simple_loss=0.2217, pruned_loss=0.04014, over 971933.78 frames.], batch size: 15, lr: 3.80e-04 2022-05-05 07:10:12,900 INFO [train.py:715] (0/8) Epoch 5, batch 20950, loss[loss=0.1454, simple_loss=0.215, pruned_loss=0.03787, over 4802.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2216, pruned_loss=0.04025, over 971433.13 frames.], batch size: 21, lr: 3.80e-04 2022-05-05 07:10:51,484 INFO [train.py:715] (0/8) Epoch 5, batch 21000, loss[loss=0.1432, simple_loss=0.2125, pruned_loss=0.03689, over 4870.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2224, pruned_loss=0.04075, over 971215.95 frames.], batch size: 30, lr: 3.80e-04 2022-05-05 07:10:51,485 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 07:11:01,470 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1101, simple_loss=0.1954, pruned_loss=0.01242, over 914524.00 frames. 2022-05-05 07:11:40,512 INFO [train.py:715] (0/8) Epoch 5, batch 21050, loss[loss=0.1592, simple_loss=0.218, pruned_loss=0.05018, over 4854.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2241, pruned_loss=0.04149, over 971675.95 frames.], batch size: 15, lr: 3.80e-04 2022-05-05 07:12:19,696 INFO [train.py:715] (0/8) Epoch 5, batch 21100, loss[loss=0.1587, simple_loss=0.2341, pruned_loss=0.04158, over 4932.00 frames.], tot_loss[loss=0.153, simple_loss=0.2236, pruned_loss=0.04115, over 971475.47 frames.], batch size: 35, lr: 3.79e-04 2022-05-05 07:12:58,339 INFO [train.py:715] (0/8) Epoch 5, batch 21150, loss[loss=0.1649, simple_loss=0.2279, pruned_loss=0.05097, over 4650.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2232, pruned_loss=0.04074, over 970975.21 frames.], batch size: 13, lr: 3.79e-04 2022-05-05 07:13:37,164 INFO [train.py:715] (0/8) Epoch 5, batch 21200, loss[loss=0.1764, simple_loss=0.2513, pruned_loss=0.05078, over 4891.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2228, pruned_loss=0.04046, over 971065.48 frames.], batch size: 22, lr: 3.79e-04 2022-05-05 07:14:15,840 INFO [train.py:715] (0/8) Epoch 5, batch 21250, loss[loss=0.1628, simple_loss=0.2227, pruned_loss=0.05144, over 4800.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2226, pruned_loss=0.04019, over 971638.10 frames.], batch size: 21, lr: 3.79e-04 2022-05-05 07:14:54,659 INFO [train.py:715] (0/8) Epoch 5, batch 21300, loss[loss=0.1593, simple_loss=0.233, pruned_loss=0.04277, over 4783.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2232, pruned_loss=0.04023, over 971703.53 frames.], batch size: 18, lr: 3.79e-04 2022-05-05 07:15:33,332 INFO [train.py:715] (0/8) Epoch 5, batch 21350, loss[loss=0.1424, simple_loss=0.2202, pruned_loss=0.03229, over 4935.00 frames.], tot_loss[loss=0.151, simple_loss=0.222, pruned_loss=0.03996, over 971874.78 frames.], batch size: 21, lr: 3.79e-04 2022-05-05 07:16:11,910 INFO [train.py:715] (0/8) Epoch 5, batch 21400, loss[loss=0.1531, simple_loss=0.2363, pruned_loss=0.03494, over 4880.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2224, pruned_loss=0.04009, over 973031.90 frames.], batch size: 22, lr: 3.79e-04 2022-05-05 07:16:50,971 INFO [train.py:715] (0/8) Epoch 5, batch 21450, loss[loss=0.1308, simple_loss=0.1998, pruned_loss=0.03094, over 4989.00 frames.], tot_loss[loss=0.1512, simple_loss=0.222, pruned_loss=0.04013, over 973171.39 frames.], batch size: 28, lr: 3.79e-04 2022-05-05 07:17:29,097 INFO [train.py:715] (0/8) Epoch 5, batch 21500, loss[loss=0.1446, simple_loss=0.2171, pruned_loss=0.03611, over 4972.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2215, pruned_loss=0.03993, over 973774.85 frames.], batch size: 24, lr: 3.79e-04 2022-05-05 07:18:08,220 INFO [train.py:715] (0/8) Epoch 5, batch 21550, loss[loss=0.1601, simple_loss=0.2263, pruned_loss=0.04691, over 4920.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2218, pruned_loss=0.04036, over 974094.19 frames.], batch size: 18, lr: 3.79e-04 2022-05-05 07:18:46,741 INFO [train.py:715] (0/8) Epoch 5, batch 21600, loss[loss=0.1842, simple_loss=0.2492, pruned_loss=0.05965, over 4848.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2211, pruned_loss=0.03989, over 973956.36 frames.], batch size: 34, lr: 3.79e-04 2022-05-05 07:19:25,820 INFO [train.py:715] (0/8) Epoch 5, batch 21650, loss[loss=0.1455, simple_loss=0.2216, pruned_loss=0.03473, over 4898.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2209, pruned_loss=0.03998, over 973964.33 frames.], batch size: 19, lr: 3.79e-04 2022-05-05 07:20:04,068 INFO [train.py:715] (0/8) Epoch 5, batch 21700, loss[loss=0.1503, simple_loss=0.2248, pruned_loss=0.03783, over 4965.00 frames.], tot_loss[loss=0.151, simple_loss=0.2215, pruned_loss=0.04023, over 973553.69 frames.], batch size: 15, lr: 3.79e-04 2022-05-05 07:20:42,462 INFO [train.py:715] (0/8) Epoch 5, batch 21750, loss[loss=0.1502, simple_loss=0.2258, pruned_loss=0.03725, over 4788.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2219, pruned_loss=0.04063, over 973972.07 frames.], batch size: 17, lr: 3.79e-04 2022-05-05 07:21:20,814 INFO [train.py:715] (0/8) Epoch 5, batch 21800, loss[loss=0.1259, simple_loss=0.1917, pruned_loss=0.03006, over 4899.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2215, pruned_loss=0.04037, over 973405.34 frames.], batch size: 18, lr: 3.79e-04 2022-05-05 07:22:00,028 INFO [train.py:715] (0/8) Epoch 5, batch 21850, loss[loss=0.1573, simple_loss=0.2351, pruned_loss=0.03977, over 4815.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2227, pruned_loss=0.04077, over 973521.87 frames.], batch size: 25, lr: 3.79e-04 2022-05-05 07:22:38,254 INFO [train.py:715] (0/8) Epoch 5, batch 21900, loss[loss=0.1297, simple_loss=0.2017, pruned_loss=0.02891, over 4803.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2232, pruned_loss=0.0413, over 974118.38 frames.], batch size: 24, lr: 3.79e-04 2022-05-05 07:23:16,808 INFO [train.py:715] (0/8) Epoch 5, batch 21950, loss[loss=0.1353, simple_loss=0.21, pruned_loss=0.03025, over 4796.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2227, pruned_loss=0.04115, over 973390.46 frames.], batch size: 24, lr: 3.79e-04 2022-05-05 07:23:55,215 INFO [train.py:715] (0/8) Epoch 5, batch 22000, loss[loss=0.1653, simple_loss=0.2254, pruned_loss=0.05262, over 4835.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2234, pruned_loss=0.04123, over 974108.81 frames.], batch size: 30, lr: 3.79e-04 2022-05-05 07:24:34,721 INFO [train.py:715] (0/8) Epoch 5, batch 22050, loss[loss=0.1381, simple_loss=0.2151, pruned_loss=0.03059, over 4929.00 frames.], tot_loss[loss=0.1518, simple_loss=0.222, pruned_loss=0.04075, over 974142.06 frames.], batch size: 23, lr: 3.79e-04 2022-05-05 07:25:13,184 INFO [train.py:715] (0/8) Epoch 5, batch 22100, loss[loss=0.1903, simple_loss=0.27, pruned_loss=0.05533, over 4980.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2226, pruned_loss=0.0409, over 973591.22 frames.], batch size: 15, lr: 3.79e-04 2022-05-05 07:25:52,413 INFO [train.py:715] (0/8) Epoch 5, batch 22150, loss[loss=0.1493, simple_loss=0.2212, pruned_loss=0.03872, over 4803.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2235, pruned_loss=0.04177, over 973587.69 frames.], batch size: 21, lr: 3.78e-04 2022-05-05 07:26:31,442 INFO [train.py:715] (0/8) Epoch 5, batch 22200, loss[loss=0.1563, simple_loss=0.2273, pruned_loss=0.04263, over 4986.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2233, pruned_loss=0.04195, over 973537.24 frames.], batch size: 28, lr: 3.78e-04 2022-05-05 07:27:11,163 INFO [train.py:715] (0/8) Epoch 5, batch 22250, loss[loss=0.1637, simple_loss=0.2325, pruned_loss=0.04742, over 4940.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2234, pruned_loss=0.04155, over 973303.06 frames.], batch size: 21, lr: 3.78e-04 2022-05-05 07:27:50,339 INFO [train.py:715] (0/8) Epoch 5, batch 22300, loss[loss=0.1404, simple_loss=0.2095, pruned_loss=0.03567, over 4693.00 frames.], tot_loss[loss=0.154, simple_loss=0.224, pruned_loss=0.04194, over 971940.09 frames.], batch size: 15, lr: 3.78e-04 2022-05-05 07:28:28,457 INFO [train.py:715] (0/8) Epoch 5, batch 22350, loss[loss=0.1359, simple_loss=0.2108, pruned_loss=0.03048, over 4930.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2229, pruned_loss=0.04137, over 972657.33 frames.], batch size: 23, lr: 3.78e-04 2022-05-05 07:29:06,831 INFO [train.py:715] (0/8) Epoch 5, batch 22400, loss[loss=0.1741, simple_loss=0.2479, pruned_loss=0.05011, over 4883.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2233, pruned_loss=0.04086, over 972418.89 frames.], batch size: 19, lr: 3.78e-04 2022-05-05 07:29:45,741 INFO [train.py:715] (0/8) Epoch 5, batch 22450, loss[loss=0.1357, simple_loss=0.2049, pruned_loss=0.03328, over 4890.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2232, pruned_loss=0.04084, over 973330.12 frames.], batch size: 19, lr: 3.78e-04 2022-05-05 07:30:25,208 INFO [train.py:715] (0/8) Epoch 5, batch 22500, loss[loss=0.1256, simple_loss=0.2059, pruned_loss=0.02266, over 4846.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2231, pruned_loss=0.04067, over 973045.14 frames.], batch size: 15, lr: 3.78e-04 2022-05-05 07:31:03,485 INFO [train.py:715] (0/8) Epoch 5, batch 22550, loss[loss=0.16, simple_loss=0.2307, pruned_loss=0.04463, over 4844.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2221, pruned_loss=0.04027, over 972526.31 frames.], batch size: 15, lr: 3.78e-04 2022-05-05 07:31:42,555 INFO [train.py:715] (0/8) Epoch 5, batch 22600, loss[loss=0.1097, simple_loss=0.1814, pruned_loss=0.01898, over 4771.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2228, pruned_loss=0.04047, over 973173.98 frames.], batch size: 14, lr: 3.78e-04 2022-05-05 07:32:21,685 INFO [train.py:715] (0/8) Epoch 5, batch 22650, loss[loss=0.1355, simple_loss=0.2162, pruned_loss=0.02743, over 4924.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2216, pruned_loss=0.03976, over 973688.70 frames.], batch size: 18, lr: 3.78e-04 2022-05-05 07:33:00,842 INFO [train.py:715] (0/8) Epoch 5, batch 22700, loss[loss=0.1666, simple_loss=0.2259, pruned_loss=0.05363, over 4881.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2222, pruned_loss=0.04031, over 974012.88 frames.], batch size: 16, lr: 3.78e-04 2022-05-05 07:33:39,167 INFO [train.py:715] (0/8) Epoch 5, batch 22750, loss[loss=0.1464, simple_loss=0.2128, pruned_loss=0.03998, over 4760.00 frames.], tot_loss[loss=0.1512, simple_loss=0.222, pruned_loss=0.0402, over 973140.13 frames.], batch size: 19, lr: 3.78e-04 2022-05-05 07:34:18,364 INFO [train.py:715] (0/8) Epoch 5, batch 22800, loss[loss=0.1772, simple_loss=0.2426, pruned_loss=0.05595, over 4888.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2225, pruned_loss=0.04081, over 973874.27 frames.], batch size: 16, lr: 3.78e-04 2022-05-05 07:34:57,940 INFO [train.py:715] (0/8) Epoch 5, batch 22850, loss[loss=0.1496, simple_loss=0.2223, pruned_loss=0.0385, over 4881.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2243, pruned_loss=0.04231, over 973349.45 frames.], batch size: 19, lr: 3.78e-04 2022-05-05 07:35:36,330 INFO [train.py:715] (0/8) Epoch 5, batch 22900, loss[loss=0.1408, simple_loss=0.2107, pruned_loss=0.03546, over 4864.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2242, pruned_loss=0.04213, over 973346.35 frames.], batch size: 32, lr: 3.78e-04 2022-05-05 07:36:15,065 INFO [train.py:715] (0/8) Epoch 5, batch 22950, loss[loss=0.1569, simple_loss=0.2269, pruned_loss=0.04345, over 4978.00 frames.], tot_loss[loss=0.1549, simple_loss=0.2248, pruned_loss=0.04248, over 972322.60 frames.], batch size: 39, lr: 3.78e-04 2022-05-05 07:36:54,406 INFO [train.py:715] (0/8) Epoch 5, batch 23000, loss[loss=0.1621, simple_loss=0.2397, pruned_loss=0.04223, over 4922.00 frames.], tot_loss[loss=0.1541, simple_loss=0.224, pruned_loss=0.04209, over 972420.00 frames.], batch size: 23, lr: 3.78e-04 2022-05-05 07:37:33,571 INFO [train.py:715] (0/8) Epoch 5, batch 23050, loss[loss=0.1701, simple_loss=0.2413, pruned_loss=0.04946, over 4814.00 frames.], tot_loss[loss=0.1542, simple_loss=0.224, pruned_loss=0.04222, over 972626.48 frames.], batch size: 26, lr: 3.78e-04 2022-05-05 07:38:12,014 INFO [train.py:715] (0/8) Epoch 5, batch 23100, loss[loss=0.1451, simple_loss=0.2213, pruned_loss=0.03445, over 4854.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2243, pruned_loss=0.04179, over 973199.40 frames.], batch size: 30, lr: 3.78e-04 2022-05-05 07:38:51,175 INFO [train.py:715] (0/8) Epoch 5, batch 23150, loss[loss=0.1483, simple_loss=0.2182, pruned_loss=0.03917, over 4794.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2239, pruned_loss=0.04156, over 973568.21 frames.], batch size: 14, lr: 3.78e-04 2022-05-05 07:39:30,783 INFO [train.py:715] (0/8) Epoch 5, batch 23200, loss[loss=0.1402, simple_loss=0.2121, pruned_loss=0.03416, over 4936.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2226, pruned_loss=0.04053, over 973467.24 frames.], batch size: 21, lr: 3.77e-04 2022-05-05 07:40:09,158 INFO [train.py:715] (0/8) Epoch 5, batch 23250, loss[loss=0.1603, simple_loss=0.2216, pruned_loss=0.04951, over 4846.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2229, pruned_loss=0.04106, over 973254.90 frames.], batch size: 32, lr: 3.77e-04 2022-05-05 07:40:47,779 INFO [train.py:715] (0/8) Epoch 5, batch 23300, loss[loss=0.1406, simple_loss=0.207, pruned_loss=0.03707, over 4792.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2221, pruned_loss=0.04062, over 973022.17 frames.], batch size: 21, lr: 3.77e-04 2022-05-05 07:41:27,164 INFO [train.py:715] (0/8) Epoch 5, batch 23350, loss[loss=0.1358, simple_loss=0.2091, pruned_loss=0.03131, over 4956.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2215, pruned_loss=0.03998, over 973206.95 frames.], batch size: 29, lr: 3.77e-04 2022-05-05 07:42:05,799 INFO [train.py:715] (0/8) Epoch 5, batch 23400, loss[loss=0.1462, simple_loss=0.2234, pruned_loss=0.03445, over 4802.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2211, pruned_loss=0.03983, over 972887.46 frames.], batch size: 21, lr: 3.77e-04 2022-05-05 07:42:44,249 INFO [train.py:715] (0/8) Epoch 5, batch 23450, loss[loss=0.1915, simple_loss=0.2621, pruned_loss=0.06042, over 4954.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2221, pruned_loss=0.04043, over 974034.11 frames.], batch size: 21, lr: 3.77e-04 2022-05-05 07:43:22,949 INFO [train.py:715] (0/8) Epoch 5, batch 23500, loss[loss=0.1294, simple_loss=0.2087, pruned_loss=0.02504, over 4917.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2222, pruned_loss=0.04032, over 974122.89 frames.], batch size: 23, lr: 3.77e-04 2022-05-05 07:44:02,008 INFO [train.py:715] (0/8) Epoch 5, batch 23550, loss[loss=0.1282, simple_loss=0.2006, pruned_loss=0.02786, over 4942.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2221, pruned_loss=0.04014, over 973943.33 frames.], batch size: 21, lr: 3.77e-04 2022-05-05 07:44:40,885 INFO [train.py:715] (0/8) Epoch 5, batch 23600, loss[loss=0.152, simple_loss=0.2202, pruned_loss=0.04189, over 4982.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2216, pruned_loss=0.04034, over 973496.49 frames.], batch size: 28, lr: 3.77e-04 2022-05-05 07:45:19,391 INFO [train.py:715] (0/8) Epoch 5, batch 23650, loss[loss=0.1264, simple_loss=0.1988, pruned_loss=0.02704, over 4803.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2225, pruned_loss=0.04085, over 971911.24 frames.], batch size: 24, lr: 3.77e-04 2022-05-05 07:45:58,895 INFO [train.py:715] (0/8) Epoch 5, batch 23700, loss[loss=0.1727, simple_loss=0.2472, pruned_loss=0.04913, over 4923.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2238, pruned_loss=0.04184, over 972592.39 frames.], batch size: 23, lr: 3.77e-04 2022-05-05 07:46:37,472 INFO [train.py:715] (0/8) Epoch 5, batch 23750, loss[loss=0.1812, simple_loss=0.2477, pruned_loss=0.05737, over 4838.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2235, pruned_loss=0.0415, over 972791.10 frames.], batch size: 15, lr: 3.77e-04 2022-05-05 07:47:16,503 INFO [train.py:715] (0/8) Epoch 5, batch 23800, loss[loss=0.1658, simple_loss=0.238, pruned_loss=0.04675, over 4902.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2235, pruned_loss=0.04111, over 972800.97 frames.], batch size: 17, lr: 3.77e-04 2022-05-05 07:47:55,207 INFO [train.py:715] (0/8) Epoch 5, batch 23850, loss[loss=0.1437, simple_loss=0.206, pruned_loss=0.0407, over 4715.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2243, pruned_loss=0.04137, over 972148.98 frames.], batch size: 15, lr: 3.77e-04 2022-05-05 07:48:34,415 INFO [train.py:715] (0/8) Epoch 5, batch 23900, loss[loss=0.1535, simple_loss=0.2362, pruned_loss=0.03541, over 4817.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2238, pruned_loss=0.04078, over 972222.06 frames.], batch size: 21, lr: 3.77e-04 2022-05-05 07:49:13,369 INFO [train.py:715] (0/8) Epoch 5, batch 23950, loss[loss=0.1401, simple_loss=0.2063, pruned_loss=0.03701, over 4829.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2246, pruned_loss=0.04124, over 971939.88 frames.], batch size: 15, lr: 3.77e-04 2022-05-05 07:49:51,760 INFO [train.py:715] (0/8) Epoch 5, batch 24000, loss[loss=0.1712, simple_loss=0.2437, pruned_loss=0.04937, over 4829.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2243, pruned_loss=0.04107, over 971704.35 frames.], batch size: 13, lr: 3.77e-04 2022-05-05 07:49:51,761 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 07:50:02,184 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.11, simple_loss=0.1955, pruned_loss=0.0123, over 914524.00 frames. 2022-05-05 07:50:40,722 INFO [train.py:715] (0/8) Epoch 5, batch 24050, loss[loss=0.1625, simple_loss=0.224, pruned_loss=0.05047, over 4957.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2246, pruned_loss=0.04145, over 971298.47 frames.], batch size: 35, lr: 3.77e-04 2022-05-05 07:51:20,432 INFO [train.py:715] (0/8) Epoch 5, batch 24100, loss[loss=0.195, simple_loss=0.2532, pruned_loss=0.06844, over 4912.00 frames.], tot_loss[loss=0.1539, simple_loss=0.2247, pruned_loss=0.04154, over 971341.86 frames.], batch size: 39, lr: 3.77e-04 2022-05-05 07:51:59,180 INFO [train.py:715] (0/8) Epoch 5, batch 24150, loss[loss=0.1775, simple_loss=0.2582, pruned_loss=0.04835, over 4798.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2236, pruned_loss=0.04131, over 972117.43 frames.], batch size: 24, lr: 3.77e-04 2022-05-05 07:52:37,492 INFO [train.py:715] (0/8) Epoch 5, batch 24200, loss[loss=0.1615, simple_loss=0.2335, pruned_loss=0.04472, over 4956.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2233, pruned_loss=0.0411, over 972915.03 frames.], batch size: 15, lr: 3.77e-04 2022-05-05 07:53:16,809 INFO [train.py:715] (0/8) Epoch 5, batch 24250, loss[loss=0.1486, simple_loss=0.2228, pruned_loss=0.03715, over 4827.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2229, pruned_loss=0.04077, over 972300.25 frames.], batch size: 12, lr: 3.76e-04 2022-05-05 07:53:55,921 INFO [train.py:715] (0/8) Epoch 5, batch 24300, loss[loss=0.1585, simple_loss=0.2291, pruned_loss=0.0439, over 4864.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2236, pruned_loss=0.04149, over 971737.30 frames.], batch size: 13, lr: 3.76e-04 2022-05-05 07:54:34,803 INFO [train.py:715] (0/8) Epoch 5, batch 24350, loss[loss=0.1338, simple_loss=0.2044, pruned_loss=0.03155, over 4822.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2236, pruned_loss=0.0414, over 971222.04 frames.], batch size: 25, lr: 3.76e-04 2022-05-05 07:55:13,054 INFO [train.py:715] (0/8) Epoch 5, batch 24400, loss[loss=0.1256, simple_loss=0.1913, pruned_loss=0.02993, over 4836.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2231, pruned_loss=0.04119, over 971775.61 frames.], batch size: 30, lr: 3.76e-04 2022-05-05 07:55:52,737 INFO [train.py:715] (0/8) Epoch 5, batch 24450, loss[loss=0.1629, simple_loss=0.2338, pruned_loss=0.04603, over 4794.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2222, pruned_loss=0.04055, over 971428.20 frames.], batch size: 17, lr: 3.76e-04 2022-05-05 07:56:30,709 INFO [train.py:715] (0/8) Epoch 5, batch 24500, loss[loss=0.1412, simple_loss=0.2178, pruned_loss=0.03225, over 4926.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2214, pruned_loss=0.04041, over 971870.10 frames.], batch size: 23, lr: 3.76e-04 2022-05-05 07:57:09,363 INFO [train.py:715] (0/8) Epoch 5, batch 24550, loss[loss=0.1352, simple_loss=0.2106, pruned_loss=0.02996, over 4806.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2217, pruned_loss=0.04059, over 971838.35 frames.], batch size: 21, lr: 3.76e-04 2022-05-05 07:57:48,727 INFO [train.py:715] (0/8) Epoch 5, batch 24600, loss[loss=0.1269, simple_loss=0.1944, pruned_loss=0.02968, over 4646.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2222, pruned_loss=0.04099, over 971465.02 frames.], batch size: 13, lr: 3.76e-04 2022-05-05 07:58:27,789 INFO [train.py:715] (0/8) Epoch 5, batch 24650, loss[loss=0.1599, simple_loss=0.2274, pruned_loss=0.04618, over 4973.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2227, pruned_loss=0.04101, over 971405.47 frames.], batch size: 24, lr: 3.76e-04 2022-05-05 07:59:06,980 INFO [train.py:715] (0/8) Epoch 5, batch 24700, loss[loss=0.1362, simple_loss=0.2041, pruned_loss=0.03409, over 4916.00 frames.], tot_loss[loss=0.153, simple_loss=0.2232, pruned_loss=0.04137, over 972945.33 frames.], batch size: 18, lr: 3.76e-04 2022-05-05 07:59:45,114 INFO [train.py:715] (0/8) Epoch 5, batch 24750, loss[loss=0.1702, simple_loss=0.2467, pruned_loss=0.04689, over 4934.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2229, pruned_loss=0.04132, over 973317.02 frames.], batch size: 29, lr: 3.76e-04 2022-05-05 08:00:24,680 INFO [train.py:715] (0/8) Epoch 5, batch 24800, loss[loss=0.1599, simple_loss=0.2416, pruned_loss=0.03906, over 4905.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2235, pruned_loss=0.04171, over 973988.60 frames.], batch size: 18, lr: 3.76e-04 2022-05-05 08:01:03,110 INFO [train.py:715] (0/8) Epoch 5, batch 24850, loss[loss=0.1541, simple_loss=0.2428, pruned_loss=0.03268, over 4952.00 frames.], tot_loss[loss=0.152, simple_loss=0.2222, pruned_loss=0.04087, over 973616.01 frames.], batch size: 21, lr: 3.76e-04 2022-05-05 08:01:41,875 INFO [train.py:715] (0/8) Epoch 5, batch 24900, loss[loss=0.1964, simple_loss=0.2633, pruned_loss=0.06473, over 4687.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2221, pruned_loss=0.0405, over 972276.16 frames.], batch size: 15, lr: 3.76e-04 2022-05-05 08:02:21,422 INFO [train.py:715] (0/8) Epoch 5, batch 24950, loss[loss=0.1621, simple_loss=0.2328, pruned_loss=0.04568, over 4990.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2223, pruned_loss=0.04076, over 972681.43 frames.], batch size: 25, lr: 3.76e-04 2022-05-05 08:03:00,473 INFO [train.py:715] (0/8) Epoch 5, batch 25000, loss[loss=0.1451, simple_loss=0.2127, pruned_loss=0.0387, over 4986.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2223, pruned_loss=0.0403, over 973564.83 frames.], batch size: 15, lr: 3.76e-04 2022-05-05 08:03:39,036 INFO [train.py:715] (0/8) Epoch 5, batch 25050, loss[loss=0.1251, simple_loss=0.1985, pruned_loss=0.02586, over 4981.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2213, pruned_loss=0.03974, over 973223.04 frames.], batch size: 35, lr: 3.76e-04 2022-05-05 08:04:17,281 INFO [train.py:715] (0/8) Epoch 5, batch 25100, loss[loss=0.12, simple_loss=0.1892, pruned_loss=0.02543, over 4808.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2216, pruned_loss=0.04042, over 972675.10 frames.], batch size: 21, lr: 3.76e-04 2022-05-05 08:04:57,540 INFO [train.py:715] (0/8) Epoch 5, batch 25150, loss[loss=0.1646, simple_loss=0.2299, pruned_loss=0.04965, over 4820.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2214, pruned_loss=0.04021, over 972047.26 frames.], batch size: 15, lr: 3.76e-04 2022-05-05 08:05:35,724 INFO [train.py:715] (0/8) Epoch 5, batch 25200, loss[loss=0.1576, simple_loss=0.2408, pruned_loss=0.03722, over 4820.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2217, pruned_loss=0.0404, over 972212.36 frames.], batch size: 25, lr: 3.76e-04 2022-05-05 08:06:14,575 INFO [train.py:715] (0/8) Epoch 5, batch 25250, loss[loss=0.1616, simple_loss=0.2212, pruned_loss=0.05101, over 4846.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2213, pruned_loss=0.04024, over 972184.64 frames.], batch size: 30, lr: 3.76e-04 2022-05-05 08:06:53,401 INFO [train.py:715] (0/8) Epoch 5, batch 25300, loss[loss=0.199, simple_loss=0.2806, pruned_loss=0.05866, over 4966.00 frames.], tot_loss[loss=0.1515, simple_loss=0.222, pruned_loss=0.04055, over 971935.83 frames.], batch size: 21, lr: 3.75e-04 2022-05-05 08:07:31,745 INFO [train.py:715] (0/8) Epoch 5, batch 25350, loss[loss=0.1276, simple_loss=0.1968, pruned_loss=0.02921, over 4829.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2219, pruned_loss=0.04061, over 971440.21 frames.], batch size: 26, lr: 3.75e-04 2022-05-05 08:08:10,246 INFO [train.py:715] (0/8) Epoch 5, batch 25400, loss[loss=0.1168, simple_loss=0.1817, pruned_loss=0.02592, over 4765.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2206, pruned_loss=0.04009, over 970796.46 frames.], batch size: 19, lr: 3.75e-04 2022-05-05 08:08:49,162 INFO [train.py:715] (0/8) Epoch 5, batch 25450, loss[loss=0.1625, simple_loss=0.2252, pruned_loss=0.04992, over 4871.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2215, pruned_loss=0.04051, over 971366.44 frames.], batch size: 38, lr: 3.75e-04 2022-05-05 08:09:28,359 INFO [train.py:715] (0/8) Epoch 5, batch 25500, loss[loss=0.1722, simple_loss=0.2346, pruned_loss=0.05485, over 4954.00 frames.], tot_loss[loss=0.154, simple_loss=0.2236, pruned_loss=0.04217, over 971283.60 frames.], batch size: 39, lr: 3.75e-04 2022-05-05 08:10:07,144 INFO [train.py:715] (0/8) Epoch 5, batch 25550, loss[loss=0.1316, simple_loss=0.2049, pruned_loss=0.02917, over 4690.00 frames.], tot_loss[loss=0.1545, simple_loss=0.2243, pruned_loss=0.04234, over 971542.89 frames.], batch size: 15, lr: 3.75e-04 2022-05-05 08:10:45,631 INFO [train.py:715] (0/8) Epoch 5, batch 25600, loss[loss=0.1336, simple_loss=0.2114, pruned_loss=0.02791, over 4807.00 frames.], tot_loss[loss=0.1536, simple_loss=0.2235, pruned_loss=0.04184, over 970725.13 frames.], batch size: 24, lr: 3.75e-04 2022-05-05 08:11:24,701 INFO [train.py:715] (0/8) Epoch 5, batch 25650, loss[loss=0.1541, simple_loss=0.2273, pruned_loss=0.0404, over 4886.00 frames.], tot_loss[loss=0.1537, simple_loss=0.2238, pruned_loss=0.0418, over 971263.83 frames.], batch size: 22, lr: 3.75e-04 2022-05-05 08:12:03,092 INFO [train.py:715] (0/8) Epoch 5, batch 25700, loss[loss=0.1534, simple_loss=0.2169, pruned_loss=0.045, over 4817.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2228, pruned_loss=0.04109, over 971394.15 frames.], batch size: 27, lr: 3.75e-04 2022-05-05 08:12:41,261 INFO [train.py:715] (0/8) Epoch 5, batch 25750, loss[loss=0.1688, simple_loss=0.239, pruned_loss=0.04926, over 4779.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2228, pruned_loss=0.04113, over 970958.04 frames.], batch size: 18, lr: 3.75e-04 2022-05-05 08:13:20,735 INFO [train.py:715] (0/8) Epoch 5, batch 25800, loss[loss=0.1656, simple_loss=0.2344, pruned_loss=0.04838, over 4860.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2214, pruned_loss=0.04046, over 970801.06 frames.], batch size: 38, lr: 3.75e-04 2022-05-05 08:13:59,830 INFO [train.py:715] (0/8) Epoch 5, batch 25850, loss[loss=0.1492, simple_loss=0.2289, pruned_loss=0.03469, over 4865.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2209, pruned_loss=0.04047, over 971362.44 frames.], batch size: 20, lr: 3.75e-04 2022-05-05 08:14:38,586 INFO [train.py:715] (0/8) Epoch 5, batch 25900, loss[loss=0.1452, simple_loss=0.2254, pruned_loss=0.03253, over 4882.00 frames.], tot_loss[loss=0.1506, simple_loss=0.221, pruned_loss=0.04013, over 972519.44 frames.], batch size: 22, lr: 3.75e-04 2022-05-05 08:15:17,121 INFO [train.py:715] (0/8) Epoch 5, batch 25950, loss[loss=0.1597, simple_loss=0.2298, pruned_loss=0.0448, over 4978.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2214, pruned_loss=0.04004, over 972950.44 frames.], batch size: 33, lr: 3.75e-04 2022-05-05 08:15:18,583 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-200000.pt 2022-05-05 08:15:58,599 INFO [train.py:715] (0/8) Epoch 5, batch 26000, loss[loss=0.1451, simple_loss=0.2198, pruned_loss=0.03515, over 4776.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2204, pruned_loss=0.03918, over 973063.48 frames.], batch size: 18, lr: 3.75e-04 2022-05-05 08:16:37,291 INFO [train.py:715] (0/8) Epoch 5, batch 26050, loss[loss=0.1486, simple_loss=0.2147, pruned_loss=0.04124, over 4954.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2201, pruned_loss=0.03913, over 972595.59 frames.], batch size: 35, lr: 3.75e-04 2022-05-05 08:17:15,757 INFO [train.py:715] (0/8) Epoch 5, batch 26100, loss[loss=0.1224, simple_loss=0.1914, pruned_loss=0.02666, over 4700.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2199, pruned_loss=0.0395, over 972460.50 frames.], batch size: 15, lr: 3.75e-04 2022-05-05 08:17:54,715 INFO [train.py:715] (0/8) Epoch 5, batch 26150, loss[loss=0.1414, simple_loss=0.2142, pruned_loss=0.03432, over 4944.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2208, pruned_loss=0.03987, over 972880.79 frames.], batch size: 35, lr: 3.75e-04 2022-05-05 08:18:33,045 INFO [train.py:715] (0/8) Epoch 5, batch 26200, loss[loss=0.1408, simple_loss=0.2116, pruned_loss=0.03502, over 4903.00 frames.], tot_loss[loss=0.15, simple_loss=0.2204, pruned_loss=0.03978, over 972789.20 frames.], batch size: 17, lr: 3.75e-04 2022-05-05 08:19:12,103 INFO [train.py:715] (0/8) Epoch 5, batch 26250, loss[loss=0.1196, simple_loss=0.1877, pruned_loss=0.02578, over 4963.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2201, pruned_loss=0.0397, over 972685.39 frames.], batch size: 35, lr: 3.75e-04 2022-05-05 08:19:51,343 INFO [train.py:715] (0/8) Epoch 5, batch 26300, loss[loss=0.186, simple_loss=0.2446, pruned_loss=0.06372, over 4786.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2204, pruned_loss=0.0399, over 971889.15 frames.], batch size: 18, lr: 3.75e-04 2022-05-05 08:20:30,622 INFO [train.py:715] (0/8) Epoch 5, batch 26350, loss[loss=0.1391, simple_loss=0.2146, pruned_loss=0.03184, over 4847.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2206, pruned_loss=0.04009, over 971026.31 frames.], batch size: 15, lr: 3.74e-04 2022-05-05 08:21:09,425 INFO [train.py:715] (0/8) Epoch 5, batch 26400, loss[loss=0.1866, simple_loss=0.2665, pruned_loss=0.0534, over 4773.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2212, pruned_loss=0.03993, over 971476.86 frames.], batch size: 17, lr: 3.74e-04 2022-05-05 08:21:48,030 INFO [train.py:715] (0/8) Epoch 5, batch 26450, loss[loss=0.1291, simple_loss=0.21, pruned_loss=0.02409, over 4822.00 frames.], tot_loss[loss=0.15, simple_loss=0.2208, pruned_loss=0.03962, over 971187.22 frames.], batch size: 15, lr: 3.74e-04 2022-05-05 08:22:26,953 INFO [train.py:715] (0/8) Epoch 5, batch 26500, loss[loss=0.1427, simple_loss=0.2173, pruned_loss=0.03405, over 4821.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2221, pruned_loss=0.04001, over 971563.66 frames.], batch size: 26, lr: 3.74e-04 2022-05-05 08:23:06,038 INFO [train.py:715] (0/8) Epoch 5, batch 26550, loss[loss=0.1543, simple_loss=0.2224, pruned_loss=0.04313, over 4797.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2226, pruned_loss=0.04, over 971990.13 frames.], batch size: 24, lr: 3.74e-04 2022-05-05 08:23:44,736 INFO [train.py:715] (0/8) Epoch 5, batch 26600, loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.02794, over 4916.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2218, pruned_loss=0.03969, over 972665.42 frames.], batch size: 23, lr: 3.74e-04 2022-05-05 08:24:24,180 INFO [train.py:715] (0/8) Epoch 5, batch 26650, loss[loss=0.1484, simple_loss=0.2218, pruned_loss=0.03755, over 4947.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2215, pruned_loss=0.04003, over 972337.29 frames.], batch size: 29, lr: 3.74e-04 2022-05-05 08:25:02,982 INFO [train.py:715] (0/8) Epoch 5, batch 26700, loss[loss=0.1587, simple_loss=0.2169, pruned_loss=0.05026, over 4805.00 frames.], tot_loss[loss=0.1512, simple_loss=0.222, pruned_loss=0.0402, over 972319.47 frames.], batch size: 12, lr: 3.74e-04 2022-05-05 08:25:41,832 INFO [train.py:715] (0/8) Epoch 5, batch 26750, loss[loss=0.1867, simple_loss=0.2494, pruned_loss=0.06201, over 4922.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2223, pruned_loss=0.04042, over 972596.28 frames.], batch size: 18, lr: 3.74e-04 2022-05-05 08:26:20,197 INFO [train.py:715] (0/8) Epoch 5, batch 26800, loss[loss=0.1678, simple_loss=0.234, pruned_loss=0.05081, over 4758.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2227, pruned_loss=0.04104, over 972311.21 frames.], batch size: 18, lr: 3.74e-04 2022-05-05 08:26:59,357 INFO [train.py:715] (0/8) Epoch 5, batch 26850, loss[loss=0.1509, simple_loss=0.2243, pruned_loss=0.03878, over 4944.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2226, pruned_loss=0.04094, over 972417.34 frames.], batch size: 21, lr: 3.74e-04 2022-05-05 08:27:38,333 INFO [train.py:715] (0/8) Epoch 5, batch 26900, loss[loss=0.1512, simple_loss=0.2209, pruned_loss=0.0408, over 4911.00 frames.], tot_loss[loss=0.1528, simple_loss=0.223, pruned_loss=0.04124, over 972575.11 frames.], batch size: 17, lr: 3.74e-04 2022-05-05 08:28:17,267 INFO [train.py:715] (0/8) Epoch 5, batch 26950, loss[loss=0.1262, simple_loss=0.1919, pruned_loss=0.03018, over 4817.00 frames.], tot_loss[loss=0.152, simple_loss=0.2222, pruned_loss=0.0409, over 971822.96 frames.], batch size: 13, lr: 3.74e-04 2022-05-05 08:28:55,971 INFO [train.py:715] (0/8) Epoch 5, batch 27000, loss[loss=0.1598, simple_loss=0.2258, pruned_loss=0.04686, over 4955.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2227, pruned_loss=0.04109, over 972530.09 frames.], batch size: 24, lr: 3.74e-04 2022-05-05 08:28:55,973 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 08:29:05,776 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1098, simple_loss=0.195, pruned_loss=0.01232, over 914524.00 frames. 2022-05-05 08:29:45,275 INFO [train.py:715] (0/8) Epoch 5, batch 27050, loss[loss=0.1363, simple_loss=0.2054, pruned_loss=0.0336, over 4943.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2211, pruned_loss=0.04051, over 973213.96 frames.], batch size: 24, lr: 3.74e-04 2022-05-05 08:30:24,752 INFO [train.py:715] (0/8) Epoch 5, batch 27100, loss[loss=0.1353, simple_loss=0.2073, pruned_loss=0.03167, over 4851.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2211, pruned_loss=0.04025, over 973724.94 frames.], batch size: 13, lr: 3.74e-04 2022-05-05 08:31:04,146 INFO [train.py:715] (0/8) Epoch 5, batch 27150, loss[loss=0.1554, simple_loss=0.215, pruned_loss=0.0479, over 4990.00 frames.], tot_loss[loss=0.1509, simple_loss=0.221, pruned_loss=0.0404, over 973885.06 frames.], batch size: 14, lr: 3.74e-04 2022-05-05 08:31:42,960 INFO [train.py:715] (0/8) Epoch 5, batch 27200, loss[loss=0.1795, simple_loss=0.253, pruned_loss=0.05302, over 4818.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2209, pruned_loss=0.04022, over 973867.87 frames.], batch size: 13, lr: 3.74e-04 2022-05-05 08:32:22,582 INFO [train.py:715] (0/8) Epoch 5, batch 27250, loss[loss=0.1462, simple_loss=0.218, pruned_loss=0.03718, over 4961.00 frames.], tot_loss[loss=0.151, simple_loss=0.2212, pruned_loss=0.04039, over 973883.46 frames.], batch size: 15, lr: 3.74e-04 2022-05-05 08:33:01,560 INFO [train.py:715] (0/8) Epoch 5, batch 27300, loss[loss=0.1648, simple_loss=0.2274, pruned_loss=0.05112, over 4797.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2219, pruned_loss=0.04063, over 973576.72 frames.], batch size: 24, lr: 3.74e-04 2022-05-05 08:33:40,118 INFO [train.py:715] (0/8) Epoch 5, batch 27350, loss[loss=0.1828, simple_loss=0.2485, pruned_loss=0.05853, over 4975.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2222, pruned_loss=0.0406, over 974101.94 frames.], batch size: 39, lr: 3.74e-04 2022-05-05 08:34:18,985 INFO [train.py:715] (0/8) Epoch 5, batch 27400, loss[loss=0.1662, simple_loss=0.2267, pruned_loss=0.0529, over 4837.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2226, pruned_loss=0.04054, over 975083.08 frames.], batch size: 15, lr: 3.74e-04 2022-05-05 08:34:58,261 INFO [train.py:715] (0/8) Epoch 5, batch 27450, loss[loss=0.1618, simple_loss=0.2294, pruned_loss=0.04715, over 4792.00 frames.], tot_loss[loss=0.152, simple_loss=0.2227, pruned_loss=0.04069, over 975157.98 frames.], batch size: 21, lr: 3.73e-04 2022-05-05 08:35:38,042 INFO [train.py:715] (0/8) Epoch 5, batch 27500, loss[loss=0.1644, simple_loss=0.2461, pruned_loss=0.04138, over 4698.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2236, pruned_loss=0.04131, over 974870.59 frames.], batch size: 15, lr: 3.73e-04 2022-05-05 08:36:16,524 INFO [train.py:715] (0/8) Epoch 5, batch 27550, loss[loss=0.1776, simple_loss=0.2376, pruned_loss=0.05884, over 4730.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2233, pruned_loss=0.04096, over 974584.93 frames.], batch size: 16, lr: 3.73e-04 2022-05-05 08:36:55,898 INFO [train.py:715] (0/8) Epoch 5, batch 27600, loss[loss=0.1361, simple_loss=0.2047, pruned_loss=0.03374, over 4773.00 frames.], tot_loss[loss=0.1526, simple_loss=0.223, pruned_loss=0.04105, over 973250.54 frames.], batch size: 19, lr: 3.73e-04 2022-05-05 08:37:34,975 INFO [train.py:715] (0/8) Epoch 5, batch 27650, loss[loss=0.1593, simple_loss=0.2386, pruned_loss=0.04002, over 4852.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2227, pruned_loss=0.04053, over 973804.51 frames.], batch size: 20, lr: 3.73e-04 2022-05-05 08:38:13,248 INFO [train.py:715] (0/8) Epoch 5, batch 27700, loss[loss=0.1418, simple_loss=0.206, pruned_loss=0.03877, over 4769.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2226, pruned_loss=0.04039, over 973637.91 frames.], batch size: 12, lr: 3.73e-04 2022-05-05 08:38:52,827 INFO [train.py:715] (0/8) Epoch 5, batch 27750, loss[loss=0.2091, simple_loss=0.2586, pruned_loss=0.07984, over 4780.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2217, pruned_loss=0.04041, over 973203.75 frames.], batch size: 18, lr: 3.73e-04 2022-05-05 08:39:32,589 INFO [train.py:715] (0/8) Epoch 5, batch 27800, loss[loss=0.1629, simple_loss=0.2314, pruned_loss=0.04716, over 4776.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2224, pruned_loss=0.0406, over 973008.29 frames.], batch size: 17, lr: 3.73e-04 2022-05-05 08:40:11,941 INFO [train.py:715] (0/8) Epoch 5, batch 27850, loss[loss=0.1344, simple_loss=0.2121, pruned_loss=0.02834, over 4972.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2225, pruned_loss=0.04086, over 973467.54 frames.], batch size: 15, lr: 3.73e-04 2022-05-05 08:40:50,650 INFO [train.py:715] (0/8) Epoch 5, batch 27900, loss[loss=0.148, simple_loss=0.2115, pruned_loss=0.04224, over 4789.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2228, pruned_loss=0.04113, over 971918.78 frames.], batch size: 18, lr: 3.73e-04 2022-05-05 08:41:29,597 INFO [train.py:715] (0/8) Epoch 5, batch 27950, loss[loss=0.1765, simple_loss=0.2449, pruned_loss=0.05404, over 4747.00 frames.], tot_loss[loss=0.1514, simple_loss=0.222, pruned_loss=0.04042, over 971871.93 frames.], batch size: 16, lr: 3.73e-04 2022-05-05 08:42:09,039 INFO [train.py:715] (0/8) Epoch 5, batch 28000, loss[loss=0.1647, simple_loss=0.2234, pruned_loss=0.053, over 4944.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2224, pruned_loss=0.04037, over 972496.90 frames.], batch size: 21, lr: 3.73e-04 2022-05-05 08:42:47,125 INFO [train.py:715] (0/8) Epoch 5, batch 28050, loss[loss=0.1461, simple_loss=0.2233, pruned_loss=0.03445, over 4843.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2228, pruned_loss=0.04037, over 972167.49 frames.], batch size: 13, lr: 3.73e-04 2022-05-05 08:43:25,853 INFO [train.py:715] (0/8) Epoch 5, batch 28100, loss[loss=0.1693, simple_loss=0.2257, pruned_loss=0.05644, over 4685.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2224, pruned_loss=0.04034, over 972256.83 frames.], batch size: 15, lr: 3.73e-04 2022-05-05 08:44:04,994 INFO [train.py:715] (0/8) Epoch 5, batch 28150, loss[loss=0.1417, simple_loss=0.2108, pruned_loss=0.03632, over 4893.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2222, pruned_loss=0.03997, over 971408.93 frames.], batch size: 19, lr: 3.73e-04 2022-05-05 08:44:43,934 INFO [train.py:715] (0/8) Epoch 5, batch 28200, loss[loss=0.1562, simple_loss=0.2145, pruned_loss=0.04893, over 4821.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2226, pruned_loss=0.04004, over 972275.18 frames.], batch size: 15, lr: 3.73e-04 2022-05-05 08:45:22,619 INFO [train.py:715] (0/8) Epoch 5, batch 28250, loss[loss=0.1592, simple_loss=0.2198, pruned_loss=0.0493, over 4910.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2222, pruned_loss=0.04029, over 972970.32 frames.], batch size: 17, lr: 3.73e-04 2022-05-05 08:46:01,485 INFO [train.py:715] (0/8) Epoch 5, batch 28300, loss[loss=0.1698, simple_loss=0.2346, pruned_loss=0.05251, over 4916.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2223, pruned_loss=0.0402, over 972886.14 frames.], batch size: 29, lr: 3.73e-04 2022-05-05 08:46:39,903 INFO [train.py:715] (0/8) Epoch 5, batch 28350, loss[loss=0.1343, simple_loss=0.222, pruned_loss=0.02333, over 4889.00 frames.], tot_loss[loss=0.1508, simple_loss=0.222, pruned_loss=0.03981, over 972647.37 frames.], batch size: 22, lr: 3.73e-04 2022-05-05 08:47:18,552 INFO [train.py:715] (0/8) Epoch 5, batch 28400, loss[loss=0.1452, simple_loss=0.2139, pruned_loss=0.03826, over 4851.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2224, pruned_loss=0.03993, over 972679.32 frames.], batch size: 13, lr: 3.73e-04 2022-05-05 08:47:57,676 INFO [train.py:715] (0/8) Epoch 5, batch 28450, loss[loss=0.1485, simple_loss=0.2122, pruned_loss=0.04237, over 4872.00 frames.], tot_loss[loss=0.1525, simple_loss=0.2233, pruned_loss=0.04085, over 972506.47 frames.], batch size: 20, lr: 3.73e-04 2022-05-05 08:48:36,723 INFO [train.py:715] (0/8) Epoch 5, batch 28500, loss[loss=0.1274, simple_loss=0.201, pruned_loss=0.02694, over 4912.00 frames.], tot_loss[loss=0.152, simple_loss=0.2229, pruned_loss=0.04055, over 971555.07 frames.], batch size: 23, lr: 3.72e-04 2022-05-05 08:49:15,934 INFO [train.py:715] (0/8) Epoch 5, batch 28550, loss[loss=0.1798, simple_loss=0.2525, pruned_loss=0.05351, over 4816.00 frames.], tot_loss[loss=0.1529, simple_loss=0.2238, pruned_loss=0.04104, over 972620.51 frames.], batch size: 26, lr: 3.72e-04 2022-05-05 08:49:54,625 INFO [train.py:715] (0/8) Epoch 5, batch 28600, loss[loss=0.1407, simple_loss=0.2109, pruned_loss=0.03529, over 4812.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2231, pruned_loss=0.04067, over 972901.55 frames.], batch size: 25, lr: 3.72e-04 2022-05-05 08:50:34,058 INFO [train.py:715] (0/8) Epoch 5, batch 28650, loss[loss=0.1384, simple_loss=0.1997, pruned_loss=0.03851, over 4793.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2228, pruned_loss=0.04069, over 972527.24 frames.], batch size: 14, lr: 3.72e-04 2022-05-05 08:51:12,499 INFO [train.py:715] (0/8) Epoch 5, batch 28700, loss[loss=0.1302, simple_loss=0.2035, pruned_loss=0.02849, over 4834.00 frames.], tot_loss[loss=0.1524, simple_loss=0.2227, pruned_loss=0.04106, over 972901.55 frames.], batch size: 27, lr: 3.72e-04 2022-05-05 08:51:51,350 INFO [train.py:715] (0/8) Epoch 5, batch 28750, loss[loss=0.1607, simple_loss=0.2349, pruned_loss=0.04331, over 4979.00 frames.], tot_loss[loss=0.1532, simple_loss=0.2233, pruned_loss=0.04158, over 973505.44 frames.], batch size: 35, lr: 3.72e-04 2022-05-05 08:52:30,117 INFO [train.py:715] (0/8) Epoch 5, batch 28800, loss[loss=0.1622, simple_loss=0.2384, pruned_loss=0.04297, over 4762.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2229, pruned_loss=0.04116, over 973035.65 frames.], batch size: 19, lr: 3.72e-04 2022-05-05 08:53:09,036 INFO [train.py:715] (0/8) Epoch 5, batch 28850, loss[loss=0.1751, simple_loss=0.256, pruned_loss=0.04713, over 4792.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2227, pruned_loss=0.04041, over 973776.09 frames.], batch size: 24, lr: 3.72e-04 2022-05-05 08:53:47,813 INFO [train.py:715] (0/8) Epoch 5, batch 28900, loss[loss=0.1146, simple_loss=0.1812, pruned_loss=0.02395, over 4964.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2226, pruned_loss=0.04054, over 973777.61 frames.], batch size: 14, lr: 3.72e-04 2022-05-05 08:54:26,495 INFO [train.py:715] (0/8) Epoch 5, batch 28950, loss[loss=0.148, simple_loss=0.2243, pruned_loss=0.03585, over 4867.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2227, pruned_loss=0.04048, over 973679.71 frames.], batch size: 20, lr: 3.72e-04 2022-05-05 08:55:05,608 INFO [train.py:715] (0/8) Epoch 5, batch 29000, loss[loss=0.1457, simple_loss=0.2123, pruned_loss=0.03951, over 4784.00 frames.], tot_loss[loss=0.152, simple_loss=0.223, pruned_loss=0.0405, over 974526.19 frames.], batch size: 14, lr: 3.72e-04 2022-05-05 08:55:43,851 INFO [train.py:715] (0/8) Epoch 5, batch 29050, loss[loss=0.1317, simple_loss=0.2016, pruned_loss=0.03094, over 4772.00 frames.], tot_loss[loss=0.152, simple_loss=0.223, pruned_loss=0.04052, over 973495.95 frames.], batch size: 14, lr: 3.72e-04 2022-05-05 08:56:22,920 INFO [train.py:715] (0/8) Epoch 5, batch 29100, loss[loss=0.1158, simple_loss=0.1846, pruned_loss=0.02348, over 4916.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2226, pruned_loss=0.04055, over 973522.88 frames.], batch size: 29, lr: 3.72e-04 2022-05-05 08:57:01,737 INFO [train.py:715] (0/8) Epoch 5, batch 29150, loss[loss=0.153, simple_loss=0.2314, pruned_loss=0.03728, over 4911.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2225, pruned_loss=0.0406, over 974075.06 frames.], batch size: 17, lr: 3.72e-04 2022-05-05 08:57:40,485 INFO [train.py:715] (0/8) Epoch 5, batch 29200, loss[loss=0.1001, simple_loss=0.1749, pruned_loss=0.01261, over 4946.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2224, pruned_loss=0.0406, over 973904.48 frames.], batch size: 23, lr: 3.72e-04 2022-05-05 08:58:19,230 INFO [train.py:715] (0/8) Epoch 5, batch 29250, loss[loss=0.1796, simple_loss=0.2437, pruned_loss=0.0577, over 4855.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2225, pruned_loss=0.04057, over 973293.68 frames.], batch size: 20, lr: 3.72e-04 2022-05-05 08:58:57,796 INFO [train.py:715] (0/8) Epoch 5, batch 29300, loss[loss=0.1483, simple_loss=0.2241, pruned_loss=0.03631, over 4785.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2212, pruned_loss=0.03977, over 972552.97 frames.], batch size: 17, lr: 3.72e-04 2022-05-05 08:59:37,055 INFO [train.py:715] (0/8) Epoch 5, batch 29350, loss[loss=0.1264, simple_loss=0.2029, pruned_loss=0.02494, over 4874.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2217, pruned_loss=0.03999, over 973122.65 frames.], batch size: 16, lr: 3.72e-04 2022-05-05 09:00:15,736 INFO [train.py:715] (0/8) Epoch 5, batch 29400, loss[loss=0.2095, simple_loss=0.2821, pruned_loss=0.0685, over 4970.00 frames.], tot_loss[loss=0.152, simple_loss=0.2226, pruned_loss=0.04072, over 972892.28 frames.], batch size: 39, lr: 3.72e-04 2022-05-05 09:00:54,487 INFO [train.py:715] (0/8) Epoch 5, batch 29450, loss[loss=0.1606, simple_loss=0.2248, pruned_loss=0.04817, over 4773.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2228, pruned_loss=0.04089, over 972019.70 frames.], batch size: 18, lr: 3.72e-04 2022-05-05 09:01:34,119 INFO [train.py:715] (0/8) Epoch 5, batch 29500, loss[loss=0.124, simple_loss=0.2006, pruned_loss=0.02369, over 4868.00 frames.], tot_loss[loss=0.152, simple_loss=0.2222, pruned_loss=0.04094, over 972379.68 frames.], batch size: 20, lr: 3.72e-04 2022-05-05 09:02:13,201 INFO [train.py:715] (0/8) Epoch 5, batch 29550, loss[loss=0.1592, simple_loss=0.2271, pruned_loss=0.04567, over 4795.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2219, pruned_loss=0.04073, over 971904.53 frames.], batch size: 12, lr: 3.72e-04 2022-05-05 09:02:52,387 INFO [train.py:715] (0/8) Epoch 5, batch 29600, loss[loss=0.1521, simple_loss=0.2295, pruned_loss=0.03735, over 4829.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2228, pruned_loss=0.04137, over 972245.83 frames.], batch size: 15, lr: 3.71e-04 2022-05-05 09:03:31,057 INFO [train.py:715] (0/8) Epoch 5, batch 29650, loss[loss=0.1412, simple_loss=0.2234, pruned_loss=0.02947, over 4808.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2227, pruned_loss=0.04097, over 972808.56 frames.], batch size: 25, lr: 3.71e-04 2022-05-05 09:04:09,886 INFO [train.py:715] (0/8) Epoch 5, batch 29700, loss[loss=0.129, simple_loss=0.2055, pruned_loss=0.0262, over 4851.00 frames.], tot_loss[loss=0.151, simple_loss=0.2216, pruned_loss=0.04022, over 972308.30 frames.], batch size: 20, lr: 3.71e-04 2022-05-05 09:04:48,810 INFO [train.py:715] (0/8) Epoch 5, batch 29750, loss[loss=0.1609, simple_loss=0.2385, pruned_loss=0.04168, over 4789.00 frames.], tot_loss[loss=0.1522, simple_loss=0.2228, pruned_loss=0.0408, over 972739.89 frames.], batch size: 18, lr: 3.71e-04 2022-05-05 09:05:27,380 INFO [train.py:715] (0/8) Epoch 5, batch 29800, loss[loss=0.1436, simple_loss=0.2082, pruned_loss=0.03949, over 4689.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2232, pruned_loss=0.04119, over 971898.42 frames.], batch size: 15, lr: 3.71e-04 2022-05-05 09:06:05,618 INFO [train.py:715] (0/8) Epoch 5, batch 29850, loss[loss=0.1562, simple_loss=0.2273, pruned_loss=0.04248, over 4689.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2216, pruned_loss=0.04059, over 971724.07 frames.], batch size: 15, lr: 3.71e-04 2022-05-05 09:06:44,666 INFO [train.py:715] (0/8) Epoch 5, batch 29900, loss[loss=0.174, simple_loss=0.2423, pruned_loss=0.05289, over 4907.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2207, pruned_loss=0.03995, over 972241.52 frames.], batch size: 19, lr: 3.71e-04 2022-05-05 09:07:24,013 INFO [train.py:715] (0/8) Epoch 5, batch 29950, loss[loss=0.13, simple_loss=0.1928, pruned_loss=0.03361, over 4740.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.0394, over 972263.28 frames.], batch size: 12, lr: 3.71e-04 2022-05-05 09:08:02,565 INFO [train.py:715] (0/8) Epoch 5, batch 30000, loss[loss=0.1724, simple_loss=0.2245, pruned_loss=0.06013, over 4895.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.03938, over 972778.73 frames.], batch size: 16, lr: 3.71e-04 2022-05-05 09:08:02,567 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 09:08:12,297 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.11, simple_loss=0.1953, pruned_loss=0.01241, over 914524.00 frames. 2022-05-05 09:08:51,326 INFO [train.py:715] (0/8) Epoch 5, batch 30050, loss[loss=0.1755, simple_loss=0.2494, pruned_loss=0.0508, over 4885.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2204, pruned_loss=0.03953, over 972521.21 frames.], batch size: 16, lr: 3.71e-04 2022-05-05 09:09:31,490 INFO [train.py:715] (0/8) Epoch 5, batch 30100, loss[loss=0.1476, simple_loss=0.2106, pruned_loss=0.04232, over 4659.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2215, pruned_loss=0.04071, over 972327.16 frames.], batch size: 13, lr: 3.71e-04 2022-05-05 09:10:10,288 INFO [train.py:715] (0/8) Epoch 5, batch 30150, loss[loss=0.1546, simple_loss=0.2211, pruned_loss=0.04403, over 4838.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2215, pruned_loss=0.0407, over 972375.44 frames.], batch size: 26, lr: 3.71e-04 2022-05-05 09:10:48,818 INFO [train.py:715] (0/8) Epoch 5, batch 30200, loss[loss=0.1195, simple_loss=0.191, pruned_loss=0.02403, over 4971.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2204, pruned_loss=0.03986, over 972618.86 frames.], batch size: 25, lr: 3.71e-04 2022-05-05 09:11:27,802 INFO [train.py:715] (0/8) Epoch 5, batch 30250, loss[loss=0.164, simple_loss=0.2353, pruned_loss=0.04638, over 4899.00 frames.], tot_loss[loss=0.15, simple_loss=0.2203, pruned_loss=0.03984, over 972773.80 frames.], batch size: 39, lr: 3.71e-04 2022-05-05 09:12:06,779 INFO [train.py:715] (0/8) Epoch 5, batch 30300, loss[loss=0.1645, simple_loss=0.2327, pruned_loss=0.04815, over 4799.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2209, pruned_loss=0.04009, over 972625.16 frames.], batch size: 21, lr: 3.71e-04 2022-05-05 09:12:45,781 INFO [train.py:715] (0/8) Epoch 5, batch 30350, loss[loss=0.1648, simple_loss=0.2443, pruned_loss=0.04268, over 4921.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2212, pruned_loss=0.04024, over 973519.28 frames.], batch size: 18, lr: 3.71e-04 2022-05-05 09:13:24,284 INFO [train.py:715] (0/8) Epoch 5, batch 30400, loss[loss=0.1483, simple_loss=0.2179, pruned_loss=0.03931, over 4751.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2212, pruned_loss=0.04023, over 973427.44 frames.], batch size: 19, lr: 3.71e-04 2022-05-05 09:14:03,371 INFO [train.py:715] (0/8) Epoch 5, batch 30450, loss[loss=0.142, simple_loss=0.2155, pruned_loss=0.03419, over 4757.00 frames.], tot_loss[loss=0.1505, simple_loss=0.221, pruned_loss=0.04005, over 973537.39 frames.], batch size: 19, lr: 3.71e-04 2022-05-05 09:14:42,244 INFO [train.py:715] (0/8) Epoch 5, batch 30500, loss[loss=0.143, simple_loss=0.2196, pruned_loss=0.03318, over 4968.00 frames.], tot_loss[loss=0.1502, simple_loss=0.221, pruned_loss=0.03972, over 973396.52 frames.], batch size: 35, lr: 3.71e-04 2022-05-05 09:15:20,921 INFO [train.py:715] (0/8) Epoch 5, batch 30550, loss[loss=0.1835, simple_loss=0.2503, pruned_loss=0.05839, over 4936.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2208, pruned_loss=0.03989, over 973423.99 frames.], batch size: 39, lr: 3.71e-04 2022-05-05 09:15:58,925 INFO [train.py:715] (0/8) Epoch 5, batch 30600, loss[loss=0.1492, simple_loss=0.2209, pruned_loss=0.03873, over 4971.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2205, pruned_loss=0.03991, over 973225.86 frames.], batch size: 15, lr: 3.71e-04 2022-05-05 09:16:37,777 INFO [train.py:715] (0/8) Epoch 5, batch 30650, loss[loss=0.1287, simple_loss=0.2068, pruned_loss=0.02529, over 4861.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2207, pruned_loss=0.03983, over 973695.46 frames.], batch size: 16, lr: 3.71e-04 2022-05-05 09:17:16,912 INFO [train.py:715] (0/8) Epoch 5, batch 30700, loss[loss=0.1491, simple_loss=0.2213, pruned_loss=0.03845, over 4792.00 frames.], tot_loss[loss=0.1503, simple_loss=0.221, pruned_loss=0.03978, over 973132.61 frames.], batch size: 18, lr: 3.70e-04 2022-05-05 09:17:55,172 INFO [train.py:715] (0/8) Epoch 5, batch 30750, loss[loss=0.1168, simple_loss=0.1932, pruned_loss=0.0202, over 4894.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2207, pruned_loss=0.03933, over 972840.20 frames.], batch size: 19, lr: 3.70e-04 2022-05-05 09:18:33,966 INFO [train.py:715] (0/8) Epoch 5, batch 30800, loss[loss=0.1389, simple_loss=0.2211, pruned_loss=0.02837, over 4800.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2208, pruned_loss=0.03951, over 973036.77 frames.], batch size: 21, lr: 3.70e-04 2022-05-05 09:19:12,983 INFO [train.py:715] (0/8) Epoch 5, batch 30850, loss[loss=0.1583, simple_loss=0.2353, pruned_loss=0.04061, over 4763.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2213, pruned_loss=0.03972, over 972637.25 frames.], batch size: 19, lr: 3.70e-04 2022-05-05 09:19:51,003 INFO [train.py:715] (0/8) Epoch 5, batch 30900, loss[loss=0.1485, simple_loss=0.2126, pruned_loss=0.04217, over 4939.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2206, pruned_loss=0.03942, over 973811.03 frames.], batch size: 21, lr: 3.70e-04 2022-05-05 09:20:29,870 INFO [train.py:715] (0/8) Epoch 5, batch 30950, loss[loss=0.1554, simple_loss=0.2221, pruned_loss=0.04429, over 4804.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2208, pruned_loss=0.03968, over 973242.67 frames.], batch size: 25, lr: 3.70e-04 2022-05-05 09:21:09,512 INFO [train.py:715] (0/8) Epoch 5, batch 31000, loss[loss=0.1298, simple_loss=0.1987, pruned_loss=0.03049, over 4952.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2216, pruned_loss=0.04005, over 973775.23 frames.], batch size: 21, lr: 3.70e-04 2022-05-05 09:21:48,975 INFO [train.py:715] (0/8) Epoch 5, batch 31050, loss[loss=0.1598, simple_loss=0.2265, pruned_loss=0.04657, over 4907.00 frames.], tot_loss[loss=0.15, simple_loss=0.2205, pruned_loss=0.03979, over 973889.25 frames.], batch size: 17, lr: 3.70e-04 2022-05-05 09:22:27,590 INFO [train.py:715] (0/8) Epoch 5, batch 31100, loss[loss=0.1582, simple_loss=0.2406, pruned_loss=0.03788, over 4660.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2219, pruned_loss=0.04042, over 973880.43 frames.], batch size: 13, lr: 3.70e-04 2022-05-05 09:23:06,675 INFO [train.py:715] (0/8) Epoch 5, batch 31150, loss[loss=0.1337, simple_loss=0.2063, pruned_loss=0.03058, over 4987.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2221, pruned_loss=0.04064, over 972844.20 frames.], batch size: 25, lr: 3.70e-04 2022-05-05 09:23:45,593 INFO [train.py:715] (0/8) Epoch 5, batch 31200, loss[loss=0.1513, simple_loss=0.2025, pruned_loss=0.05005, over 4881.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2218, pruned_loss=0.04032, over 973047.88 frames.], batch size: 16, lr: 3.70e-04 2022-05-05 09:24:24,055 INFO [train.py:715] (0/8) Epoch 5, batch 31250, loss[loss=0.1654, simple_loss=0.2409, pruned_loss=0.04496, over 4898.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2219, pruned_loss=0.04038, over 973228.82 frames.], batch size: 19, lr: 3.70e-04 2022-05-05 09:25:02,645 INFO [train.py:715] (0/8) Epoch 5, batch 31300, loss[loss=0.1449, simple_loss=0.2134, pruned_loss=0.03819, over 4913.00 frames.], tot_loss[loss=0.1515, simple_loss=0.222, pruned_loss=0.04051, over 973669.38 frames.], batch size: 18, lr: 3.70e-04 2022-05-05 09:25:41,532 INFO [train.py:715] (0/8) Epoch 5, batch 31350, loss[loss=0.1467, simple_loss=0.2222, pruned_loss=0.03557, over 4897.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2223, pruned_loss=0.04042, over 973526.07 frames.], batch size: 19, lr: 3.70e-04 2022-05-05 09:26:20,315 INFO [train.py:715] (0/8) Epoch 5, batch 31400, loss[loss=0.1347, simple_loss=0.2036, pruned_loss=0.0329, over 4993.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2213, pruned_loss=0.04013, over 974373.81 frames.], batch size: 14, lr: 3.70e-04 2022-05-05 09:26:59,039 INFO [train.py:715] (0/8) Epoch 5, batch 31450, loss[loss=0.148, simple_loss=0.2173, pruned_loss=0.03934, over 4781.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2222, pruned_loss=0.04044, over 974275.63 frames.], batch size: 17, lr: 3.70e-04 2022-05-05 09:27:37,863 INFO [train.py:715] (0/8) Epoch 5, batch 31500, loss[loss=0.1464, simple_loss=0.2186, pruned_loss=0.03712, over 4967.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2224, pruned_loss=0.04092, over 973984.78 frames.], batch size: 15, lr: 3.70e-04 2022-05-05 09:28:16,783 INFO [train.py:715] (0/8) Epoch 5, batch 31550, loss[loss=0.1312, simple_loss=0.2062, pruned_loss=0.02806, over 4782.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2214, pruned_loss=0.04039, over 972595.93 frames.], batch size: 17, lr: 3.70e-04 2022-05-05 09:28:55,558 INFO [train.py:715] (0/8) Epoch 5, batch 31600, loss[loss=0.1417, simple_loss=0.2145, pruned_loss=0.03441, over 4769.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2215, pruned_loss=0.04014, over 972694.82 frames.], batch size: 18, lr: 3.70e-04 2022-05-05 09:29:34,421 INFO [train.py:715] (0/8) Epoch 5, batch 31650, loss[loss=0.133, simple_loss=0.2048, pruned_loss=0.03057, over 4948.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2224, pruned_loss=0.04063, over 973163.36 frames.], batch size: 21, lr: 3.70e-04 2022-05-05 09:30:13,321 INFO [train.py:715] (0/8) Epoch 5, batch 31700, loss[loss=0.1437, simple_loss=0.2106, pruned_loss=0.03841, over 4914.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2223, pruned_loss=0.04055, over 973627.35 frames.], batch size: 17, lr: 3.70e-04 2022-05-05 09:30:52,056 INFO [train.py:715] (0/8) Epoch 5, batch 31750, loss[loss=0.1613, simple_loss=0.2371, pruned_loss=0.04279, over 4861.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2233, pruned_loss=0.04063, over 972905.00 frames.], batch size: 20, lr: 3.70e-04 2022-05-05 09:31:31,169 INFO [train.py:715] (0/8) Epoch 5, batch 31800, loss[loss=0.1879, simple_loss=0.2551, pruned_loss=0.06035, over 4733.00 frames.], tot_loss[loss=0.152, simple_loss=0.2224, pruned_loss=0.04078, over 972141.84 frames.], batch size: 16, lr: 3.69e-04 2022-05-05 09:32:09,900 INFO [train.py:715] (0/8) Epoch 5, batch 31850, loss[loss=0.1426, simple_loss=0.2143, pruned_loss=0.03539, over 4789.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2226, pruned_loss=0.041, over 971952.30 frames.], batch size: 17, lr: 3.69e-04 2022-05-05 09:32:49,445 INFO [train.py:715] (0/8) Epoch 5, batch 31900, loss[loss=0.1471, simple_loss=0.2225, pruned_loss=0.03591, over 4892.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2214, pruned_loss=0.04019, over 971736.55 frames.], batch size: 22, lr: 3.69e-04 2022-05-05 09:33:28,129 INFO [train.py:715] (0/8) Epoch 5, batch 31950, loss[loss=0.1257, simple_loss=0.2009, pruned_loss=0.02522, over 4875.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2213, pruned_loss=0.04003, over 972085.50 frames.], batch size: 16, lr: 3.69e-04 2022-05-05 09:34:06,671 INFO [train.py:715] (0/8) Epoch 5, batch 32000, loss[loss=0.2045, simple_loss=0.2706, pruned_loss=0.0692, over 4704.00 frames.], tot_loss[loss=0.1526, simple_loss=0.223, pruned_loss=0.04112, over 971577.20 frames.], batch size: 15, lr: 3.69e-04 2022-05-05 09:34:45,045 INFO [train.py:715] (0/8) Epoch 5, batch 32050, loss[loss=0.1585, simple_loss=0.2315, pruned_loss=0.04276, over 4976.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2233, pruned_loss=0.04139, over 971531.66 frames.], batch size: 28, lr: 3.69e-04 2022-05-05 09:35:24,092 INFO [train.py:715] (0/8) Epoch 5, batch 32100, loss[loss=0.1486, simple_loss=0.228, pruned_loss=0.0346, over 4948.00 frames.], tot_loss[loss=0.1533, simple_loss=0.2239, pruned_loss=0.04139, over 970993.78 frames.], batch size: 23, lr: 3.69e-04 2022-05-05 09:36:02,962 INFO [train.py:715] (0/8) Epoch 5, batch 32150, loss[loss=0.1796, simple_loss=0.2478, pruned_loss=0.05564, over 4978.00 frames.], tot_loss[loss=0.1526, simple_loss=0.2233, pruned_loss=0.04094, over 969860.68 frames.], batch size: 24, lr: 3.69e-04 2022-05-05 09:36:41,520 INFO [train.py:715] (0/8) Epoch 5, batch 32200, loss[loss=0.1356, simple_loss=0.2126, pruned_loss=0.02928, over 4767.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2225, pruned_loss=0.04034, over 970488.89 frames.], batch size: 14, lr: 3.69e-04 2022-05-05 09:37:20,072 INFO [train.py:715] (0/8) Epoch 5, batch 32250, loss[loss=0.1427, simple_loss=0.2137, pruned_loss=0.03584, over 4947.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2218, pruned_loss=0.03978, over 971487.04 frames.], batch size: 29, lr: 3.69e-04 2022-05-05 09:37:59,208 INFO [train.py:715] (0/8) Epoch 5, batch 32300, loss[loss=0.1654, simple_loss=0.2451, pruned_loss=0.04284, over 4902.00 frames.], tot_loss[loss=0.1514, simple_loss=0.2223, pruned_loss=0.04027, over 971930.84 frames.], batch size: 17, lr: 3.69e-04 2022-05-05 09:38:37,800 INFO [train.py:715] (0/8) Epoch 5, batch 32350, loss[loss=0.1578, simple_loss=0.2241, pruned_loss=0.04582, over 4758.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2221, pruned_loss=0.04001, over 972092.69 frames.], batch size: 16, lr: 3.69e-04 2022-05-05 09:39:16,503 INFO [train.py:715] (0/8) Epoch 5, batch 32400, loss[loss=0.1764, simple_loss=0.2354, pruned_loss=0.05872, over 4950.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2226, pruned_loss=0.04083, over 972543.47 frames.], batch size: 35, lr: 3.69e-04 2022-05-05 09:39:55,114 INFO [train.py:715] (0/8) Epoch 5, batch 32450, loss[loss=0.1386, simple_loss=0.2097, pruned_loss=0.03369, over 4896.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2229, pruned_loss=0.04087, over 972738.79 frames.], batch size: 19, lr: 3.69e-04 2022-05-05 09:40:33,911 INFO [train.py:715] (0/8) Epoch 5, batch 32500, loss[loss=0.1296, simple_loss=0.2162, pruned_loss=0.02152, over 4752.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2224, pruned_loss=0.04067, over 972282.72 frames.], batch size: 19, lr: 3.69e-04 2022-05-05 09:41:13,464 INFO [train.py:715] (0/8) Epoch 5, batch 32550, loss[loss=0.1547, simple_loss=0.231, pruned_loss=0.03914, over 4671.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2225, pruned_loss=0.04101, over 971318.50 frames.], batch size: 14, lr: 3.69e-04 2022-05-05 09:41:51,929 INFO [train.py:715] (0/8) Epoch 5, batch 32600, loss[loss=0.1421, simple_loss=0.2093, pruned_loss=0.03742, over 4803.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2221, pruned_loss=0.04042, over 970806.99 frames.], batch size: 12, lr: 3.69e-04 2022-05-05 09:42:30,724 INFO [train.py:715] (0/8) Epoch 5, batch 32650, loss[loss=0.1521, simple_loss=0.2176, pruned_loss=0.04333, over 4766.00 frames.], tot_loss[loss=0.1513, simple_loss=0.2217, pruned_loss=0.04049, over 971721.52 frames.], batch size: 18, lr: 3.69e-04 2022-05-05 09:43:09,270 INFO [train.py:715] (0/8) Epoch 5, batch 32700, loss[loss=0.1362, simple_loss=0.2089, pruned_loss=0.03175, over 4903.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2213, pruned_loss=0.04044, over 971454.28 frames.], batch size: 19, lr: 3.69e-04 2022-05-05 09:43:47,568 INFO [train.py:715] (0/8) Epoch 5, batch 32750, loss[loss=0.1537, simple_loss=0.2186, pruned_loss=0.04434, over 4842.00 frames.], tot_loss[loss=0.151, simple_loss=0.2217, pruned_loss=0.04016, over 972131.45 frames.], batch size: 32, lr: 3.69e-04 2022-05-05 09:44:26,284 INFO [train.py:715] (0/8) Epoch 5, batch 32800, loss[loss=0.163, simple_loss=0.2262, pruned_loss=0.0499, over 4865.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2223, pruned_loss=0.04061, over 972663.55 frames.], batch size: 32, lr: 3.69e-04 2022-05-05 09:45:05,104 INFO [train.py:715] (0/8) Epoch 5, batch 32850, loss[loss=0.1255, simple_loss=0.1932, pruned_loss=0.02889, over 4824.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2224, pruned_loss=0.04091, over 972980.36 frames.], batch size: 12, lr: 3.69e-04 2022-05-05 09:45:44,048 INFO [train.py:715] (0/8) Epoch 5, batch 32900, loss[loss=0.202, simple_loss=0.2797, pruned_loss=0.06211, over 4882.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2221, pruned_loss=0.04073, over 973237.40 frames.], batch size: 16, lr: 3.69e-04 2022-05-05 09:46:22,919 INFO [train.py:715] (0/8) Epoch 5, batch 32950, loss[loss=0.153, simple_loss=0.2245, pruned_loss=0.04076, over 4988.00 frames.], tot_loss[loss=0.1516, simple_loss=0.222, pruned_loss=0.04061, over 972969.38 frames.], batch size: 25, lr: 3.68e-04 2022-05-05 09:47:01,970 INFO [train.py:715] (0/8) Epoch 5, batch 33000, loss[loss=0.1308, simple_loss=0.2013, pruned_loss=0.03016, over 4944.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2212, pruned_loss=0.04007, over 973188.83 frames.], batch size: 29, lr: 3.68e-04 2022-05-05 09:47:01,971 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 09:47:11,685 INFO [train.py:742] (0/8) Epoch 5, validation: loss=0.1099, simple_loss=0.1951, pruned_loss=0.01236, over 914524.00 frames. 2022-05-05 09:47:50,702 INFO [train.py:715] (0/8) Epoch 5, batch 33050, loss[loss=0.1521, simple_loss=0.2373, pruned_loss=0.03349, over 4927.00 frames.], tot_loss[loss=0.1513, simple_loss=0.222, pruned_loss=0.04025, over 973281.06 frames.], batch size: 29, lr: 3.68e-04 2022-05-05 09:48:29,613 INFO [train.py:715] (0/8) Epoch 5, batch 33100, loss[loss=0.1505, simple_loss=0.2149, pruned_loss=0.04311, over 4783.00 frames.], tot_loss[loss=0.151, simple_loss=0.222, pruned_loss=0.04002, over 972774.84 frames.], batch size: 17, lr: 3.68e-04 2022-05-05 09:49:07,620 INFO [train.py:715] (0/8) Epoch 5, batch 33150, loss[loss=0.1618, simple_loss=0.2364, pruned_loss=0.0436, over 4949.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2211, pruned_loss=0.03957, over 972866.35 frames.], batch size: 21, lr: 3.68e-04 2022-05-05 09:49:46,214 INFO [train.py:715] (0/8) Epoch 5, batch 33200, loss[loss=0.1511, simple_loss=0.2213, pruned_loss=0.0404, over 4885.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2217, pruned_loss=0.03975, over 972954.93 frames.], batch size: 19, lr: 3.68e-04 2022-05-05 09:50:25,076 INFO [train.py:715] (0/8) Epoch 5, batch 33250, loss[loss=0.1324, simple_loss=0.2118, pruned_loss=0.02647, over 4810.00 frames.], tot_loss[loss=0.1511, simple_loss=0.222, pruned_loss=0.04008, over 973070.44 frames.], batch size: 21, lr: 3.68e-04 2022-05-05 09:51:03,567 INFO [train.py:715] (0/8) Epoch 5, batch 33300, loss[loss=0.1388, simple_loss=0.212, pruned_loss=0.0328, over 4924.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2222, pruned_loss=0.04, over 973254.69 frames.], batch size: 18, lr: 3.68e-04 2022-05-05 09:51:41,935 INFO [train.py:715] (0/8) Epoch 5, batch 33350, loss[loss=0.1657, simple_loss=0.2249, pruned_loss=0.05318, over 4849.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2218, pruned_loss=0.03974, over 972218.39 frames.], batch size: 30, lr: 3.68e-04 2022-05-05 09:52:21,206 INFO [train.py:715] (0/8) Epoch 5, batch 33400, loss[loss=0.141, simple_loss=0.2147, pruned_loss=0.03362, over 4810.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2218, pruned_loss=0.03943, over 972543.55 frames.], batch size: 25, lr: 3.68e-04 2022-05-05 09:52:59,898 INFO [train.py:715] (0/8) Epoch 5, batch 33450, loss[loss=0.1381, simple_loss=0.2133, pruned_loss=0.03147, over 4960.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2219, pruned_loss=0.03938, over 973394.83 frames.], batch size: 35, lr: 3.68e-04 2022-05-05 09:53:38,248 INFO [train.py:715] (0/8) Epoch 5, batch 33500, loss[loss=0.1984, simple_loss=0.261, pruned_loss=0.06786, over 4912.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2229, pruned_loss=0.0401, over 973032.72 frames.], batch size: 39, lr: 3.68e-04 2022-05-05 09:54:16,983 INFO [train.py:715] (0/8) Epoch 5, batch 33550, loss[loss=0.1462, simple_loss=0.2187, pruned_loss=0.0368, over 4806.00 frames.], tot_loss[loss=0.151, simple_loss=0.2217, pruned_loss=0.04016, over 973553.08 frames.], batch size: 15, lr: 3.68e-04 2022-05-05 09:54:55,687 INFO [train.py:715] (0/8) Epoch 5, batch 33600, loss[loss=0.1405, simple_loss=0.21, pruned_loss=0.03553, over 4865.00 frames.], tot_loss[loss=0.1509, simple_loss=0.222, pruned_loss=0.03994, over 973857.85 frames.], batch size: 32, lr: 3.68e-04 2022-05-05 09:55:34,351 INFO [train.py:715] (0/8) Epoch 5, batch 33650, loss[loss=0.1846, simple_loss=0.2596, pruned_loss=0.05485, over 4890.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2217, pruned_loss=0.0397, over 973239.74 frames.], batch size: 16, lr: 3.68e-04 2022-05-05 09:56:12,629 INFO [train.py:715] (0/8) Epoch 5, batch 33700, loss[loss=0.1634, simple_loss=0.2271, pruned_loss=0.04983, over 4899.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2213, pruned_loss=0.03953, over 973122.54 frames.], batch size: 19, lr: 3.68e-04 2022-05-05 09:56:51,509 INFO [train.py:715] (0/8) Epoch 5, batch 33750, loss[loss=0.1544, simple_loss=0.2156, pruned_loss=0.04664, over 4794.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2218, pruned_loss=0.03996, over 972963.09 frames.], batch size: 18, lr: 3.68e-04 2022-05-05 09:57:30,152 INFO [train.py:715] (0/8) Epoch 5, batch 33800, loss[loss=0.1578, simple_loss=0.2353, pruned_loss=0.04012, over 4808.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2219, pruned_loss=0.03978, over 972641.04 frames.], batch size: 24, lr: 3.68e-04 2022-05-05 09:58:09,138 INFO [train.py:715] (0/8) Epoch 5, batch 33850, loss[loss=0.1396, simple_loss=0.2208, pruned_loss=0.02925, over 4754.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2227, pruned_loss=0.04032, over 972178.19 frames.], batch size: 19, lr: 3.68e-04 2022-05-05 09:58:47,621 INFO [train.py:715] (0/8) Epoch 5, batch 33900, loss[loss=0.1492, simple_loss=0.224, pruned_loss=0.0372, over 4867.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2224, pruned_loss=0.04069, over 972116.30 frames.], batch size: 39, lr: 3.68e-04 2022-05-05 09:59:25,958 INFO [train.py:715] (0/8) Epoch 5, batch 33950, loss[loss=0.1421, simple_loss=0.2188, pruned_loss=0.03267, over 4760.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2238, pruned_loss=0.0416, over 972239.23 frames.], batch size: 16, lr: 3.68e-04 2022-05-05 09:59:27,773 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-208000.pt 2022-05-05 10:00:06,987 INFO [train.py:715] (0/8) Epoch 5, batch 34000, loss[loss=0.1611, simple_loss=0.228, pruned_loss=0.04709, over 4910.00 frames.], tot_loss[loss=0.1542, simple_loss=0.2245, pruned_loss=0.04198, over 972345.76 frames.], batch size: 18, lr: 3.68e-04 2022-05-05 10:00:45,228 INFO [train.py:715] (0/8) Epoch 5, batch 34050, loss[loss=0.1529, simple_loss=0.2155, pruned_loss=0.04515, over 4785.00 frames.], tot_loss[loss=0.1535, simple_loss=0.2238, pruned_loss=0.04154, over 972717.30 frames.], batch size: 14, lr: 3.67e-04 2022-05-05 10:01:23,918 INFO [train.py:715] (0/8) Epoch 5, batch 34100, loss[loss=0.1579, simple_loss=0.2263, pruned_loss=0.04475, over 4637.00 frames.], tot_loss[loss=0.1528, simple_loss=0.2232, pruned_loss=0.04117, over 972621.89 frames.], batch size: 13, lr: 3.67e-04 2022-05-05 10:02:02,743 INFO [train.py:715] (0/8) Epoch 5, batch 34150, loss[loss=0.139, simple_loss=0.2115, pruned_loss=0.03323, over 4983.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2215, pruned_loss=0.04037, over 971981.37 frames.], batch size: 39, lr: 3.67e-04 2022-05-05 10:02:41,105 INFO [train.py:715] (0/8) Epoch 5, batch 34200, loss[loss=0.1632, simple_loss=0.2426, pruned_loss=0.04192, over 4987.00 frames.], tot_loss[loss=0.1506, simple_loss=0.221, pruned_loss=0.0401, over 971520.91 frames.], batch size: 25, lr: 3.67e-04 2022-05-05 10:03:20,093 INFO [train.py:715] (0/8) Epoch 5, batch 34250, loss[loss=0.1397, simple_loss=0.2044, pruned_loss=0.03752, over 4813.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2207, pruned_loss=0.03997, over 972044.72 frames.], batch size: 13, lr: 3.67e-04 2022-05-05 10:03:58,244 INFO [train.py:715] (0/8) Epoch 5, batch 34300, loss[loss=0.145, simple_loss=0.2201, pruned_loss=0.03501, over 4914.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2218, pruned_loss=0.04033, over 971578.92 frames.], batch size: 18, lr: 3.67e-04 2022-05-05 10:04:36,906 INFO [train.py:715] (0/8) Epoch 5, batch 34350, loss[loss=0.1426, simple_loss=0.2189, pruned_loss=0.0332, over 4850.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2212, pruned_loss=0.04031, over 972186.04 frames.], batch size: 34, lr: 3.67e-04 2022-05-05 10:05:14,793 INFO [train.py:715] (0/8) Epoch 5, batch 34400, loss[loss=0.1369, simple_loss=0.2056, pruned_loss=0.03412, over 4833.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2209, pruned_loss=0.04017, over 972792.35 frames.], batch size: 26, lr: 3.67e-04 2022-05-05 10:05:53,760 INFO [train.py:715] (0/8) Epoch 5, batch 34450, loss[loss=0.175, simple_loss=0.2511, pruned_loss=0.04949, over 4829.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2216, pruned_loss=0.0401, over 973005.54 frames.], batch size: 15, lr: 3.67e-04 2022-05-05 10:06:32,731 INFO [train.py:715] (0/8) Epoch 5, batch 34500, loss[loss=0.1432, simple_loss=0.2122, pruned_loss=0.03706, over 4712.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2211, pruned_loss=0.03966, over 972121.61 frames.], batch size: 15, lr: 3.67e-04 2022-05-05 10:07:11,199 INFO [train.py:715] (0/8) Epoch 5, batch 34550, loss[loss=0.1389, simple_loss=0.2087, pruned_loss=0.03452, over 4969.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2209, pruned_loss=0.03936, over 971571.43 frames.], batch size: 35, lr: 3.67e-04 2022-05-05 10:07:49,948 INFO [train.py:715] (0/8) Epoch 5, batch 34600, loss[loss=0.1641, simple_loss=0.2307, pruned_loss=0.04874, over 4849.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2214, pruned_loss=0.04011, over 971896.52 frames.], batch size: 32, lr: 3.67e-04 2022-05-05 10:08:28,651 INFO [train.py:715] (0/8) Epoch 5, batch 34650, loss[loss=0.1405, simple_loss=0.2097, pruned_loss=0.03566, over 4786.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2209, pruned_loss=0.03964, over 971909.80 frames.], batch size: 14, lr: 3.67e-04 2022-05-05 10:09:07,576 INFO [train.py:715] (0/8) Epoch 5, batch 34700, loss[loss=0.1401, simple_loss=0.2102, pruned_loss=0.03498, over 4881.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2205, pruned_loss=0.03947, over 972089.49 frames.], batch size: 22, lr: 3.67e-04 2022-05-05 10:09:44,908 INFO [train.py:715] (0/8) Epoch 5, batch 34750, loss[loss=0.1475, simple_loss=0.2203, pruned_loss=0.03736, over 4759.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2203, pruned_loss=0.03942, over 971209.08 frames.], batch size: 16, lr: 3.67e-04 2022-05-05 10:10:21,602 INFO [train.py:715] (0/8) Epoch 5, batch 34800, loss[loss=0.1318, simple_loss=0.2006, pruned_loss=0.03152, over 4776.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2202, pruned_loss=0.03968, over 969583.91 frames.], batch size: 12, lr: 3.67e-04 2022-05-05 10:10:30,200 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-5.pt 2022-05-05 10:11:11,223 INFO [train.py:715] (0/8) Epoch 6, batch 0, loss[loss=0.1218, simple_loss=0.1863, pruned_loss=0.02865, over 4964.00 frames.], tot_loss[loss=0.1218, simple_loss=0.1863, pruned_loss=0.02865, over 4964.00 frames.], batch size: 15, lr: 3.46e-04 2022-05-05 10:11:50,188 INFO [train.py:715] (0/8) Epoch 6, batch 50, loss[loss=0.1236, simple_loss=0.2036, pruned_loss=0.02179, over 4693.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2212, pruned_loss=0.03873, over 219150.54 frames.], batch size: 15, lr: 3.46e-04 2022-05-05 10:12:29,108 INFO [train.py:715] (0/8) Epoch 6, batch 100, loss[loss=0.1512, simple_loss=0.2153, pruned_loss=0.04355, over 4757.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2219, pruned_loss=0.03954, over 385326.74 frames.], batch size: 18, lr: 3.46e-04 2022-05-05 10:13:08,347 INFO [train.py:715] (0/8) Epoch 6, batch 150, loss[loss=0.1458, simple_loss=0.2183, pruned_loss=0.03667, over 4990.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2211, pruned_loss=0.03914, over 516131.90 frames.], batch size: 25, lr: 3.46e-04 2022-05-05 10:13:47,631 INFO [train.py:715] (0/8) Epoch 6, batch 200, loss[loss=0.1157, simple_loss=0.1813, pruned_loss=0.02507, over 4802.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2217, pruned_loss=0.03991, over 617622.92 frames.], batch size: 13, lr: 3.45e-04 2022-05-05 10:14:26,641 INFO [train.py:715] (0/8) Epoch 6, batch 250, loss[loss=0.1677, simple_loss=0.2277, pruned_loss=0.05386, over 4955.00 frames.], tot_loss[loss=0.151, simple_loss=0.2215, pruned_loss=0.0402, over 695894.13 frames.], batch size: 35, lr: 3.45e-04 2022-05-05 10:15:05,461 INFO [train.py:715] (0/8) Epoch 6, batch 300, loss[loss=0.1702, simple_loss=0.2422, pruned_loss=0.04913, over 4983.00 frames.], tot_loss[loss=0.1501, simple_loss=0.221, pruned_loss=0.03964, over 756576.73 frames.], batch size: 39, lr: 3.45e-04 2022-05-05 10:15:44,469 INFO [train.py:715] (0/8) Epoch 6, batch 350, loss[loss=0.1556, simple_loss=0.2317, pruned_loss=0.03981, over 4807.00 frames.], tot_loss[loss=0.151, simple_loss=0.2219, pruned_loss=0.04006, over 804189.30 frames.], batch size: 17, lr: 3.45e-04 2022-05-05 10:16:23,651 INFO [train.py:715] (0/8) Epoch 6, batch 400, loss[loss=0.1301, simple_loss=0.2053, pruned_loss=0.0275, over 4804.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2228, pruned_loss=0.04045, over 841872.33 frames.], batch size: 24, lr: 3.45e-04 2022-05-05 10:17:02,408 INFO [train.py:715] (0/8) Epoch 6, batch 450, loss[loss=0.1683, simple_loss=0.2381, pruned_loss=0.04925, over 4942.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2224, pruned_loss=0.04049, over 871217.82 frames.], batch size: 39, lr: 3.45e-04 2022-05-05 10:17:41,006 INFO [train.py:715] (0/8) Epoch 6, batch 500, loss[loss=0.1563, simple_loss=0.2191, pruned_loss=0.04673, over 4756.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2221, pruned_loss=0.04004, over 893093.53 frames.], batch size: 16, lr: 3.45e-04 2022-05-05 10:18:20,496 INFO [train.py:715] (0/8) Epoch 6, batch 550, loss[loss=0.1335, simple_loss=0.199, pruned_loss=0.03402, over 4968.00 frames.], tot_loss[loss=0.1503, simple_loss=0.221, pruned_loss=0.03978, over 910530.38 frames.], batch size: 35, lr: 3.45e-04 2022-05-05 10:18:59,384 INFO [train.py:715] (0/8) Epoch 6, batch 600, loss[loss=0.1422, simple_loss=0.2262, pruned_loss=0.02909, over 4940.00 frames.], tot_loss[loss=0.1517, simple_loss=0.2222, pruned_loss=0.04058, over 924302.18 frames.], batch size: 23, lr: 3.45e-04 2022-05-05 10:19:38,400 INFO [train.py:715] (0/8) Epoch 6, batch 650, loss[loss=0.1562, simple_loss=0.2242, pruned_loss=0.04408, over 4937.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2217, pruned_loss=0.03997, over 934245.50 frames.], batch size: 23, lr: 3.45e-04 2022-05-05 10:20:17,487 INFO [train.py:715] (0/8) Epoch 6, batch 700, loss[loss=0.1764, simple_loss=0.2515, pruned_loss=0.05067, over 4785.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2214, pruned_loss=0.04008, over 942396.28 frames.], batch size: 21, lr: 3.45e-04 2022-05-05 10:20:57,079 INFO [train.py:715] (0/8) Epoch 6, batch 750, loss[loss=0.1216, simple_loss=0.1986, pruned_loss=0.02227, over 4969.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2219, pruned_loss=0.04023, over 949583.94 frames.], batch size: 15, lr: 3.45e-04 2022-05-05 10:21:35,853 INFO [train.py:715] (0/8) Epoch 6, batch 800, loss[loss=0.1302, simple_loss=0.2093, pruned_loss=0.02559, over 4978.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2209, pruned_loss=0.0396, over 954177.81 frames.], batch size: 28, lr: 3.45e-04 2022-05-05 10:22:14,568 INFO [train.py:715] (0/8) Epoch 6, batch 850, loss[loss=0.1812, simple_loss=0.2485, pruned_loss=0.05692, over 4967.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2214, pruned_loss=0.04016, over 958586.77 frames.], batch size: 24, lr: 3.45e-04 2022-05-05 10:22:54,102 INFO [train.py:715] (0/8) Epoch 6, batch 900, loss[loss=0.1292, simple_loss=0.1971, pruned_loss=0.03067, over 4931.00 frames.], tot_loss[loss=0.1502, simple_loss=0.221, pruned_loss=0.03968, over 961221.66 frames.], batch size: 21, lr: 3.45e-04 2022-05-05 10:23:33,397 INFO [train.py:715] (0/8) Epoch 6, batch 950, loss[loss=0.1618, simple_loss=0.2326, pruned_loss=0.04554, over 4943.00 frames.], tot_loss[loss=0.15, simple_loss=0.2209, pruned_loss=0.03959, over 963196.16 frames.], batch size: 35, lr: 3.45e-04 2022-05-05 10:24:12,119 INFO [train.py:715] (0/8) Epoch 6, batch 1000, loss[loss=0.1243, simple_loss=0.2004, pruned_loss=0.02413, over 4968.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2201, pruned_loss=0.03948, over 965302.79 frames.], batch size: 25, lr: 3.45e-04 2022-05-05 10:24:51,182 INFO [train.py:715] (0/8) Epoch 6, batch 1050, loss[loss=0.1365, simple_loss=0.2124, pruned_loss=0.03024, over 4828.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2209, pruned_loss=0.03984, over 967273.65 frames.], batch size: 27, lr: 3.45e-04 2022-05-05 10:25:30,702 INFO [train.py:715] (0/8) Epoch 6, batch 1100, loss[loss=0.1628, simple_loss=0.2324, pruned_loss=0.04657, over 4845.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2205, pruned_loss=0.03946, over 968299.46 frames.], batch size: 30, lr: 3.45e-04 2022-05-05 10:26:09,923 INFO [train.py:715] (0/8) Epoch 6, batch 1150, loss[loss=0.1874, simple_loss=0.2381, pruned_loss=0.06835, over 4804.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2203, pruned_loss=0.03937, over 968864.60 frames.], batch size: 14, lr: 3.45e-04 2022-05-05 10:26:48,492 INFO [train.py:715] (0/8) Epoch 6, batch 1200, loss[loss=0.1247, simple_loss=0.1977, pruned_loss=0.02582, over 4974.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2205, pruned_loss=0.0392, over 969633.85 frames.], batch size: 25, lr: 3.45e-04 2022-05-05 10:27:28,193 INFO [train.py:715] (0/8) Epoch 6, batch 1250, loss[loss=0.1244, simple_loss=0.1969, pruned_loss=0.02598, over 4857.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2198, pruned_loss=0.03916, over 970398.43 frames.], batch size: 32, lr: 3.45e-04 2022-05-05 10:28:07,469 INFO [train.py:715] (0/8) Epoch 6, batch 1300, loss[loss=0.1513, simple_loss=0.2204, pruned_loss=0.04107, over 4889.00 frames.], tot_loss[loss=0.15, simple_loss=0.2206, pruned_loss=0.03971, over 970655.85 frames.], batch size: 22, lr: 3.45e-04 2022-05-05 10:28:46,061 INFO [train.py:715] (0/8) Epoch 6, batch 1350, loss[loss=0.1643, simple_loss=0.2339, pruned_loss=0.04737, over 4855.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2205, pruned_loss=0.03966, over 971253.48 frames.], batch size: 20, lr: 3.45e-04 2022-05-05 10:29:24,988 INFO [train.py:715] (0/8) Epoch 6, batch 1400, loss[loss=0.1792, simple_loss=0.2538, pruned_loss=0.05233, over 4815.00 frames.], tot_loss[loss=0.15, simple_loss=0.2205, pruned_loss=0.03973, over 971612.96 frames.], batch size: 15, lr: 3.45e-04 2022-05-05 10:30:04,136 INFO [train.py:715] (0/8) Epoch 6, batch 1450, loss[loss=0.182, simple_loss=0.242, pruned_loss=0.06098, over 4697.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2211, pruned_loss=0.03993, over 971018.55 frames.], batch size: 15, lr: 3.44e-04 2022-05-05 10:30:42,810 INFO [train.py:715] (0/8) Epoch 6, batch 1500, loss[loss=0.1517, simple_loss=0.22, pruned_loss=0.04169, over 4796.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2204, pruned_loss=0.03946, over 971973.49 frames.], batch size: 24, lr: 3.44e-04 2022-05-05 10:31:21,214 INFO [train.py:715] (0/8) Epoch 6, batch 1550, loss[loss=0.1419, simple_loss=0.2216, pruned_loss=0.03107, over 4813.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2211, pruned_loss=0.03959, over 970829.18 frames.], batch size: 26, lr: 3.44e-04 2022-05-05 10:32:00,470 INFO [train.py:715] (0/8) Epoch 6, batch 1600, loss[loss=0.1906, simple_loss=0.2584, pruned_loss=0.06139, over 4881.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2205, pruned_loss=0.03937, over 971287.73 frames.], batch size: 32, lr: 3.44e-04 2022-05-05 10:32:40,005 INFO [train.py:715] (0/8) Epoch 6, batch 1650, loss[loss=0.1506, simple_loss=0.2314, pruned_loss=0.03493, over 4755.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2209, pruned_loss=0.03922, over 971270.64 frames.], batch size: 16, lr: 3.44e-04 2022-05-05 10:33:18,412 INFO [train.py:715] (0/8) Epoch 6, batch 1700, loss[loss=0.1498, simple_loss=0.2181, pruned_loss=0.04078, over 4946.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2209, pruned_loss=0.03962, over 972566.79 frames.], batch size: 35, lr: 3.44e-04 2022-05-05 10:33:57,726 INFO [train.py:715] (0/8) Epoch 6, batch 1750, loss[loss=0.1278, simple_loss=0.1995, pruned_loss=0.02806, over 4991.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2207, pruned_loss=0.03993, over 972358.30 frames.], batch size: 14, lr: 3.44e-04 2022-05-05 10:34:37,320 INFO [train.py:715] (0/8) Epoch 6, batch 1800, loss[loss=0.1353, simple_loss=0.2095, pruned_loss=0.03058, over 4826.00 frames.], tot_loss[loss=0.1509, simple_loss=0.221, pruned_loss=0.04037, over 972354.52 frames.], batch size: 26, lr: 3.44e-04 2022-05-05 10:35:16,401 INFO [train.py:715] (0/8) Epoch 6, batch 1850, loss[loss=0.1496, simple_loss=0.2247, pruned_loss=0.03725, over 4912.00 frames.], tot_loss[loss=0.1509, simple_loss=0.221, pruned_loss=0.04045, over 971517.47 frames.], batch size: 17, lr: 3.44e-04 2022-05-05 10:35:54,728 INFO [train.py:715] (0/8) Epoch 6, batch 1900, loss[loss=0.1862, simple_loss=0.2529, pruned_loss=0.05977, over 4749.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2201, pruned_loss=0.03978, over 971300.10 frames.], batch size: 19, lr: 3.44e-04 2022-05-05 10:36:34,277 INFO [train.py:715] (0/8) Epoch 6, batch 1950, loss[loss=0.1356, simple_loss=0.2121, pruned_loss=0.02953, over 4781.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2201, pruned_loss=0.03965, over 971099.94 frames.], batch size: 18, lr: 3.44e-04 2022-05-05 10:37:13,031 INFO [train.py:715] (0/8) Epoch 6, batch 2000, loss[loss=0.1482, simple_loss=0.236, pruned_loss=0.03019, over 4955.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2209, pruned_loss=0.03984, over 970975.35 frames.], batch size: 21, lr: 3.44e-04 2022-05-05 10:37:52,080 INFO [train.py:715] (0/8) Epoch 6, batch 2050, loss[loss=0.1585, simple_loss=0.2269, pruned_loss=0.04507, over 4863.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2209, pruned_loss=0.03938, over 972174.31 frames.], batch size: 32, lr: 3.44e-04 2022-05-05 10:38:30,925 INFO [train.py:715] (0/8) Epoch 6, batch 2100, loss[loss=0.1422, simple_loss=0.2116, pruned_loss=0.03635, over 4989.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2206, pruned_loss=0.03936, over 971805.15 frames.], batch size: 25, lr: 3.44e-04 2022-05-05 10:39:10,109 INFO [train.py:715] (0/8) Epoch 6, batch 2150, loss[loss=0.1534, simple_loss=0.2164, pruned_loss=0.04522, over 4960.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03932, over 972483.67 frames.], batch size: 14, lr: 3.44e-04 2022-05-05 10:39:49,070 INFO [train.py:715] (0/8) Epoch 6, batch 2200, loss[loss=0.1545, simple_loss=0.2242, pruned_loss=0.04236, over 4845.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2208, pruned_loss=0.03937, over 971229.96 frames.], batch size: 32, lr: 3.44e-04 2022-05-05 10:40:27,525 INFO [train.py:715] (0/8) Epoch 6, batch 2250, loss[loss=0.1446, simple_loss=0.2253, pruned_loss=0.03199, over 4804.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2205, pruned_loss=0.03947, over 970851.39 frames.], batch size: 25, lr: 3.44e-04 2022-05-05 10:41:06,873 INFO [train.py:715] (0/8) Epoch 6, batch 2300, loss[loss=0.1588, simple_loss=0.2298, pruned_loss=0.04388, over 4820.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2202, pruned_loss=0.03958, over 971587.30 frames.], batch size: 25, lr: 3.44e-04 2022-05-05 10:41:45,977 INFO [train.py:715] (0/8) Epoch 6, batch 2350, loss[loss=0.1398, simple_loss=0.2186, pruned_loss=0.03051, over 4917.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2197, pruned_loss=0.03921, over 971485.51 frames.], batch size: 18, lr: 3.44e-04 2022-05-05 10:42:24,701 INFO [train.py:715] (0/8) Epoch 6, batch 2400, loss[loss=0.1496, simple_loss=0.2185, pruned_loss=0.04032, over 4755.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2201, pruned_loss=0.03926, over 971633.25 frames.], batch size: 19, lr: 3.44e-04 2022-05-05 10:43:03,442 INFO [train.py:715] (0/8) Epoch 6, batch 2450, loss[loss=0.1423, simple_loss=0.2094, pruned_loss=0.03765, over 4774.00 frames.], tot_loss[loss=0.1497, simple_loss=0.22, pruned_loss=0.03972, over 970508.74 frames.], batch size: 17, lr: 3.44e-04 2022-05-05 10:43:42,674 INFO [train.py:715] (0/8) Epoch 6, batch 2500, loss[loss=0.1377, simple_loss=0.2144, pruned_loss=0.03048, over 4805.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2197, pruned_loss=0.03957, over 971003.43 frames.], batch size: 24, lr: 3.44e-04 2022-05-05 10:44:21,855 INFO [train.py:715] (0/8) Epoch 6, batch 2550, loss[loss=0.1398, simple_loss=0.2042, pruned_loss=0.03764, over 4787.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2198, pruned_loss=0.03949, over 971038.26 frames.], batch size: 14, lr: 3.44e-04 2022-05-05 10:45:00,763 INFO [train.py:715] (0/8) Epoch 6, batch 2600, loss[loss=0.1472, simple_loss=0.215, pruned_loss=0.03974, over 4848.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2208, pruned_loss=0.03986, over 971727.78 frames.], batch size: 30, lr: 3.44e-04 2022-05-05 10:45:40,393 INFO [train.py:715] (0/8) Epoch 6, batch 2650, loss[loss=0.177, simple_loss=0.2417, pruned_loss=0.05618, over 4880.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2204, pruned_loss=0.03943, over 972102.72 frames.], batch size: 16, lr: 3.43e-04 2022-05-05 10:46:19,958 INFO [train.py:715] (0/8) Epoch 6, batch 2700, loss[loss=0.1409, simple_loss=0.2172, pruned_loss=0.03226, over 4975.00 frames.], tot_loss[loss=0.1502, simple_loss=0.221, pruned_loss=0.03974, over 971865.27 frames.], batch size: 24, lr: 3.43e-04 2022-05-05 10:46:58,100 INFO [train.py:715] (0/8) Epoch 6, batch 2750, loss[loss=0.1443, simple_loss=0.2079, pruned_loss=0.04033, over 4644.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2214, pruned_loss=0.03989, over 971703.09 frames.], batch size: 13, lr: 3.43e-04 2022-05-05 10:47:37,111 INFO [train.py:715] (0/8) Epoch 6, batch 2800, loss[loss=0.1202, simple_loss=0.1892, pruned_loss=0.02561, over 4748.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2207, pruned_loss=0.03932, over 971922.22 frames.], batch size: 16, lr: 3.43e-04 2022-05-05 10:48:16,470 INFO [train.py:715] (0/8) Epoch 6, batch 2850, loss[loss=0.1467, simple_loss=0.2153, pruned_loss=0.03905, over 4944.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2207, pruned_loss=0.03941, over 972483.49 frames.], batch size: 39, lr: 3.43e-04 2022-05-05 10:48:55,295 INFO [train.py:715] (0/8) Epoch 6, batch 2900, loss[loss=0.1169, simple_loss=0.193, pruned_loss=0.02042, over 4786.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2214, pruned_loss=0.03975, over 973146.68 frames.], batch size: 13, lr: 3.43e-04 2022-05-05 10:49:33,635 INFO [train.py:715] (0/8) Epoch 6, batch 2950, loss[loss=0.1357, simple_loss=0.2085, pruned_loss=0.03142, over 4811.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2218, pruned_loss=0.03979, over 973657.18 frames.], batch size: 21, lr: 3.43e-04 2022-05-05 10:50:12,861 INFO [train.py:715] (0/8) Epoch 6, batch 3000, loss[loss=0.1499, simple_loss=0.2208, pruned_loss=0.03944, over 4882.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2214, pruned_loss=0.03967, over 973675.05 frames.], batch size: 20, lr: 3.43e-04 2022-05-05 10:50:12,862 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 10:50:22,539 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1095, simple_loss=0.1945, pruned_loss=0.01223, over 914524.00 frames. 2022-05-05 10:51:02,170 INFO [train.py:715] (0/8) Epoch 6, batch 3050, loss[loss=0.1704, simple_loss=0.2436, pruned_loss=0.0486, over 4888.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2218, pruned_loss=0.03989, over 972895.69 frames.], batch size: 19, lr: 3.43e-04 2022-05-05 10:51:41,563 INFO [train.py:715] (0/8) Epoch 6, batch 3100, loss[loss=0.1727, simple_loss=0.2375, pruned_loss=0.05391, over 4846.00 frames.], tot_loss[loss=0.15, simple_loss=0.2212, pruned_loss=0.03942, over 972249.87 frames.], batch size: 30, lr: 3.43e-04 2022-05-05 10:52:20,128 INFO [train.py:715] (0/8) Epoch 6, batch 3150, loss[loss=0.1712, simple_loss=0.2471, pruned_loss=0.04767, over 4830.00 frames.], tot_loss[loss=0.1508, simple_loss=0.222, pruned_loss=0.03984, over 972009.58 frames.], batch size: 12, lr: 3.43e-04 2022-05-05 10:52:58,777 INFO [train.py:715] (0/8) Epoch 6, batch 3200, loss[loss=0.1454, simple_loss=0.2151, pruned_loss=0.03791, over 4912.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2222, pruned_loss=0.03983, over 971489.35 frames.], batch size: 18, lr: 3.43e-04 2022-05-05 10:53:38,604 INFO [train.py:715] (0/8) Epoch 6, batch 3250, loss[loss=0.134, simple_loss=0.2025, pruned_loss=0.03278, over 4799.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2229, pruned_loss=0.04045, over 971269.40 frames.], batch size: 21, lr: 3.43e-04 2022-05-05 10:54:17,331 INFO [train.py:715] (0/8) Epoch 6, batch 3300, loss[loss=0.1373, simple_loss=0.2138, pruned_loss=0.03037, over 4869.00 frames.], tot_loss[loss=0.1521, simple_loss=0.2233, pruned_loss=0.04042, over 971733.81 frames.], batch size: 20, lr: 3.43e-04 2022-05-05 10:54:55,860 INFO [train.py:715] (0/8) Epoch 6, batch 3350, loss[loss=0.1653, simple_loss=0.2383, pruned_loss=0.04613, over 4731.00 frames.], tot_loss[loss=0.1519, simple_loss=0.2231, pruned_loss=0.04041, over 972179.08 frames.], batch size: 12, lr: 3.43e-04 2022-05-05 10:55:35,255 INFO [train.py:715] (0/8) Epoch 6, batch 3400, loss[loss=0.1655, simple_loss=0.2461, pruned_loss=0.04247, over 4906.00 frames.], tot_loss[loss=0.1527, simple_loss=0.2237, pruned_loss=0.04091, over 973007.22 frames.], batch size: 19, lr: 3.43e-04 2022-05-05 10:56:14,430 INFO [train.py:715] (0/8) Epoch 6, batch 3450, loss[loss=0.1468, simple_loss=0.2236, pruned_loss=0.03499, over 4842.00 frames.], tot_loss[loss=0.1531, simple_loss=0.2239, pruned_loss=0.04115, over 973358.73 frames.], batch size: 26, lr: 3.43e-04 2022-05-05 10:56:52,538 INFO [train.py:715] (0/8) Epoch 6, batch 3500, loss[loss=0.1398, simple_loss=0.2089, pruned_loss=0.03529, over 4696.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2229, pruned_loss=0.04086, over 973264.91 frames.], batch size: 15, lr: 3.43e-04 2022-05-05 10:57:31,366 INFO [train.py:715] (0/8) Epoch 6, batch 3550, loss[loss=0.1484, simple_loss=0.219, pruned_loss=0.03888, over 4920.00 frames.], tot_loss[loss=0.1533, simple_loss=0.224, pruned_loss=0.04135, over 974056.33 frames.], batch size: 23, lr: 3.43e-04 2022-05-05 10:58:10,827 INFO [train.py:715] (0/8) Epoch 6, batch 3600, loss[loss=0.1264, simple_loss=0.1912, pruned_loss=0.03077, over 4843.00 frames.], tot_loss[loss=0.1523, simple_loss=0.223, pruned_loss=0.04082, over 974001.74 frames.], batch size: 12, lr: 3.43e-04 2022-05-05 10:58:49,769 INFO [train.py:715] (0/8) Epoch 6, batch 3650, loss[loss=0.1418, simple_loss=0.213, pruned_loss=0.03534, over 4982.00 frames.], tot_loss[loss=0.1523, simple_loss=0.2231, pruned_loss=0.04073, over 973381.19 frames.], batch size: 28, lr: 3.43e-04 2022-05-05 10:59:27,962 INFO [train.py:715] (0/8) Epoch 6, batch 3700, loss[loss=0.1536, simple_loss=0.2201, pruned_loss=0.04359, over 4839.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2225, pruned_loss=0.04056, over 972945.06 frames.], batch size: 15, lr: 3.43e-04 2022-05-05 11:00:07,224 INFO [train.py:715] (0/8) Epoch 6, batch 3750, loss[loss=0.1245, simple_loss=0.2042, pruned_loss=0.0224, over 4800.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2221, pruned_loss=0.04057, over 971769.91 frames.], batch size: 21, lr: 3.43e-04 2022-05-05 11:00:46,314 INFO [train.py:715] (0/8) Epoch 6, batch 3800, loss[loss=0.143, simple_loss=0.2088, pruned_loss=0.03864, over 4780.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2211, pruned_loss=0.04033, over 973021.56 frames.], batch size: 17, lr: 3.43e-04 2022-05-05 11:01:24,434 INFO [train.py:715] (0/8) Epoch 6, batch 3850, loss[loss=0.1256, simple_loss=0.2056, pruned_loss=0.02276, over 4846.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2199, pruned_loss=0.03965, over 972989.62 frames.], batch size: 26, lr: 3.43e-04 2022-05-05 11:02:03,348 INFO [train.py:715] (0/8) Epoch 6, batch 3900, loss[loss=0.1634, simple_loss=0.2244, pruned_loss=0.05117, over 4775.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2194, pruned_loss=0.03904, over 972094.93 frames.], batch size: 19, lr: 3.42e-04 2022-05-05 11:02:42,647 INFO [train.py:715] (0/8) Epoch 6, batch 3950, loss[loss=0.1642, simple_loss=0.2229, pruned_loss=0.05279, over 4894.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2193, pruned_loss=0.039, over 972317.53 frames.], batch size: 38, lr: 3.42e-04 2022-05-05 11:03:21,704 INFO [train.py:715] (0/8) Epoch 6, batch 4000, loss[loss=0.1336, simple_loss=0.2106, pruned_loss=0.02835, over 4902.00 frames.], tot_loss[loss=0.149, simple_loss=0.2192, pruned_loss=0.03941, over 973391.30 frames.], batch size: 19, lr: 3.42e-04 2022-05-05 11:04:00,009 INFO [train.py:715] (0/8) Epoch 6, batch 4050, loss[loss=0.14, simple_loss=0.2252, pruned_loss=0.02745, over 4937.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2189, pruned_loss=0.03901, over 973499.04 frames.], batch size: 23, lr: 3.42e-04 2022-05-05 11:04:39,115 INFO [train.py:715] (0/8) Epoch 6, batch 4100, loss[loss=0.1401, simple_loss=0.2163, pruned_loss=0.03192, over 4756.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2191, pruned_loss=0.03911, over 972686.48 frames.], batch size: 16, lr: 3.42e-04 2022-05-05 11:05:17,848 INFO [train.py:715] (0/8) Epoch 6, batch 4150, loss[loss=0.1401, simple_loss=0.2074, pruned_loss=0.03644, over 4778.00 frames.], tot_loss[loss=0.148, simple_loss=0.2186, pruned_loss=0.03867, over 972663.32 frames.], batch size: 14, lr: 3.42e-04 2022-05-05 11:05:56,004 INFO [train.py:715] (0/8) Epoch 6, batch 4200, loss[loss=0.1577, simple_loss=0.2323, pruned_loss=0.04152, over 4843.00 frames.], tot_loss[loss=0.1481, simple_loss=0.219, pruned_loss=0.03861, over 972012.99 frames.], batch size: 26, lr: 3.42e-04 2022-05-05 11:06:34,722 INFO [train.py:715] (0/8) Epoch 6, batch 4250, loss[loss=0.1299, simple_loss=0.2057, pruned_loss=0.02704, over 4934.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2192, pruned_loss=0.03866, over 972248.80 frames.], batch size: 21, lr: 3.42e-04 2022-05-05 11:07:13,784 INFO [train.py:715] (0/8) Epoch 6, batch 4300, loss[loss=0.1443, simple_loss=0.2048, pruned_loss=0.04188, over 4858.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2207, pruned_loss=0.03953, over 972585.10 frames.], batch size: 32, lr: 3.42e-04 2022-05-05 11:07:52,582 INFO [train.py:715] (0/8) Epoch 6, batch 4350, loss[loss=0.1429, simple_loss=0.2104, pruned_loss=0.03766, over 4907.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2203, pruned_loss=0.03949, over 972357.70 frames.], batch size: 18, lr: 3.42e-04 2022-05-05 11:08:30,485 INFO [train.py:715] (0/8) Epoch 6, batch 4400, loss[loss=0.1608, simple_loss=0.2343, pruned_loss=0.04367, over 4949.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2198, pruned_loss=0.03922, over 972654.89 frames.], batch size: 24, lr: 3.42e-04 2022-05-05 11:09:08,940 INFO [train.py:715] (0/8) Epoch 6, batch 4450, loss[loss=0.1342, simple_loss=0.2106, pruned_loss=0.02892, over 4889.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2201, pruned_loss=0.03968, over 972164.08 frames.], batch size: 19, lr: 3.42e-04 2022-05-05 11:09:48,070 INFO [train.py:715] (0/8) Epoch 6, batch 4500, loss[loss=0.1658, simple_loss=0.2489, pruned_loss=0.04138, over 4907.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2202, pruned_loss=0.03957, over 972250.33 frames.], batch size: 39, lr: 3.42e-04 2022-05-05 11:10:26,351 INFO [train.py:715] (0/8) Epoch 6, batch 4550, loss[loss=0.1682, simple_loss=0.2308, pruned_loss=0.05277, over 4746.00 frames.], tot_loss[loss=0.1508, simple_loss=0.2212, pruned_loss=0.04019, over 972362.99 frames.], batch size: 16, lr: 3.42e-04 2022-05-05 11:11:04,817 INFO [train.py:715] (0/8) Epoch 6, batch 4600, loss[loss=0.1342, simple_loss=0.2164, pruned_loss=0.02596, over 4931.00 frames.], tot_loss[loss=0.1518, simple_loss=0.2219, pruned_loss=0.04087, over 972796.13 frames.], batch size: 21, lr: 3.42e-04 2022-05-05 11:11:44,222 INFO [train.py:715] (0/8) Epoch 6, batch 4650, loss[loss=0.1481, simple_loss=0.2188, pruned_loss=0.03869, over 4712.00 frames.], tot_loss[loss=0.1512, simple_loss=0.2212, pruned_loss=0.04062, over 972655.52 frames.], batch size: 12, lr: 3.42e-04 2022-05-05 11:12:23,346 INFO [train.py:715] (0/8) Epoch 6, batch 4700, loss[loss=0.1703, simple_loss=0.2418, pruned_loss=0.04945, over 4732.00 frames.], tot_loss[loss=0.1507, simple_loss=0.221, pruned_loss=0.04017, over 972806.16 frames.], batch size: 16, lr: 3.42e-04 2022-05-05 11:13:01,629 INFO [train.py:715] (0/8) Epoch 6, batch 4750, loss[loss=0.1693, simple_loss=0.232, pruned_loss=0.05325, over 4909.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2203, pruned_loss=0.03969, over 972411.24 frames.], batch size: 19, lr: 3.42e-04 2022-05-05 11:13:40,643 INFO [train.py:715] (0/8) Epoch 6, batch 4800, loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03053, over 4863.00 frames.], tot_loss[loss=0.149, simple_loss=0.2193, pruned_loss=0.0393, over 972591.45 frames.], batch size: 32, lr: 3.42e-04 2022-05-05 11:14:19,738 INFO [train.py:715] (0/8) Epoch 6, batch 4850, loss[loss=0.1402, simple_loss=0.2104, pruned_loss=0.03497, over 4965.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2192, pruned_loss=0.03925, over 973007.18 frames.], batch size: 15, lr: 3.42e-04 2022-05-05 11:14:58,280 INFO [train.py:715] (0/8) Epoch 6, batch 4900, loss[loss=0.1503, simple_loss=0.2293, pruned_loss=0.0356, over 4880.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2198, pruned_loss=0.03934, over 972614.46 frames.], batch size: 19, lr: 3.42e-04 2022-05-05 11:15:37,161 INFO [train.py:715] (0/8) Epoch 6, batch 4950, loss[loss=0.1314, simple_loss=0.2028, pruned_loss=0.03002, over 4919.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2191, pruned_loss=0.03883, over 972487.78 frames.], batch size: 23, lr: 3.42e-04 2022-05-05 11:16:16,917 INFO [train.py:715] (0/8) Epoch 6, batch 5000, loss[loss=0.1852, simple_loss=0.2464, pruned_loss=0.062, over 4971.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2192, pruned_loss=0.03906, over 972813.32 frames.], batch size: 35, lr: 3.42e-04 2022-05-05 11:16:55,988 INFO [train.py:715] (0/8) Epoch 6, batch 5050, loss[loss=0.1276, simple_loss=0.2103, pruned_loss=0.02246, over 4987.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2187, pruned_loss=0.03899, over 972995.07 frames.], batch size: 25, lr: 3.42e-04 2022-05-05 11:17:34,326 INFO [train.py:715] (0/8) Epoch 6, batch 5100, loss[loss=0.1477, simple_loss=0.2318, pruned_loss=0.03176, over 4936.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2184, pruned_loss=0.03877, over 973777.91 frames.], batch size: 29, lr: 3.42e-04 2022-05-05 11:18:13,251 INFO [train.py:715] (0/8) Epoch 6, batch 5150, loss[loss=0.1578, simple_loss=0.2277, pruned_loss=0.04399, over 4763.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2186, pruned_loss=0.03908, over 972695.09 frames.], batch size: 19, lr: 3.41e-04 2022-05-05 11:18:52,358 INFO [train.py:715] (0/8) Epoch 6, batch 5200, loss[loss=0.1576, simple_loss=0.2307, pruned_loss=0.04231, over 4939.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2196, pruned_loss=0.03947, over 972015.90 frames.], batch size: 21, lr: 3.41e-04 2022-05-05 11:19:30,488 INFO [train.py:715] (0/8) Epoch 6, batch 5250, loss[loss=0.1355, simple_loss=0.2005, pruned_loss=0.03522, over 4925.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2194, pruned_loss=0.03943, over 972021.65 frames.], batch size: 23, lr: 3.41e-04 2022-05-05 11:20:09,573 INFO [train.py:715] (0/8) Epoch 6, batch 5300, loss[loss=0.1432, simple_loss=0.2209, pruned_loss=0.03276, over 4959.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2192, pruned_loss=0.03899, over 972189.27 frames.], batch size: 39, lr: 3.41e-04 2022-05-05 11:20:48,891 INFO [train.py:715] (0/8) Epoch 6, batch 5350, loss[loss=0.1476, simple_loss=0.225, pruned_loss=0.03509, over 4789.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2192, pruned_loss=0.03922, over 971809.70 frames.], batch size: 17, lr: 3.41e-04 2022-05-05 11:21:27,963 INFO [train.py:715] (0/8) Epoch 6, batch 5400, loss[loss=0.1335, simple_loss=0.2013, pruned_loss=0.03285, over 4877.00 frames.], tot_loss[loss=0.1492, simple_loss=0.22, pruned_loss=0.03918, over 972072.59 frames.], batch size: 22, lr: 3.41e-04 2022-05-05 11:22:06,516 INFO [train.py:715] (0/8) Epoch 6, batch 5450, loss[loss=0.1308, simple_loss=0.2115, pruned_loss=0.02509, over 4839.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2205, pruned_loss=0.03938, over 972423.71 frames.], batch size: 30, lr: 3.41e-04 2022-05-05 11:22:45,314 INFO [train.py:715] (0/8) Epoch 6, batch 5500, loss[loss=0.1805, simple_loss=0.2423, pruned_loss=0.05941, over 4720.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2209, pruned_loss=0.03969, over 971623.98 frames.], batch size: 15, lr: 3.41e-04 2022-05-05 11:23:24,189 INFO [train.py:715] (0/8) Epoch 6, batch 5550, loss[loss=0.1484, simple_loss=0.2289, pruned_loss=0.03399, over 4810.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2205, pruned_loss=0.03933, over 970912.83 frames.], batch size: 25, lr: 3.41e-04 2022-05-05 11:24:02,779 INFO [train.py:715] (0/8) Epoch 6, batch 5600, loss[loss=0.1788, simple_loss=0.2339, pruned_loss=0.06187, over 4972.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2204, pruned_loss=0.03941, over 970633.15 frames.], batch size: 14, lr: 3.41e-04 2022-05-05 11:24:42,271 INFO [train.py:715] (0/8) Epoch 6, batch 5650, loss[loss=0.1215, simple_loss=0.1775, pruned_loss=0.03276, over 4771.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2207, pruned_loss=0.03941, over 970265.19 frames.], batch size: 12, lr: 3.41e-04 2022-05-05 11:25:21,623 INFO [train.py:715] (0/8) Epoch 6, batch 5700, loss[loss=0.1619, simple_loss=0.2349, pruned_loss=0.0444, over 4753.00 frames.], tot_loss[loss=0.15, simple_loss=0.2209, pruned_loss=0.03956, over 971018.97 frames.], batch size: 16, lr: 3.41e-04 2022-05-05 11:26:00,235 INFO [train.py:715] (0/8) Epoch 6, batch 5750, loss[loss=0.1254, simple_loss=0.1993, pruned_loss=0.02574, over 4818.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2204, pruned_loss=0.03905, over 971189.16 frames.], batch size: 25, lr: 3.41e-04 2022-05-05 11:26:38,641 INFO [train.py:715] (0/8) Epoch 6, batch 5800, loss[loss=0.1421, simple_loss=0.2217, pruned_loss=0.03127, over 4792.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2201, pruned_loss=0.03853, over 971739.87 frames.], batch size: 18, lr: 3.41e-04 2022-05-05 11:27:17,532 INFO [train.py:715] (0/8) Epoch 6, batch 5850, loss[loss=0.1619, simple_loss=0.2471, pruned_loss=0.03834, over 4804.00 frames.], tot_loss[loss=0.149, simple_loss=0.2202, pruned_loss=0.03891, over 972535.78 frames.], batch size: 18, lr: 3.41e-04 2022-05-05 11:27:56,991 INFO [train.py:715] (0/8) Epoch 6, batch 5900, loss[loss=0.1188, simple_loss=0.1949, pruned_loss=0.02132, over 4764.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2207, pruned_loss=0.0391, over 972516.03 frames.], batch size: 16, lr: 3.41e-04 2022-05-05 11:28:34,906 INFO [train.py:715] (0/8) Epoch 6, batch 5950, loss[loss=0.1554, simple_loss=0.2318, pruned_loss=0.03949, over 4823.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2214, pruned_loss=0.03917, over 972382.94 frames.], batch size: 25, lr: 3.41e-04 2022-05-05 11:29:14,283 INFO [train.py:715] (0/8) Epoch 6, batch 6000, loss[loss=0.148, simple_loss=0.2157, pruned_loss=0.04012, over 4773.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2212, pruned_loss=0.03926, over 971364.04 frames.], batch size: 14, lr: 3.41e-04 2022-05-05 11:29:14,285 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 11:29:24,855 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1095, simple_loss=0.1945, pruned_loss=0.01229, over 914524.00 frames. 2022-05-05 11:30:04,466 INFO [train.py:715] (0/8) Epoch 6, batch 6050, loss[loss=0.2102, simple_loss=0.2766, pruned_loss=0.07186, over 4743.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2204, pruned_loss=0.03904, over 971588.60 frames.], batch size: 16, lr: 3.41e-04 2022-05-05 11:30:43,724 INFO [train.py:715] (0/8) Epoch 6, batch 6100, loss[loss=0.1609, simple_loss=0.2306, pruned_loss=0.04561, over 4758.00 frames.], tot_loss[loss=0.1496, simple_loss=0.221, pruned_loss=0.03915, over 972002.77 frames.], batch size: 19, lr: 3.41e-04 2022-05-05 11:31:23,120 INFO [train.py:715] (0/8) Epoch 6, batch 6150, loss[loss=0.1354, simple_loss=0.2085, pruned_loss=0.03118, over 4696.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2215, pruned_loss=0.03915, over 973024.77 frames.], batch size: 15, lr: 3.41e-04 2022-05-05 11:32:01,615 INFO [train.py:715] (0/8) Epoch 6, batch 6200, loss[loss=0.1679, simple_loss=0.2418, pruned_loss=0.04696, over 4891.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2204, pruned_loss=0.03863, over 973638.50 frames.], batch size: 22, lr: 3.41e-04 2022-05-05 11:32:40,932 INFO [train.py:715] (0/8) Epoch 6, batch 6250, loss[loss=0.1641, simple_loss=0.2458, pruned_loss=0.04121, over 4796.00 frames.], tot_loss[loss=0.1487, simple_loss=0.22, pruned_loss=0.03865, over 974262.30 frames.], batch size: 18, lr: 3.41e-04 2022-05-05 11:33:20,230 INFO [train.py:715] (0/8) Epoch 6, batch 6300, loss[loss=0.1545, simple_loss=0.2187, pruned_loss=0.04514, over 4921.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2199, pruned_loss=0.03836, over 973565.26 frames.], batch size: 18, lr: 3.41e-04 2022-05-05 11:33:58,704 INFO [train.py:715] (0/8) Epoch 6, batch 6350, loss[loss=0.135, simple_loss=0.207, pruned_loss=0.03148, over 4818.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2207, pruned_loss=0.03918, over 973099.30 frames.], batch size: 27, lr: 3.41e-04 2022-05-05 11:34:37,339 INFO [train.py:715] (0/8) Epoch 6, batch 6400, loss[loss=0.1357, simple_loss=0.2087, pruned_loss=0.03133, over 4832.00 frames.], tot_loss[loss=0.149, simple_loss=0.2204, pruned_loss=0.03886, over 973116.15 frames.], batch size: 27, lr: 3.40e-04 2022-05-05 11:35:16,565 INFO [train.py:715] (0/8) Epoch 6, batch 6450, loss[loss=0.1358, simple_loss=0.2142, pruned_loss=0.02869, over 4975.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2219, pruned_loss=0.0395, over 973258.03 frames.], batch size: 35, lr: 3.40e-04 2022-05-05 11:35:55,384 INFO [train.py:715] (0/8) Epoch 6, batch 6500, loss[loss=0.1447, simple_loss=0.2129, pruned_loss=0.03827, over 4930.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2213, pruned_loss=0.0397, over 972994.73 frames.], batch size: 23, lr: 3.40e-04 2022-05-05 11:36:33,965 INFO [train.py:715] (0/8) Epoch 6, batch 6550, loss[loss=0.1742, simple_loss=0.2398, pruned_loss=0.05424, over 4870.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2208, pruned_loss=0.0398, over 973788.80 frames.], batch size: 32, lr: 3.40e-04 2022-05-05 11:37:12,774 INFO [train.py:715] (0/8) Epoch 6, batch 6600, loss[loss=0.1397, simple_loss=0.2096, pruned_loss=0.03488, over 4802.00 frames.], tot_loss[loss=0.1511, simple_loss=0.2214, pruned_loss=0.0404, over 973723.12 frames.], batch size: 12, lr: 3.40e-04 2022-05-05 11:37:52,971 INFO [train.py:715] (0/8) Epoch 6, batch 6650, loss[loss=0.1611, simple_loss=0.2334, pruned_loss=0.04436, over 4968.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2211, pruned_loss=0.04032, over 973375.00 frames.], batch size: 15, lr: 3.40e-04 2022-05-05 11:38:31,782 INFO [train.py:715] (0/8) Epoch 6, batch 6700, loss[loss=0.1245, simple_loss=0.2034, pruned_loss=0.02277, over 4903.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2222, pruned_loss=0.04041, over 972914.82 frames.], batch size: 23, lr: 3.40e-04 2022-05-05 11:39:10,517 INFO [train.py:715] (0/8) Epoch 6, batch 6750, loss[loss=0.1326, simple_loss=0.2002, pruned_loss=0.03252, over 4904.00 frames.], tot_loss[loss=0.1509, simple_loss=0.2219, pruned_loss=0.03993, over 972638.59 frames.], batch size: 17, lr: 3.40e-04 2022-05-05 11:39:49,803 INFO [train.py:715] (0/8) Epoch 6, batch 6800, loss[loss=0.1337, simple_loss=0.2137, pruned_loss=0.02684, over 4871.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2215, pruned_loss=0.03986, over 972630.47 frames.], batch size: 20, lr: 3.40e-04 2022-05-05 11:40:28,788 INFO [train.py:715] (0/8) Epoch 6, batch 6850, loss[loss=0.1692, simple_loss=0.2405, pruned_loss=0.04893, over 4940.00 frames.], tot_loss[loss=0.1499, simple_loss=0.221, pruned_loss=0.03933, over 972925.94 frames.], batch size: 39, lr: 3.40e-04 2022-05-05 11:41:06,838 INFO [train.py:715] (0/8) Epoch 6, batch 6900, loss[loss=0.1549, simple_loss=0.2338, pruned_loss=0.03799, over 4976.00 frames.], tot_loss[loss=0.1486, simple_loss=0.22, pruned_loss=0.03865, over 973271.37 frames.], batch size: 35, lr: 3.40e-04 2022-05-05 11:41:45,910 INFO [train.py:715] (0/8) Epoch 6, batch 6950, loss[loss=0.1449, simple_loss=0.2207, pruned_loss=0.0346, over 4655.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2202, pruned_loss=0.03874, over 972884.53 frames.], batch size: 13, lr: 3.40e-04 2022-05-05 11:42:25,619 INFO [train.py:715] (0/8) Epoch 6, batch 7000, loss[loss=0.1329, simple_loss=0.2086, pruned_loss=0.02863, over 4863.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2216, pruned_loss=0.0394, over 972889.76 frames.], batch size: 20, lr: 3.40e-04 2022-05-05 11:43:04,215 INFO [train.py:715] (0/8) Epoch 6, batch 7050, loss[loss=0.135, simple_loss=0.2052, pruned_loss=0.03238, over 4690.00 frames.], tot_loss[loss=0.1507, simple_loss=0.2218, pruned_loss=0.03982, over 972982.79 frames.], batch size: 15, lr: 3.40e-04 2022-05-05 11:43:42,730 INFO [train.py:715] (0/8) Epoch 6, batch 7100, loss[loss=0.1429, simple_loss=0.2022, pruned_loss=0.0418, over 4993.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2212, pruned_loss=0.03931, over 972904.86 frames.], batch size: 14, lr: 3.40e-04 2022-05-05 11:44:16,967 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-216000.pt 2022-05-05 11:44:25,531 INFO [train.py:715] (0/8) Epoch 6, batch 7150, loss[loss=0.1269, simple_loss=0.1924, pruned_loss=0.03071, over 4984.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2212, pruned_loss=0.03891, over 972903.42 frames.], batch size: 28, lr: 3.40e-04 2022-05-05 11:45:04,231 INFO [train.py:715] (0/8) Epoch 6, batch 7200, loss[loss=0.1447, simple_loss=0.222, pruned_loss=0.03371, over 4839.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2215, pruned_loss=0.03907, over 972758.52 frames.], batch size: 26, lr: 3.40e-04 2022-05-05 11:45:42,692 INFO [train.py:715] (0/8) Epoch 6, batch 7250, loss[loss=0.1628, simple_loss=0.2382, pruned_loss=0.0437, over 4851.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2211, pruned_loss=0.03912, over 973165.07 frames.], batch size: 30, lr: 3.40e-04 2022-05-05 11:46:21,446 INFO [train.py:715] (0/8) Epoch 6, batch 7300, loss[loss=0.1396, simple_loss=0.2164, pruned_loss=0.03141, over 4869.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2212, pruned_loss=0.03931, over 973574.33 frames.], batch size: 16, lr: 3.40e-04 2022-05-05 11:47:01,049 INFO [train.py:715] (0/8) Epoch 6, batch 7350, loss[loss=0.206, simple_loss=0.2484, pruned_loss=0.08181, over 4759.00 frames.], tot_loss[loss=0.1499, simple_loss=0.221, pruned_loss=0.03936, over 972693.39 frames.], batch size: 12, lr: 3.40e-04 2022-05-05 11:47:38,856 INFO [train.py:715] (0/8) Epoch 6, batch 7400, loss[loss=0.1332, simple_loss=0.2092, pruned_loss=0.02862, over 4801.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.0393, over 972637.64 frames.], batch size: 21, lr: 3.40e-04 2022-05-05 11:48:18,375 INFO [train.py:715] (0/8) Epoch 6, batch 7450, loss[loss=0.1494, simple_loss=0.2172, pruned_loss=0.04085, over 4979.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2206, pruned_loss=0.03992, over 972279.64 frames.], batch size: 33, lr: 3.40e-04 2022-05-05 11:48:56,989 INFO [train.py:715] (0/8) Epoch 6, batch 7500, loss[loss=0.1708, simple_loss=0.2347, pruned_loss=0.05346, over 4866.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2208, pruned_loss=0.04013, over 971920.36 frames.], batch size: 30, lr: 3.40e-04 2022-05-05 11:49:35,690 INFO [train.py:715] (0/8) Epoch 6, batch 7550, loss[loss=0.1417, simple_loss=0.2117, pruned_loss=0.03588, over 4899.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2213, pruned_loss=0.03982, over 972014.93 frames.], batch size: 19, lr: 3.40e-04 2022-05-05 11:50:14,631 INFO [train.py:715] (0/8) Epoch 6, batch 7600, loss[loss=0.1383, simple_loss=0.2106, pruned_loss=0.03294, over 4733.00 frames.], tot_loss[loss=0.1503, simple_loss=0.221, pruned_loss=0.03978, over 972029.44 frames.], batch size: 16, lr: 3.40e-04 2022-05-05 11:50:53,759 INFO [train.py:715] (0/8) Epoch 6, batch 7650, loss[loss=0.1512, simple_loss=0.2195, pruned_loss=0.04141, over 4894.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2205, pruned_loss=0.03918, over 972316.69 frames.], batch size: 19, lr: 3.40e-04 2022-05-05 11:51:33,380 INFO [train.py:715] (0/8) Epoch 6, batch 7700, loss[loss=0.1309, simple_loss=0.2059, pruned_loss=0.02797, over 4885.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2199, pruned_loss=0.03871, over 972168.25 frames.], batch size: 22, lr: 3.39e-04 2022-05-05 11:52:11,587 INFO [train.py:715] (0/8) Epoch 6, batch 7750, loss[loss=0.1563, simple_loss=0.2137, pruned_loss=0.04948, over 4825.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2203, pruned_loss=0.03899, over 971608.59 frames.], batch size: 26, lr: 3.39e-04 2022-05-05 11:52:51,082 INFO [train.py:715] (0/8) Epoch 6, batch 7800, loss[loss=0.1623, simple_loss=0.237, pruned_loss=0.04379, over 4980.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2204, pruned_loss=0.03911, over 970830.78 frames.], batch size: 15, lr: 3.39e-04 2022-05-05 11:53:30,016 INFO [train.py:715] (0/8) Epoch 6, batch 7850, loss[loss=0.1426, simple_loss=0.2184, pruned_loss=0.03347, over 4756.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2212, pruned_loss=0.0392, over 971981.69 frames.], batch size: 19, lr: 3.39e-04 2022-05-05 11:54:08,582 INFO [train.py:715] (0/8) Epoch 6, batch 7900, loss[loss=0.1639, simple_loss=0.2384, pruned_loss=0.04466, over 4885.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2214, pruned_loss=0.03939, over 971802.43 frames.], batch size: 19, lr: 3.39e-04 2022-05-05 11:54:47,340 INFO [train.py:715] (0/8) Epoch 6, batch 7950, loss[loss=0.1645, simple_loss=0.2259, pruned_loss=0.05149, over 4974.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2217, pruned_loss=0.0398, over 972669.75 frames.], batch size: 24, lr: 3.39e-04 2022-05-05 11:55:26,522 INFO [train.py:715] (0/8) Epoch 6, batch 8000, loss[loss=0.1499, simple_loss=0.2238, pruned_loss=0.03798, over 4912.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2211, pruned_loss=0.03989, over 972835.27 frames.], batch size: 23, lr: 3.39e-04 2022-05-05 11:56:05,893 INFO [train.py:715] (0/8) Epoch 6, batch 8050, loss[loss=0.1643, simple_loss=0.2344, pruned_loss=0.04707, over 4890.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2207, pruned_loss=0.03934, over 973296.51 frames.], batch size: 22, lr: 3.39e-04 2022-05-05 11:56:43,891 INFO [train.py:715] (0/8) Epoch 6, batch 8100, loss[loss=0.1709, simple_loss=0.2447, pruned_loss=0.04849, over 4795.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2201, pruned_loss=0.03889, over 973424.49 frames.], batch size: 21, lr: 3.39e-04 2022-05-05 11:57:22,885 INFO [train.py:715] (0/8) Epoch 6, batch 8150, loss[loss=0.1576, simple_loss=0.2173, pruned_loss=0.04894, over 4879.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2208, pruned_loss=0.0392, over 973346.52 frames.], batch size: 32, lr: 3.39e-04 2022-05-05 11:58:01,955 INFO [train.py:715] (0/8) Epoch 6, batch 8200, loss[loss=0.1471, simple_loss=0.2249, pruned_loss=0.03466, over 4850.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2209, pruned_loss=0.03948, over 973078.26 frames.], batch size: 20, lr: 3.39e-04 2022-05-05 11:58:41,275 INFO [train.py:715] (0/8) Epoch 6, batch 8250, loss[loss=0.1307, simple_loss=0.1998, pruned_loss=0.03086, over 4740.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2211, pruned_loss=0.03965, over 973142.05 frames.], batch size: 16, lr: 3.39e-04 2022-05-05 11:59:19,576 INFO [train.py:715] (0/8) Epoch 6, batch 8300, loss[loss=0.1619, simple_loss=0.2288, pruned_loss=0.04751, over 4863.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2209, pruned_loss=0.03961, over 972793.05 frames.], batch size: 13, lr: 3.39e-04 2022-05-05 11:59:58,757 INFO [train.py:715] (0/8) Epoch 6, batch 8350, loss[loss=0.1389, simple_loss=0.2018, pruned_loss=0.03801, over 4862.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2206, pruned_loss=0.03958, over 972867.35 frames.], batch size: 32, lr: 3.39e-04 2022-05-05 12:00:37,617 INFO [train.py:715] (0/8) Epoch 6, batch 8400, loss[loss=0.1611, simple_loss=0.2226, pruned_loss=0.04983, over 4764.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2208, pruned_loss=0.03973, over 973203.90 frames.], batch size: 17, lr: 3.39e-04 2022-05-05 12:01:15,839 INFO [train.py:715] (0/8) Epoch 6, batch 8450, loss[loss=0.1576, simple_loss=0.2231, pruned_loss=0.04607, over 4753.00 frames.], tot_loss[loss=0.1492, simple_loss=0.22, pruned_loss=0.0392, over 972297.65 frames.], batch size: 19, lr: 3.39e-04 2022-05-05 12:01:54,982 INFO [train.py:715] (0/8) Epoch 6, batch 8500, loss[loss=0.1481, simple_loss=0.221, pruned_loss=0.03759, over 4942.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2195, pruned_loss=0.03898, over 972468.20 frames.], batch size: 21, lr: 3.39e-04 2022-05-05 12:02:33,544 INFO [train.py:715] (0/8) Epoch 6, batch 8550, loss[loss=0.1904, simple_loss=0.2711, pruned_loss=0.05489, over 4804.00 frames.], tot_loss[loss=0.149, simple_loss=0.2198, pruned_loss=0.0391, over 971970.60 frames.], batch size: 26, lr: 3.39e-04 2022-05-05 12:03:12,437 INFO [train.py:715] (0/8) Epoch 6, batch 8600, loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02998, over 4884.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2205, pruned_loss=0.03984, over 972901.80 frames.], batch size: 22, lr: 3.39e-04 2022-05-05 12:03:50,306 INFO [train.py:715] (0/8) Epoch 6, batch 8650, loss[loss=0.1851, simple_loss=0.241, pruned_loss=0.06462, over 4885.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2204, pruned_loss=0.03958, over 972716.17 frames.], batch size: 39, lr: 3.39e-04 2022-05-05 12:04:29,732 INFO [train.py:715] (0/8) Epoch 6, batch 8700, loss[loss=0.1361, simple_loss=0.213, pruned_loss=0.02959, over 4987.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2207, pruned_loss=0.03958, over 973377.96 frames.], batch size: 28, lr: 3.39e-04 2022-05-05 12:05:08,430 INFO [train.py:715] (0/8) Epoch 6, batch 8750, loss[loss=0.1727, simple_loss=0.2327, pruned_loss=0.05636, over 4917.00 frames.], tot_loss[loss=0.149, simple_loss=0.2196, pruned_loss=0.03923, over 973026.89 frames.], batch size: 18, lr: 3.39e-04 2022-05-05 12:05:46,859 INFO [train.py:715] (0/8) Epoch 6, batch 8800, loss[loss=0.1324, simple_loss=0.2113, pruned_loss=0.02673, over 4869.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.03932, over 973419.48 frames.], batch size: 20, lr: 3.39e-04 2022-05-05 12:06:25,681 INFO [train.py:715] (0/8) Epoch 6, batch 8850, loss[loss=0.1686, simple_loss=0.2358, pruned_loss=0.05064, over 4787.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2207, pruned_loss=0.03992, over 972988.06 frames.], batch size: 17, lr: 3.39e-04 2022-05-05 12:07:04,775 INFO [train.py:715] (0/8) Epoch 6, batch 8900, loss[loss=0.1323, simple_loss=0.2113, pruned_loss=0.02668, over 4956.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2209, pruned_loss=0.04015, over 973140.49 frames.], batch size: 21, lr: 3.39e-04 2022-05-05 12:07:43,993 INFO [train.py:715] (0/8) Epoch 6, batch 8950, loss[loss=0.1358, simple_loss=0.21, pruned_loss=0.03084, over 4923.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2222, pruned_loss=0.04048, over 972633.95 frames.], batch size: 18, lr: 3.38e-04 2022-05-05 12:08:22,488 INFO [train.py:715] (0/8) Epoch 6, batch 9000, loss[loss=0.1609, simple_loss=0.2247, pruned_loss=0.04858, over 4907.00 frames.], tot_loss[loss=0.1505, simple_loss=0.2211, pruned_loss=0.03996, over 972662.66 frames.], batch size: 17, lr: 3.38e-04 2022-05-05 12:08:22,489 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 12:08:35,890 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1094, simple_loss=0.1946, pruned_loss=0.01213, over 914524.00 frames. 2022-05-05 12:09:14,896 INFO [train.py:715] (0/8) Epoch 6, batch 9050, loss[loss=0.155, simple_loss=0.2177, pruned_loss=0.04621, over 4777.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03936, over 972316.16 frames.], batch size: 12, lr: 3.38e-04 2022-05-05 12:09:53,931 INFO [train.py:715] (0/8) Epoch 6, batch 9100, loss[loss=0.1339, simple_loss=0.2106, pruned_loss=0.02859, over 4775.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03931, over 971515.11 frames.], batch size: 19, lr: 3.38e-04 2022-05-05 12:10:33,368 INFO [train.py:715] (0/8) Epoch 6, batch 9150, loss[loss=0.1343, simple_loss=0.2028, pruned_loss=0.03288, over 4879.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2204, pruned_loss=0.03913, over 971148.30 frames.], batch size: 16, lr: 3.38e-04 2022-05-05 12:11:11,392 INFO [train.py:715] (0/8) Epoch 6, batch 9200, loss[loss=0.16, simple_loss=0.2275, pruned_loss=0.04624, over 4917.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2204, pruned_loss=0.03902, over 971302.30 frames.], batch size: 23, lr: 3.38e-04 2022-05-05 12:11:50,806 INFO [train.py:715] (0/8) Epoch 6, batch 9250, loss[loss=0.1157, simple_loss=0.1947, pruned_loss=0.01834, over 4799.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2207, pruned_loss=0.03908, over 971427.03 frames.], batch size: 24, lr: 3.38e-04 2022-05-05 12:12:29,882 INFO [train.py:715] (0/8) Epoch 6, batch 9300, loss[loss=0.1435, simple_loss=0.2092, pruned_loss=0.03887, over 4990.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2208, pruned_loss=0.03923, over 971763.24 frames.], batch size: 28, lr: 3.38e-04 2022-05-05 12:13:08,397 INFO [train.py:715] (0/8) Epoch 6, batch 9350, loss[loss=0.1373, simple_loss=0.2131, pruned_loss=0.03072, over 4978.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2208, pruned_loss=0.03908, over 971171.42 frames.], batch size: 25, lr: 3.38e-04 2022-05-05 12:13:47,630 INFO [train.py:715] (0/8) Epoch 6, batch 9400, loss[loss=0.1319, simple_loss=0.2101, pruned_loss=0.02686, over 4864.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2208, pruned_loss=0.03935, over 971786.60 frames.], batch size: 20, lr: 3.38e-04 2022-05-05 12:14:26,439 INFO [train.py:715] (0/8) Epoch 6, batch 9450, loss[loss=0.1354, simple_loss=0.2134, pruned_loss=0.02872, over 4784.00 frames.], tot_loss[loss=0.15, simple_loss=0.221, pruned_loss=0.03949, over 971958.19 frames.], batch size: 18, lr: 3.38e-04 2022-05-05 12:15:05,762 INFO [train.py:715] (0/8) Epoch 6, batch 9500, loss[loss=0.1387, simple_loss=0.2067, pruned_loss=0.03532, over 4932.00 frames.], tot_loss[loss=0.1498, simple_loss=0.221, pruned_loss=0.03933, over 972272.14 frames.], batch size: 21, lr: 3.38e-04 2022-05-05 12:15:44,432 INFO [train.py:715] (0/8) Epoch 6, batch 9550, loss[loss=0.1641, simple_loss=0.2484, pruned_loss=0.03991, over 4938.00 frames.], tot_loss[loss=0.1502, simple_loss=0.221, pruned_loss=0.03964, over 972511.79 frames.], batch size: 21, lr: 3.38e-04 2022-05-05 12:16:23,400 INFO [train.py:715] (0/8) Epoch 6, batch 9600, loss[loss=0.1838, simple_loss=0.2454, pruned_loss=0.06109, over 4967.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03925, over 972825.69 frames.], batch size: 39, lr: 3.38e-04 2022-05-05 12:17:02,126 INFO [train.py:715] (0/8) Epoch 6, batch 9650, loss[loss=0.1542, simple_loss=0.2297, pruned_loss=0.0393, over 4985.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2202, pruned_loss=0.03919, over 972842.59 frames.], batch size: 35, lr: 3.38e-04 2022-05-05 12:17:40,453 INFO [train.py:715] (0/8) Epoch 6, batch 9700, loss[loss=0.1457, simple_loss=0.2219, pruned_loss=0.03475, over 4818.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2197, pruned_loss=0.03891, over 973336.06 frames.], batch size: 25, lr: 3.38e-04 2022-05-05 12:18:19,754 INFO [train.py:715] (0/8) Epoch 6, batch 9750, loss[loss=0.1391, simple_loss=0.2215, pruned_loss=0.02836, over 4822.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2192, pruned_loss=0.03827, over 972402.76 frames.], batch size: 25, lr: 3.38e-04 2022-05-05 12:18:59,477 INFO [train.py:715] (0/8) Epoch 6, batch 9800, loss[loss=0.1572, simple_loss=0.2297, pruned_loss=0.04235, over 4796.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2192, pruned_loss=0.03822, over 972269.44 frames.], batch size: 18, lr: 3.38e-04 2022-05-05 12:19:39,847 INFO [train.py:715] (0/8) Epoch 6, batch 9850, loss[loss=0.1625, simple_loss=0.2403, pruned_loss=0.04237, over 4913.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2192, pruned_loss=0.03814, over 972247.54 frames.], batch size: 19, lr: 3.38e-04 2022-05-05 12:20:18,996 INFO [train.py:715] (0/8) Epoch 6, batch 9900, loss[loss=0.1488, simple_loss=0.2185, pruned_loss=0.0395, over 4770.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2199, pruned_loss=0.03888, over 972226.96 frames.], batch size: 18, lr: 3.38e-04 2022-05-05 12:20:59,135 INFO [train.py:715] (0/8) Epoch 6, batch 9950, loss[loss=0.1618, simple_loss=0.2304, pruned_loss=0.04662, over 4813.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2209, pruned_loss=0.03944, over 971526.98 frames.], batch size: 27, lr: 3.38e-04 2022-05-05 12:21:39,156 INFO [train.py:715] (0/8) Epoch 6, batch 10000, loss[loss=0.1404, simple_loss=0.213, pruned_loss=0.03389, over 4777.00 frames.], tot_loss[loss=0.1492, simple_loss=0.22, pruned_loss=0.0392, over 971368.56 frames.], batch size: 17, lr: 3.38e-04 2022-05-05 12:22:17,400 INFO [train.py:715] (0/8) Epoch 6, batch 10050, loss[loss=0.1296, simple_loss=0.1964, pruned_loss=0.03144, over 4892.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2209, pruned_loss=0.03976, over 971742.64 frames.], batch size: 16, lr: 3.38e-04 2022-05-05 12:22:56,771 INFO [train.py:715] (0/8) Epoch 6, batch 10100, loss[loss=0.1491, simple_loss=0.2289, pruned_loss=0.0346, over 4946.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2213, pruned_loss=0.04, over 970973.14 frames.], batch size: 21, lr: 3.38e-04 2022-05-05 12:23:34,989 INFO [train.py:715] (0/8) Epoch 6, batch 10150, loss[loss=0.1388, simple_loss=0.2216, pruned_loss=0.02797, over 4923.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2203, pruned_loss=0.03913, over 971364.84 frames.], batch size: 18, lr: 3.38e-04 2022-05-05 12:24:14,024 INFO [train.py:715] (0/8) Epoch 6, batch 10200, loss[loss=0.1872, simple_loss=0.2524, pruned_loss=0.06099, over 4844.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2198, pruned_loss=0.03866, over 972897.15 frames.], batch size: 32, lr: 3.38e-04 2022-05-05 12:24:52,550 INFO [train.py:715] (0/8) Epoch 6, batch 10250, loss[loss=0.1393, simple_loss=0.2102, pruned_loss=0.03424, over 4833.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.03858, over 973481.52 frames.], batch size: 13, lr: 3.37e-04 2022-05-05 12:25:31,640 INFO [train.py:715] (0/8) Epoch 6, batch 10300, loss[loss=0.1568, simple_loss=0.228, pruned_loss=0.04283, over 4788.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2204, pruned_loss=0.03889, over 972334.43 frames.], batch size: 18, lr: 3.37e-04 2022-05-05 12:26:10,139 INFO [train.py:715] (0/8) Epoch 6, batch 10350, loss[loss=0.1766, simple_loss=0.2501, pruned_loss=0.05159, over 4776.00 frames.], tot_loss[loss=0.149, simple_loss=0.2207, pruned_loss=0.03867, over 972088.18 frames.], batch size: 12, lr: 3.37e-04 2022-05-05 12:26:49,275 INFO [train.py:715] (0/8) Epoch 6, batch 10400, loss[loss=0.1506, simple_loss=0.2209, pruned_loss=0.04019, over 4944.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2211, pruned_loss=0.03881, over 972947.21 frames.], batch size: 23, lr: 3.37e-04 2022-05-05 12:27:27,706 INFO [train.py:715] (0/8) Epoch 6, batch 10450, loss[loss=0.1419, simple_loss=0.21, pruned_loss=0.03689, over 4861.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2211, pruned_loss=0.03869, over 973000.11 frames.], batch size: 32, lr: 3.37e-04 2022-05-05 12:28:06,359 INFO [train.py:715] (0/8) Epoch 6, batch 10500, loss[loss=0.1534, simple_loss=0.2242, pruned_loss=0.04131, over 4955.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2204, pruned_loss=0.03805, over 972734.35 frames.], batch size: 35, lr: 3.37e-04 2022-05-05 12:28:45,426 INFO [train.py:715] (0/8) Epoch 6, batch 10550, loss[loss=0.1558, simple_loss=0.2214, pruned_loss=0.04515, over 4927.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2209, pruned_loss=0.03866, over 972967.49 frames.], batch size: 35, lr: 3.37e-04 2022-05-05 12:29:23,697 INFO [train.py:715] (0/8) Epoch 6, batch 10600, loss[loss=0.1573, simple_loss=0.2347, pruned_loss=0.03993, over 4699.00 frames.], tot_loss[loss=0.149, simple_loss=0.2208, pruned_loss=0.03866, over 973553.26 frames.], batch size: 15, lr: 3.37e-04 2022-05-05 12:30:02,907 INFO [train.py:715] (0/8) Epoch 6, batch 10650, loss[loss=0.1639, simple_loss=0.2278, pruned_loss=0.05001, over 4836.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2207, pruned_loss=0.03845, over 973531.42 frames.], batch size: 32, lr: 3.37e-04 2022-05-05 12:30:41,617 INFO [train.py:715] (0/8) Epoch 6, batch 10700, loss[loss=0.1585, simple_loss=0.2204, pruned_loss=0.04837, over 4834.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2215, pruned_loss=0.03894, over 973138.00 frames.], batch size: 15, lr: 3.37e-04 2022-05-05 12:31:20,567 INFO [train.py:715] (0/8) Epoch 6, batch 10750, loss[loss=0.1724, simple_loss=0.2568, pruned_loss=0.04402, over 4873.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2216, pruned_loss=0.03892, over 973216.87 frames.], batch size: 16, lr: 3.37e-04 2022-05-05 12:31:59,028 INFO [train.py:715] (0/8) Epoch 6, batch 10800, loss[loss=0.1531, simple_loss=0.2147, pruned_loss=0.04575, over 4988.00 frames.], tot_loss[loss=0.149, simple_loss=0.2205, pruned_loss=0.03874, over 973192.55 frames.], batch size: 24, lr: 3.37e-04 2022-05-05 12:32:37,565 INFO [train.py:715] (0/8) Epoch 6, batch 10850, loss[loss=0.1813, simple_loss=0.2419, pruned_loss=0.06037, over 4930.00 frames.], tot_loss[loss=0.149, simple_loss=0.2206, pruned_loss=0.03873, over 974225.32 frames.], batch size: 23, lr: 3.37e-04 2022-05-05 12:33:15,991 INFO [train.py:715] (0/8) Epoch 6, batch 10900, loss[loss=0.1545, simple_loss=0.229, pruned_loss=0.04003, over 4758.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2209, pruned_loss=0.03877, over 974604.75 frames.], batch size: 18, lr: 3.37e-04 2022-05-05 12:33:54,113 INFO [train.py:715] (0/8) Epoch 6, batch 10950, loss[loss=0.1699, simple_loss=0.2377, pruned_loss=0.051, over 4935.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2218, pruned_loss=0.03924, over 974961.74 frames.], batch size: 23, lr: 3.37e-04 2022-05-05 12:34:33,262 INFO [train.py:715] (0/8) Epoch 6, batch 11000, loss[loss=0.1182, simple_loss=0.1935, pruned_loss=0.02141, over 4932.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2208, pruned_loss=0.03887, over 975052.75 frames.], batch size: 23, lr: 3.37e-04 2022-05-05 12:35:11,624 INFO [train.py:715] (0/8) Epoch 6, batch 11050, loss[loss=0.1435, simple_loss=0.2272, pruned_loss=0.02993, over 4907.00 frames.], tot_loss[loss=0.1495, simple_loss=0.221, pruned_loss=0.03898, over 974528.38 frames.], batch size: 17, lr: 3.37e-04 2022-05-05 12:35:50,636 INFO [train.py:715] (0/8) Epoch 6, batch 11100, loss[loss=0.1585, simple_loss=0.2425, pruned_loss=0.03719, over 4712.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2217, pruned_loss=0.03939, over 973182.86 frames.], batch size: 15, lr: 3.37e-04 2022-05-05 12:36:29,025 INFO [train.py:715] (0/8) Epoch 6, batch 11150, loss[loss=0.1386, simple_loss=0.2057, pruned_loss=0.03581, over 4963.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2213, pruned_loss=0.03893, over 973196.97 frames.], batch size: 29, lr: 3.37e-04 2022-05-05 12:37:07,404 INFO [train.py:715] (0/8) Epoch 6, batch 11200, loss[loss=0.1546, simple_loss=0.2267, pruned_loss=0.0412, over 4941.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2203, pruned_loss=0.03858, over 973541.72 frames.], batch size: 23, lr: 3.37e-04 2022-05-05 12:37:45,840 INFO [train.py:715] (0/8) Epoch 6, batch 11250, loss[loss=0.1642, simple_loss=0.2221, pruned_loss=0.05313, over 4798.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2202, pruned_loss=0.03865, over 972816.43 frames.], batch size: 14, lr: 3.37e-04 2022-05-05 12:38:24,401 INFO [train.py:715] (0/8) Epoch 6, batch 11300, loss[loss=0.1543, simple_loss=0.23, pruned_loss=0.03931, over 4963.00 frames.], tot_loss[loss=0.1482, simple_loss=0.22, pruned_loss=0.03825, over 972480.38 frames.], batch size: 24, lr: 3.37e-04 2022-05-05 12:39:03,681 INFO [train.py:715] (0/8) Epoch 6, batch 11350, loss[loss=0.2047, simple_loss=0.271, pruned_loss=0.06924, over 4979.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2212, pruned_loss=0.03899, over 971851.00 frames.], batch size: 40, lr: 3.37e-04 2022-05-05 12:39:42,618 INFO [train.py:715] (0/8) Epoch 6, batch 11400, loss[loss=0.1748, simple_loss=0.2375, pruned_loss=0.0561, over 4921.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2209, pruned_loss=0.03911, over 972289.30 frames.], batch size: 39, lr: 3.37e-04 2022-05-05 12:40:21,677 INFO [train.py:715] (0/8) Epoch 6, batch 11450, loss[loss=0.1427, simple_loss=0.2225, pruned_loss=0.0315, over 4817.00 frames.], tot_loss[loss=0.1496, simple_loss=0.221, pruned_loss=0.03909, over 972030.61 frames.], batch size: 25, lr: 3.37e-04 2022-05-05 12:40:59,947 INFO [train.py:715] (0/8) Epoch 6, batch 11500, loss[loss=0.1189, simple_loss=0.1966, pruned_loss=0.0206, over 4816.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2202, pruned_loss=0.03872, over 971797.23 frames.], batch size: 21, lr: 3.37e-04 2022-05-05 12:41:38,297 INFO [train.py:715] (0/8) Epoch 6, batch 11550, loss[loss=0.158, simple_loss=0.2349, pruned_loss=0.04056, over 4790.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2189, pruned_loss=0.03809, over 971755.55 frames.], batch size: 18, lr: 3.36e-04 2022-05-05 12:42:17,674 INFO [train.py:715] (0/8) Epoch 6, batch 11600, loss[loss=0.158, simple_loss=0.2415, pruned_loss=0.03721, over 4850.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2188, pruned_loss=0.03788, over 971824.49 frames.], batch size: 20, lr: 3.36e-04 2022-05-05 12:42:56,128 INFO [train.py:715] (0/8) Epoch 6, batch 11650, loss[loss=0.133, simple_loss=0.2064, pruned_loss=0.02979, over 4686.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2182, pruned_loss=0.03783, over 970684.64 frames.], batch size: 15, lr: 3.36e-04 2022-05-05 12:43:34,993 INFO [train.py:715] (0/8) Epoch 6, batch 11700, loss[loss=0.1399, simple_loss=0.2251, pruned_loss=0.02732, over 4837.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2185, pruned_loss=0.03795, over 971662.07 frames.], batch size: 15, lr: 3.36e-04 2022-05-05 12:44:13,932 INFO [train.py:715] (0/8) Epoch 6, batch 11750, loss[loss=0.1366, simple_loss=0.2095, pruned_loss=0.0319, over 4771.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2188, pruned_loss=0.03816, over 971736.23 frames.], batch size: 17, lr: 3.36e-04 2022-05-05 12:44:53,164 INFO [train.py:715] (0/8) Epoch 6, batch 11800, loss[loss=0.1341, simple_loss=0.204, pruned_loss=0.03206, over 4753.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2187, pruned_loss=0.0381, over 971771.91 frames.], batch size: 16, lr: 3.36e-04 2022-05-05 12:45:31,849 INFO [train.py:715] (0/8) Epoch 6, batch 11850, loss[loss=0.1124, simple_loss=0.1828, pruned_loss=0.02102, over 4938.00 frames.], tot_loss[loss=0.147, simple_loss=0.2185, pruned_loss=0.03776, over 971620.69 frames.], batch size: 29, lr: 3.36e-04 2022-05-05 12:46:10,414 INFO [train.py:715] (0/8) Epoch 6, batch 11900, loss[loss=0.1427, simple_loss=0.2145, pruned_loss=0.03541, over 4868.00 frames.], tot_loss[loss=0.148, simple_loss=0.2194, pruned_loss=0.03831, over 971868.58 frames.], batch size: 32, lr: 3.36e-04 2022-05-05 12:46:49,722 INFO [train.py:715] (0/8) Epoch 6, batch 11950, loss[loss=0.184, simple_loss=0.2611, pruned_loss=0.05348, over 4886.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2198, pruned_loss=0.0387, over 972522.41 frames.], batch size: 16, lr: 3.36e-04 2022-05-05 12:47:28,218 INFO [train.py:715] (0/8) Epoch 6, batch 12000, loss[loss=0.1502, simple_loss=0.224, pruned_loss=0.03822, over 4854.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2201, pruned_loss=0.03867, over 971182.01 frames.], batch size: 34, lr: 3.36e-04 2022-05-05 12:47:28,219 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 12:47:37,945 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1091, simple_loss=0.1942, pruned_loss=0.01199, over 914524.00 frames. 2022-05-05 12:48:16,694 INFO [train.py:715] (0/8) Epoch 6, batch 12050, loss[loss=0.1651, simple_loss=0.2345, pruned_loss=0.04789, over 4922.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2196, pruned_loss=0.03844, over 970706.98 frames.], batch size: 29, lr: 3.36e-04 2022-05-05 12:48:56,373 INFO [train.py:715] (0/8) Epoch 6, batch 12100, loss[loss=0.1427, simple_loss=0.2154, pruned_loss=0.03495, over 4917.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2208, pruned_loss=0.03902, over 970889.38 frames.], batch size: 17, lr: 3.36e-04 2022-05-05 12:49:35,319 INFO [train.py:715] (0/8) Epoch 6, batch 12150, loss[loss=0.1443, simple_loss=0.2052, pruned_loss=0.04168, over 4653.00 frames.], tot_loss[loss=0.1493, simple_loss=0.221, pruned_loss=0.03877, over 971210.38 frames.], batch size: 13, lr: 3.36e-04 2022-05-05 12:50:14,104 INFO [train.py:715] (0/8) Epoch 6, batch 12200, loss[loss=0.1502, simple_loss=0.2313, pruned_loss=0.03454, over 4778.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2213, pruned_loss=0.03915, over 971126.43 frames.], batch size: 17, lr: 3.36e-04 2022-05-05 12:50:53,313 INFO [train.py:715] (0/8) Epoch 6, batch 12250, loss[loss=0.156, simple_loss=0.2215, pruned_loss=0.04523, over 4753.00 frames.], tot_loss[loss=0.1515, simple_loss=0.2232, pruned_loss=0.03991, over 972160.18 frames.], batch size: 16, lr: 3.36e-04 2022-05-05 12:51:32,107 INFO [train.py:715] (0/8) Epoch 6, batch 12300, loss[loss=0.1343, simple_loss=0.2067, pruned_loss=0.03092, over 4874.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2219, pruned_loss=0.03919, over 971568.89 frames.], batch size: 22, lr: 3.36e-04 2022-05-05 12:52:11,884 INFO [train.py:715] (0/8) Epoch 6, batch 12350, loss[loss=0.1464, simple_loss=0.2212, pruned_loss=0.03578, over 4748.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2218, pruned_loss=0.03949, over 971508.08 frames.], batch size: 16, lr: 3.36e-04 2022-05-05 12:52:50,508 INFO [train.py:715] (0/8) Epoch 6, batch 12400, loss[loss=0.1306, simple_loss=0.1964, pruned_loss=0.03238, over 4981.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2202, pruned_loss=0.03899, over 971596.37 frames.], batch size: 14, lr: 3.36e-04 2022-05-05 12:53:29,625 INFO [train.py:715] (0/8) Epoch 6, batch 12450, loss[loss=0.144, simple_loss=0.216, pruned_loss=0.03598, over 4976.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.03865, over 971661.30 frames.], batch size: 15, lr: 3.36e-04 2022-05-05 12:54:08,742 INFO [train.py:715] (0/8) Epoch 6, batch 12500, loss[loss=0.1417, simple_loss=0.221, pruned_loss=0.03122, over 4859.00 frames.], tot_loss[loss=0.149, simple_loss=0.2204, pruned_loss=0.03875, over 971505.15 frames.], batch size: 20, lr: 3.36e-04 2022-05-05 12:54:47,048 INFO [train.py:715] (0/8) Epoch 6, batch 12550, loss[loss=0.1776, simple_loss=0.2589, pruned_loss=0.04811, over 4909.00 frames.], tot_loss[loss=0.1495, simple_loss=0.221, pruned_loss=0.03903, over 971600.69 frames.], batch size: 19, lr: 3.36e-04 2022-05-05 12:55:26,405 INFO [train.py:715] (0/8) Epoch 6, batch 12600, loss[loss=0.1577, simple_loss=0.227, pruned_loss=0.04419, over 4936.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2207, pruned_loss=0.03881, over 971832.51 frames.], batch size: 15, lr: 3.36e-04 2022-05-05 12:56:05,090 INFO [train.py:715] (0/8) Epoch 6, batch 12650, loss[loss=0.1222, simple_loss=0.1971, pruned_loss=0.02359, over 4800.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2202, pruned_loss=0.03856, over 972026.71 frames.], batch size: 21, lr: 3.36e-04 2022-05-05 12:56:43,908 INFO [train.py:715] (0/8) Epoch 6, batch 12700, loss[loss=0.149, simple_loss=0.2182, pruned_loss=0.03985, over 4890.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2199, pruned_loss=0.0385, over 972375.81 frames.], batch size: 16, lr: 3.36e-04 2022-05-05 12:57:22,044 INFO [train.py:715] (0/8) Epoch 6, batch 12750, loss[loss=0.1274, simple_loss=0.1891, pruned_loss=0.03287, over 4842.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2197, pruned_loss=0.03853, over 972925.38 frames.], batch size: 12, lr: 3.36e-04 2022-05-05 12:58:01,005 INFO [train.py:715] (0/8) Epoch 6, batch 12800, loss[loss=0.1545, simple_loss=0.2283, pruned_loss=0.04031, over 4947.00 frames.], tot_loss[loss=0.1486, simple_loss=0.22, pruned_loss=0.03859, over 973190.70 frames.], batch size: 39, lr: 3.36e-04 2022-05-05 12:58:39,729 INFO [train.py:715] (0/8) Epoch 6, batch 12850, loss[loss=0.1705, simple_loss=0.2442, pruned_loss=0.04835, over 4901.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2202, pruned_loss=0.03836, over 972545.36 frames.], batch size: 17, lr: 3.35e-04 2022-05-05 12:59:18,382 INFO [train.py:715] (0/8) Epoch 6, batch 12900, loss[loss=0.1314, simple_loss=0.2083, pruned_loss=0.02726, over 4811.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2199, pruned_loss=0.03851, over 972467.34 frames.], batch size: 21, lr: 3.35e-04 2022-05-05 12:59:58,335 INFO [train.py:715] (0/8) Epoch 6, batch 12950, loss[loss=0.1361, simple_loss=0.2019, pruned_loss=0.0352, over 4705.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2194, pruned_loss=0.03861, over 972215.71 frames.], batch size: 12, lr: 3.35e-04 2022-05-05 13:00:37,480 INFO [train.py:715] (0/8) Epoch 6, batch 13000, loss[loss=0.126, simple_loss=0.1909, pruned_loss=0.03052, over 4938.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.03861, over 972059.93 frames.], batch size: 21, lr: 3.35e-04 2022-05-05 13:01:16,470 INFO [train.py:715] (0/8) Epoch 6, batch 13050, loss[loss=0.134, simple_loss=0.202, pruned_loss=0.03297, over 4836.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2202, pruned_loss=0.03877, over 971923.09 frames.], batch size: 26, lr: 3.35e-04 2022-05-05 13:01:54,763 INFO [train.py:715] (0/8) Epoch 6, batch 13100, loss[loss=0.1676, simple_loss=0.242, pruned_loss=0.04656, over 4733.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2195, pruned_loss=0.03836, over 970696.68 frames.], batch size: 16, lr: 3.35e-04 2022-05-05 13:02:34,344 INFO [train.py:715] (0/8) Epoch 6, batch 13150, loss[loss=0.1247, simple_loss=0.2058, pruned_loss=0.0218, over 4976.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2194, pruned_loss=0.03823, over 971389.80 frames.], batch size: 24, lr: 3.35e-04 2022-05-05 13:03:12,921 INFO [train.py:715] (0/8) Epoch 6, batch 13200, loss[loss=0.1347, simple_loss=0.2019, pruned_loss=0.03376, over 4654.00 frames.], tot_loss[loss=0.1486, simple_loss=0.22, pruned_loss=0.03863, over 971883.38 frames.], batch size: 13, lr: 3.35e-04 2022-05-05 13:03:51,763 INFO [train.py:715] (0/8) Epoch 6, batch 13250, loss[loss=0.168, simple_loss=0.2409, pruned_loss=0.04752, over 4925.00 frames.], tot_loss[loss=0.148, simple_loss=0.2196, pruned_loss=0.03824, over 972280.45 frames.], batch size: 29, lr: 3.35e-04 2022-05-05 13:04:30,641 INFO [train.py:715] (0/8) Epoch 6, batch 13300, loss[loss=0.147, simple_loss=0.2107, pruned_loss=0.04167, over 4963.00 frames.], tot_loss[loss=0.1476, simple_loss=0.219, pruned_loss=0.03813, over 972836.12 frames.], batch size: 21, lr: 3.35e-04 2022-05-05 13:05:09,757 INFO [train.py:715] (0/8) Epoch 6, batch 13350, loss[loss=0.159, simple_loss=0.233, pruned_loss=0.04247, over 4653.00 frames.], tot_loss[loss=0.148, simple_loss=0.2192, pruned_loss=0.03847, over 973675.84 frames.], batch size: 13, lr: 3.35e-04 2022-05-05 13:05:48,883 INFO [train.py:715] (0/8) Epoch 6, batch 13400, loss[loss=0.1597, simple_loss=0.2247, pruned_loss=0.04732, over 4953.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2195, pruned_loss=0.03865, over 973605.90 frames.], batch size: 24, lr: 3.35e-04 2022-05-05 13:06:27,481 INFO [train.py:715] (0/8) Epoch 6, batch 13450, loss[loss=0.1408, simple_loss=0.2094, pruned_loss=0.03612, over 4859.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2199, pruned_loss=0.03869, over 973986.20 frames.], batch size: 20, lr: 3.35e-04 2022-05-05 13:07:07,009 INFO [train.py:715] (0/8) Epoch 6, batch 13500, loss[loss=0.1501, simple_loss=0.2211, pruned_loss=0.03956, over 4787.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2198, pruned_loss=0.03853, over 973743.97 frames.], batch size: 17, lr: 3.35e-04 2022-05-05 13:07:45,020 INFO [train.py:715] (0/8) Epoch 6, batch 13550, loss[loss=0.1328, simple_loss=0.2132, pruned_loss=0.02622, over 4800.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2198, pruned_loss=0.0386, over 973563.33 frames.], batch size: 21, lr: 3.35e-04 2022-05-05 13:08:23,969 INFO [train.py:715] (0/8) Epoch 6, batch 13600, loss[loss=0.1549, simple_loss=0.224, pruned_loss=0.04293, over 4871.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2199, pruned_loss=0.03877, over 972736.88 frames.], batch size: 32, lr: 3.35e-04 2022-05-05 13:09:03,109 INFO [train.py:715] (0/8) Epoch 6, batch 13650, loss[loss=0.161, simple_loss=0.2344, pruned_loss=0.0438, over 4924.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2203, pruned_loss=0.03908, over 971408.32 frames.], batch size: 17, lr: 3.35e-04 2022-05-05 13:09:42,435 INFO [train.py:715] (0/8) Epoch 6, batch 13700, loss[loss=0.1527, simple_loss=0.2264, pruned_loss=0.03947, over 4792.00 frames.], tot_loss[loss=0.1491, simple_loss=0.22, pruned_loss=0.03903, over 971534.48 frames.], batch size: 14, lr: 3.35e-04 2022-05-05 13:10:21,543 INFO [train.py:715] (0/8) Epoch 6, batch 13750, loss[loss=0.1331, simple_loss=0.2059, pruned_loss=0.03015, over 4771.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2201, pruned_loss=0.03917, over 971580.00 frames.], batch size: 17, lr: 3.35e-04 2022-05-05 13:11:00,143 INFO [train.py:715] (0/8) Epoch 6, batch 13800, loss[loss=0.1487, simple_loss=0.2284, pruned_loss=0.03448, over 4846.00 frames.], tot_loss[loss=0.149, simple_loss=0.2203, pruned_loss=0.03884, over 972072.54 frames.], batch size: 32, lr: 3.35e-04 2022-05-05 13:11:40,113 INFO [train.py:715] (0/8) Epoch 6, batch 13850, loss[loss=0.1552, simple_loss=0.2116, pruned_loss=0.04945, over 4837.00 frames.], tot_loss[loss=0.149, simple_loss=0.22, pruned_loss=0.03901, over 971107.71 frames.], batch size: 15, lr: 3.35e-04 2022-05-05 13:12:18,446 INFO [train.py:715] (0/8) Epoch 6, batch 13900, loss[loss=0.1297, simple_loss=0.1989, pruned_loss=0.03027, over 4723.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2192, pruned_loss=0.03892, over 970857.94 frames.], batch size: 12, lr: 3.35e-04 2022-05-05 13:12:57,458 INFO [train.py:715] (0/8) Epoch 6, batch 13950, loss[loss=0.1263, simple_loss=0.1989, pruned_loss=0.02679, over 4889.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2182, pruned_loss=0.03813, over 971692.95 frames.], batch size: 22, lr: 3.35e-04 2022-05-05 13:13:36,060 INFO [train.py:715] (0/8) Epoch 6, batch 14000, loss[loss=0.1254, simple_loss=0.1982, pruned_loss=0.02629, over 4849.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2186, pruned_loss=0.03824, over 971674.46 frames.], batch size: 13, lr: 3.35e-04 2022-05-05 13:14:15,108 INFO [train.py:715] (0/8) Epoch 6, batch 14050, loss[loss=0.1646, simple_loss=0.2322, pruned_loss=0.04845, over 4685.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2184, pruned_loss=0.0383, over 972140.53 frames.], batch size: 15, lr: 3.35e-04 2022-05-05 13:14:53,528 INFO [train.py:715] (0/8) Epoch 6, batch 14100, loss[loss=0.1328, simple_loss=0.2082, pruned_loss=0.02874, over 4966.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2191, pruned_loss=0.03856, over 973093.85 frames.], batch size: 21, lr: 3.35e-04 2022-05-05 13:15:32,015 INFO [train.py:715] (0/8) Epoch 6, batch 14150, loss[loss=0.1425, simple_loss=0.2233, pruned_loss=0.03086, over 4980.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2199, pruned_loss=0.03841, over 972788.93 frames.], batch size: 24, lr: 3.35e-04 2022-05-05 13:16:11,445 INFO [train.py:715] (0/8) Epoch 6, batch 14200, loss[loss=0.1655, simple_loss=0.2417, pruned_loss=0.04463, over 4690.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2201, pruned_loss=0.03832, over 973043.72 frames.], batch size: 15, lr: 3.34e-04 2022-05-05 13:16:50,084 INFO [train.py:715] (0/8) Epoch 6, batch 14250, loss[loss=0.1272, simple_loss=0.1962, pruned_loss=0.02914, over 4986.00 frames.], tot_loss[loss=0.1489, simple_loss=0.22, pruned_loss=0.03883, over 972379.76 frames.], batch size: 16, lr: 3.34e-04 2022-05-05 13:17:29,123 INFO [train.py:715] (0/8) Epoch 6, batch 14300, loss[loss=0.1581, simple_loss=0.2254, pruned_loss=0.04539, over 4853.00 frames.], tot_loss[loss=0.1491, simple_loss=0.22, pruned_loss=0.03912, over 972890.01 frames.], batch size: 20, lr: 3.34e-04 2022-05-05 13:18:07,578 INFO [train.py:715] (0/8) Epoch 6, batch 14350, loss[loss=0.1649, simple_loss=0.2325, pruned_loss=0.04867, over 4754.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2211, pruned_loss=0.0397, over 973091.47 frames.], batch size: 19, lr: 3.34e-04 2022-05-05 13:18:47,505 INFO [train.py:715] (0/8) Epoch 6, batch 14400, loss[loss=0.1574, simple_loss=0.2293, pruned_loss=0.04279, over 4803.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2211, pruned_loss=0.0396, over 972517.84 frames.], batch size: 25, lr: 3.34e-04 2022-05-05 13:19:25,853 INFO [train.py:715] (0/8) Epoch 6, batch 14450, loss[loss=0.1381, simple_loss=0.2103, pruned_loss=0.03295, over 4878.00 frames.], tot_loss[loss=0.1506, simple_loss=0.2215, pruned_loss=0.03987, over 972311.25 frames.], batch size: 16, lr: 3.34e-04 2022-05-05 13:20:04,244 INFO [train.py:715] (0/8) Epoch 6, batch 14500, loss[loss=0.1716, simple_loss=0.2468, pruned_loss=0.04813, over 4988.00 frames.], tot_loss[loss=0.15, simple_loss=0.2211, pruned_loss=0.03941, over 972430.40 frames.], batch size: 26, lr: 3.34e-04 2022-05-05 13:20:43,932 INFO [train.py:715] (0/8) Epoch 6, batch 14550, loss[loss=0.1396, simple_loss=0.2127, pruned_loss=0.03328, over 4725.00 frames.], tot_loss[loss=0.1493, simple_loss=0.22, pruned_loss=0.03927, over 972499.42 frames.], batch size: 16, lr: 3.34e-04 2022-05-05 13:21:22,649 INFO [train.py:715] (0/8) Epoch 6, batch 14600, loss[loss=0.09866, simple_loss=0.1784, pruned_loss=0.009439, over 4795.00 frames.], tot_loss[loss=0.149, simple_loss=0.2199, pruned_loss=0.03905, over 971916.96 frames.], batch size: 24, lr: 3.34e-04 2022-05-05 13:22:01,120 INFO [train.py:715] (0/8) Epoch 6, batch 14650, loss[loss=0.1415, simple_loss=0.2116, pruned_loss=0.03574, over 4910.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2205, pruned_loss=0.03932, over 972826.67 frames.], batch size: 19, lr: 3.34e-04 2022-05-05 13:22:40,128 INFO [train.py:715] (0/8) Epoch 6, batch 14700, loss[loss=0.1921, simple_loss=0.2518, pruned_loss=0.06621, over 4905.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2207, pruned_loss=0.03952, over 973221.28 frames.], batch size: 17, lr: 3.34e-04 2022-05-05 13:23:19,674 INFO [train.py:715] (0/8) Epoch 6, batch 14750, loss[loss=0.1737, simple_loss=0.2521, pruned_loss=0.04766, over 4811.00 frames.], tot_loss[loss=0.1503, simple_loss=0.221, pruned_loss=0.03977, over 973263.49 frames.], batch size: 25, lr: 3.34e-04 2022-05-05 13:23:57,828 INFO [train.py:715] (0/8) Epoch 6, batch 14800, loss[loss=0.1452, simple_loss=0.2195, pruned_loss=0.03549, over 4927.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.03933, over 973192.27 frames.], batch size: 39, lr: 3.34e-04 2022-05-05 13:24:35,993 INFO [train.py:715] (0/8) Epoch 6, batch 14850, loss[loss=0.1308, simple_loss=0.1986, pruned_loss=0.03145, over 4839.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2206, pruned_loss=0.03975, over 973523.14 frames.], batch size: 12, lr: 3.34e-04 2022-05-05 13:25:15,102 INFO [train.py:715] (0/8) Epoch 6, batch 14900, loss[loss=0.1661, simple_loss=0.2352, pruned_loss=0.04847, over 4770.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2204, pruned_loss=0.03954, over 972392.52 frames.], batch size: 17, lr: 3.34e-04 2022-05-05 13:25:53,356 INFO [train.py:715] (0/8) Epoch 6, batch 14950, loss[loss=0.1386, simple_loss=0.203, pruned_loss=0.03714, over 4948.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2202, pruned_loss=0.0395, over 972286.64 frames.], batch size: 21, lr: 3.34e-04 2022-05-05 13:26:32,022 INFO [train.py:715] (0/8) Epoch 6, batch 15000, loss[loss=0.1667, simple_loss=0.2414, pruned_loss=0.04603, over 4760.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2195, pruned_loss=0.03914, over 972640.93 frames.], batch size: 17, lr: 3.34e-04 2022-05-05 13:26:32,023 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 13:26:41,820 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1091, simple_loss=0.1941, pruned_loss=0.01202, over 914524.00 frames. 2022-05-05 13:27:20,598 INFO [train.py:715] (0/8) Epoch 6, batch 15050, loss[loss=0.1736, simple_loss=0.2551, pruned_loss=0.04604, over 4893.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2199, pruned_loss=0.03917, over 972469.05 frames.], batch size: 22, lr: 3.34e-04 2022-05-05 13:27:59,348 INFO [train.py:715] (0/8) Epoch 6, batch 15100, loss[loss=0.1546, simple_loss=0.2142, pruned_loss=0.04751, over 4828.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2207, pruned_loss=0.03974, over 972972.83 frames.], batch size: 15, lr: 3.34e-04 2022-05-05 13:28:32,833 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-224000.pt 2022-05-05 13:28:41,259 INFO [train.py:715] (0/8) Epoch 6, batch 15150, loss[loss=0.1717, simple_loss=0.2329, pruned_loss=0.05521, over 4784.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2202, pruned_loss=0.03975, over 972979.02 frames.], batch size: 18, lr: 3.34e-04 2022-05-05 13:29:19,828 INFO [train.py:715] (0/8) Epoch 6, batch 15200, loss[loss=0.1349, simple_loss=0.2045, pruned_loss=0.03269, over 4923.00 frames.], tot_loss[loss=0.1499, simple_loss=0.22, pruned_loss=0.03993, over 972959.61 frames.], batch size: 23, lr: 3.34e-04 2022-05-05 13:29:58,371 INFO [train.py:715] (0/8) Epoch 6, batch 15250, loss[loss=0.1645, simple_loss=0.2388, pruned_loss=0.04513, over 4780.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2208, pruned_loss=0.03991, over 972795.06 frames.], batch size: 17, lr: 3.34e-04 2022-05-05 13:30:37,904 INFO [train.py:715] (0/8) Epoch 6, batch 15300, loss[loss=0.158, simple_loss=0.2329, pruned_loss=0.04149, over 4875.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2207, pruned_loss=0.03959, over 972237.41 frames.], batch size: 16, lr: 3.34e-04 2022-05-05 13:31:15,930 INFO [train.py:715] (0/8) Epoch 6, batch 15350, loss[loss=0.1483, simple_loss=0.2233, pruned_loss=0.03663, over 4820.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2213, pruned_loss=0.03951, over 972320.74 frames.], batch size: 13, lr: 3.34e-04 2022-05-05 13:31:54,940 INFO [train.py:715] (0/8) Epoch 6, batch 15400, loss[loss=0.1297, simple_loss=0.2003, pruned_loss=0.02951, over 4750.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03929, over 972596.59 frames.], batch size: 19, lr: 3.34e-04 2022-05-05 13:32:33,861 INFO [train.py:715] (0/8) Epoch 6, batch 15450, loss[loss=0.1574, simple_loss=0.2338, pruned_loss=0.04046, over 4944.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.03932, over 972370.23 frames.], batch size: 35, lr: 3.34e-04 2022-05-05 13:33:13,324 INFO [train.py:715] (0/8) Epoch 6, batch 15500, loss[loss=0.1604, simple_loss=0.2249, pruned_loss=0.04795, over 4684.00 frames.], tot_loss[loss=0.149, simple_loss=0.2198, pruned_loss=0.03907, over 972395.84 frames.], batch size: 15, lr: 3.34e-04 2022-05-05 13:33:51,500 INFO [train.py:715] (0/8) Epoch 6, batch 15550, loss[loss=0.1372, simple_loss=0.1969, pruned_loss=0.03875, over 4747.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2198, pruned_loss=0.03899, over 971739.92 frames.], batch size: 19, lr: 3.33e-04 2022-05-05 13:34:30,395 INFO [train.py:715] (0/8) Epoch 6, batch 15600, loss[loss=0.148, simple_loss=0.2141, pruned_loss=0.04096, over 4971.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2199, pruned_loss=0.03936, over 971945.73 frames.], batch size: 15, lr: 3.33e-04 2022-05-05 13:35:09,324 INFO [train.py:715] (0/8) Epoch 6, batch 15650, loss[loss=0.134, simple_loss=0.2011, pruned_loss=0.03342, over 4793.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2205, pruned_loss=0.03962, over 972332.60 frames.], batch size: 25, lr: 3.33e-04 2022-05-05 13:35:47,366 INFO [train.py:715] (0/8) Epoch 6, batch 15700, loss[loss=0.1105, simple_loss=0.182, pruned_loss=0.01948, over 4756.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2196, pruned_loss=0.03932, over 972366.78 frames.], batch size: 19, lr: 3.33e-04 2022-05-05 13:36:26,050 INFO [train.py:715] (0/8) Epoch 6, batch 15750, loss[loss=0.1467, simple_loss=0.2209, pruned_loss=0.03625, over 4930.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2198, pruned_loss=0.03917, over 972877.16 frames.], batch size: 18, lr: 3.33e-04 2022-05-05 13:37:04,789 INFO [train.py:715] (0/8) Epoch 6, batch 15800, loss[loss=0.1266, simple_loss=0.1988, pruned_loss=0.02714, over 4746.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2189, pruned_loss=0.03844, over 973102.51 frames.], batch size: 19, lr: 3.33e-04 2022-05-05 13:37:43,835 INFO [train.py:715] (0/8) Epoch 6, batch 15850, loss[loss=0.1599, simple_loss=0.2353, pruned_loss=0.04226, over 4890.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2194, pruned_loss=0.03871, over 973072.97 frames.], batch size: 17, lr: 3.33e-04 2022-05-05 13:38:22,279 INFO [train.py:715] (0/8) Epoch 6, batch 15900, loss[loss=0.1498, simple_loss=0.2245, pruned_loss=0.03752, over 4980.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2203, pruned_loss=0.03902, over 973566.47 frames.], batch size: 24, lr: 3.33e-04 2022-05-05 13:39:00,645 INFO [train.py:715] (0/8) Epoch 6, batch 15950, loss[loss=0.1408, simple_loss=0.214, pruned_loss=0.03376, over 4924.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2197, pruned_loss=0.03853, over 974091.63 frames.], batch size: 18, lr: 3.33e-04 2022-05-05 13:39:39,971 INFO [train.py:715] (0/8) Epoch 6, batch 16000, loss[loss=0.1281, simple_loss=0.1977, pruned_loss=0.0292, over 4691.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2193, pruned_loss=0.0384, over 972896.44 frames.], batch size: 15, lr: 3.33e-04 2022-05-05 13:40:18,427 INFO [train.py:715] (0/8) Epoch 6, batch 16050, loss[loss=0.1148, simple_loss=0.187, pruned_loss=0.02127, over 4742.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2195, pruned_loss=0.03844, over 972607.83 frames.], batch size: 16, lr: 3.33e-04 2022-05-05 13:40:56,903 INFO [train.py:715] (0/8) Epoch 6, batch 16100, loss[loss=0.1332, simple_loss=0.211, pruned_loss=0.02773, over 4782.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2195, pruned_loss=0.03854, over 973461.42 frames.], batch size: 18, lr: 3.33e-04 2022-05-05 13:41:35,293 INFO [train.py:715] (0/8) Epoch 6, batch 16150, loss[loss=0.1234, simple_loss=0.1933, pruned_loss=0.02674, over 4982.00 frames.], tot_loss[loss=0.1486, simple_loss=0.22, pruned_loss=0.03864, over 973495.74 frames.], batch size: 25, lr: 3.33e-04 2022-05-05 13:42:14,794 INFO [train.py:715] (0/8) Epoch 6, batch 16200, loss[loss=0.1207, simple_loss=0.1938, pruned_loss=0.0238, over 4839.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2198, pruned_loss=0.03837, over 972574.40 frames.], batch size: 26, lr: 3.33e-04 2022-05-05 13:42:53,106 INFO [train.py:715] (0/8) Epoch 6, batch 16250, loss[loss=0.2014, simple_loss=0.2801, pruned_loss=0.0614, over 4886.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2207, pruned_loss=0.03881, over 971977.86 frames.], batch size: 19, lr: 3.33e-04 2022-05-05 13:43:31,725 INFO [train.py:715] (0/8) Epoch 6, batch 16300, loss[loss=0.1339, simple_loss=0.2086, pruned_loss=0.02961, over 4881.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2205, pruned_loss=0.03849, over 971846.73 frames.], batch size: 20, lr: 3.33e-04 2022-05-05 13:44:11,200 INFO [train.py:715] (0/8) Epoch 6, batch 16350, loss[loss=0.155, simple_loss=0.2302, pruned_loss=0.03996, over 4984.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2204, pruned_loss=0.03852, over 972108.46 frames.], batch size: 35, lr: 3.33e-04 2022-05-05 13:44:49,505 INFO [train.py:715] (0/8) Epoch 6, batch 16400, loss[loss=0.1456, simple_loss=0.2156, pruned_loss=0.03785, over 4866.00 frames.], tot_loss[loss=0.149, simple_loss=0.2205, pruned_loss=0.03873, over 972079.79 frames.], batch size: 20, lr: 3.33e-04 2022-05-05 13:45:28,821 INFO [train.py:715] (0/8) Epoch 6, batch 16450, loss[loss=0.1502, simple_loss=0.2256, pruned_loss=0.03735, over 4799.00 frames.], tot_loss[loss=0.148, simple_loss=0.2199, pruned_loss=0.03804, over 971785.43 frames.], batch size: 14, lr: 3.33e-04 2022-05-05 13:46:07,625 INFO [train.py:715] (0/8) Epoch 6, batch 16500, loss[loss=0.1302, simple_loss=0.1947, pruned_loss=0.03288, over 4896.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.03859, over 972239.98 frames.], batch size: 18, lr: 3.33e-04 2022-05-05 13:46:46,574 INFO [train.py:715] (0/8) Epoch 6, batch 16550, loss[loss=0.1644, simple_loss=0.2247, pruned_loss=0.0521, over 4874.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2185, pruned_loss=0.03813, over 971479.22 frames.], batch size: 16, lr: 3.33e-04 2022-05-05 13:47:24,410 INFO [train.py:715] (0/8) Epoch 6, batch 16600, loss[loss=0.1361, simple_loss=0.2275, pruned_loss=0.02229, over 4770.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2191, pruned_loss=0.03815, over 971936.16 frames.], batch size: 18, lr: 3.33e-04 2022-05-05 13:48:03,150 INFO [train.py:715] (0/8) Epoch 6, batch 16650, loss[loss=0.1799, simple_loss=0.2519, pruned_loss=0.05401, over 4913.00 frames.], tot_loss[loss=0.1486, simple_loss=0.22, pruned_loss=0.03857, over 972268.37 frames.], batch size: 39, lr: 3.33e-04 2022-05-05 13:48:42,810 INFO [train.py:715] (0/8) Epoch 6, batch 16700, loss[loss=0.168, simple_loss=0.2484, pruned_loss=0.04378, over 4910.00 frames.], tot_loss[loss=0.1476, simple_loss=0.219, pruned_loss=0.03811, over 972388.27 frames.], batch size: 17, lr: 3.33e-04 2022-05-05 13:49:21,218 INFO [train.py:715] (0/8) Epoch 6, batch 16750, loss[loss=0.1563, simple_loss=0.2277, pruned_loss=0.04249, over 4688.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2195, pruned_loss=0.03887, over 972947.95 frames.], batch size: 15, lr: 3.33e-04 2022-05-05 13:50:00,115 INFO [train.py:715] (0/8) Epoch 6, batch 16800, loss[loss=0.1355, simple_loss=0.2084, pruned_loss=0.0313, over 4887.00 frames.], tot_loss[loss=0.148, simple_loss=0.219, pruned_loss=0.0385, over 972445.60 frames.], batch size: 32, lr: 3.33e-04 2022-05-05 13:50:39,322 INFO [train.py:715] (0/8) Epoch 6, batch 16850, loss[loss=0.1431, simple_loss=0.2147, pruned_loss=0.03577, over 4923.00 frames.], tot_loss[loss=0.1481, simple_loss=0.219, pruned_loss=0.03859, over 973364.21 frames.], batch size: 18, lr: 3.33e-04 2022-05-05 13:51:19,118 INFO [train.py:715] (0/8) Epoch 6, batch 16900, loss[loss=0.1484, simple_loss=0.2322, pruned_loss=0.03229, over 4820.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2191, pruned_loss=0.03855, over 972785.64 frames.], batch size: 25, lr: 3.32e-04 2022-05-05 13:51:57,171 INFO [train.py:715] (0/8) Epoch 6, batch 16950, loss[loss=0.1421, simple_loss=0.2022, pruned_loss=0.04106, over 4775.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2187, pruned_loss=0.03836, over 972377.04 frames.], batch size: 17, lr: 3.32e-04 2022-05-05 13:52:36,228 INFO [train.py:715] (0/8) Epoch 6, batch 17000, loss[loss=0.1595, simple_loss=0.2402, pruned_loss=0.03947, over 4790.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2195, pruned_loss=0.03874, over 972624.18 frames.], batch size: 14, lr: 3.32e-04 2022-05-05 13:53:15,745 INFO [train.py:715] (0/8) Epoch 6, batch 17050, loss[loss=0.1414, simple_loss=0.2017, pruned_loss=0.04052, over 4890.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2189, pruned_loss=0.03845, over 972026.51 frames.], batch size: 32, lr: 3.32e-04 2022-05-05 13:53:53,899 INFO [train.py:715] (0/8) Epoch 6, batch 17100, loss[loss=0.1537, simple_loss=0.226, pruned_loss=0.04075, over 4958.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2195, pruned_loss=0.03894, over 971587.84 frames.], batch size: 24, lr: 3.32e-04 2022-05-05 13:54:32,772 INFO [train.py:715] (0/8) Epoch 6, batch 17150, loss[loss=0.161, simple_loss=0.2286, pruned_loss=0.04673, over 4849.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2192, pruned_loss=0.03893, over 970922.17 frames.], batch size: 15, lr: 3.32e-04 2022-05-05 13:55:11,747 INFO [train.py:715] (0/8) Epoch 6, batch 17200, loss[loss=0.1733, simple_loss=0.2467, pruned_loss=0.04995, over 4945.00 frames.], tot_loss[loss=0.149, simple_loss=0.2197, pruned_loss=0.03913, over 971501.65 frames.], batch size: 23, lr: 3.32e-04 2022-05-05 13:55:51,109 INFO [train.py:715] (0/8) Epoch 6, batch 17250, loss[loss=0.1673, simple_loss=0.2438, pruned_loss=0.04537, over 4877.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2192, pruned_loss=0.03901, over 972432.18 frames.], batch size: 16, lr: 3.32e-04 2022-05-05 13:56:29,073 INFO [train.py:715] (0/8) Epoch 6, batch 17300, loss[loss=0.1488, simple_loss=0.2258, pruned_loss=0.03588, over 4745.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2197, pruned_loss=0.03895, over 972646.49 frames.], batch size: 19, lr: 3.32e-04 2022-05-05 13:57:07,889 INFO [train.py:715] (0/8) Epoch 6, batch 17350, loss[loss=0.1525, simple_loss=0.2236, pruned_loss=0.04065, over 4928.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2192, pruned_loss=0.03827, over 973168.88 frames.], batch size: 21, lr: 3.32e-04 2022-05-05 13:57:47,276 INFO [train.py:715] (0/8) Epoch 6, batch 17400, loss[loss=0.1252, simple_loss=0.1921, pruned_loss=0.02916, over 4773.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2195, pruned_loss=0.03876, over 973404.77 frames.], batch size: 18, lr: 3.32e-04 2022-05-05 13:58:26,210 INFO [train.py:715] (0/8) Epoch 6, batch 17450, loss[loss=0.1418, simple_loss=0.2073, pruned_loss=0.03817, over 4872.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2199, pruned_loss=0.03861, over 972927.53 frames.], batch size: 32, lr: 3.32e-04 2022-05-05 13:59:04,828 INFO [train.py:715] (0/8) Epoch 6, batch 17500, loss[loss=0.1409, simple_loss=0.2207, pruned_loss=0.03056, over 4777.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2198, pruned_loss=0.03883, over 972580.93 frames.], batch size: 17, lr: 3.32e-04 2022-05-05 13:59:43,980 INFO [train.py:715] (0/8) Epoch 6, batch 17550, loss[loss=0.1466, simple_loss=0.2214, pruned_loss=0.03591, over 4802.00 frames.], tot_loss[loss=0.1479, simple_loss=0.219, pruned_loss=0.03842, over 972201.37 frames.], batch size: 24, lr: 3.32e-04 2022-05-05 14:00:23,860 INFO [train.py:715] (0/8) Epoch 6, batch 17600, loss[loss=0.1292, simple_loss=0.1999, pruned_loss=0.02929, over 4807.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.03863, over 972737.20 frames.], batch size: 13, lr: 3.32e-04 2022-05-05 14:01:01,422 INFO [train.py:715] (0/8) Epoch 6, batch 17650, loss[loss=0.1159, simple_loss=0.1963, pruned_loss=0.01778, over 4780.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2183, pruned_loss=0.03791, over 972315.79 frames.], batch size: 17, lr: 3.32e-04 2022-05-05 14:01:40,862 INFO [train.py:715] (0/8) Epoch 6, batch 17700, loss[loss=0.1481, simple_loss=0.2258, pruned_loss=0.03523, over 4905.00 frames.], tot_loss[loss=0.1481, simple_loss=0.219, pruned_loss=0.03859, over 972244.95 frames.], batch size: 19, lr: 3.32e-04 2022-05-05 14:02:20,248 INFO [train.py:715] (0/8) Epoch 6, batch 17750, loss[loss=0.17, simple_loss=0.2469, pruned_loss=0.04655, over 4797.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2198, pruned_loss=0.03917, over 971291.58 frames.], batch size: 21, lr: 3.32e-04 2022-05-05 14:02:58,603 INFO [train.py:715] (0/8) Epoch 6, batch 17800, loss[loss=0.1992, simple_loss=0.2607, pruned_loss=0.06886, over 4821.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2198, pruned_loss=0.03939, over 970952.68 frames.], batch size: 26, lr: 3.32e-04 2022-05-05 14:03:37,537 INFO [train.py:715] (0/8) Epoch 6, batch 17850, loss[loss=0.1279, simple_loss=0.2059, pruned_loss=0.02499, over 4762.00 frames.], tot_loss[loss=0.1489, simple_loss=0.22, pruned_loss=0.03894, over 971322.58 frames.], batch size: 16, lr: 3.32e-04 2022-05-05 14:04:16,743 INFO [train.py:715] (0/8) Epoch 6, batch 17900, loss[loss=0.1834, simple_loss=0.2497, pruned_loss=0.05861, over 4914.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2196, pruned_loss=0.03844, over 971514.77 frames.], batch size: 18, lr: 3.32e-04 2022-05-05 14:04:56,307 INFO [train.py:715] (0/8) Epoch 6, batch 17950, loss[loss=0.1171, simple_loss=0.1923, pruned_loss=0.02092, over 4738.00 frames.], tot_loss[loss=0.149, simple_loss=0.2202, pruned_loss=0.03889, over 971338.99 frames.], batch size: 12, lr: 3.32e-04 2022-05-05 14:05:34,135 INFO [train.py:715] (0/8) Epoch 6, batch 18000, loss[loss=0.1533, simple_loss=0.2211, pruned_loss=0.04269, over 4983.00 frames.], tot_loss[loss=0.148, simple_loss=0.2189, pruned_loss=0.03853, over 971463.86 frames.], batch size: 15, lr: 3.32e-04 2022-05-05 14:05:34,137 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 14:05:43,885 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1087, simple_loss=0.1939, pruned_loss=0.0118, over 914524.00 frames. 2022-05-05 14:06:22,336 INFO [train.py:715] (0/8) Epoch 6, batch 18050, loss[loss=0.1599, simple_loss=0.2259, pruned_loss=0.04699, over 4976.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2195, pruned_loss=0.03889, over 971747.15 frames.], batch size: 35, lr: 3.32e-04 2022-05-05 14:07:01,816 INFO [train.py:715] (0/8) Epoch 6, batch 18100, loss[loss=0.1326, simple_loss=0.2136, pruned_loss=0.02582, over 4804.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2197, pruned_loss=0.03899, over 972381.92 frames.], batch size: 25, lr: 3.32e-04 2022-05-05 14:07:41,266 INFO [train.py:715] (0/8) Epoch 6, batch 18150, loss[loss=0.1427, simple_loss=0.2063, pruned_loss=0.0396, over 4960.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2207, pruned_loss=0.03948, over 972807.16 frames.], batch size: 24, lr: 3.32e-04 2022-05-05 14:08:19,360 INFO [train.py:715] (0/8) Epoch 6, batch 18200, loss[loss=0.1438, simple_loss=0.2217, pruned_loss=0.033, over 4987.00 frames.], tot_loss[loss=0.149, simple_loss=0.2198, pruned_loss=0.0391, over 972517.56 frames.], batch size: 25, lr: 3.32e-04 2022-05-05 14:08:58,865 INFO [train.py:715] (0/8) Epoch 6, batch 18250, loss[loss=0.1577, simple_loss=0.2229, pruned_loss=0.04624, over 4712.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2194, pruned_loss=0.03895, over 971175.97 frames.], batch size: 12, lr: 3.31e-04 2022-05-05 14:09:38,212 INFO [train.py:715] (0/8) Epoch 6, batch 18300, loss[loss=0.1718, simple_loss=0.2415, pruned_loss=0.05101, over 4944.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2207, pruned_loss=0.03883, over 971496.12 frames.], batch size: 35, lr: 3.31e-04 2022-05-05 14:10:17,259 INFO [train.py:715] (0/8) Epoch 6, batch 18350, loss[loss=0.1732, simple_loss=0.2416, pruned_loss=0.05238, over 4991.00 frames.], tot_loss[loss=0.15, simple_loss=0.2211, pruned_loss=0.0394, over 970904.24 frames.], batch size: 14, lr: 3.31e-04 2022-05-05 14:10:55,589 INFO [train.py:715] (0/8) Epoch 6, batch 18400, loss[loss=0.1405, simple_loss=0.224, pruned_loss=0.02853, over 4978.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2201, pruned_loss=0.03877, over 971766.27 frames.], batch size: 25, lr: 3.31e-04 2022-05-05 14:11:34,888 INFO [train.py:715] (0/8) Epoch 6, batch 18450, loss[loss=0.1604, simple_loss=0.2267, pruned_loss=0.04707, over 4865.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2202, pruned_loss=0.03899, over 971845.72 frames.], batch size: 34, lr: 3.31e-04 2022-05-05 14:12:14,314 INFO [train.py:715] (0/8) Epoch 6, batch 18500, loss[loss=0.152, simple_loss=0.2219, pruned_loss=0.04102, over 4916.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2201, pruned_loss=0.0386, over 971647.47 frames.], batch size: 18, lr: 3.31e-04 2022-05-05 14:12:52,314 INFO [train.py:715] (0/8) Epoch 6, batch 18550, loss[loss=0.2065, simple_loss=0.2607, pruned_loss=0.07615, over 4748.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2211, pruned_loss=0.03982, over 972463.99 frames.], batch size: 16, lr: 3.31e-04 2022-05-05 14:13:31,748 INFO [train.py:715] (0/8) Epoch 6, batch 18600, loss[loss=0.1385, simple_loss=0.2024, pruned_loss=0.03724, over 4962.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2211, pruned_loss=0.03931, over 972055.71 frames.], batch size: 21, lr: 3.31e-04 2022-05-05 14:14:10,845 INFO [train.py:715] (0/8) Epoch 6, batch 18650, loss[loss=0.1513, simple_loss=0.218, pruned_loss=0.04232, over 4914.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2205, pruned_loss=0.03909, over 971747.07 frames.], batch size: 19, lr: 3.31e-04 2022-05-05 14:14:50,388 INFO [train.py:715] (0/8) Epoch 6, batch 18700, loss[loss=0.155, simple_loss=0.2276, pruned_loss=0.04116, over 4744.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2211, pruned_loss=0.03917, over 971838.11 frames.], batch size: 16, lr: 3.31e-04 2022-05-05 14:15:28,527 INFO [train.py:715] (0/8) Epoch 6, batch 18750, loss[loss=0.1681, simple_loss=0.2373, pruned_loss=0.04944, over 4933.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2215, pruned_loss=0.03955, over 972157.65 frames.], batch size: 29, lr: 3.31e-04 2022-05-05 14:16:07,699 INFO [train.py:715] (0/8) Epoch 6, batch 18800, loss[loss=0.2074, simple_loss=0.2835, pruned_loss=0.06562, over 4840.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2208, pruned_loss=0.03923, over 971805.61 frames.], batch size: 15, lr: 3.31e-04 2022-05-05 14:16:47,203 INFO [train.py:715] (0/8) Epoch 6, batch 18850, loss[loss=0.1792, simple_loss=0.2474, pruned_loss=0.05546, over 4788.00 frames.], tot_loss[loss=0.1503, simple_loss=0.2213, pruned_loss=0.03967, over 971690.03 frames.], batch size: 18, lr: 3.31e-04 2022-05-05 14:17:25,254 INFO [train.py:715] (0/8) Epoch 6, batch 18900, loss[loss=0.113, simple_loss=0.1741, pruned_loss=0.02599, over 4831.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2206, pruned_loss=0.03912, over 971663.30 frames.], batch size: 13, lr: 3.31e-04 2022-05-05 14:18:04,839 INFO [train.py:715] (0/8) Epoch 6, batch 18950, loss[loss=0.1616, simple_loss=0.233, pruned_loss=0.04508, over 4966.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2205, pruned_loss=0.03933, over 971197.30 frames.], batch size: 40, lr: 3.31e-04 2022-05-05 14:18:43,968 INFO [train.py:715] (0/8) Epoch 6, batch 19000, loss[loss=0.121, simple_loss=0.1972, pruned_loss=0.02235, over 4820.00 frames.], tot_loss[loss=0.149, simple_loss=0.2203, pruned_loss=0.03889, over 970556.88 frames.], batch size: 27, lr: 3.31e-04 2022-05-05 14:19:23,154 INFO [train.py:715] (0/8) Epoch 6, batch 19050, loss[loss=0.155, simple_loss=0.2149, pruned_loss=0.04751, over 4823.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2196, pruned_loss=0.0383, over 970972.87 frames.], batch size: 15, lr: 3.31e-04 2022-05-05 14:20:01,542 INFO [train.py:715] (0/8) Epoch 6, batch 19100, loss[loss=0.1396, simple_loss=0.202, pruned_loss=0.0386, over 4858.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2203, pruned_loss=0.03879, over 971169.92 frames.], batch size: 32, lr: 3.31e-04 2022-05-05 14:20:40,516 INFO [train.py:715] (0/8) Epoch 6, batch 19150, loss[loss=0.1612, simple_loss=0.2344, pruned_loss=0.04404, over 4961.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2201, pruned_loss=0.03858, over 971615.95 frames.], batch size: 29, lr: 3.31e-04 2022-05-05 14:21:20,169 INFO [train.py:715] (0/8) Epoch 6, batch 19200, loss[loss=0.1538, simple_loss=0.2271, pruned_loss=0.04028, over 4816.00 frames.], tot_loss[loss=0.1475, simple_loss=0.219, pruned_loss=0.03805, over 971199.86 frames.], batch size: 27, lr: 3.31e-04 2022-05-05 14:21:58,236 INFO [train.py:715] (0/8) Epoch 6, batch 19250, loss[loss=0.1401, simple_loss=0.2117, pruned_loss=0.03428, over 4784.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2185, pruned_loss=0.03797, over 971717.79 frames.], batch size: 14, lr: 3.31e-04 2022-05-05 14:22:37,139 INFO [train.py:715] (0/8) Epoch 6, batch 19300, loss[loss=0.1517, simple_loss=0.2222, pruned_loss=0.0406, over 4920.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2184, pruned_loss=0.03751, over 971621.09 frames.], batch size: 17, lr: 3.31e-04 2022-05-05 14:23:16,400 INFO [train.py:715] (0/8) Epoch 6, batch 19350, loss[loss=0.1416, simple_loss=0.2121, pruned_loss=0.03551, over 4975.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2195, pruned_loss=0.03799, over 971117.09 frames.], batch size: 14, lr: 3.31e-04 2022-05-05 14:23:54,990 INFO [train.py:715] (0/8) Epoch 6, batch 19400, loss[loss=0.1522, simple_loss=0.2236, pruned_loss=0.04035, over 4945.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2186, pruned_loss=0.03743, over 970644.99 frames.], batch size: 21, lr: 3.31e-04 2022-05-05 14:24:33,667 INFO [train.py:715] (0/8) Epoch 6, batch 19450, loss[loss=0.136, simple_loss=0.2026, pruned_loss=0.03473, over 4691.00 frames.], tot_loss[loss=0.148, simple_loss=0.2192, pruned_loss=0.03836, over 971104.18 frames.], batch size: 15, lr: 3.31e-04 2022-05-05 14:25:13,065 INFO [train.py:715] (0/8) Epoch 6, batch 19500, loss[loss=0.1416, simple_loss=0.2168, pruned_loss=0.03314, over 4960.00 frames.], tot_loss[loss=0.148, simple_loss=0.2195, pruned_loss=0.03826, over 971619.47 frames.], batch size: 24, lr: 3.31e-04 2022-05-05 14:25:51,971 INFO [train.py:715] (0/8) Epoch 6, batch 19550, loss[loss=0.1194, simple_loss=0.197, pruned_loss=0.02092, over 4829.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2182, pruned_loss=0.03778, over 971633.15 frames.], batch size: 13, lr: 3.31e-04 2022-05-05 14:26:30,331 INFO [train.py:715] (0/8) Epoch 6, batch 19600, loss[loss=0.1379, simple_loss=0.2195, pruned_loss=0.02812, over 4794.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2196, pruned_loss=0.03858, over 971977.28 frames.], batch size: 24, lr: 3.31e-04 2022-05-05 14:27:09,228 INFO [train.py:715] (0/8) Epoch 6, batch 19650, loss[loss=0.1208, simple_loss=0.1851, pruned_loss=0.02824, over 4969.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2183, pruned_loss=0.03828, over 971686.93 frames.], batch size: 14, lr: 3.30e-04 2022-05-05 14:27:48,351 INFO [train.py:715] (0/8) Epoch 6, batch 19700, loss[loss=0.1181, simple_loss=0.1934, pruned_loss=0.02143, over 4965.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2188, pruned_loss=0.03823, over 971369.80 frames.], batch size: 24, lr: 3.30e-04 2022-05-05 14:28:27,131 INFO [train.py:715] (0/8) Epoch 6, batch 19750, loss[loss=0.1601, simple_loss=0.2319, pruned_loss=0.04416, over 4904.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2193, pruned_loss=0.03848, over 970869.11 frames.], batch size: 17, lr: 3.30e-04 2022-05-05 14:29:05,243 INFO [train.py:715] (0/8) Epoch 6, batch 19800, loss[loss=0.1476, simple_loss=0.2203, pruned_loss=0.03743, over 4758.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2196, pruned_loss=0.03876, over 971566.69 frames.], batch size: 19, lr: 3.30e-04 2022-05-05 14:29:44,597 INFO [train.py:715] (0/8) Epoch 6, batch 19850, loss[loss=0.1305, simple_loss=0.2112, pruned_loss=0.02493, over 4992.00 frames.], tot_loss[loss=0.1495, simple_loss=0.2209, pruned_loss=0.03906, over 971179.79 frames.], batch size: 14, lr: 3.30e-04 2022-05-05 14:30:24,340 INFO [train.py:715] (0/8) Epoch 6, batch 19900, loss[loss=0.1583, simple_loss=0.2386, pruned_loss=0.03905, over 4984.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2207, pruned_loss=0.03871, over 971033.34 frames.], batch size: 39, lr: 3.30e-04 2022-05-05 14:31:02,422 INFO [train.py:715] (0/8) Epoch 6, batch 19950, loss[loss=0.1434, simple_loss=0.2084, pruned_loss=0.03921, over 4946.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2209, pruned_loss=0.03848, over 971731.28 frames.], batch size: 39, lr: 3.30e-04 2022-05-05 14:31:41,546 INFO [train.py:715] (0/8) Epoch 6, batch 20000, loss[loss=0.1276, simple_loss=0.2019, pruned_loss=0.02662, over 4819.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2209, pruned_loss=0.03836, over 972188.85 frames.], batch size: 15, lr: 3.30e-04 2022-05-05 14:32:21,020 INFO [train.py:715] (0/8) Epoch 6, batch 20050, loss[loss=0.1587, simple_loss=0.2294, pruned_loss=0.04402, over 4956.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2207, pruned_loss=0.03836, over 972799.59 frames.], batch size: 35, lr: 3.30e-04 2022-05-05 14:32:59,450 INFO [train.py:715] (0/8) Epoch 6, batch 20100, loss[loss=0.1799, simple_loss=0.2636, pruned_loss=0.04805, over 4958.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2207, pruned_loss=0.03854, over 973043.90 frames.], batch size: 24, lr: 3.30e-04 2022-05-05 14:33:38,523 INFO [train.py:715] (0/8) Epoch 6, batch 20150, loss[loss=0.1341, simple_loss=0.2069, pruned_loss=0.03067, over 4737.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2206, pruned_loss=0.03879, over 972996.23 frames.], batch size: 16, lr: 3.30e-04 2022-05-05 14:34:17,806 INFO [train.py:715] (0/8) Epoch 6, batch 20200, loss[loss=0.1577, simple_loss=0.2221, pruned_loss=0.04667, over 4946.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2205, pruned_loss=0.03857, over 971655.44 frames.], batch size: 35, lr: 3.30e-04 2022-05-05 14:34:56,733 INFO [train.py:715] (0/8) Epoch 6, batch 20250, loss[loss=0.149, simple_loss=0.222, pruned_loss=0.03797, over 4808.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2197, pruned_loss=0.0384, over 972362.72 frames.], batch size: 21, lr: 3.30e-04 2022-05-05 14:35:35,495 INFO [train.py:715] (0/8) Epoch 6, batch 20300, loss[loss=0.1205, simple_loss=0.1922, pruned_loss=0.02437, over 4984.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2203, pruned_loss=0.03911, over 972060.29 frames.], batch size: 14, lr: 3.30e-04 2022-05-05 14:36:14,858 INFO [train.py:715] (0/8) Epoch 6, batch 20350, loss[loss=0.1459, simple_loss=0.2109, pruned_loss=0.0405, over 4780.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2192, pruned_loss=0.03867, over 972397.42 frames.], batch size: 18, lr: 3.30e-04 2022-05-05 14:36:54,305 INFO [train.py:715] (0/8) Epoch 6, batch 20400, loss[loss=0.1451, simple_loss=0.2181, pruned_loss=0.03607, over 4778.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2196, pruned_loss=0.03872, over 972977.77 frames.], batch size: 14, lr: 3.30e-04 2022-05-05 14:37:32,662 INFO [train.py:715] (0/8) Epoch 6, batch 20450, loss[loss=0.1719, simple_loss=0.2393, pruned_loss=0.05227, over 4935.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.03858, over 972573.71 frames.], batch size: 29, lr: 3.30e-04 2022-05-05 14:38:11,466 INFO [train.py:715] (0/8) Epoch 6, batch 20500, loss[loss=0.1265, simple_loss=0.1966, pruned_loss=0.02819, over 4847.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2208, pruned_loss=0.03882, over 972935.77 frames.], batch size: 26, lr: 3.30e-04 2022-05-05 14:38:50,530 INFO [train.py:715] (0/8) Epoch 6, batch 20550, loss[loss=0.1585, simple_loss=0.2342, pruned_loss=0.0414, over 4689.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2203, pruned_loss=0.03831, over 971943.29 frames.], batch size: 15, lr: 3.30e-04 2022-05-05 14:39:29,692 INFO [train.py:715] (0/8) Epoch 6, batch 20600, loss[loss=0.1533, simple_loss=0.2293, pruned_loss=0.03861, over 4923.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2207, pruned_loss=0.0387, over 972466.23 frames.], batch size: 23, lr: 3.30e-04 2022-05-05 14:40:07,957 INFO [train.py:715] (0/8) Epoch 6, batch 20650, loss[loss=0.1741, simple_loss=0.2459, pruned_loss=0.05117, over 4924.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2203, pruned_loss=0.03839, over 972974.87 frames.], batch size: 39, lr: 3.30e-04 2022-05-05 14:40:46,650 INFO [train.py:715] (0/8) Epoch 6, batch 20700, loss[loss=0.132, simple_loss=0.2117, pruned_loss=0.02616, over 4988.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2205, pruned_loss=0.03849, over 973612.13 frames.], batch size: 28, lr: 3.30e-04 2022-05-05 14:41:26,012 INFO [train.py:715] (0/8) Epoch 6, batch 20750, loss[loss=0.1332, simple_loss=0.2107, pruned_loss=0.02783, over 4953.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2203, pruned_loss=0.03828, over 973500.47 frames.], batch size: 35, lr: 3.30e-04 2022-05-05 14:42:04,390 INFO [train.py:715] (0/8) Epoch 6, batch 20800, loss[loss=0.1412, simple_loss=0.2133, pruned_loss=0.03449, over 4977.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2199, pruned_loss=0.03792, over 973074.28 frames.], batch size: 25, lr: 3.30e-04 2022-05-05 14:42:43,605 INFO [train.py:715] (0/8) Epoch 6, batch 20850, loss[loss=0.1435, simple_loss=0.2063, pruned_loss=0.04038, over 4799.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2188, pruned_loss=0.03804, over 972658.92 frames.], batch size: 24, lr: 3.30e-04 2022-05-05 14:43:22,879 INFO [train.py:715] (0/8) Epoch 6, batch 20900, loss[loss=0.143, simple_loss=0.2077, pruned_loss=0.03915, over 4793.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.0386, over 972238.20 frames.], batch size: 18, lr: 3.30e-04 2022-05-05 14:44:02,107 INFO [train.py:715] (0/8) Epoch 6, batch 20950, loss[loss=0.1205, simple_loss=0.1878, pruned_loss=0.02664, over 4849.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2202, pruned_loss=0.0392, over 973223.17 frames.], batch size: 13, lr: 3.30e-04 2022-05-05 14:44:40,091 INFO [train.py:715] (0/8) Epoch 6, batch 21000, loss[loss=0.1816, simple_loss=0.2389, pruned_loss=0.06216, over 4956.00 frames.], tot_loss[loss=0.149, simple_loss=0.2199, pruned_loss=0.03904, over 972894.25 frames.], batch size: 35, lr: 3.29e-04 2022-05-05 14:44:40,093 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 14:44:51,877 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1089, simple_loss=0.1939, pruned_loss=0.01192, over 914524.00 frames. 2022-05-05 14:45:30,114 INFO [train.py:715] (0/8) Epoch 6, batch 21050, loss[loss=0.1627, simple_loss=0.2371, pruned_loss=0.04417, over 4914.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2202, pruned_loss=0.03866, over 973140.24 frames.], batch size: 17, lr: 3.29e-04 2022-05-05 14:46:09,483 INFO [train.py:715] (0/8) Epoch 6, batch 21100, loss[loss=0.1424, simple_loss=0.205, pruned_loss=0.03991, over 4771.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2205, pruned_loss=0.0386, over 972513.85 frames.], batch size: 12, lr: 3.29e-04 2022-05-05 14:46:48,884 INFO [train.py:715] (0/8) Epoch 6, batch 21150, loss[loss=0.1663, simple_loss=0.2299, pruned_loss=0.05133, over 4912.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2207, pruned_loss=0.03921, over 973134.76 frames.], batch size: 17, lr: 3.29e-04 2022-05-05 14:47:27,341 INFO [train.py:715] (0/8) Epoch 6, batch 21200, loss[loss=0.1441, simple_loss=0.2131, pruned_loss=0.03752, over 4893.00 frames.], tot_loss[loss=0.15, simple_loss=0.2211, pruned_loss=0.03943, over 973739.62 frames.], batch size: 16, lr: 3.29e-04 2022-05-05 14:48:06,350 INFO [train.py:715] (0/8) Epoch 6, batch 21250, loss[loss=0.1574, simple_loss=0.2285, pruned_loss=0.0431, over 4815.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2206, pruned_loss=0.03882, over 973786.97 frames.], batch size: 26, lr: 3.29e-04 2022-05-05 14:48:45,975 INFO [train.py:715] (0/8) Epoch 6, batch 21300, loss[loss=0.2013, simple_loss=0.2533, pruned_loss=0.07466, over 4842.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2198, pruned_loss=0.03833, over 973685.87 frames.], batch size: 13, lr: 3.29e-04 2022-05-05 14:49:24,954 INFO [train.py:715] (0/8) Epoch 6, batch 21350, loss[loss=0.1485, simple_loss=0.2282, pruned_loss=0.0344, over 4766.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2191, pruned_loss=0.03804, over 974005.53 frames.], batch size: 19, lr: 3.29e-04 2022-05-05 14:50:03,786 INFO [train.py:715] (0/8) Epoch 6, batch 21400, loss[loss=0.1843, simple_loss=0.2472, pruned_loss=0.06072, over 4849.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2189, pruned_loss=0.03782, over 972341.84 frames.], batch size: 32, lr: 3.29e-04 2022-05-05 14:50:42,548 INFO [train.py:715] (0/8) Epoch 6, batch 21450, loss[loss=0.1394, simple_loss=0.204, pruned_loss=0.03741, over 4908.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2191, pruned_loss=0.03816, over 972262.82 frames.], batch size: 23, lr: 3.29e-04 2022-05-05 14:51:21,819 INFO [train.py:715] (0/8) Epoch 6, batch 21500, loss[loss=0.1182, simple_loss=0.1843, pruned_loss=0.02605, over 4877.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.03867, over 972617.74 frames.], batch size: 13, lr: 3.29e-04 2022-05-05 14:52:00,290 INFO [train.py:715] (0/8) Epoch 6, batch 21550, loss[loss=0.1434, simple_loss=0.2229, pruned_loss=0.0319, over 4812.00 frames.], tot_loss[loss=0.1475, simple_loss=0.219, pruned_loss=0.03801, over 972778.93 frames.], batch size: 25, lr: 3.29e-04 2022-05-05 14:52:39,311 INFO [train.py:715] (0/8) Epoch 6, batch 21600, loss[loss=0.1345, simple_loss=0.2064, pruned_loss=0.03131, over 4907.00 frames.], tot_loss[loss=0.147, simple_loss=0.2185, pruned_loss=0.03772, over 972754.17 frames.], batch size: 22, lr: 3.29e-04 2022-05-05 14:53:18,463 INFO [train.py:715] (0/8) Epoch 6, batch 21650, loss[loss=0.1672, simple_loss=0.2212, pruned_loss=0.05662, over 4869.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2183, pruned_loss=0.0375, over 972747.33 frames.], batch size: 32, lr: 3.29e-04 2022-05-05 14:53:57,742 INFO [train.py:715] (0/8) Epoch 6, batch 21700, loss[loss=0.1254, simple_loss=0.1863, pruned_loss=0.03222, over 4812.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2182, pruned_loss=0.0375, over 972869.30 frames.], batch size: 13, lr: 3.29e-04 2022-05-05 14:54:36,452 INFO [train.py:715] (0/8) Epoch 6, batch 21750, loss[loss=0.1304, simple_loss=0.2128, pruned_loss=0.02399, over 4960.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2177, pruned_loss=0.03707, over 972854.83 frames.], batch size: 24, lr: 3.29e-04 2022-05-05 14:55:15,311 INFO [train.py:715] (0/8) Epoch 6, batch 21800, loss[loss=0.1521, simple_loss=0.2207, pruned_loss=0.0418, over 4647.00 frames.], tot_loss[loss=0.145, simple_loss=0.2171, pruned_loss=0.03646, over 971286.50 frames.], batch size: 13, lr: 3.29e-04 2022-05-05 14:55:54,104 INFO [train.py:715] (0/8) Epoch 6, batch 21850, loss[loss=0.1238, simple_loss=0.2018, pruned_loss=0.02289, over 4885.00 frames.], tot_loss[loss=0.145, simple_loss=0.2172, pruned_loss=0.03642, over 971749.69 frames.], batch size: 32, lr: 3.29e-04 2022-05-05 14:56:32,644 INFO [train.py:715] (0/8) Epoch 6, batch 21900, loss[loss=0.1733, simple_loss=0.2432, pruned_loss=0.05173, over 4834.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2177, pruned_loss=0.03696, over 971378.12 frames.], batch size: 15, lr: 3.29e-04 2022-05-05 14:57:11,517 INFO [train.py:715] (0/8) Epoch 6, batch 21950, loss[loss=0.1407, simple_loss=0.2083, pruned_loss=0.03655, over 4850.00 frames.], tot_loss[loss=0.146, simple_loss=0.218, pruned_loss=0.03695, over 971599.09 frames.], batch size: 20, lr: 3.29e-04 2022-05-05 14:57:50,232 INFO [train.py:715] (0/8) Epoch 6, batch 22000, loss[loss=0.153, simple_loss=0.2301, pruned_loss=0.03799, over 4795.00 frames.], tot_loss[loss=0.147, simple_loss=0.2189, pruned_loss=0.03756, over 972441.18 frames.], batch size: 24, lr: 3.29e-04 2022-05-05 14:58:29,942 INFO [train.py:715] (0/8) Epoch 6, batch 22050, loss[loss=0.1375, simple_loss=0.2103, pruned_loss=0.03241, over 4769.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2186, pruned_loss=0.03724, over 971955.81 frames.], batch size: 18, lr: 3.29e-04 2022-05-05 14:59:08,259 INFO [train.py:715] (0/8) Epoch 6, batch 22100, loss[loss=0.1307, simple_loss=0.2009, pruned_loss=0.03021, over 4738.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2185, pruned_loss=0.03704, over 972028.51 frames.], batch size: 16, lr: 3.29e-04 2022-05-05 14:59:47,061 INFO [train.py:715] (0/8) Epoch 6, batch 22150, loss[loss=0.1902, simple_loss=0.2425, pruned_loss=0.06894, over 4764.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2193, pruned_loss=0.03775, over 972702.70 frames.], batch size: 12, lr: 3.29e-04 2022-05-05 15:00:26,253 INFO [train.py:715] (0/8) Epoch 6, batch 22200, loss[loss=0.1677, simple_loss=0.2438, pruned_loss=0.0458, over 4779.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2194, pruned_loss=0.03783, over 972940.25 frames.], batch size: 18, lr: 3.29e-04 2022-05-05 15:01:04,914 INFO [train.py:715] (0/8) Epoch 6, batch 22250, loss[loss=0.1298, simple_loss=0.1999, pruned_loss=0.02985, over 4962.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2203, pruned_loss=0.03816, over 973075.34 frames.], batch size: 35, lr: 3.29e-04 2022-05-05 15:01:43,596 INFO [train.py:715] (0/8) Epoch 6, batch 22300, loss[loss=0.1015, simple_loss=0.1702, pruned_loss=0.01637, over 4917.00 frames.], tot_loss[loss=0.147, simple_loss=0.2189, pruned_loss=0.03758, over 972258.94 frames.], batch size: 17, lr: 3.29e-04 2022-05-05 15:02:22,654 INFO [train.py:715] (0/8) Epoch 6, batch 22350, loss[loss=0.1543, simple_loss=0.2141, pruned_loss=0.04727, over 4838.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2191, pruned_loss=0.03823, over 972179.15 frames.], batch size: 30, lr: 3.29e-04 2022-05-05 15:03:02,002 INFO [train.py:715] (0/8) Epoch 6, batch 22400, loss[loss=0.18, simple_loss=0.2436, pruned_loss=0.05817, over 4903.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2199, pruned_loss=0.03842, over 971876.80 frames.], batch size: 17, lr: 3.29e-04 2022-05-05 15:03:40,488 INFO [train.py:715] (0/8) Epoch 6, batch 22450, loss[loss=0.1187, simple_loss=0.1886, pruned_loss=0.02441, over 4812.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2192, pruned_loss=0.03832, over 971881.86 frames.], batch size: 25, lr: 3.28e-04 2022-05-05 15:04:19,438 INFO [train.py:715] (0/8) Epoch 6, batch 22500, loss[loss=0.1625, simple_loss=0.2399, pruned_loss=0.04254, over 4805.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2189, pruned_loss=0.03776, over 971535.96 frames.], batch size: 25, lr: 3.28e-04 2022-05-05 15:04:58,756 INFO [train.py:715] (0/8) Epoch 6, batch 22550, loss[loss=0.1398, simple_loss=0.2125, pruned_loss=0.0336, over 4799.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2176, pruned_loss=0.03694, over 971557.91 frames.], batch size: 17, lr: 3.28e-04 2022-05-05 15:05:37,176 INFO [train.py:715] (0/8) Epoch 6, batch 22600, loss[loss=0.1524, simple_loss=0.2354, pruned_loss=0.03474, over 4697.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2177, pruned_loss=0.03722, over 972692.85 frames.], batch size: 15, lr: 3.28e-04 2022-05-05 15:06:16,004 INFO [train.py:715] (0/8) Epoch 6, batch 22650, loss[loss=0.1698, simple_loss=0.2389, pruned_loss=0.05031, over 4859.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2185, pruned_loss=0.03744, over 972169.44 frames.], batch size: 20, lr: 3.28e-04 2022-05-05 15:06:54,602 INFO [train.py:715] (0/8) Epoch 6, batch 22700, loss[loss=0.1515, simple_loss=0.221, pruned_loss=0.04097, over 4927.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2186, pruned_loss=0.03806, over 972262.05 frames.], batch size: 17, lr: 3.28e-04 2022-05-05 15:07:33,402 INFO [train.py:715] (0/8) Epoch 6, batch 22750, loss[loss=0.1579, simple_loss=0.227, pruned_loss=0.04439, over 4903.00 frames.], tot_loss[loss=0.1486, simple_loss=0.22, pruned_loss=0.03861, over 973113.50 frames.], batch size: 19, lr: 3.28e-04 2022-05-05 15:08:11,864 INFO [train.py:715] (0/8) Epoch 6, batch 22800, loss[loss=0.1722, simple_loss=0.2435, pruned_loss=0.05041, over 4907.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2207, pruned_loss=0.03891, over 972782.04 frames.], batch size: 39, lr: 3.28e-04 2022-05-05 15:08:50,373 INFO [train.py:715] (0/8) Epoch 6, batch 22850, loss[loss=0.1485, simple_loss=0.2191, pruned_loss=0.0389, over 4891.00 frames.], tot_loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03926, over 972606.33 frames.], batch size: 22, lr: 3.28e-04 2022-05-05 15:09:29,025 INFO [train.py:715] (0/8) Epoch 6, batch 22900, loss[loss=0.1358, simple_loss=0.2091, pruned_loss=0.03125, over 4797.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2208, pruned_loss=0.03945, over 972908.95 frames.], batch size: 21, lr: 3.28e-04 2022-05-05 15:10:08,154 INFO [train.py:715] (0/8) Epoch 6, batch 22950, loss[loss=0.1386, simple_loss=0.2108, pruned_loss=0.03325, over 4918.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2208, pruned_loss=0.03947, over 973312.97 frames.], batch size: 23, lr: 3.28e-04 2022-05-05 15:10:46,569 INFO [train.py:715] (0/8) Epoch 6, batch 23000, loss[loss=0.1264, simple_loss=0.1895, pruned_loss=0.03158, over 4961.00 frames.], tot_loss[loss=0.1501, simple_loss=0.2208, pruned_loss=0.03972, over 973624.26 frames.], batch size: 14, lr: 3.28e-04 2022-05-05 15:11:25,824 INFO [train.py:715] (0/8) Epoch 6, batch 23050, loss[loss=0.1548, simple_loss=0.2211, pruned_loss=0.04422, over 4792.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2201, pruned_loss=0.03932, over 973030.00 frames.], batch size: 14, lr: 3.28e-04 2022-05-05 15:12:05,298 INFO [train.py:715] (0/8) Epoch 6, batch 23100, loss[loss=0.1305, simple_loss=0.2046, pruned_loss=0.02819, over 4883.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2188, pruned_loss=0.03817, over 971729.20 frames.], batch size: 22, lr: 3.28e-04 2022-05-05 15:12:37,691 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-232000.pt 2022-05-05 15:12:46,116 INFO [train.py:715] (0/8) Epoch 6, batch 23150, loss[loss=0.1897, simple_loss=0.2562, pruned_loss=0.06164, over 4937.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.0386, over 972246.41 frames.], batch size: 29, lr: 3.28e-04 2022-05-05 15:13:25,464 INFO [train.py:715] (0/8) Epoch 6, batch 23200, loss[loss=0.1728, simple_loss=0.2406, pruned_loss=0.05245, over 4941.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2194, pruned_loss=0.03866, over 972702.56 frames.], batch size: 24, lr: 3.28e-04 2022-05-05 15:14:04,865 INFO [train.py:715] (0/8) Epoch 6, batch 23250, loss[loss=0.1503, simple_loss=0.2218, pruned_loss=0.03938, over 4812.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2203, pruned_loss=0.03921, over 972914.39 frames.], batch size: 13, lr: 3.28e-04 2022-05-05 15:14:43,521 INFO [train.py:715] (0/8) Epoch 6, batch 23300, loss[loss=0.15, simple_loss=0.2206, pruned_loss=0.03967, over 4793.00 frames.], tot_loss[loss=0.1493, simple_loss=0.2202, pruned_loss=0.03923, over 972592.64 frames.], batch size: 14, lr: 3.28e-04 2022-05-05 15:15:21,519 INFO [train.py:715] (0/8) Epoch 6, batch 23350, loss[loss=0.1337, simple_loss=0.2059, pruned_loss=0.03072, over 4958.00 frames.], tot_loss[loss=0.1491, simple_loss=0.22, pruned_loss=0.03908, over 972569.32 frames.], batch size: 15, lr: 3.28e-04 2022-05-05 15:16:00,565 INFO [train.py:715] (0/8) Epoch 6, batch 23400, loss[loss=0.1679, simple_loss=0.2373, pruned_loss=0.04924, over 4889.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2191, pruned_loss=0.03873, over 972810.28 frames.], batch size: 39, lr: 3.28e-04 2022-05-05 15:16:40,144 INFO [train.py:715] (0/8) Epoch 6, batch 23450, loss[loss=0.1337, simple_loss=0.2117, pruned_loss=0.02787, over 4738.00 frames.], tot_loss[loss=0.1491, simple_loss=0.22, pruned_loss=0.03913, over 973108.54 frames.], batch size: 19, lr: 3.28e-04 2022-05-05 15:17:19,118 INFO [train.py:715] (0/8) Epoch 6, batch 23500, loss[loss=0.1258, simple_loss=0.2003, pruned_loss=0.02571, over 4903.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2204, pruned_loss=0.03896, over 973057.29 frames.], batch size: 18, lr: 3.28e-04 2022-05-05 15:17:58,299 INFO [train.py:715] (0/8) Epoch 6, batch 23550, loss[loss=0.1248, simple_loss=0.2045, pruned_loss=0.0226, over 4941.00 frames.], tot_loss[loss=0.149, simple_loss=0.2203, pruned_loss=0.03882, over 972456.42 frames.], batch size: 23, lr: 3.28e-04 2022-05-05 15:18:37,512 INFO [train.py:715] (0/8) Epoch 6, batch 23600, loss[loss=0.1743, simple_loss=0.2399, pruned_loss=0.05432, over 4931.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2206, pruned_loss=0.03892, over 972725.82 frames.], batch size: 39, lr: 3.28e-04 2022-05-05 15:19:16,253 INFO [train.py:715] (0/8) Epoch 6, batch 23650, loss[loss=0.1403, simple_loss=0.214, pruned_loss=0.03325, over 4639.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2196, pruned_loss=0.0383, over 972197.34 frames.], batch size: 13, lr: 3.28e-04 2022-05-05 15:19:54,387 INFO [train.py:715] (0/8) Epoch 6, batch 23700, loss[loss=0.145, simple_loss=0.2263, pruned_loss=0.03183, over 4837.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2198, pruned_loss=0.0387, over 972063.52 frames.], batch size: 30, lr: 3.28e-04 2022-05-05 15:20:33,412 INFO [train.py:715] (0/8) Epoch 6, batch 23750, loss[loss=0.1308, simple_loss=0.2043, pruned_loss=0.02866, over 4965.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2207, pruned_loss=0.03902, over 971803.85 frames.], batch size: 21, lr: 3.28e-04 2022-05-05 15:21:12,832 INFO [train.py:715] (0/8) Epoch 6, batch 23800, loss[loss=0.1514, simple_loss=0.2218, pruned_loss=0.04046, over 4853.00 frames.], tot_loss[loss=0.1488, simple_loss=0.22, pruned_loss=0.03877, over 971982.63 frames.], batch size: 20, lr: 3.28e-04 2022-05-05 15:21:51,199 INFO [train.py:715] (0/8) Epoch 6, batch 23850, loss[loss=0.1312, simple_loss=0.217, pruned_loss=0.02266, over 4768.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2194, pruned_loss=0.03861, over 971240.18 frames.], batch size: 18, lr: 3.27e-04 2022-05-05 15:22:29,812 INFO [train.py:715] (0/8) Epoch 6, batch 23900, loss[loss=0.1661, simple_loss=0.2275, pruned_loss=0.05235, over 4814.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2199, pruned_loss=0.03849, over 970971.18 frames.], batch size: 21, lr: 3.27e-04 2022-05-05 15:23:08,541 INFO [train.py:715] (0/8) Epoch 6, batch 23950, loss[loss=0.1384, simple_loss=0.2091, pruned_loss=0.03378, over 4906.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2199, pruned_loss=0.0388, over 971528.93 frames.], batch size: 29, lr: 3.27e-04 2022-05-05 15:23:47,214 INFO [train.py:715] (0/8) Epoch 6, batch 24000, loss[loss=0.1557, simple_loss=0.2274, pruned_loss=0.04198, over 4677.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2196, pruned_loss=0.03877, over 971048.20 frames.], batch size: 15, lr: 3.27e-04 2022-05-05 15:23:47,216 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 15:23:58,203 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1089, simple_loss=0.1939, pruned_loss=0.01195, over 914524.00 frames. 2022-05-05 15:24:36,965 INFO [train.py:715] (0/8) Epoch 6, batch 24050, loss[loss=0.1577, simple_loss=0.2199, pruned_loss=0.04772, over 4938.00 frames.], tot_loss[loss=0.1481, simple_loss=0.219, pruned_loss=0.03861, over 970211.95 frames.], batch size: 23, lr: 3.27e-04 2022-05-05 15:25:15,029 INFO [train.py:715] (0/8) Epoch 6, batch 24100, loss[loss=0.1622, simple_loss=0.2306, pruned_loss=0.04687, over 4895.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2183, pruned_loss=0.03799, over 970359.68 frames.], batch size: 19, lr: 3.27e-04 2022-05-05 15:25:53,704 INFO [train.py:715] (0/8) Epoch 6, batch 24150, loss[loss=0.1239, simple_loss=0.1936, pruned_loss=0.02708, over 4988.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2176, pruned_loss=0.03762, over 970510.08 frames.], batch size: 31, lr: 3.27e-04 2022-05-05 15:26:32,795 INFO [train.py:715] (0/8) Epoch 6, batch 24200, loss[loss=0.158, simple_loss=0.2188, pruned_loss=0.04863, over 4866.00 frames.], tot_loss[loss=0.1467, simple_loss=0.218, pruned_loss=0.03772, over 970989.63 frames.], batch size: 32, lr: 3.27e-04 2022-05-05 15:27:10,717 INFO [train.py:715] (0/8) Epoch 6, batch 24250, loss[loss=0.1563, simple_loss=0.2201, pruned_loss=0.04623, over 4833.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2175, pruned_loss=0.03737, over 971445.96 frames.], batch size: 15, lr: 3.27e-04 2022-05-05 15:27:49,112 INFO [train.py:715] (0/8) Epoch 6, batch 24300, loss[loss=0.1218, simple_loss=0.1902, pruned_loss=0.02672, over 4810.00 frames.], tot_loss[loss=0.1455, simple_loss=0.217, pruned_loss=0.037, over 971838.38 frames.], batch size: 24, lr: 3.27e-04 2022-05-05 15:28:28,040 INFO [train.py:715] (0/8) Epoch 6, batch 24350, loss[loss=0.131, simple_loss=0.2051, pruned_loss=0.02843, over 4891.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2182, pruned_loss=0.0376, over 972868.40 frames.], batch size: 19, lr: 3.27e-04 2022-05-05 15:29:07,157 INFO [train.py:715] (0/8) Epoch 6, batch 24400, loss[loss=0.1459, simple_loss=0.219, pruned_loss=0.03643, over 4698.00 frames.], tot_loss[loss=0.146, simple_loss=0.2175, pruned_loss=0.03729, over 972022.30 frames.], batch size: 15, lr: 3.27e-04 2022-05-05 15:29:45,506 INFO [train.py:715] (0/8) Epoch 6, batch 24450, loss[loss=0.1396, simple_loss=0.2122, pruned_loss=0.03355, over 4932.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2181, pruned_loss=0.03736, over 971799.90 frames.], batch size: 18, lr: 3.27e-04 2022-05-05 15:30:24,116 INFO [train.py:715] (0/8) Epoch 6, batch 24500, loss[loss=0.1221, simple_loss=0.1915, pruned_loss=0.02635, over 4817.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2184, pruned_loss=0.03768, over 971667.80 frames.], batch size: 26, lr: 3.27e-04 2022-05-05 15:31:03,933 INFO [train.py:715] (0/8) Epoch 6, batch 24550, loss[loss=0.1356, simple_loss=0.2109, pruned_loss=0.0302, over 4812.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2191, pruned_loss=0.03808, over 972190.11 frames.], batch size: 26, lr: 3.27e-04 2022-05-05 15:31:42,158 INFO [train.py:715] (0/8) Epoch 6, batch 24600, loss[loss=0.1598, simple_loss=0.223, pruned_loss=0.04828, over 4981.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2188, pruned_loss=0.03821, over 972312.88 frames.], batch size: 15, lr: 3.27e-04 2022-05-05 15:32:21,357 INFO [train.py:715] (0/8) Epoch 6, batch 24650, loss[loss=0.1525, simple_loss=0.2213, pruned_loss=0.04186, over 4786.00 frames.], tot_loss[loss=0.1477, simple_loss=0.219, pruned_loss=0.03823, over 971989.20 frames.], batch size: 18, lr: 3.27e-04 2022-05-05 15:33:00,610 INFO [train.py:715] (0/8) Epoch 6, batch 24700, loss[loss=0.1584, simple_loss=0.2298, pruned_loss=0.04353, over 4866.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2191, pruned_loss=0.03832, over 971432.75 frames.], batch size: 32, lr: 3.27e-04 2022-05-05 15:33:39,469 INFO [train.py:715] (0/8) Epoch 6, batch 24750, loss[loss=0.1579, simple_loss=0.2278, pruned_loss=0.04399, over 4812.00 frames.], tot_loss[loss=0.149, simple_loss=0.2203, pruned_loss=0.03885, over 971661.07 frames.], batch size: 25, lr: 3.27e-04 2022-05-05 15:34:17,832 INFO [train.py:715] (0/8) Epoch 6, batch 24800, loss[loss=0.1154, simple_loss=0.1927, pruned_loss=0.01906, over 4880.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2202, pruned_loss=0.03858, over 971607.44 frames.], batch size: 22, lr: 3.27e-04 2022-05-05 15:34:56,837 INFO [train.py:715] (0/8) Epoch 6, batch 24850, loss[loss=0.1515, simple_loss=0.2227, pruned_loss=0.04011, over 4981.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2198, pruned_loss=0.03853, over 971758.11 frames.], batch size: 31, lr: 3.27e-04 2022-05-05 15:35:36,644 INFO [train.py:715] (0/8) Epoch 6, batch 24900, loss[loss=0.151, simple_loss=0.2075, pruned_loss=0.04724, over 4857.00 frames.], tot_loss[loss=0.149, simple_loss=0.2203, pruned_loss=0.03888, over 971528.70 frames.], batch size: 32, lr: 3.27e-04 2022-05-05 15:36:14,916 INFO [train.py:715] (0/8) Epoch 6, batch 24950, loss[loss=0.1405, simple_loss=0.2073, pruned_loss=0.03685, over 4849.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2197, pruned_loss=0.0383, over 970588.51 frames.], batch size: 30, lr: 3.27e-04 2022-05-05 15:36:53,552 INFO [train.py:715] (0/8) Epoch 6, batch 25000, loss[loss=0.1321, simple_loss=0.2022, pruned_loss=0.03101, over 4903.00 frames.], tot_loss[loss=0.148, simple_loss=0.2195, pruned_loss=0.03828, over 971619.73 frames.], batch size: 17, lr: 3.27e-04 2022-05-05 15:37:32,633 INFO [train.py:715] (0/8) Epoch 6, batch 25050, loss[loss=0.1321, simple_loss=0.2147, pruned_loss=0.02472, over 4896.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2191, pruned_loss=0.03814, over 972324.02 frames.], batch size: 22, lr: 3.27e-04 2022-05-05 15:38:11,567 INFO [train.py:715] (0/8) Epoch 6, batch 25100, loss[loss=0.1498, simple_loss=0.2199, pruned_loss=0.03987, over 4806.00 frames.], tot_loss[loss=0.1476, simple_loss=0.219, pruned_loss=0.03814, over 972405.64 frames.], batch size: 25, lr: 3.27e-04 2022-05-05 15:38:50,092 INFO [train.py:715] (0/8) Epoch 6, batch 25150, loss[loss=0.1582, simple_loss=0.232, pruned_loss=0.04217, over 4856.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2189, pruned_loss=0.03806, over 972190.23 frames.], batch size: 32, lr: 3.27e-04 2022-05-05 15:39:28,929 INFO [train.py:715] (0/8) Epoch 6, batch 25200, loss[loss=0.1402, simple_loss=0.1978, pruned_loss=0.04131, over 4825.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2194, pruned_loss=0.03872, over 972591.07 frames.], batch size: 15, lr: 3.27e-04 2022-05-05 15:40:07,776 INFO [train.py:715] (0/8) Epoch 6, batch 25250, loss[loss=0.1455, simple_loss=0.226, pruned_loss=0.03254, over 4757.00 frames.], tot_loss[loss=0.1489, simple_loss=0.22, pruned_loss=0.03887, over 972915.32 frames.], batch size: 16, lr: 3.26e-04 2022-05-05 15:40:46,084 INFO [train.py:715] (0/8) Epoch 6, batch 25300, loss[loss=0.1341, simple_loss=0.2086, pruned_loss=0.02982, over 4889.00 frames.], tot_loss[loss=0.1481, simple_loss=0.219, pruned_loss=0.03855, over 973085.79 frames.], batch size: 19, lr: 3.26e-04 2022-05-05 15:41:24,365 INFO [train.py:715] (0/8) Epoch 6, batch 25350, loss[loss=0.1542, simple_loss=0.2319, pruned_loss=0.03822, over 4758.00 frames.], tot_loss[loss=0.1473, simple_loss=0.218, pruned_loss=0.03834, over 973037.18 frames.], batch size: 19, lr: 3.26e-04 2022-05-05 15:42:03,172 INFO [train.py:715] (0/8) Epoch 6, batch 25400, loss[loss=0.1405, simple_loss=0.213, pruned_loss=0.034, over 4921.00 frames.], tot_loss[loss=0.1482, simple_loss=0.219, pruned_loss=0.03875, over 972764.47 frames.], batch size: 29, lr: 3.26e-04 2022-05-05 15:42:41,991 INFO [train.py:715] (0/8) Epoch 6, batch 25450, loss[loss=0.1621, simple_loss=0.2206, pruned_loss=0.05177, over 4959.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.03863, over 973240.90 frames.], batch size: 35, lr: 3.26e-04 2022-05-05 15:43:20,088 INFO [train.py:715] (0/8) Epoch 6, batch 25500, loss[loss=0.1565, simple_loss=0.2238, pruned_loss=0.04457, over 4990.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2196, pruned_loss=0.03848, over 973055.71 frames.], batch size: 20, lr: 3.26e-04 2022-05-05 15:43:58,581 INFO [train.py:715] (0/8) Epoch 6, batch 25550, loss[loss=0.1443, simple_loss=0.2161, pruned_loss=0.03627, over 4956.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2195, pruned_loss=0.0382, over 973473.21 frames.], batch size: 24, lr: 3.26e-04 2022-05-05 15:44:37,701 INFO [train.py:715] (0/8) Epoch 6, batch 25600, loss[loss=0.1549, simple_loss=0.2299, pruned_loss=0.03993, over 4797.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2189, pruned_loss=0.03782, over 973250.66 frames.], batch size: 24, lr: 3.26e-04 2022-05-05 15:45:15,936 INFO [train.py:715] (0/8) Epoch 6, batch 25650, loss[loss=0.1304, simple_loss=0.217, pruned_loss=0.02193, over 4939.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2194, pruned_loss=0.03779, over 973831.50 frames.], batch size: 23, lr: 3.26e-04 2022-05-05 15:45:54,737 INFO [train.py:715] (0/8) Epoch 6, batch 25700, loss[loss=0.1447, simple_loss=0.2113, pruned_loss=0.03905, over 4852.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2202, pruned_loss=0.03819, over 973794.30 frames.], batch size: 32, lr: 3.26e-04 2022-05-05 15:46:34,041 INFO [train.py:715] (0/8) Epoch 6, batch 25750, loss[loss=0.1471, simple_loss=0.2164, pruned_loss=0.03894, over 4827.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2204, pruned_loss=0.03826, over 973474.52 frames.], batch size: 13, lr: 3.26e-04 2022-05-05 15:47:12,317 INFO [train.py:715] (0/8) Epoch 6, batch 25800, loss[loss=0.1571, simple_loss=0.2246, pruned_loss=0.04479, over 4876.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2203, pruned_loss=0.03857, over 972818.97 frames.], batch size: 22, lr: 3.26e-04 2022-05-05 15:47:50,574 INFO [train.py:715] (0/8) Epoch 6, batch 25850, loss[loss=0.1546, simple_loss=0.2308, pruned_loss=0.03922, over 4946.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2199, pruned_loss=0.03854, over 973049.64 frames.], batch size: 21, lr: 3.26e-04 2022-05-05 15:48:29,217 INFO [train.py:715] (0/8) Epoch 6, batch 25900, loss[loss=0.1495, simple_loss=0.2185, pruned_loss=0.04019, over 4907.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2192, pruned_loss=0.03803, over 972592.42 frames.], batch size: 19, lr: 3.26e-04 2022-05-05 15:49:08,365 INFO [train.py:715] (0/8) Epoch 6, batch 25950, loss[loss=0.1621, simple_loss=0.2301, pruned_loss=0.04709, over 4931.00 frames.], tot_loss[loss=0.1472, simple_loss=0.219, pruned_loss=0.03772, over 972428.61 frames.], batch size: 23, lr: 3.26e-04 2022-05-05 15:49:46,037 INFO [train.py:715] (0/8) Epoch 6, batch 26000, loss[loss=0.1315, simple_loss=0.1983, pruned_loss=0.0324, over 4761.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2187, pruned_loss=0.03796, over 972274.53 frames.], batch size: 14, lr: 3.26e-04 2022-05-05 15:50:24,226 INFO [train.py:715] (0/8) Epoch 6, batch 26050, loss[loss=0.1188, simple_loss=0.1873, pruned_loss=0.02514, over 4855.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2197, pruned_loss=0.03842, over 971957.02 frames.], batch size: 20, lr: 3.26e-04 2022-05-05 15:51:03,216 INFO [train.py:715] (0/8) Epoch 6, batch 26100, loss[loss=0.1234, simple_loss=0.1993, pruned_loss=0.02377, over 4827.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2184, pruned_loss=0.03796, over 971871.30 frames.], batch size: 25, lr: 3.26e-04 2022-05-05 15:51:41,620 INFO [train.py:715] (0/8) Epoch 6, batch 26150, loss[loss=0.1775, simple_loss=0.2336, pruned_loss=0.06075, over 4886.00 frames.], tot_loss[loss=0.1468, simple_loss=0.218, pruned_loss=0.03784, over 972557.34 frames.], batch size: 39, lr: 3.26e-04 2022-05-05 15:52:20,120 INFO [train.py:715] (0/8) Epoch 6, batch 26200, loss[loss=0.142, simple_loss=0.2145, pruned_loss=0.03475, over 4855.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2177, pruned_loss=0.03758, over 972665.02 frames.], batch size: 20, lr: 3.26e-04 2022-05-05 15:52:58,597 INFO [train.py:715] (0/8) Epoch 6, batch 26250, loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02895, over 4968.00 frames.], tot_loss[loss=0.1465, simple_loss=0.218, pruned_loss=0.0375, over 972128.55 frames.], batch size: 15, lr: 3.26e-04 2022-05-05 15:53:37,248 INFO [train.py:715] (0/8) Epoch 6, batch 26300, loss[loss=0.1749, simple_loss=0.244, pruned_loss=0.05286, over 4844.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2198, pruned_loss=0.03823, over 972197.03 frames.], batch size: 30, lr: 3.26e-04 2022-05-05 15:54:15,323 INFO [train.py:715] (0/8) Epoch 6, batch 26350, loss[loss=0.1243, simple_loss=0.1967, pruned_loss=0.02594, over 4842.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2185, pruned_loss=0.0376, over 972042.75 frames.], batch size: 13, lr: 3.26e-04 2022-05-05 15:54:53,791 INFO [train.py:715] (0/8) Epoch 6, batch 26400, loss[loss=0.1315, simple_loss=0.1971, pruned_loss=0.03292, over 4779.00 frames.], tot_loss[loss=0.147, simple_loss=0.2187, pruned_loss=0.0376, over 971269.22 frames.], batch size: 12, lr: 3.26e-04 2022-05-05 15:55:33,105 INFO [train.py:715] (0/8) Epoch 6, batch 26450, loss[loss=0.1315, simple_loss=0.1964, pruned_loss=0.03326, over 4782.00 frames.], tot_loss[loss=0.147, simple_loss=0.2188, pruned_loss=0.0376, over 970719.42 frames.], batch size: 17, lr: 3.26e-04 2022-05-05 15:56:11,691 INFO [train.py:715] (0/8) Epoch 6, batch 26500, loss[loss=0.1435, simple_loss=0.2149, pruned_loss=0.03611, over 4822.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2185, pruned_loss=0.03765, over 971030.87 frames.], batch size: 15, lr: 3.26e-04 2022-05-05 15:56:50,066 INFO [train.py:715] (0/8) Epoch 6, batch 26550, loss[loss=0.1301, simple_loss=0.199, pruned_loss=0.03059, over 4892.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2179, pruned_loss=0.03729, over 970588.82 frames.], batch size: 19, lr: 3.26e-04 2022-05-05 15:57:28,897 INFO [train.py:715] (0/8) Epoch 6, batch 26600, loss[loss=0.1427, simple_loss=0.2188, pruned_loss=0.03327, over 4925.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2179, pruned_loss=0.03738, over 970753.31 frames.], batch size: 17, lr: 3.26e-04 2022-05-05 15:58:07,538 INFO [train.py:715] (0/8) Epoch 6, batch 26650, loss[loss=0.1943, simple_loss=0.2649, pruned_loss=0.06185, over 4835.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2178, pruned_loss=0.03768, over 971424.07 frames.], batch size: 26, lr: 3.26e-04 2022-05-05 15:58:46,264 INFO [train.py:715] (0/8) Epoch 6, batch 26700, loss[loss=0.1467, simple_loss=0.2187, pruned_loss=0.03733, over 4982.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2177, pruned_loss=0.03739, over 971642.61 frames.], batch size: 28, lr: 3.25e-04 2022-05-05 15:59:24,552 INFO [train.py:715] (0/8) Epoch 6, batch 26750, loss[loss=0.1301, simple_loss=0.205, pruned_loss=0.02755, over 4737.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03735, over 971929.57 frames.], batch size: 16, lr: 3.25e-04 2022-05-05 16:00:03,780 INFO [train.py:715] (0/8) Epoch 6, batch 26800, loss[loss=0.1411, simple_loss=0.2102, pruned_loss=0.03598, over 4866.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03767, over 971531.97 frames.], batch size: 20, lr: 3.25e-04 2022-05-05 16:00:41,897 INFO [train.py:715] (0/8) Epoch 6, batch 26850, loss[loss=0.1555, simple_loss=0.2311, pruned_loss=0.03995, over 4922.00 frames.], tot_loss[loss=0.1472, simple_loss=0.219, pruned_loss=0.03767, over 971386.37 frames.], batch size: 17, lr: 3.25e-04 2022-05-05 16:01:20,550 INFO [train.py:715] (0/8) Epoch 6, batch 26900, loss[loss=0.1428, simple_loss=0.2098, pruned_loss=0.03789, over 4873.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2198, pruned_loss=0.03821, over 971470.76 frames.], batch size: 22, lr: 3.25e-04 2022-05-05 16:01:59,791 INFO [train.py:715] (0/8) Epoch 6, batch 26950, loss[loss=0.1299, simple_loss=0.2065, pruned_loss=0.02668, over 4878.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2199, pruned_loss=0.0382, over 971674.46 frames.], batch size: 22, lr: 3.25e-04 2022-05-05 16:02:39,036 INFO [train.py:715] (0/8) Epoch 6, batch 27000, loss[loss=0.1687, simple_loss=0.2359, pruned_loss=0.05075, over 4707.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2205, pruned_loss=0.03881, over 972529.82 frames.], batch size: 15, lr: 3.25e-04 2022-05-05 16:02:39,037 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 16:02:48,797 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1088, simple_loss=0.1938, pruned_loss=0.01188, over 914524.00 frames. 2022-05-05 16:03:28,071 INFO [train.py:715] (0/8) Epoch 6, batch 27050, loss[loss=0.1488, simple_loss=0.2136, pruned_loss=0.04202, over 4919.00 frames.], tot_loss[loss=0.149, simple_loss=0.2204, pruned_loss=0.03886, over 972104.44 frames.], batch size: 23, lr: 3.25e-04 2022-05-05 16:04:06,803 INFO [train.py:715] (0/8) Epoch 6, batch 27100, loss[loss=0.1569, simple_loss=0.2279, pruned_loss=0.04298, over 4835.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2198, pruned_loss=0.03835, over 971986.44 frames.], batch size: 15, lr: 3.25e-04 2022-05-05 16:04:45,437 INFO [train.py:715] (0/8) Epoch 6, batch 27150, loss[loss=0.1448, simple_loss=0.2254, pruned_loss=0.03213, over 4769.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2191, pruned_loss=0.03804, over 971778.12 frames.], batch size: 17, lr: 3.25e-04 2022-05-05 16:05:25,172 INFO [train.py:715] (0/8) Epoch 6, batch 27200, loss[loss=0.12, simple_loss=0.1906, pruned_loss=0.02471, over 4958.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.03864, over 972037.36 frames.], batch size: 24, lr: 3.25e-04 2022-05-05 16:06:03,409 INFO [train.py:715] (0/8) Epoch 6, batch 27250, loss[loss=0.1425, simple_loss=0.208, pruned_loss=0.03848, over 4863.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2193, pruned_loss=0.03866, over 971348.33 frames.], batch size: 32, lr: 3.25e-04 2022-05-05 16:06:43,059 INFO [train.py:715] (0/8) Epoch 6, batch 27300, loss[loss=0.1509, simple_loss=0.2226, pruned_loss=0.03955, over 4860.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2206, pruned_loss=0.03942, over 971934.88 frames.], batch size: 13, lr: 3.25e-04 2022-05-05 16:07:22,055 INFO [train.py:715] (0/8) Epoch 6, batch 27350, loss[loss=0.1157, simple_loss=0.1912, pruned_loss=0.02014, over 4932.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2207, pruned_loss=0.03945, over 971657.59 frames.], batch size: 18, lr: 3.25e-04 2022-05-05 16:08:01,162 INFO [train.py:715] (0/8) Epoch 6, batch 27400, loss[loss=0.1356, simple_loss=0.2106, pruned_loss=0.03036, over 4798.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2215, pruned_loss=0.03949, over 971108.47 frames.], batch size: 14, lr: 3.25e-04 2022-05-05 16:08:39,769 INFO [train.py:715] (0/8) Epoch 6, batch 27450, loss[loss=0.1534, simple_loss=0.2253, pruned_loss=0.04075, over 4960.00 frames.], tot_loss[loss=0.1504, simple_loss=0.2218, pruned_loss=0.03947, over 972244.41 frames.], batch size: 35, lr: 3.25e-04 2022-05-05 16:09:18,811 INFO [train.py:715] (0/8) Epoch 6, batch 27500, loss[loss=0.1602, simple_loss=0.2302, pruned_loss=0.04509, over 4845.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2201, pruned_loss=0.03887, over 972154.92 frames.], batch size: 15, lr: 3.25e-04 2022-05-05 16:09:58,184 INFO [train.py:715] (0/8) Epoch 6, batch 27550, loss[loss=0.1398, simple_loss=0.2132, pruned_loss=0.03326, over 4870.00 frames.], tot_loss[loss=0.1497, simple_loss=0.2206, pruned_loss=0.03941, over 972480.33 frames.], batch size: 32, lr: 3.25e-04 2022-05-05 16:10:36,910 INFO [train.py:715] (0/8) Epoch 6, batch 27600, loss[loss=0.1447, simple_loss=0.2135, pruned_loss=0.0379, over 4861.00 frames.], tot_loss[loss=0.1499, simple_loss=0.2202, pruned_loss=0.03981, over 973216.38 frames.], batch size: 32, lr: 3.25e-04 2022-05-05 16:11:15,422 INFO [train.py:715] (0/8) Epoch 6, batch 27650, loss[loss=0.1404, simple_loss=0.2104, pruned_loss=0.03521, over 4778.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2194, pruned_loss=0.03951, over 972701.36 frames.], batch size: 17, lr: 3.25e-04 2022-05-05 16:11:54,435 INFO [train.py:715] (0/8) Epoch 6, batch 27700, loss[loss=0.1376, simple_loss=0.211, pruned_loss=0.03206, over 4782.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2188, pruned_loss=0.03879, over 971724.03 frames.], batch size: 18, lr: 3.25e-04 2022-05-05 16:12:32,976 INFO [train.py:715] (0/8) Epoch 6, batch 27750, loss[loss=0.1035, simple_loss=0.1758, pruned_loss=0.01557, over 4841.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2177, pruned_loss=0.03803, over 971856.64 frames.], batch size: 13, lr: 3.25e-04 2022-05-05 16:13:12,183 INFO [train.py:715] (0/8) Epoch 6, batch 27800, loss[loss=0.1463, simple_loss=0.2138, pruned_loss=0.03942, over 4762.00 frames.], tot_loss[loss=0.148, simple_loss=0.2187, pruned_loss=0.03865, over 971676.89 frames.], batch size: 18, lr: 3.25e-04 2022-05-05 16:13:51,237 INFO [train.py:715] (0/8) Epoch 6, batch 27850, loss[loss=0.1341, simple_loss=0.21, pruned_loss=0.02907, over 4800.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2195, pruned_loss=0.03885, over 972424.72 frames.], batch size: 24, lr: 3.25e-04 2022-05-05 16:14:30,893 INFO [train.py:715] (0/8) Epoch 6, batch 27900, loss[loss=0.1255, simple_loss=0.2061, pruned_loss=0.02241, over 4955.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.03865, over 970928.66 frames.], batch size: 24, lr: 3.25e-04 2022-05-05 16:15:09,372 INFO [train.py:715] (0/8) Epoch 6, batch 27950, loss[loss=0.1509, simple_loss=0.2121, pruned_loss=0.04484, over 4806.00 frames.], tot_loss[loss=0.149, simple_loss=0.2198, pruned_loss=0.03912, over 971116.57 frames.], batch size: 14, lr: 3.25e-04 2022-05-05 16:15:48,250 INFO [train.py:715] (0/8) Epoch 6, batch 28000, loss[loss=0.1473, simple_loss=0.2306, pruned_loss=0.03197, over 4988.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2202, pruned_loss=0.03905, over 971170.46 frames.], batch size: 26, lr: 3.25e-04 2022-05-05 16:16:27,391 INFO [train.py:715] (0/8) Epoch 6, batch 28050, loss[loss=0.1518, simple_loss=0.2165, pruned_loss=0.0435, over 4762.00 frames.], tot_loss[loss=0.1481, simple_loss=0.219, pruned_loss=0.03861, over 971099.30 frames.], batch size: 16, lr: 3.25e-04 2022-05-05 16:17:06,023 INFO [train.py:715] (0/8) Epoch 6, batch 28100, loss[loss=0.1779, simple_loss=0.2571, pruned_loss=0.04933, over 4857.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2193, pruned_loss=0.0388, over 971413.65 frames.], batch size: 30, lr: 3.25e-04 2022-05-05 16:17:44,945 INFO [train.py:715] (0/8) Epoch 6, batch 28150, loss[loss=0.1772, simple_loss=0.2495, pruned_loss=0.05246, over 4859.00 frames.], tot_loss[loss=0.1483, simple_loss=0.219, pruned_loss=0.03878, over 970138.12 frames.], batch size: 32, lr: 3.24e-04 2022-05-05 16:18:24,085 INFO [train.py:715] (0/8) Epoch 6, batch 28200, loss[loss=0.1728, simple_loss=0.2417, pruned_loss=0.05198, over 4955.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2193, pruned_loss=0.03879, over 970332.83 frames.], batch size: 35, lr: 3.24e-04 2022-05-05 16:19:03,408 INFO [train.py:715] (0/8) Epoch 6, batch 28250, loss[loss=0.1824, simple_loss=0.2463, pruned_loss=0.05919, over 4832.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2195, pruned_loss=0.03876, over 971290.50 frames.], batch size: 27, lr: 3.24e-04 2022-05-05 16:19:41,788 INFO [train.py:715] (0/8) Epoch 6, batch 28300, loss[loss=0.1455, simple_loss=0.2147, pruned_loss=0.03816, over 4901.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2199, pruned_loss=0.03888, over 971437.22 frames.], batch size: 17, lr: 3.24e-04 2022-05-05 16:20:20,025 INFO [train.py:715] (0/8) Epoch 6, batch 28350, loss[loss=0.1861, simple_loss=0.2507, pruned_loss=0.06077, over 4920.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2188, pruned_loss=0.03825, over 971316.54 frames.], batch size: 39, lr: 3.24e-04 2022-05-05 16:20:59,870 INFO [train.py:715] (0/8) Epoch 6, batch 28400, loss[loss=0.1363, simple_loss=0.2133, pruned_loss=0.02963, over 4970.00 frames.], tot_loss[loss=0.148, simple_loss=0.2191, pruned_loss=0.03846, over 971385.24 frames.], batch size: 24, lr: 3.24e-04 2022-05-05 16:21:38,666 INFO [train.py:715] (0/8) Epoch 6, batch 28450, loss[loss=0.1484, simple_loss=0.2165, pruned_loss=0.04019, over 4860.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2202, pruned_loss=0.03913, over 972061.22 frames.], batch size: 15, lr: 3.24e-04 2022-05-05 16:22:17,504 INFO [train.py:715] (0/8) Epoch 6, batch 28500, loss[loss=0.1403, simple_loss=0.2109, pruned_loss=0.0348, over 4886.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2195, pruned_loss=0.03857, over 972275.56 frames.], batch size: 16, lr: 3.24e-04 2022-05-05 16:22:56,647 INFO [train.py:715] (0/8) Epoch 6, batch 28550, loss[loss=0.1698, simple_loss=0.2335, pruned_loss=0.05302, over 4786.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2201, pruned_loss=0.03866, over 972746.72 frames.], batch size: 14, lr: 3.24e-04 2022-05-05 16:23:36,085 INFO [train.py:715] (0/8) Epoch 6, batch 28600, loss[loss=0.1838, simple_loss=0.2571, pruned_loss=0.05523, over 4991.00 frames.], tot_loss[loss=0.149, simple_loss=0.2205, pruned_loss=0.03881, over 972180.01 frames.], batch size: 28, lr: 3.24e-04 2022-05-05 16:24:14,188 INFO [train.py:715] (0/8) Epoch 6, batch 28650, loss[loss=0.1285, simple_loss=0.2127, pruned_loss=0.0222, over 4921.00 frames.], tot_loss[loss=0.1484, simple_loss=0.22, pruned_loss=0.03836, over 971647.59 frames.], batch size: 17, lr: 3.24e-04 2022-05-05 16:24:52,988 INFO [train.py:715] (0/8) Epoch 6, batch 28700, loss[loss=0.2103, simple_loss=0.2593, pruned_loss=0.0806, over 4701.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2202, pruned_loss=0.03855, over 971434.96 frames.], batch size: 15, lr: 3.24e-04 2022-05-05 16:25:32,171 INFO [train.py:715] (0/8) Epoch 6, batch 28750, loss[loss=0.12, simple_loss=0.1893, pruned_loss=0.02534, over 4832.00 frames.], tot_loss[loss=0.1475, simple_loss=0.219, pruned_loss=0.03803, over 971969.58 frames.], batch size: 15, lr: 3.24e-04 2022-05-05 16:26:10,896 INFO [train.py:715] (0/8) Epoch 6, batch 28800, loss[loss=0.1417, simple_loss=0.2181, pruned_loss=0.0326, over 4818.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2196, pruned_loss=0.03829, over 971207.17 frames.], batch size: 26, lr: 3.24e-04 2022-05-05 16:26:49,764 INFO [train.py:715] (0/8) Epoch 6, batch 28850, loss[loss=0.1468, simple_loss=0.2232, pruned_loss=0.03519, over 4697.00 frames.], tot_loss[loss=0.148, simple_loss=0.2196, pruned_loss=0.03818, over 971012.94 frames.], batch size: 15, lr: 3.24e-04 2022-05-05 16:27:28,065 INFO [train.py:715] (0/8) Epoch 6, batch 28900, loss[loss=0.1735, simple_loss=0.2327, pruned_loss=0.05711, over 4898.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2193, pruned_loss=0.03793, over 972164.68 frames.], batch size: 18, lr: 3.24e-04 2022-05-05 16:28:07,511 INFO [train.py:715] (0/8) Epoch 6, batch 28950, loss[loss=0.1735, simple_loss=0.2508, pruned_loss=0.04809, over 4971.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2197, pruned_loss=0.03793, over 970923.46 frames.], batch size: 15, lr: 3.24e-04 2022-05-05 16:28:45,745 INFO [train.py:715] (0/8) Epoch 6, batch 29000, loss[loss=0.1445, simple_loss=0.223, pruned_loss=0.03302, over 4905.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2195, pruned_loss=0.03792, over 970681.35 frames.], batch size: 19, lr: 3.24e-04 2022-05-05 16:29:23,900 INFO [train.py:715] (0/8) Epoch 6, batch 29050, loss[loss=0.135, simple_loss=0.2121, pruned_loss=0.02894, over 4942.00 frames.], tot_loss[loss=0.1485, simple_loss=0.22, pruned_loss=0.03844, over 971083.09 frames.], batch size: 21, lr: 3.24e-04 2022-05-05 16:30:02,949 INFO [train.py:715] (0/8) Epoch 6, batch 29100, loss[loss=0.175, simple_loss=0.2491, pruned_loss=0.05048, over 4782.00 frames.], tot_loss[loss=0.1484, simple_loss=0.22, pruned_loss=0.0384, over 971059.03 frames.], batch size: 18, lr: 3.24e-04 2022-05-05 16:30:41,834 INFO [train.py:715] (0/8) Epoch 6, batch 29150, loss[loss=0.138, simple_loss=0.2105, pruned_loss=0.03273, over 4785.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2197, pruned_loss=0.03791, over 970830.33 frames.], batch size: 18, lr: 3.24e-04 2022-05-05 16:31:20,666 INFO [train.py:715] (0/8) Epoch 6, batch 29200, loss[loss=0.1922, simple_loss=0.2455, pruned_loss=0.06944, over 4860.00 frames.], tot_loss[loss=0.148, simple_loss=0.2198, pruned_loss=0.03816, over 970319.56 frames.], batch size: 20, lr: 3.24e-04 2022-05-05 16:31:59,883 INFO [train.py:715] (0/8) Epoch 6, batch 29250, loss[loss=0.137, simple_loss=0.2089, pruned_loss=0.03254, over 4797.00 frames.], tot_loss[loss=0.1482, simple_loss=0.22, pruned_loss=0.03817, over 971435.49 frames.], batch size: 24, lr: 3.24e-04 2022-05-05 16:32:39,919 INFO [train.py:715] (0/8) Epoch 6, batch 29300, loss[loss=0.1384, simple_loss=0.2116, pruned_loss=0.03265, over 4755.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2191, pruned_loss=0.03829, over 971018.65 frames.], batch size: 19, lr: 3.24e-04 2022-05-05 16:33:18,203 INFO [train.py:715] (0/8) Epoch 6, batch 29350, loss[loss=0.1729, simple_loss=0.2356, pruned_loss=0.05509, over 4754.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2185, pruned_loss=0.03796, over 971222.10 frames.], batch size: 19, lr: 3.24e-04 2022-05-05 16:33:57,194 INFO [train.py:715] (0/8) Epoch 6, batch 29400, loss[loss=0.1615, simple_loss=0.236, pruned_loss=0.04351, over 4796.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03756, over 971653.55 frames.], batch size: 21, lr: 3.24e-04 2022-05-05 16:34:36,593 INFO [train.py:715] (0/8) Epoch 6, batch 29450, loss[loss=0.1572, simple_loss=0.2297, pruned_loss=0.04238, over 4990.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03755, over 971390.69 frames.], batch size: 25, lr: 3.24e-04 2022-05-05 16:35:15,800 INFO [train.py:715] (0/8) Epoch 6, batch 29500, loss[loss=0.142, simple_loss=0.2105, pruned_loss=0.03679, over 4815.00 frames.], tot_loss[loss=0.1466, simple_loss=0.218, pruned_loss=0.03755, over 971941.47 frames.], batch size: 27, lr: 3.24e-04 2022-05-05 16:35:53,788 INFO [train.py:715] (0/8) Epoch 6, batch 29550, loss[loss=0.1323, simple_loss=0.2014, pruned_loss=0.03156, over 4857.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2183, pruned_loss=0.03745, over 972774.40 frames.], batch size: 20, lr: 3.24e-04 2022-05-05 16:36:33,140 INFO [train.py:715] (0/8) Epoch 6, batch 29600, loss[loss=0.1545, simple_loss=0.2244, pruned_loss=0.04226, over 4987.00 frames.], tot_loss[loss=0.1463, simple_loss=0.218, pruned_loss=0.03735, over 973131.44 frames.], batch size: 25, lr: 3.24e-04 2022-05-05 16:37:12,529 INFO [train.py:715] (0/8) Epoch 6, batch 29650, loss[loss=0.1627, simple_loss=0.2289, pruned_loss=0.04826, over 4994.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2171, pruned_loss=0.0372, over 972570.04 frames.], batch size: 16, lr: 3.23e-04 2022-05-05 16:37:51,060 INFO [train.py:715] (0/8) Epoch 6, batch 29700, loss[loss=0.1232, simple_loss=0.2037, pruned_loss=0.0214, over 4902.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2175, pruned_loss=0.03798, over 972927.47 frames.], batch size: 18, lr: 3.23e-04 2022-05-05 16:38:29,759 INFO [train.py:715] (0/8) Epoch 6, batch 29750, loss[loss=0.1824, simple_loss=0.2458, pruned_loss=0.05953, over 4877.00 frames.], tot_loss[loss=0.147, simple_loss=0.218, pruned_loss=0.03803, over 973521.66 frames.], batch size: 22, lr: 3.23e-04 2022-05-05 16:39:08,773 INFO [train.py:715] (0/8) Epoch 6, batch 29800, loss[loss=0.1237, simple_loss=0.1945, pruned_loss=0.02646, over 4938.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2189, pruned_loss=0.03828, over 972949.57 frames.], batch size: 18, lr: 3.23e-04 2022-05-05 16:39:48,201 INFO [train.py:715] (0/8) Epoch 6, batch 29850, loss[loss=0.1376, simple_loss=0.2104, pruned_loss=0.03234, over 4805.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2184, pruned_loss=0.0379, over 971814.16 frames.], batch size: 12, lr: 3.23e-04 2022-05-05 16:40:26,711 INFO [train.py:715] (0/8) Epoch 6, batch 29900, loss[loss=0.1694, simple_loss=0.2433, pruned_loss=0.04773, over 4984.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2188, pruned_loss=0.03785, over 972195.93 frames.], batch size: 24, lr: 3.23e-04 2022-05-05 16:41:05,698 INFO [train.py:715] (0/8) Epoch 6, batch 29950, loss[loss=0.154, simple_loss=0.2416, pruned_loss=0.03322, over 4759.00 frames.], tot_loss[loss=0.1477, simple_loss=0.219, pruned_loss=0.03824, over 972531.08 frames.], batch size: 19, lr: 3.23e-04 2022-05-05 16:41:45,051 INFO [train.py:715] (0/8) Epoch 6, batch 30000, loss[loss=0.1424, simple_loss=0.2028, pruned_loss=0.04096, over 4828.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2185, pruned_loss=0.03812, over 971878.47 frames.], batch size: 13, lr: 3.23e-04 2022-05-05 16:41:45,052 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 16:41:54,715 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1088, simple_loss=0.1938, pruned_loss=0.0119, over 914524.00 frames. 2022-05-05 16:42:34,421 INFO [train.py:715] (0/8) Epoch 6, batch 30050, loss[loss=0.1712, simple_loss=0.2509, pruned_loss=0.04572, over 4907.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2188, pruned_loss=0.03835, over 972598.07 frames.], batch size: 19, lr: 3.23e-04 2022-05-05 16:43:12,812 INFO [train.py:715] (0/8) Epoch 6, batch 30100, loss[loss=0.1191, simple_loss=0.1946, pruned_loss=0.02182, over 4736.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2195, pruned_loss=0.03855, over 973218.12 frames.], batch size: 16, lr: 3.23e-04 2022-05-05 16:43:51,563 INFO [train.py:715] (0/8) Epoch 6, batch 30150, loss[loss=0.1393, simple_loss=0.2149, pruned_loss=0.03189, over 4938.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2187, pruned_loss=0.03807, over 973106.71 frames.], batch size: 21, lr: 3.23e-04 2022-05-05 16:44:30,964 INFO [train.py:715] (0/8) Epoch 6, batch 30200, loss[loss=0.1346, simple_loss=0.2072, pruned_loss=0.03105, over 4801.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2187, pruned_loss=0.03822, over 972470.96 frames.], batch size: 13, lr: 3.23e-04 2022-05-05 16:45:10,340 INFO [train.py:715] (0/8) Epoch 6, batch 30250, loss[loss=0.1264, simple_loss=0.2011, pruned_loss=0.02581, over 4972.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2192, pruned_loss=0.03849, over 972593.79 frames.], batch size: 15, lr: 3.23e-04 2022-05-05 16:45:48,509 INFO [train.py:715] (0/8) Epoch 6, batch 30300, loss[loss=0.1751, simple_loss=0.2372, pruned_loss=0.05648, over 4791.00 frames.], tot_loss[loss=0.1489, simple_loss=0.22, pruned_loss=0.0389, over 972341.82 frames.], batch size: 24, lr: 3.23e-04 2022-05-05 16:46:27,513 INFO [train.py:715] (0/8) Epoch 6, batch 30350, loss[loss=0.1273, simple_loss=0.2, pruned_loss=0.02729, over 4803.00 frames.], tot_loss[loss=0.1491, simple_loss=0.22, pruned_loss=0.03914, over 972253.89 frames.], batch size: 24, lr: 3.23e-04 2022-05-05 16:47:06,583 INFO [train.py:715] (0/8) Epoch 6, batch 30400, loss[loss=0.2147, simple_loss=0.2605, pruned_loss=0.08445, over 4755.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2207, pruned_loss=0.0391, over 971620.99 frames.], batch size: 16, lr: 3.23e-04 2022-05-05 16:47:45,259 INFO [train.py:715] (0/8) Epoch 6, batch 30450, loss[loss=0.1856, simple_loss=0.2508, pruned_loss=0.06016, over 4876.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2203, pruned_loss=0.03871, over 972454.86 frames.], batch size: 16, lr: 3.23e-04 2022-05-05 16:48:23,945 INFO [train.py:715] (0/8) Epoch 6, batch 30500, loss[loss=0.1406, simple_loss=0.2064, pruned_loss=0.03737, over 4900.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2201, pruned_loss=0.03889, over 973107.73 frames.], batch size: 18, lr: 3.23e-04 2022-05-05 16:49:02,692 INFO [train.py:715] (0/8) Epoch 6, batch 30550, loss[loss=0.1394, simple_loss=0.2072, pruned_loss=0.03578, over 4819.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2203, pruned_loss=0.03898, over 972638.34 frames.], batch size: 21, lr: 3.23e-04 2022-05-05 16:49:41,850 INFO [train.py:715] (0/8) Epoch 6, batch 30600, loss[loss=0.1701, simple_loss=0.2377, pruned_loss=0.05118, over 4948.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2203, pruned_loss=0.03899, over 972256.73 frames.], batch size: 15, lr: 3.23e-04 2022-05-05 16:50:20,374 INFO [train.py:715] (0/8) Epoch 6, batch 30650, loss[loss=0.1704, simple_loss=0.2555, pruned_loss=0.0427, over 4846.00 frames.], tot_loss[loss=0.1494, simple_loss=0.2211, pruned_loss=0.03882, over 972021.05 frames.], batch size: 20, lr: 3.23e-04 2022-05-05 16:50:59,230 INFO [train.py:715] (0/8) Epoch 6, batch 30700, loss[loss=0.1456, simple_loss=0.2167, pruned_loss=0.03721, over 4806.00 frames.], tot_loss[loss=0.1493, simple_loss=0.221, pruned_loss=0.03877, over 972399.31 frames.], batch size: 25, lr: 3.23e-04 2022-05-05 16:51:38,187 INFO [train.py:715] (0/8) Epoch 6, batch 30750, loss[loss=0.1519, simple_loss=0.2237, pruned_loss=0.04008, over 4948.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2204, pruned_loss=0.03844, over 972688.84 frames.], batch size: 29, lr: 3.23e-04 2022-05-05 16:52:17,032 INFO [train.py:715] (0/8) Epoch 6, batch 30800, loss[loss=0.1412, simple_loss=0.2074, pruned_loss=0.03753, over 4842.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2198, pruned_loss=0.03801, over 973402.95 frames.], batch size: 30, lr: 3.23e-04 2022-05-05 16:52:55,437 INFO [train.py:715] (0/8) Epoch 6, batch 30850, loss[loss=0.1425, simple_loss=0.2176, pruned_loss=0.03375, over 4931.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2202, pruned_loss=0.03859, over 973442.06 frames.], batch size: 21, lr: 3.23e-04 2022-05-05 16:53:34,165 INFO [train.py:715] (0/8) Epoch 6, batch 30900, loss[loss=0.136, simple_loss=0.2035, pruned_loss=0.03427, over 4799.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2204, pruned_loss=0.03849, over 972991.85 frames.], batch size: 21, lr: 3.23e-04 2022-05-05 16:54:13,770 INFO [train.py:715] (0/8) Epoch 6, batch 30950, loss[loss=0.1612, simple_loss=0.2368, pruned_loss=0.04279, over 4950.00 frames.], tot_loss[loss=0.148, simple_loss=0.22, pruned_loss=0.03799, over 973776.26 frames.], batch size: 21, lr: 3.23e-04 2022-05-05 16:54:51,904 INFO [train.py:715] (0/8) Epoch 6, batch 31000, loss[loss=0.145, simple_loss=0.2311, pruned_loss=0.02944, over 4952.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2194, pruned_loss=0.03804, over 973028.39 frames.], batch size: 24, lr: 3.23e-04 2022-05-05 16:55:30,909 INFO [train.py:715] (0/8) Epoch 6, batch 31050, loss[loss=0.1758, simple_loss=0.2375, pruned_loss=0.05708, over 4784.00 frames.], tot_loss[loss=0.1485, simple_loss=0.22, pruned_loss=0.0385, over 973149.19 frames.], batch size: 18, lr: 3.23e-04 2022-05-05 16:56:10,162 INFO [train.py:715] (0/8) Epoch 6, batch 31100, loss[loss=0.1542, simple_loss=0.2174, pruned_loss=0.0455, over 4908.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2186, pruned_loss=0.03775, over 972556.23 frames.], batch size: 17, lr: 3.22e-04 2022-05-05 16:56:42,939 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-240000.pt 2022-05-05 16:56:51,375 INFO [train.py:715] (0/8) Epoch 6, batch 31150, loss[loss=0.1643, simple_loss=0.2344, pruned_loss=0.04709, over 4785.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2193, pruned_loss=0.03786, over 972777.18 frames.], batch size: 17, lr: 3.22e-04 2022-05-05 16:57:30,154 INFO [train.py:715] (0/8) Epoch 6, batch 31200, loss[loss=0.1221, simple_loss=0.1904, pruned_loss=0.02694, over 4982.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2189, pruned_loss=0.03819, over 972635.91 frames.], batch size: 25, lr: 3.22e-04 2022-05-05 16:58:09,409 INFO [train.py:715] (0/8) Epoch 6, batch 31250, loss[loss=0.1257, simple_loss=0.199, pruned_loss=0.0262, over 4767.00 frames.], tot_loss[loss=0.1479, simple_loss=0.219, pruned_loss=0.03835, over 972531.29 frames.], batch size: 18, lr: 3.22e-04 2022-05-05 16:58:48,242 INFO [train.py:715] (0/8) Epoch 6, batch 31300, loss[loss=0.1399, simple_loss=0.2086, pruned_loss=0.03561, over 4895.00 frames.], tot_loss[loss=0.148, simple_loss=0.219, pruned_loss=0.03849, over 972601.52 frames.], batch size: 19, lr: 3.22e-04 2022-05-05 16:59:27,118 INFO [train.py:715] (0/8) Epoch 6, batch 31350, loss[loss=0.1284, simple_loss=0.2049, pruned_loss=0.02595, over 4974.00 frames.], tot_loss[loss=0.148, simple_loss=0.2192, pruned_loss=0.03836, over 972785.43 frames.], batch size: 28, lr: 3.22e-04 2022-05-05 17:00:06,351 INFO [train.py:715] (0/8) Epoch 6, batch 31400, loss[loss=0.1596, simple_loss=0.2329, pruned_loss=0.0431, over 4817.00 frames.], tot_loss[loss=0.1477, simple_loss=0.219, pruned_loss=0.03817, over 971803.86 frames.], batch size: 27, lr: 3.22e-04 2022-05-05 17:00:45,699 INFO [train.py:715] (0/8) Epoch 6, batch 31450, loss[loss=0.167, simple_loss=0.2503, pruned_loss=0.04179, over 4688.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2191, pruned_loss=0.03808, over 971593.13 frames.], batch size: 15, lr: 3.22e-04 2022-05-05 17:01:23,992 INFO [train.py:715] (0/8) Epoch 6, batch 31500, loss[loss=0.1451, simple_loss=0.2116, pruned_loss=0.03931, over 4980.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2192, pruned_loss=0.03803, over 972247.70 frames.], batch size: 31, lr: 3.22e-04 2022-05-05 17:02:02,409 INFO [train.py:715] (0/8) Epoch 6, batch 31550, loss[loss=0.1229, simple_loss=0.19, pruned_loss=0.0279, over 4858.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2186, pruned_loss=0.03824, over 971258.45 frames.], batch size: 32, lr: 3.22e-04 2022-05-05 17:02:41,953 INFO [train.py:715] (0/8) Epoch 6, batch 31600, loss[loss=0.1316, simple_loss=0.2094, pruned_loss=0.02687, over 4771.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2186, pruned_loss=0.03791, over 971370.64 frames.], batch size: 18, lr: 3.22e-04 2022-05-05 17:03:21,193 INFO [train.py:715] (0/8) Epoch 6, batch 31650, loss[loss=0.1422, simple_loss=0.2167, pruned_loss=0.03384, over 4978.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2192, pruned_loss=0.03793, over 972101.23 frames.], batch size: 28, lr: 3.22e-04 2022-05-05 17:03:59,730 INFO [train.py:715] (0/8) Epoch 6, batch 31700, loss[loss=0.1318, simple_loss=0.2, pruned_loss=0.03179, over 4847.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2187, pruned_loss=0.03806, over 971930.48 frames.], batch size: 30, lr: 3.22e-04 2022-05-05 17:04:38,252 INFO [train.py:715] (0/8) Epoch 6, batch 31750, loss[loss=0.1787, simple_loss=0.2387, pruned_loss=0.05937, over 4955.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2201, pruned_loss=0.03876, over 971922.37 frames.], batch size: 15, lr: 3.22e-04 2022-05-05 17:05:17,755 INFO [train.py:715] (0/8) Epoch 6, batch 31800, loss[loss=0.1396, simple_loss=0.2084, pruned_loss=0.0354, over 4891.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2201, pruned_loss=0.03854, over 973264.40 frames.], batch size: 17, lr: 3.22e-04 2022-05-05 17:05:56,238 INFO [train.py:715] (0/8) Epoch 6, batch 31850, loss[loss=0.1244, simple_loss=0.1993, pruned_loss=0.02475, over 4783.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2202, pruned_loss=0.03829, over 972912.78 frames.], batch size: 17, lr: 3.22e-04 2022-05-05 17:06:34,774 INFO [train.py:715] (0/8) Epoch 6, batch 31900, loss[loss=0.1428, simple_loss=0.2163, pruned_loss=0.03465, over 4779.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2203, pruned_loss=0.03873, over 972243.16 frames.], batch size: 17, lr: 3.22e-04 2022-05-05 17:07:13,867 INFO [train.py:715] (0/8) Epoch 6, batch 31950, loss[loss=0.1583, simple_loss=0.2322, pruned_loss=0.04224, over 4756.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2204, pruned_loss=0.03862, over 973836.01 frames.], batch size: 19, lr: 3.22e-04 2022-05-05 17:07:52,485 INFO [train.py:715] (0/8) Epoch 6, batch 32000, loss[loss=0.1663, simple_loss=0.2415, pruned_loss=0.04552, over 4868.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2197, pruned_loss=0.03838, over 973854.60 frames.], batch size: 20, lr: 3.22e-04 2022-05-05 17:08:31,938 INFO [train.py:715] (0/8) Epoch 6, batch 32050, loss[loss=0.1274, simple_loss=0.1939, pruned_loss=0.03045, over 4848.00 frames.], tot_loss[loss=0.1478, simple_loss=0.219, pruned_loss=0.03829, over 973433.58 frames.], batch size: 12, lr: 3.22e-04 2022-05-05 17:09:11,460 INFO [train.py:715] (0/8) Epoch 6, batch 32100, loss[loss=0.1402, simple_loss=0.2213, pruned_loss=0.02957, over 4770.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2194, pruned_loss=0.03817, over 973461.00 frames.], batch size: 18, lr: 3.22e-04 2022-05-05 17:09:50,459 INFO [train.py:715] (0/8) Epoch 6, batch 32150, loss[loss=0.1429, simple_loss=0.2248, pruned_loss=0.03053, over 4757.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2181, pruned_loss=0.03776, over 972814.53 frames.], batch size: 19, lr: 3.22e-04 2022-05-05 17:10:28,953 INFO [train.py:715] (0/8) Epoch 6, batch 32200, loss[loss=0.1749, simple_loss=0.2383, pruned_loss=0.05579, over 4692.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2197, pruned_loss=0.03848, over 972641.83 frames.], batch size: 15, lr: 3.22e-04 2022-05-05 17:11:08,025 INFO [train.py:715] (0/8) Epoch 6, batch 32250, loss[loss=0.1558, simple_loss=0.2361, pruned_loss=0.03779, over 4795.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2198, pruned_loss=0.03831, over 971863.12 frames.], batch size: 24, lr: 3.22e-04 2022-05-05 17:11:46,849 INFO [train.py:715] (0/8) Epoch 6, batch 32300, loss[loss=0.1421, simple_loss=0.2173, pruned_loss=0.03345, over 4771.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2193, pruned_loss=0.03774, over 971421.06 frames.], batch size: 18, lr: 3.22e-04 2022-05-05 17:12:26,138 INFO [train.py:715] (0/8) Epoch 6, batch 32350, loss[loss=0.1552, simple_loss=0.2407, pruned_loss=0.03489, over 4916.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2193, pruned_loss=0.03766, over 971600.94 frames.], batch size: 18, lr: 3.22e-04 2022-05-05 17:13:04,500 INFO [train.py:715] (0/8) Epoch 6, batch 32400, loss[loss=0.1614, simple_loss=0.2324, pruned_loss=0.04524, over 4753.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2189, pruned_loss=0.03735, over 971491.55 frames.], batch size: 19, lr: 3.22e-04 2022-05-05 17:13:43,918 INFO [train.py:715] (0/8) Epoch 6, batch 32450, loss[loss=0.1237, simple_loss=0.2047, pruned_loss=0.02136, over 4995.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2186, pruned_loss=0.03745, over 971680.98 frames.], batch size: 14, lr: 3.22e-04 2022-05-05 17:14:23,266 INFO [train.py:715] (0/8) Epoch 6, batch 32500, loss[loss=0.1287, simple_loss=0.2065, pruned_loss=0.02547, over 4815.00 frames.], tot_loss[loss=0.146, simple_loss=0.2179, pruned_loss=0.03706, over 971698.89 frames.], batch size: 21, lr: 3.22e-04 2022-05-05 17:15:01,983 INFO [train.py:715] (0/8) Epoch 6, batch 32550, loss[loss=0.1335, simple_loss=0.2063, pruned_loss=0.03031, over 4796.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03691, over 972205.50 frames.], batch size: 25, lr: 3.22e-04 2022-05-05 17:15:40,775 INFO [train.py:715] (0/8) Epoch 6, batch 32600, loss[loss=0.1467, simple_loss=0.2094, pruned_loss=0.04203, over 4818.00 frames.], tot_loss[loss=0.146, simple_loss=0.218, pruned_loss=0.03704, over 972189.49 frames.], batch size: 27, lr: 3.21e-04 2022-05-05 17:16:19,205 INFO [train.py:715] (0/8) Epoch 6, batch 32650, loss[loss=0.1338, simple_loss=0.2071, pruned_loss=0.03024, over 4968.00 frames.], tot_loss[loss=0.1462, simple_loss=0.218, pruned_loss=0.03723, over 971966.74 frames.], batch size: 24, lr: 3.21e-04 2022-05-05 17:16:57,836 INFO [train.py:715] (0/8) Epoch 6, batch 32700, loss[loss=0.1268, simple_loss=0.2029, pruned_loss=0.02537, over 4762.00 frames.], tot_loss[loss=0.1464, simple_loss=0.218, pruned_loss=0.03742, over 972489.28 frames.], batch size: 18, lr: 3.21e-04 2022-05-05 17:17:35,884 INFO [train.py:715] (0/8) Epoch 6, batch 32750, loss[loss=0.1684, simple_loss=0.2382, pruned_loss=0.04928, over 4901.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2175, pruned_loss=0.03749, over 972178.88 frames.], batch size: 22, lr: 3.21e-04 2022-05-05 17:18:14,601 INFO [train.py:715] (0/8) Epoch 6, batch 32800, loss[loss=0.1641, simple_loss=0.2254, pruned_loss=0.05141, over 4965.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2174, pruned_loss=0.03767, over 971588.00 frames.], batch size: 24, lr: 3.21e-04 2022-05-05 17:18:53,195 INFO [train.py:715] (0/8) Epoch 6, batch 32850, loss[loss=0.1295, simple_loss=0.1973, pruned_loss=0.03083, over 4827.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2171, pruned_loss=0.03754, over 971810.13 frames.], batch size: 15, lr: 3.21e-04 2022-05-05 17:19:31,603 INFO [train.py:715] (0/8) Epoch 6, batch 32900, loss[loss=0.1476, simple_loss=0.2257, pruned_loss=0.03472, over 4933.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2175, pruned_loss=0.03742, over 972233.97 frames.], batch size: 21, lr: 3.21e-04 2022-05-05 17:20:09,695 INFO [train.py:715] (0/8) Epoch 6, batch 32950, loss[loss=0.159, simple_loss=0.2384, pruned_loss=0.03978, over 4855.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2182, pruned_loss=0.03756, over 973148.81 frames.], batch size: 20, lr: 3.21e-04 2022-05-05 17:20:48,503 INFO [train.py:715] (0/8) Epoch 6, batch 33000, loss[loss=0.1603, simple_loss=0.2286, pruned_loss=0.04595, over 4778.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2174, pruned_loss=0.03696, over 972895.02 frames.], batch size: 18, lr: 3.21e-04 2022-05-05 17:20:48,505 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 17:20:58,111 INFO [train.py:742] (0/8) Epoch 6, validation: loss=0.1087, simple_loss=0.1938, pruned_loss=0.01183, over 914524.00 frames. 2022-05-05 17:21:36,676 INFO [train.py:715] (0/8) Epoch 6, batch 33050, loss[loss=0.1562, simple_loss=0.2312, pruned_loss=0.04056, over 4759.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2182, pruned_loss=0.03737, over 972845.05 frames.], batch size: 19, lr: 3.21e-04 2022-05-05 17:22:15,260 INFO [train.py:715] (0/8) Epoch 6, batch 33100, loss[loss=0.1499, simple_loss=0.2203, pruned_loss=0.0398, over 4867.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2179, pruned_loss=0.03745, over 972841.35 frames.], batch size: 32, lr: 3.21e-04 2022-05-05 17:22:53,007 INFO [train.py:715] (0/8) Epoch 6, batch 33150, loss[loss=0.1857, simple_loss=0.2614, pruned_loss=0.05501, over 4925.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2181, pruned_loss=0.03758, over 974020.23 frames.], batch size: 39, lr: 3.21e-04 2022-05-05 17:23:31,895 INFO [train.py:715] (0/8) Epoch 6, batch 33200, loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02986, over 4879.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2187, pruned_loss=0.03812, over 973810.04 frames.], batch size: 38, lr: 3.21e-04 2022-05-05 17:24:10,784 INFO [train.py:715] (0/8) Epoch 6, batch 33250, loss[loss=0.1589, simple_loss=0.2343, pruned_loss=0.04175, over 4816.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2187, pruned_loss=0.03802, over 973784.65 frames.], batch size: 15, lr: 3.21e-04 2022-05-05 17:24:49,861 INFO [train.py:715] (0/8) Epoch 6, batch 33300, loss[loss=0.1489, simple_loss=0.2166, pruned_loss=0.04062, over 4860.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2186, pruned_loss=0.03786, over 973003.86 frames.], batch size: 20, lr: 3.21e-04 2022-05-05 17:25:28,466 INFO [train.py:715] (0/8) Epoch 6, batch 33350, loss[loss=0.1299, simple_loss=0.1993, pruned_loss=0.03026, over 4980.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2179, pruned_loss=0.03784, over 972659.72 frames.], batch size: 14, lr: 3.21e-04 2022-05-05 17:26:07,932 INFO [train.py:715] (0/8) Epoch 6, batch 33400, loss[loss=0.171, simple_loss=0.2423, pruned_loss=0.0498, over 4645.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2187, pruned_loss=0.03793, over 972920.93 frames.], batch size: 13, lr: 3.21e-04 2022-05-05 17:26:47,023 INFO [train.py:715] (0/8) Epoch 6, batch 33450, loss[loss=0.1578, simple_loss=0.218, pruned_loss=0.04876, over 4844.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2185, pruned_loss=0.03769, over 971828.58 frames.], batch size: 13, lr: 3.21e-04 2022-05-05 17:27:25,288 INFO [train.py:715] (0/8) Epoch 6, batch 33500, loss[loss=0.1647, simple_loss=0.2281, pruned_loss=0.05064, over 4954.00 frames.], tot_loss[loss=0.1472, simple_loss=0.219, pruned_loss=0.03768, over 972411.60 frames.], batch size: 15, lr: 3.21e-04 2022-05-05 17:28:04,311 INFO [train.py:715] (0/8) Epoch 6, batch 33550, loss[loss=0.1283, simple_loss=0.206, pruned_loss=0.02528, over 4849.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2192, pruned_loss=0.03776, over 972848.60 frames.], batch size: 20, lr: 3.21e-04 2022-05-05 17:28:43,720 INFO [train.py:715] (0/8) Epoch 6, batch 33600, loss[loss=0.1537, simple_loss=0.2305, pruned_loss=0.03841, over 4939.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2188, pruned_loss=0.03735, over 972536.55 frames.], batch size: 39, lr: 3.21e-04 2022-05-05 17:29:22,675 INFO [train.py:715] (0/8) Epoch 6, batch 33650, loss[loss=0.1389, simple_loss=0.224, pruned_loss=0.02689, over 4797.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2183, pruned_loss=0.03723, over 973372.20 frames.], batch size: 17, lr: 3.21e-04 2022-05-05 17:30:01,272 INFO [train.py:715] (0/8) Epoch 6, batch 33700, loss[loss=0.155, simple_loss=0.2239, pruned_loss=0.04302, over 4694.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03683, over 972364.95 frames.], batch size: 15, lr: 3.21e-04 2022-05-05 17:30:39,881 INFO [train.py:715] (0/8) Epoch 6, batch 33750, loss[loss=0.1475, simple_loss=0.2155, pruned_loss=0.03973, over 4822.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.0368, over 971507.45 frames.], batch size: 15, lr: 3.21e-04 2022-05-05 17:31:19,205 INFO [train.py:715] (0/8) Epoch 6, batch 33800, loss[loss=0.1327, simple_loss=0.2047, pruned_loss=0.03034, over 4791.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2178, pruned_loss=0.03678, over 971265.12 frames.], batch size: 12, lr: 3.21e-04 2022-05-05 17:31:58,015 INFO [train.py:715] (0/8) Epoch 6, batch 33850, loss[loss=0.1603, simple_loss=0.2403, pruned_loss=0.04009, over 4971.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2185, pruned_loss=0.03739, over 971219.32 frames.], batch size: 28, lr: 3.21e-04 2022-05-05 17:32:36,702 INFO [train.py:715] (0/8) Epoch 6, batch 33900, loss[loss=0.1361, simple_loss=0.2141, pruned_loss=0.02905, over 4869.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2186, pruned_loss=0.03729, over 970833.48 frames.], batch size: 20, lr: 3.21e-04 2022-05-05 17:33:16,045 INFO [train.py:715] (0/8) Epoch 6, batch 33950, loss[loss=0.1459, simple_loss=0.2236, pruned_loss=0.03407, over 4979.00 frames.], tot_loss[loss=0.1469, simple_loss=0.219, pruned_loss=0.03741, over 972190.46 frames.], batch size: 14, lr: 3.21e-04 2022-05-05 17:33:55,026 INFO [train.py:715] (0/8) Epoch 6, batch 34000, loss[loss=0.1571, simple_loss=0.2187, pruned_loss=0.04775, over 4806.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2195, pruned_loss=0.03785, over 972300.04 frames.], batch size: 13, lr: 3.21e-04 2022-05-05 17:34:33,698 INFO [train.py:715] (0/8) Epoch 6, batch 34050, loss[loss=0.1298, simple_loss=0.2034, pruned_loss=0.02809, over 4741.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2197, pruned_loss=0.03825, over 971309.28 frames.], batch size: 16, lr: 3.21e-04 2022-05-05 17:35:12,978 INFO [train.py:715] (0/8) Epoch 6, batch 34100, loss[loss=0.1542, simple_loss=0.234, pruned_loss=0.03715, over 4780.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2192, pruned_loss=0.03814, over 972063.69 frames.], batch size: 17, lr: 3.20e-04 2022-05-05 17:35:51,934 INFO [train.py:715] (0/8) Epoch 6, batch 34150, loss[loss=0.1376, simple_loss=0.2055, pruned_loss=0.03481, over 4904.00 frames.], tot_loss[loss=0.1479, simple_loss=0.2192, pruned_loss=0.03831, over 972848.46 frames.], batch size: 18, lr: 3.20e-04 2022-05-05 17:36:30,533 INFO [train.py:715] (0/8) Epoch 6, batch 34200, loss[loss=0.1418, simple_loss=0.214, pruned_loss=0.03476, over 4792.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2188, pruned_loss=0.03813, over 972833.39 frames.], batch size: 18, lr: 3.20e-04 2022-05-05 17:37:09,176 INFO [train.py:715] (0/8) Epoch 6, batch 34250, loss[loss=0.1106, simple_loss=0.1846, pruned_loss=0.01831, over 4775.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2182, pruned_loss=0.0378, over 972368.75 frames.], batch size: 12, lr: 3.20e-04 2022-05-05 17:37:48,385 INFO [train.py:715] (0/8) Epoch 6, batch 34300, loss[loss=0.1618, simple_loss=0.235, pruned_loss=0.04424, over 4775.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2182, pruned_loss=0.03775, over 972033.55 frames.], batch size: 19, lr: 3.20e-04 2022-05-05 17:38:26,979 INFO [train.py:715] (0/8) Epoch 6, batch 34350, loss[loss=0.1532, simple_loss=0.2287, pruned_loss=0.03885, over 4800.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2183, pruned_loss=0.03774, over 972025.76 frames.], batch size: 14, lr: 3.20e-04 2022-05-05 17:39:05,616 INFO [train.py:715] (0/8) Epoch 6, batch 34400, loss[loss=0.1429, simple_loss=0.2213, pruned_loss=0.03223, over 4923.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2187, pruned_loss=0.03814, over 971641.55 frames.], batch size: 23, lr: 3.20e-04 2022-05-05 17:39:45,296 INFO [train.py:715] (0/8) Epoch 6, batch 34450, loss[loss=0.166, simple_loss=0.2314, pruned_loss=0.05035, over 4888.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2187, pruned_loss=0.03807, over 971800.27 frames.], batch size: 19, lr: 3.20e-04 2022-05-05 17:40:24,038 INFO [train.py:715] (0/8) Epoch 6, batch 34500, loss[loss=0.1414, simple_loss=0.2183, pruned_loss=0.03227, over 4885.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2185, pruned_loss=0.03752, over 971682.63 frames.], batch size: 16, lr: 3.20e-04 2022-05-05 17:41:02,888 INFO [train.py:715] (0/8) Epoch 6, batch 34550, loss[loss=0.1228, simple_loss=0.1948, pruned_loss=0.02545, over 4940.00 frames.], tot_loss[loss=0.1474, simple_loss=0.219, pruned_loss=0.03794, over 972614.30 frames.], batch size: 21, lr: 3.20e-04 2022-05-05 17:41:41,803 INFO [train.py:715] (0/8) Epoch 6, batch 34600, loss[loss=0.1383, simple_loss=0.2026, pruned_loss=0.03705, over 4846.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2199, pruned_loss=0.03882, over 973066.57 frames.], batch size: 32, lr: 3.20e-04 2022-05-05 17:42:20,613 INFO [train.py:715] (0/8) Epoch 6, batch 34650, loss[loss=0.1535, simple_loss=0.2286, pruned_loss=0.03922, over 4953.00 frames.], tot_loss[loss=0.1492, simple_loss=0.2202, pruned_loss=0.03904, over 971829.10 frames.], batch size: 24, lr: 3.20e-04 2022-05-05 17:42:59,312 INFO [train.py:715] (0/8) Epoch 6, batch 34700, loss[loss=0.1191, simple_loss=0.1965, pruned_loss=0.02085, over 4825.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2191, pruned_loss=0.03828, over 971589.99 frames.], batch size: 13, lr: 3.20e-04 2022-05-05 17:43:37,145 INFO [train.py:715] (0/8) Epoch 6, batch 34750, loss[loss=0.1524, simple_loss=0.2286, pruned_loss=0.03807, over 4756.00 frames.], tot_loss[loss=0.1486, simple_loss=0.2197, pruned_loss=0.0388, over 971093.54 frames.], batch size: 16, lr: 3.20e-04 2022-05-05 17:44:13,982 INFO [train.py:715] (0/8) Epoch 6, batch 34800, loss[loss=0.1496, simple_loss=0.2134, pruned_loss=0.04292, over 4756.00 frames.], tot_loss[loss=0.147, simple_loss=0.2179, pruned_loss=0.038, over 970189.40 frames.], batch size: 12, lr: 3.20e-04 2022-05-05 17:44:23,371 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-6.pt 2022-05-05 17:45:04,003 INFO [train.py:715] (0/8) Epoch 7, batch 0, loss[loss=0.1516, simple_loss=0.2244, pruned_loss=0.03944, over 4935.00 frames.], tot_loss[loss=0.1516, simple_loss=0.2244, pruned_loss=0.03944, over 4935.00 frames.], batch size: 23, lr: 3.03e-04 2022-05-05 17:45:42,572 INFO [train.py:715] (0/8) Epoch 7, batch 50, loss[loss=0.1593, simple_loss=0.2264, pruned_loss=0.0461, over 4938.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2187, pruned_loss=0.03886, over 219834.46 frames.], batch size: 39, lr: 3.03e-04 2022-05-05 17:46:21,352 INFO [train.py:715] (0/8) Epoch 7, batch 100, loss[loss=0.1557, simple_loss=0.2287, pruned_loss=0.04128, over 4965.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2202, pruned_loss=0.03966, over 387219.94 frames.], batch size: 35, lr: 3.03e-04 2022-05-05 17:47:00,255 INFO [train.py:715] (0/8) Epoch 7, batch 150, loss[loss=0.1484, simple_loss=0.211, pruned_loss=0.04288, over 4852.00 frames.], tot_loss[loss=0.1498, simple_loss=0.2206, pruned_loss=0.03949, over 517445.38 frames.], batch size: 13, lr: 3.03e-04 2022-05-05 17:47:39,937 INFO [train.py:715] (0/8) Epoch 7, batch 200, loss[loss=0.1685, simple_loss=0.2338, pruned_loss=0.05164, over 4900.00 frames.], tot_loss[loss=0.1488, simple_loss=0.2195, pruned_loss=0.039, over 617742.30 frames.], batch size: 17, lr: 3.03e-04 2022-05-05 17:48:18,720 INFO [train.py:715] (0/8) Epoch 7, batch 250, loss[loss=0.1199, simple_loss=0.1964, pruned_loss=0.02168, over 4818.00 frames.], tot_loss[loss=0.1487, simple_loss=0.2193, pruned_loss=0.03907, over 696717.77 frames.], batch size: 27, lr: 3.03e-04 2022-05-05 17:48:58,167 INFO [train.py:715] (0/8) Epoch 7, batch 300, loss[loss=0.1836, simple_loss=0.2523, pruned_loss=0.05743, over 4779.00 frames.], tot_loss[loss=0.1489, simple_loss=0.2199, pruned_loss=0.03897, over 757652.21 frames.], batch size: 17, lr: 3.02e-04 2022-05-05 17:49:36,845 INFO [train.py:715] (0/8) Epoch 7, batch 350, loss[loss=0.1923, simple_loss=0.2649, pruned_loss=0.05986, over 4975.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2191, pruned_loss=0.03852, over 805348.19 frames.], batch size: 15, lr: 3.02e-04 2022-05-05 17:50:16,221 INFO [train.py:715] (0/8) Epoch 7, batch 400, loss[loss=0.1146, simple_loss=0.1939, pruned_loss=0.01762, over 4804.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2189, pruned_loss=0.03834, over 842005.56 frames.], batch size: 12, lr: 3.02e-04 2022-05-05 17:50:54,884 INFO [train.py:715] (0/8) Epoch 7, batch 450, loss[loss=0.1922, simple_loss=0.2407, pruned_loss=0.07183, over 4904.00 frames.], tot_loss[loss=0.1477, simple_loss=0.219, pruned_loss=0.0382, over 870917.25 frames.], batch size: 17, lr: 3.02e-04 2022-05-05 17:51:33,736 INFO [train.py:715] (0/8) Epoch 7, batch 500, loss[loss=0.1296, simple_loss=0.2094, pruned_loss=0.02493, over 4974.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2189, pruned_loss=0.03812, over 893509.47 frames.], batch size: 28, lr: 3.02e-04 2022-05-05 17:52:12,469 INFO [train.py:715] (0/8) Epoch 7, batch 550, loss[loss=0.1977, simple_loss=0.2546, pruned_loss=0.0704, over 4815.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03757, over 910626.39 frames.], batch size: 27, lr: 3.02e-04 2022-05-05 17:52:51,634 INFO [train.py:715] (0/8) Epoch 7, batch 600, loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03723, over 4898.00 frames.], tot_loss[loss=0.1477, simple_loss=0.219, pruned_loss=0.03826, over 924740.37 frames.], batch size: 17, lr: 3.02e-04 2022-05-05 17:53:29,945 INFO [train.py:715] (0/8) Epoch 7, batch 650, loss[loss=0.1293, simple_loss=0.1993, pruned_loss=0.02967, over 4953.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2194, pruned_loss=0.03834, over 935143.66 frames.], batch size: 29, lr: 3.02e-04 2022-05-05 17:54:08,326 INFO [train.py:715] (0/8) Epoch 7, batch 700, loss[loss=0.1796, simple_loss=0.2498, pruned_loss=0.05473, over 4977.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2191, pruned_loss=0.03812, over 943632.38 frames.], batch size: 40, lr: 3.02e-04 2022-05-05 17:54:47,590 INFO [train.py:715] (0/8) Epoch 7, batch 750, loss[loss=0.1494, simple_loss=0.2174, pruned_loss=0.0407, over 4841.00 frames.], tot_loss[loss=0.1464, simple_loss=0.218, pruned_loss=0.03738, over 949701.83 frames.], batch size: 30, lr: 3.02e-04 2022-05-05 17:55:26,295 INFO [train.py:715] (0/8) Epoch 7, batch 800, loss[loss=0.1539, simple_loss=0.228, pruned_loss=0.03986, over 4977.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2173, pruned_loss=0.03719, over 954247.60 frames.], batch size: 24, lr: 3.02e-04 2022-05-05 17:56:04,982 INFO [train.py:715] (0/8) Epoch 7, batch 850, loss[loss=0.1647, simple_loss=0.2252, pruned_loss=0.05205, over 4789.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2172, pruned_loss=0.03723, over 956774.38 frames.], batch size: 21, lr: 3.02e-04 2022-05-05 17:56:44,238 INFO [train.py:715] (0/8) Epoch 7, batch 900, loss[loss=0.1452, simple_loss=0.2145, pruned_loss=0.03795, over 4984.00 frames.], tot_loss[loss=0.147, simple_loss=0.2182, pruned_loss=0.03786, over 959548.40 frames.], batch size: 28, lr: 3.02e-04 2022-05-05 17:57:23,219 INFO [train.py:715] (0/8) Epoch 7, batch 950, loss[loss=0.1365, simple_loss=0.2128, pruned_loss=0.0301, over 4958.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2178, pruned_loss=0.03783, over 962777.12 frames.], batch size: 35, lr: 3.02e-04 2022-05-05 17:58:01,719 INFO [train.py:715] (0/8) Epoch 7, batch 1000, loss[loss=0.1458, simple_loss=0.213, pruned_loss=0.03932, over 4821.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2183, pruned_loss=0.03809, over 963910.49 frames.], batch size: 27, lr: 3.02e-04 2022-05-05 17:58:40,403 INFO [train.py:715] (0/8) Epoch 7, batch 1050, loss[loss=0.144, simple_loss=0.2177, pruned_loss=0.03512, over 4990.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2186, pruned_loss=0.03807, over 966210.84 frames.], batch size: 15, lr: 3.02e-04 2022-05-05 17:59:19,622 INFO [train.py:715] (0/8) Epoch 7, batch 1100, loss[loss=0.1277, simple_loss=0.2039, pruned_loss=0.02576, over 4858.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2184, pruned_loss=0.03785, over 967499.52 frames.], batch size: 20, lr: 3.02e-04 2022-05-05 17:59:57,780 INFO [train.py:715] (0/8) Epoch 7, batch 1150, loss[loss=0.1577, simple_loss=0.2292, pruned_loss=0.04312, over 4781.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2178, pruned_loss=0.03752, over 967799.70 frames.], batch size: 18, lr: 3.02e-04 2022-05-05 18:00:36,961 INFO [train.py:715] (0/8) Epoch 7, batch 1200, loss[loss=0.1459, simple_loss=0.2074, pruned_loss=0.04219, over 4918.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2181, pruned_loss=0.03753, over 968695.21 frames.], batch size: 18, lr: 3.02e-04 2022-05-05 18:01:16,048 INFO [train.py:715] (0/8) Epoch 7, batch 1250, loss[loss=0.1202, simple_loss=0.1914, pruned_loss=0.02453, over 4820.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2177, pruned_loss=0.0374, over 969025.58 frames.], batch size: 27, lr: 3.02e-04 2022-05-05 18:01:55,177 INFO [train.py:715] (0/8) Epoch 7, batch 1300, loss[loss=0.1461, simple_loss=0.224, pruned_loss=0.03416, over 4812.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2181, pruned_loss=0.03759, over 970231.80 frames.], batch size: 25, lr: 3.02e-04 2022-05-05 18:02:33,761 INFO [train.py:715] (0/8) Epoch 7, batch 1350, loss[loss=0.15, simple_loss=0.2254, pruned_loss=0.03726, over 4833.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2188, pruned_loss=0.03838, over 970258.36 frames.], batch size: 20, lr: 3.02e-04 2022-05-05 18:03:12,551 INFO [train.py:715] (0/8) Epoch 7, batch 1400, loss[loss=0.1206, simple_loss=0.2013, pruned_loss=0.01996, over 4769.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2189, pruned_loss=0.03804, over 971265.02 frames.], batch size: 19, lr: 3.02e-04 2022-05-05 18:03:51,639 INFO [train.py:715] (0/8) Epoch 7, batch 1450, loss[loss=0.1553, simple_loss=0.2121, pruned_loss=0.04928, over 4923.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2192, pruned_loss=0.03862, over 970680.52 frames.], batch size: 35, lr: 3.02e-04 2022-05-05 18:04:29,768 INFO [train.py:715] (0/8) Epoch 7, batch 1500, loss[loss=0.2073, simple_loss=0.2711, pruned_loss=0.07171, over 4732.00 frames.], tot_loss[loss=0.1491, simple_loss=0.2199, pruned_loss=0.03914, over 970715.61 frames.], batch size: 16, lr: 3.02e-04 2022-05-05 18:05:08,978 INFO [train.py:715] (0/8) Epoch 7, batch 1550, loss[loss=0.1285, simple_loss=0.1975, pruned_loss=0.02973, over 4810.00 frames.], tot_loss[loss=0.1477, simple_loss=0.219, pruned_loss=0.03813, over 970699.89 frames.], batch size: 13, lr: 3.02e-04 2022-05-05 18:05:47,785 INFO [train.py:715] (0/8) Epoch 7, batch 1600, loss[loss=0.1886, simple_loss=0.2449, pruned_loss=0.06613, over 4644.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2187, pruned_loss=0.03815, over 970063.30 frames.], batch size: 13, lr: 3.02e-04 2022-05-05 18:06:26,678 INFO [train.py:715] (0/8) Epoch 7, batch 1650, loss[loss=0.1185, simple_loss=0.196, pruned_loss=0.02048, over 4947.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2187, pruned_loss=0.03786, over 971047.23 frames.], batch size: 21, lr: 3.02e-04 2022-05-05 18:07:05,254 INFO [train.py:715] (0/8) Epoch 7, batch 1700, loss[loss=0.1564, simple_loss=0.223, pruned_loss=0.04495, over 4871.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2187, pruned_loss=0.03812, over 971873.52 frames.], batch size: 19, lr: 3.02e-04 2022-05-05 18:07:44,160 INFO [train.py:715] (0/8) Epoch 7, batch 1750, loss[loss=0.1271, simple_loss=0.2052, pruned_loss=0.02444, over 4821.00 frames.], tot_loss[loss=0.147, simple_loss=0.2184, pruned_loss=0.03777, over 972375.07 frames.], batch size: 21, lr: 3.02e-04 2022-05-05 18:08:24,135 INFO [train.py:715] (0/8) Epoch 7, batch 1800, loss[loss=0.1464, simple_loss=0.2207, pruned_loss=0.03611, over 4838.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2192, pruned_loss=0.03765, over 972136.16 frames.], batch size: 15, lr: 3.02e-04 2022-05-05 18:09:03,069 INFO [train.py:715] (0/8) Epoch 7, batch 1850, loss[loss=0.1401, simple_loss=0.2212, pruned_loss=0.02952, over 4920.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2197, pruned_loss=0.03769, over 971743.69 frames.], batch size: 23, lr: 3.02e-04 2022-05-05 18:09:41,923 INFO [train.py:715] (0/8) Epoch 7, batch 1900, loss[loss=0.1196, simple_loss=0.184, pruned_loss=0.02759, over 4754.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2197, pruned_loss=0.03796, over 972246.44 frames.], batch size: 19, lr: 3.01e-04 2022-05-05 18:10:20,110 INFO [train.py:715] (0/8) Epoch 7, batch 1950, loss[loss=0.1611, simple_loss=0.2229, pruned_loss=0.04963, over 4743.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2188, pruned_loss=0.03765, over 972175.38 frames.], batch size: 16, lr: 3.01e-04 2022-05-05 18:10:59,287 INFO [train.py:715] (0/8) Epoch 7, batch 2000, loss[loss=0.1526, simple_loss=0.2184, pruned_loss=0.04344, over 4958.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2183, pruned_loss=0.03758, over 972435.81 frames.], batch size: 35, lr: 3.01e-04 2022-05-05 18:11:37,484 INFO [train.py:715] (0/8) Epoch 7, batch 2050, loss[loss=0.1614, simple_loss=0.2446, pruned_loss=0.03909, over 4800.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2187, pruned_loss=0.03737, over 972329.75 frames.], batch size: 21, lr: 3.01e-04 2022-05-05 18:12:16,135 INFO [train.py:715] (0/8) Epoch 7, batch 2100, loss[loss=0.1577, simple_loss=0.2322, pruned_loss=0.04158, over 4975.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2184, pruned_loss=0.03764, over 972385.82 frames.], batch size: 28, lr: 3.01e-04 2022-05-05 18:12:54,589 INFO [train.py:715] (0/8) Epoch 7, batch 2150, loss[loss=0.1265, simple_loss=0.1966, pruned_loss=0.0282, over 4800.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2181, pruned_loss=0.03731, over 972406.44 frames.], batch size: 21, lr: 3.01e-04 2022-05-05 18:13:32,796 INFO [train.py:715] (0/8) Epoch 7, batch 2200, loss[loss=0.1244, simple_loss=0.1994, pruned_loss=0.02464, over 4959.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2189, pruned_loss=0.03772, over 973017.27 frames.], batch size: 15, lr: 3.01e-04 2022-05-05 18:14:11,046 INFO [train.py:715] (0/8) Epoch 7, batch 2250, loss[loss=0.1319, simple_loss=0.2103, pruned_loss=0.02677, over 4860.00 frames.], tot_loss[loss=0.1469, simple_loss=0.219, pruned_loss=0.03735, over 973019.96 frames.], batch size: 20, lr: 3.01e-04 2022-05-05 18:14:50,045 INFO [train.py:715] (0/8) Epoch 7, batch 2300, loss[loss=0.1429, simple_loss=0.2205, pruned_loss=0.03263, over 4777.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2193, pruned_loss=0.03761, over 973095.47 frames.], batch size: 18, lr: 3.01e-04 2022-05-05 18:15:29,526 INFO [train.py:715] (0/8) Epoch 7, batch 2350, loss[loss=0.1314, simple_loss=0.2107, pruned_loss=0.02604, over 4839.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2198, pruned_loss=0.03782, over 973085.39 frames.], batch size: 15, lr: 3.01e-04 2022-05-05 18:16:08,315 INFO [train.py:715] (0/8) Epoch 7, batch 2400, loss[loss=0.1786, simple_loss=0.2445, pruned_loss=0.05628, over 4744.00 frames.], tot_loss[loss=0.1482, simple_loss=0.2201, pruned_loss=0.03817, over 972979.36 frames.], batch size: 16, lr: 3.01e-04 2022-05-05 18:16:46,785 INFO [train.py:715] (0/8) Epoch 7, batch 2450, loss[loss=0.1205, simple_loss=0.2013, pruned_loss=0.01982, over 4973.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2203, pruned_loss=0.03815, over 972857.14 frames.], batch size: 28, lr: 3.01e-04 2022-05-05 18:17:25,559 INFO [train.py:715] (0/8) Epoch 7, batch 2500, loss[loss=0.134, simple_loss=0.2072, pruned_loss=0.0304, over 4835.00 frames.], tot_loss[loss=0.1468, simple_loss=0.219, pruned_loss=0.03729, over 972756.97 frames.], batch size: 26, lr: 3.01e-04 2022-05-05 18:18:03,859 INFO [train.py:715] (0/8) Epoch 7, batch 2550, loss[loss=0.1193, simple_loss=0.1997, pruned_loss=0.01945, over 4845.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2173, pruned_loss=0.03642, over 972423.20 frames.], batch size: 26, lr: 3.01e-04 2022-05-05 18:18:42,382 INFO [train.py:715] (0/8) Epoch 7, batch 2600, loss[loss=0.1753, simple_loss=0.2415, pruned_loss=0.05457, over 4966.00 frames.], tot_loss[loss=0.1456, simple_loss=0.218, pruned_loss=0.0366, over 972461.51 frames.], batch size: 14, lr: 3.01e-04 2022-05-05 18:19:21,116 INFO [train.py:715] (0/8) Epoch 7, batch 2650, loss[loss=0.1457, simple_loss=0.2095, pruned_loss=0.04095, over 4817.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2174, pruned_loss=0.03641, over 972853.19 frames.], batch size: 12, lr: 3.01e-04 2022-05-05 18:19:59,707 INFO [train.py:715] (0/8) Epoch 7, batch 2700, loss[loss=0.116, simple_loss=0.1871, pruned_loss=0.02243, over 4841.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2171, pruned_loss=0.03622, over 972758.53 frames.], batch size: 13, lr: 3.01e-04 2022-05-05 18:20:37,582 INFO [train.py:715] (0/8) Epoch 7, batch 2750, loss[loss=0.1203, simple_loss=0.2011, pruned_loss=0.01972, over 4839.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2188, pruned_loss=0.0374, over 973647.55 frames.], batch size: 26, lr: 3.01e-04 2022-05-05 18:21:16,371 INFO [train.py:715] (0/8) Epoch 7, batch 2800, loss[loss=0.1402, simple_loss=0.2087, pruned_loss=0.03592, over 4788.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2187, pruned_loss=0.03742, over 973437.09 frames.], batch size: 14, lr: 3.01e-04 2022-05-05 18:21:55,731 INFO [train.py:715] (0/8) Epoch 7, batch 2850, loss[loss=0.1321, simple_loss=0.1995, pruned_loss=0.03236, over 4793.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2182, pruned_loss=0.0372, over 973802.91 frames.], batch size: 24, lr: 3.01e-04 2022-05-05 18:22:35,308 INFO [train.py:715] (0/8) Epoch 7, batch 2900, loss[loss=0.1599, simple_loss=0.2322, pruned_loss=0.04381, over 4909.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2187, pruned_loss=0.03712, over 972875.67 frames.], batch size: 17, lr: 3.01e-04 2022-05-05 18:23:14,209 INFO [train.py:715] (0/8) Epoch 7, batch 2950, loss[loss=0.1396, simple_loss=0.2078, pruned_loss=0.03575, over 4798.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2177, pruned_loss=0.03663, over 972009.67 frames.], batch size: 25, lr: 3.01e-04 2022-05-05 18:23:53,375 INFO [train.py:715] (0/8) Epoch 7, batch 3000, loss[loss=0.1542, simple_loss=0.2171, pruned_loss=0.04568, over 4882.00 frames.], tot_loss[loss=0.146, simple_loss=0.218, pruned_loss=0.03699, over 972365.46 frames.], batch size: 16, lr: 3.01e-04 2022-05-05 18:23:53,376 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 18:24:04,767 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1084, simple_loss=0.1933, pruned_loss=0.01171, over 914524.00 frames. 2022-05-05 18:24:44,248 INFO [train.py:715] (0/8) Epoch 7, batch 3050, loss[loss=0.1299, simple_loss=0.2011, pruned_loss=0.0294, over 4738.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2186, pruned_loss=0.03739, over 973346.10 frames.], batch size: 16, lr: 3.01e-04 2022-05-05 18:25:23,057 INFO [train.py:715] (0/8) Epoch 7, batch 3100, loss[loss=0.1239, simple_loss=0.1989, pruned_loss=0.02448, over 4860.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2186, pruned_loss=0.03748, over 972970.28 frames.], batch size: 30, lr: 3.01e-04 2022-05-05 18:26:01,758 INFO [train.py:715] (0/8) Epoch 7, batch 3150, loss[loss=0.139, simple_loss=0.21, pruned_loss=0.03398, over 4845.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2185, pruned_loss=0.03761, over 972698.30 frames.], batch size: 20, lr: 3.01e-04 2022-05-05 18:26:39,659 INFO [train.py:715] (0/8) Epoch 7, batch 3200, loss[loss=0.1654, simple_loss=0.2197, pruned_loss=0.05556, over 4858.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2177, pruned_loss=0.03728, over 972442.33 frames.], batch size: 30, lr: 3.01e-04 2022-05-05 18:27:17,883 INFO [train.py:715] (0/8) Epoch 7, batch 3250, loss[loss=0.1289, simple_loss=0.1986, pruned_loss=0.02959, over 4798.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2181, pruned_loss=0.03772, over 972390.84 frames.], batch size: 24, lr: 3.01e-04 2022-05-05 18:27:56,433 INFO [train.py:715] (0/8) Epoch 7, batch 3300, loss[loss=0.1742, simple_loss=0.256, pruned_loss=0.0462, over 4648.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2186, pruned_loss=0.03788, over 971279.86 frames.], batch size: 13, lr: 3.01e-04 2022-05-05 18:28:35,028 INFO [train.py:715] (0/8) Epoch 7, batch 3350, loss[loss=0.1115, simple_loss=0.1882, pruned_loss=0.01733, over 4797.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2174, pruned_loss=0.03737, over 971355.91 frames.], batch size: 24, lr: 3.01e-04 2022-05-05 18:29:13,820 INFO [train.py:715] (0/8) Epoch 7, batch 3400, loss[loss=0.1194, simple_loss=0.1913, pruned_loss=0.02373, over 4945.00 frames.], tot_loss[loss=0.1456, simple_loss=0.217, pruned_loss=0.03704, over 972053.76 frames.], batch size: 21, lr: 3.01e-04 2022-05-05 18:29:52,247 INFO [train.py:715] (0/8) Epoch 7, batch 3450, loss[loss=0.1347, simple_loss=0.2079, pruned_loss=0.03076, over 4797.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2175, pruned_loss=0.03672, over 972397.77 frames.], batch size: 17, lr: 3.01e-04 2022-05-05 18:30:31,302 INFO [train.py:715] (0/8) Epoch 7, batch 3500, loss[loss=0.1195, simple_loss=0.1901, pruned_loss=0.02446, over 4963.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2177, pruned_loss=0.03732, over 972460.12 frames.], batch size: 14, lr: 3.01e-04 2022-05-05 18:31:09,921 INFO [train.py:715] (0/8) Epoch 7, batch 3550, loss[loss=0.1691, simple_loss=0.2324, pruned_loss=0.0529, over 4776.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2179, pruned_loss=0.03746, over 971778.10 frames.], batch size: 18, lr: 3.00e-04 2022-05-05 18:31:48,692 INFO [train.py:715] (0/8) Epoch 7, batch 3600, loss[loss=0.1549, simple_loss=0.2237, pruned_loss=0.04308, over 4928.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2178, pruned_loss=0.03748, over 971970.55 frames.], batch size: 18, lr: 3.00e-04 2022-05-05 18:32:27,422 INFO [train.py:715] (0/8) Epoch 7, batch 3650, loss[loss=0.1639, simple_loss=0.2276, pruned_loss=0.05008, over 4819.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2174, pruned_loss=0.03733, over 972416.40 frames.], batch size: 25, lr: 3.00e-04 2022-05-05 18:33:06,460 INFO [train.py:715] (0/8) Epoch 7, batch 3700, loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02947, over 4829.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2175, pruned_loss=0.03738, over 971746.31 frames.], batch size: 30, lr: 3.00e-04 2022-05-05 18:33:45,230 INFO [train.py:715] (0/8) Epoch 7, batch 3750, loss[loss=0.1539, simple_loss=0.2242, pruned_loss=0.04175, over 4966.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2169, pruned_loss=0.03694, over 972817.87 frames.], batch size: 24, lr: 3.00e-04 2022-05-05 18:34:23,488 INFO [train.py:715] (0/8) Epoch 7, batch 3800, loss[loss=0.1561, simple_loss=0.2241, pruned_loss=0.04405, over 4840.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2169, pruned_loss=0.03671, over 972193.23 frames.], batch size: 15, lr: 3.00e-04 2022-05-05 18:35:01,652 INFO [train.py:715] (0/8) Epoch 7, batch 3850, loss[loss=0.1261, simple_loss=0.1867, pruned_loss=0.03269, over 4870.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2164, pruned_loss=0.03655, over 972278.06 frames.], batch size: 16, lr: 3.00e-04 2022-05-05 18:35:39,924 INFO [train.py:715] (0/8) Epoch 7, batch 3900, loss[loss=0.1172, simple_loss=0.1943, pruned_loss=0.02005, over 4965.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2171, pruned_loss=0.03673, over 972153.81 frames.], batch size: 24, lr: 3.00e-04 2022-05-05 18:36:18,410 INFO [train.py:715] (0/8) Epoch 7, batch 3950, loss[loss=0.1514, simple_loss=0.2158, pruned_loss=0.04344, over 4746.00 frames.], tot_loss[loss=0.1452, simple_loss=0.217, pruned_loss=0.0367, over 972814.33 frames.], batch size: 14, lr: 3.00e-04 2022-05-05 18:36:57,037 INFO [train.py:715] (0/8) Epoch 7, batch 4000, loss[loss=0.1598, simple_loss=0.2317, pruned_loss=0.04398, over 4786.00 frames.], tot_loss[loss=0.147, simple_loss=0.2185, pruned_loss=0.03776, over 972701.39 frames.], batch size: 14, lr: 3.00e-04 2022-05-05 18:37:35,132 INFO [train.py:715] (0/8) Epoch 7, batch 4050, loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03169, over 4788.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2191, pruned_loss=0.03797, over 972771.79 frames.], batch size: 17, lr: 3.00e-04 2022-05-05 18:38:14,043 INFO [train.py:715] (0/8) Epoch 7, batch 4100, loss[loss=0.1221, simple_loss=0.1911, pruned_loss=0.02658, over 4837.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2182, pruned_loss=0.03748, over 972494.16 frames.], batch size: 13, lr: 3.00e-04 2022-05-05 18:38:52,565 INFO [train.py:715] (0/8) Epoch 7, batch 4150, loss[loss=0.1387, simple_loss=0.2135, pruned_loss=0.03193, over 4946.00 frames.], tot_loss[loss=0.1474, simple_loss=0.219, pruned_loss=0.03784, over 972756.54 frames.], batch size: 23, lr: 3.00e-04 2022-05-05 18:39:31,256 INFO [train.py:715] (0/8) Epoch 7, batch 4200, loss[loss=0.1429, simple_loss=0.2234, pruned_loss=0.03125, over 4964.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2185, pruned_loss=0.03714, over 972235.60 frames.], batch size: 14, lr: 3.00e-04 2022-05-05 18:40:09,109 INFO [train.py:715] (0/8) Epoch 7, batch 4250, loss[loss=0.1532, simple_loss=0.2251, pruned_loss=0.04062, over 4966.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2185, pruned_loss=0.03704, over 972432.63 frames.], batch size: 35, lr: 3.00e-04 2022-05-05 18:40:47,948 INFO [train.py:715] (0/8) Epoch 7, batch 4300, loss[loss=0.1596, simple_loss=0.232, pruned_loss=0.04353, over 4954.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2183, pruned_loss=0.03714, over 972306.34 frames.], batch size: 21, lr: 3.00e-04 2022-05-05 18:41:13,244 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-248000.pt 2022-05-05 18:41:28,763 INFO [train.py:715] (0/8) Epoch 7, batch 4350, loss[loss=0.1442, simple_loss=0.2162, pruned_loss=0.03615, over 4991.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2184, pruned_loss=0.03725, over 972857.62 frames.], batch size: 16, lr: 3.00e-04 2022-05-05 18:42:07,268 INFO [train.py:715] (0/8) Epoch 7, batch 4400, loss[loss=0.1232, simple_loss=0.1941, pruned_loss=0.02613, over 4893.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2188, pruned_loss=0.03753, over 972758.57 frames.], batch size: 19, lr: 3.00e-04 2022-05-05 18:42:46,326 INFO [train.py:715] (0/8) Epoch 7, batch 4450, loss[loss=0.1686, simple_loss=0.2346, pruned_loss=0.05125, over 4741.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2185, pruned_loss=0.03727, over 972894.45 frames.], batch size: 16, lr: 3.00e-04 2022-05-05 18:43:25,191 INFO [train.py:715] (0/8) Epoch 7, batch 4500, loss[loss=0.1469, simple_loss=0.2179, pruned_loss=0.03792, over 4787.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2177, pruned_loss=0.03703, over 971812.62 frames.], batch size: 18, lr: 3.00e-04 2022-05-05 18:44:03,951 INFO [train.py:715] (0/8) Epoch 7, batch 4550, loss[loss=0.131, simple_loss=0.2109, pruned_loss=0.02553, over 4940.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2178, pruned_loss=0.03683, over 971947.58 frames.], batch size: 21, lr: 3.00e-04 2022-05-05 18:44:42,552 INFO [train.py:715] (0/8) Epoch 7, batch 4600, loss[loss=0.163, simple_loss=0.2419, pruned_loss=0.04206, over 4776.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03684, over 971689.01 frames.], batch size: 18, lr: 3.00e-04 2022-05-05 18:45:21,323 INFO [train.py:715] (0/8) Epoch 7, batch 4650, loss[loss=0.1653, simple_loss=0.232, pruned_loss=0.04929, over 4976.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2173, pruned_loss=0.03661, over 972200.94 frames.], batch size: 24, lr: 3.00e-04 2022-05-05 18:45:59,785 INFO [train.py:715] (0/8) Epoch 7, batch 4700, loss[loss=0.1601, simple_loss=0.2254, pruned_loss=0.04741, over 4836.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2168, pruned_loss=0.03622, over 972325.83 frames.], batch size: 13, lr: 3.00e-04 2022-05-05 18:46:37,970 INFO [train.py:715] (0/8) Epoch 7, batch 4750, loss[loss=0.166, simple_loss=0.2366, pruned_loss=0.04772, over 4877.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03632, over 972356.06 frames.], batch size: 22, lr: 3.00e-04 2022-05-05 18:47:17,152 INFO [train.py:715] (0/8) Epoch 7, batch 4800, loss[loss=0.1535, simple_loss=0.2153, pruned_loss=0.04588, over 4819.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2174, pruned_loss=0.03679, over 972982.56 frames.], batch size: 27, lr: 3.00e-04 2022-05-05 18:47:55,561 INFO [train.py:715] (0/8) Epoch 7, batch 4850, loss[loss=0.1351, simple_loss=0.205, pruned_loss=0.03254, over 4827.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2188, pruned_loss=0.03771, over 973493.02 frames.], batch size: 30, lr: 3.00e-04 2022-05-05 18:48:34,300 INFO [train.py:715] (0/8) Epoch 7, batch 4900, loss[loss=0.1585, simple_loss=0.2333, pruned_loss=0.04182, over 4981.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2192, pruned_loss=0.03809, over 972926.65 frames.], batch size: 24, lr: 3.00e-04 2022-05-05 18:49:12,732 INFO [train.py:715] (0/8) Epoch 7, batch 4950, loss[loss=0.1517, simple_loss=0.2085, pruned_loss=0.04748, over 4977.00 frames.], tot_loss[loss=0.1472, simple_loss=0.219, pruned_loss=0.03772, over 973623.78 frames.], batch size: 14, lr: 3.00e-04 2022-05-05 18:49:51,775 INFO [train.py:715] (0/8) Epoch 7, batch 5000, loss[loss=0.1536, simple_loss=0.2179, pruned_loss=0.04464, over 4933.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2177, pruned_loss=0.03694, over 974458.62 frames.], batch size: 29, lr: 3.00e-04 2022-05-05 18:50:30,770 INFO [train.py:715] (0/8) Epoch 7, batch 5050, loss[loss=0.1393, simple_loss=0.21, pruned_loss=0.03433, over 4758.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03686, over 974741.70 frames.], batch size: 14, lr: 3.00e-04 2022-05-05 18:51:09,366 INFO [train.py:715] (0/8) Epoch 7, batch 5100, loss[loss=0.1631, simple_loss=0.2271, pruned_loss=0.04953, over 4851.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2183, pruned_loss=0.03718, over 974964.29 frames.], batch size: 34, lr: 3.00e-04 2022-05-05 18:51:48,428 INFO [train.py:715] (0/8) Epoch 7, batch 5150, loss[loss=0.1382, simple_loss=0.2062, pruned_loss=0.03507, over 4896.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2193, pruned_loss=0.03806, over 974544.60 frames.], batch size: 22, lr: 3.00e-04 2022-05-05 18:52:27,133 INFO [train.py:715] (0/8) Epoch 7, batch 5200, loss[loss=0.1525, simple_loss=0.2222, pruned_loss=0.04142, over 4759.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2183, pruned_loss=0.03781, over 974301.03 frames.], batch size: 19, lr: 2.99e-04 2022-05-05 18:53:06,161 INFO [train.py:715] (0/8) Epoch 7, batch 5250, loss[loss=0.1275, simple_loss=0.2135, pruned_loss=0.02074, over 4953.00 frames.], tot_loss[loss=0.1467, simple_loss=0.218, pruned_loss=0.03773, over 973681.37 frames.], batch size: 24, lr: 2.99e-04 2022-05-05 18:53:44,791 INFO [train.py:715] (0/8) Epoch 7, batch 5300, loss[loss=0.1436, simple_loss=0.2217, pruned_loss=0.03272, over 4899.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2176, pruned_loss=0.03756, over 973160.05 frames.], batch size: 22, lr: 2.99e-04 2022-05-05 18:54:24,155 INFO [train.py:715] (0/8) Epoch 7, batch 5350, loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02999, over 4906.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2183, pruned_loss=0.03759, over 972421.59 frames.], batch size: 18, lr: 2.99e-04 2022-05-05 18:55:02,364 INFO [train.py:715] (0/8) Epoch 7, batch 5400, loss[loss=0.1799, simple_loss=0.256, pruned_loss=0.05188, over 4755.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2177, pruned_loss=0.03692, over 972347.66 frames.], batch size: 19, lr: 2.99e-04 2022-05-05 18:55:41,208 INFO [train.py:715] (0/8) Epoch 7, batch 5450, loss[loss=0.122, simple_loss=0.1952, pruned_loss=0.02441, over 4986.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2182, pruned_loss=0.03739, over 971824.76 frames.], batch size: 14, lr: 2.99e-04 2022-05-05 18:56:20,337 INFO [train.py:715] (0/8) Epoch 7, batch 5500, loss[loss=0.1226, simple_loss=0.2078, pruned_loss=0.01872, over 4985.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03757, over 972317.04 frames.], batch size: 28, lr: 2.99e-04 2022-05-05 18:56:59,121 INFO [train.py:715] (0/8) Epoch 7, batch 5550, loss[loss=0.1601, simple_loss=0.2349, pruned_loss=0.04264, over 4736.00 frames.], tot_loss[loss=0.147, simple_loss=0.2188, pruned_loss=0.03756, over 971453.48 frames.], batch size: 12, lr: 2.99e-04 2022-05-05 18:57:38,236 INFO [train.py:715] (0/8) Epoch 7, batch 5600, loss[loss=0.1751, simple_loss=0.2584, pruned_loss=0.04594, over 4826.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2188, pruned_loss=0.03797, over 971972.57 frames.], batch size: 25, lr: 2.99e-04 2022-05-05 18:58:17,277 INFO [train.py:715] (0/8) Epoch 7, batch 5650, loss[loss=0.162, simple_loss=0.2328, pruned_loss=0.04558, over 4848.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2179, pruned_loss=0.03771, over 972346.21 frames.], batch size: 32, lr: 2.99e-04 2022-05-05 18:58:56,364 INFO [train.py:715] (0/8) Epoch 7, batch 5700, loss[loss=0.1553, simple_loss=0.2262, pruned_loss=0.04225, over 4967.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2188, pruned_loss=0.03777, over 973157.32 frames.], batch size: 15, lr: 2.99e-04 2022-05-05 18:59:34,746 INFO [train.py:715] (0/8) Epoch 7, batch 5750, loss[loss=0.172, simple_loss=0.2375, pruned_loss=0.05326, over 4874.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2175, pruned_loss=0.03717, over 973237.85 frames.], batch size: 16, lr: 2.99e-04 2022-05-05 19:00:12,897 INFO [train.py:715] (0/8) Epoch 7, batch 5800, loss[loss=0.1308, simple_loss=0.1986, pruned_loss=0.0315, over 4751.00 frames.], tot_loss[loss=0.146, simple_loss=0.2176, pruned_loss=0.03718, over 973657.98 frames.], batch size: 16, lr: 2.99e-04 2022-05-05 19:00:52,625 INFO [train.py:715] (0/8) Epoch 7, batch 5850, loss[loss=0.1315, simple_loss=0.2037, pruned_loss=0.02968, over 4852.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03702, over 972819.29 frames.], batch size: 20, lr: 2.99e-04 2022-05-05 19:01:30,925 INFO [train.py:715] (0/8) Epoch 7, batch 5900, loss[loss=0.1476, simple_loss=0.2136, pruned_loss=0.04082, over 4958.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2172, pruned_loss=0.03713, over 972903.20 frames.], batch size: 15, lr: 2.99e-04 2022-05-05 19:02:09,957 INFO [train.py:715] (0/8) Epoch 7, batch 5950, loss[loss=0.1322, simple_loss=0.2059, pruned_loss=0.0292, over 4970.00 frames.], tot_loss[loss=0.146, simple_loss=0.2175, pruned_loss=0.03726, over 973022.02 frames.], batch size: 35, lr: 2.99e-04 2022-05-05 19:02:48,381 INFO [train.py:715] (0/8) Epoch 7, batch 6000, loss[loss=0.148, simple_loss=0.227, pruned_loss=0.03448, over 4799.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2175, pruned_loss=0.03732, over 974258.74 frames.], batch size: 24, lr: 2.99e-04 2022-05-05 19:02:48,382 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 19:02:58,047 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1085, simple_loss=0.1933, pruned_loss=0.0119, over 914524.00 frames. 2022-05-05 19:03:36,915 INFO [train.py:715] (0/8) Epoch 7, batch 6050, loss[loss=0.1444, simple_loss=0.2145, pruned_loss=0.03714, over 4837.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2175, pruned_loss=0.03768, over 974613.06 frames.], batch size: 30, lr: 2.99e-04 2022-05-05 19:04:16,078 INFO [train.py:715] (0/8) Epoch 7, batch 6100, loss[loss=0.1291, simple_loss=0.204, pruned_loss=0.02716, over 4827.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2177, pruned_loss=0.03758, over 974265.71 frames.], batch size: 26, lr: 2.99e-04 2022-05-05 19:04:55,376 INFO [train.py:715] (0/8) Epoch 7, batch 6150, loss[loss=0.1316, simple_loss=0.2088, pruned_loss=0.02725, over 4768.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2176, pruned_loss=0.03768, over 973579.94 frames.], batch size: 14, lr: 2.99e-04 2022-05-05 19:05:33,825 INFO [train.py:715] (0/8) Epoch 7, batch 6200, loss[loss=0.1292, simple_loss=0.203, pruned_loss=0.02771, over 4893.00 frames.], tot_loss[loss=0.1468, simple_loss=0.218, pruned_loss=0.03777, over 973957.89 frames.], batch size: 18, lr: 2.99e-04 2022-05-05 19:06:13,678 INFO [train.py:715] (0/8) Epoch 7, batch 6250, loss[loss=0.1287, simple_loss=0.2022, pruned_loss=0.02763, over 4737.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2172, pruned_loss=0.03714, over 973371.41 frames.], batch size: 16, lr: 2.99e-04 2022-05-05 19:06:52,571 INFO [train.py:715] (0/8) Epoch 7, batch 6300, loss[loss=0.1841, simple_loss=0.2456, pruned_loss=0.06127, over 4926.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2181, pruned_loss=0.03735, over 973710.66 frames.], batch size: 39, lr: 2.99e-04 2022-05-05 19:07:30,970 INFO [train.py:715] (0/8) Epoch 7, batch 6350, loss[loss=0.1395, simple_loss=0.2074, pruned_loss=0.03584, over 4774.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03762, over 973252.56 frames.], batch size: 18, lr: 2.99e-04 2022-05-05 19:08:10,028 INFO [train.py:715] (0/8) Epoch 7, batch 6400, loss[loss=0.1254, simple_loss=0.1974, pruned_loss=0.0267, over 4896.00 frames.], tot_loss[loss=0.146, simple_loss=0.2178, pruned_loss=0.03707, over 973551.76 frames.], batch size: 22, lr: 2.99e-04 2022-05-05 19:08:49,043 INFO [train.py:715] (0/8) Epoch 7, batch 6450, loss[loss=0.1755, simple_loss=0.2474, pruned_loss=0.05178, over 4923.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2172, pruned_loss=0.03691, over 972705.43 frames.], batch size: 21, lr: 2.99e-04 2022-05-05 19:09:27,582 INFO [train.py:715] (0/8) Epoch 7, batch 6500, loss[loss=0.1234, simple_loss=0.2023, pruned_loss=0.0223, over 4768.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2171, pruned_loss=0.03681, over 972777.68 frames.], batch size: 14, lr: 2.99e-04 2022-05-05 19:10:06,570 INFO [train.py:715] (0/8) Epoch 7, batch 6550, loss[loss=0.1385, simple_loss=0.2009, pruned_loss=0.03803, over 4835.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03725, over 973295.45 frames.], batch size: 12, lr: 2.99e-04 2022-05-05 19:10:46,390 INFO [train.py:715] (0/8) Epoch 7, batch 6600, loss[loss=0.1377, simple_loss=0.2052, pruned_loss=0.03515, over 4800.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2183, pruned_loss=0.03766, over 972947.88 frames.], batch size: 24, lr: 2.99e-04 2022-05-05 19:11:25,238 INFO [train.py:715] (0/8) Epoch 7, batch 6650, loss[loss=0.158, simple_loss=0.2315, pruned_loss=0.04226, over 4916.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2188, pruned_loss=0.03798, over 972578.75 frames.], batch size: 17, lr: 2.99e-04 2022-05-05 19:12:04,474 INFO [train.py:715] (0/8) Epoch 7, batch 6700, loss[loss=0.1501, simple_loss=0.2084, pruned_loss=0.04594, over 4980.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2188, pruned_loss=0.03811, over 972538.66 frames.], batch size: 14, lr: 2.99e-04 2022-05-05 19:12:43,220 INFO [train.py:715] (0/8) Epoch 7, batch 6750, loss[loss=0.1329, simple_loss=0.2011, pruned_loss=0.03235, over 4966.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2181, pruned_loss=0.03779, over 972805.33 frames.], batch size: 35, lr: 2.99e-04 2022-05-05 19:13:22,213 INFO [train.py:715] (0/8) Epoch 7, batch 6800, loss[loss=0.1384, simple_loss=0.2168, pruned_loss=0.03002, over 4839.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2179, pruned_loss=0.03769, over 972530.45 frames.], batch size: 15, lr: 2.99e-04 2022-05-05 19:14:00,585 INFO [train.py:715] (0/8) Epoch 7, batch 6850, loss[loss=0.1346, simple_loss=0.207, pruned_loss=0.03113, over 4831.00 frames.], tot_loss[loss=0.147, simple_loss=0.2187, pruned_loss=0.03769, over 972742.14 frames.], batch size: 13, lr: 2.99e-04 2022-05-05 19:14:39,182 INFO [train.py:715] (0/8) Epoch 7, batch 6900, loss[loss=0.1511, simple_loss=0.218, pruned_loss=0.04211, over 4959.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2184, pruned_loss=0.03741, over 971613.37 frames.], batch size: 35, lr: 2.98e-04 2022-05-05 19:15:18,695 INFO [train.py:715] (0/8) Epoch 7, batch 6950, loss[loss=0.1317, simple_loss=0.2131, pruned_loss=0.0252, over 4963.00 frames.], tot_loss[loss=0.147, simple_loss=0.2191, pruned_loss=0.03748, over 971968.09 frames.], batch size: 24, lr: 2.98e-04 2022-05-05 19:15:56,856 INFO [train.py:715] (0/8) Epoch 7, batch 7000, loss[loss=0.1498, simple_loss=0.2187, pruned_loss=0.04044, over 4693.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2185, pruned_loss=0.03706, over 971512.02 frames.], batch size: 15, lr: 2.98e-04 2022-05-05 19:16:35,555 INFO [train.py:715] (0/8) Epoch 7, batch 7050, loss[loss=0.1568, simple_loss=0.2204, pruned_loss=0.0466, over 4923.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2182, pruned_loss=0.03698, over 971780.94 frames.], batch size: 18, lr: 2.98e-04 2022-05-05 19:17:14,119 INFO [train.py:715] (0/8) Epoch 7, batch 7100, loss[loss=0.1093, simple_loss=0.1782, pruned_loss=0.02016, over 4818.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2188, pruned_loss=0.0377, over 971292.70 frames.], batch size: 13, lr: 2.98e-04 2022-05-05 19:17:52,397 INFO [train.py:715] (0/8) Epoch 7, batch 7150, loss[loss=0.124, simple_loss=0.1981, pruned_loss=0.0249, over 4792.00 frames.], tot_loss[loss=0.147, simple_loss=0.2187, pruned_loss=0.03763, over 970903.06 frames.], batch size: 24, lr: 2.98e-04 2022-05-05 19:18:31,017 INFO [train.py:715] (0/8) Epoch 7, batch 7200, loss[loss=0.1376, simple_loss=0.2034, pruned_loss=0.03584, over 4796.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2191, pruned_loss=0.03812, over 971114.43 frames.], batch size: 12, lr: 2.98e-04 2022-05-05 19:19:10,023 INFO [train.py:715] (0/8) Epoch 7, batch 7250, loss[loss=0.1272, simple_loss=0.1969, pruned_loss=0.02872, over 4976.00 frames.], tot_loss[loss=0.1483, simple_loss=0.2196, pruned_loss=0.03851, over 971651.89 frames.], batch size: 14, lr: 2.98e-04 2022-05-05 19:19:49,672 INFO [train.py:715] (0/8) Epoch 7, batch 7300, loss[loss=0.1256, simple_loss=0.2057, pruned_loss=0.02276, over 4888.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2196, pruned_loss=0.03794, over 971610.45 frames.], batch size: 19, lr: 2.98e-04 2022-05-05 19:20:28,205 INFO [train.py:715] (0/8) Epoch 7, batch 7350, loss[loss=0.1627, simple_loss=0.2241, pruned_loss=0.05064, over 4985.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2188, pruned_loss=0.03799, over 971788.19 frames.], batch size: 35, lr: 2.98e-04 2022-05-05 19:21:06,661 INFO [train.py:715] (0/8) Epoch 7, batch 7400, loss[loss=0.1524, simple_loss=0.2242, pruned_loss=0.04027, over 4864.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2186, pruned_loss=0.03752, over 971006.42 frames.], batch size: 16, lr: 2.98e-04 2022-05-05 19:21:45,790 INFO [train.py:715] (0/8) Epoch 7, batch 7450, loss[loss=0.113, simple_loss=0.1834, pruned_loss=0.02132, over 4794.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2177, pruned_loss=0.03708, over 971083.48 frames.], batch size: 14, lr: 2.98e-04 2022-05-05 19:22:23,997 INFO [train.py:715] (0/8) Epoch 7, batch 7500, loss[loss=0.171, simple_loss=0.221, pruned_loss=0.06052, over 4746.00 frames.], tot_loss[loss=0.146, simple_loss=0.2179, pruned_loss=0.03712, over 971235.42 frames.], batch size: 16, lr: 2.98e-04 2022-05-05 19:23:02,796 INFO [train.py:715] (0/8) Epoch 7, batch 7550, loss[loss=0.1501, simple_loss=0.228, pruned_loss=0.03613, over 4810.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2182, pruned_loss=0.03728, over 971655.68 frames.], batch size: 26, lr: 2.98e-04 2022-05-05 19:23:41,660 INFO [train.py:715] (0/8) Epoch 7, batch 7600, loss[loss=0.1368, simple_loss=0.2001, pruned_loss=0.03677, over 4927.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2175, pruned_loss=0.03693, over 971662.05 frames.], batch size: 18, lr: 2.98e-04 2022-05-05 19:24:20,768 INFO [train.py:715] (0/8) Epoch 7, batch 7650, loss[loss=0.1483, simple_loss=0.2105, pruned_loss=0.04298, over 4834.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2168, pruned_loss=0.03667, over 972379.43 frames.], batch size: 15, lr: 2.98e-04 2022-05-05 19:24:59,079 INFO [train.py:715] (0/8) Epoch 7, batch 7700, loss[loss=0.1358, simple_loss=0.2032, pruned_loss=0.03418, over 4854.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2171, pruned_loss=0.03682, over 972246.17 frames.], batch size: 30, lr: 2.98e-04 2022-05-05 19:25:38,045 INFO [train.py:715] (0/8) Epoch 7, batch 7750, loss[loss=0.1371, simple_loss=0.2127, pruned_loss=0.03075, over 4773.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2178, pruned_loss=0.03745, over 971226.48 frames.], batch size: 17, lr: 2.98e-04 2022-05-05 19:26:17,067 INFO [train.py:715] (0/8) Epoch 7, batch 7800, loss[loss=0.1677, simple_loss=0.2374, pruned_loss=0.04899, over 4700.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2186, pruned_loss=0.03814, over 971392.03 frames.], batch size: 15, lr: 2.98e-04 2022-05-05 19:26:55,226 INFO [train.py:715] (0/8) Epoch 7, batch 7850, loss[loss=0.147, simple_loss=0.2227, pruned_loss=0.03562, over 4963.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2191, pruned_loss=0.03808, over 971527.67 frames.], batch size: 15, lr: 2.98e-04 2022-05-05 19:27:34,423 INFO [train.py:715] (0/8) Epoch 7, batch 7900, loss[loss=0.1307, simple_loss=0.2074, pruned_loss=0.02707, over 4731.00 frames.], tot_loss[loss=0.1484, simple_loss=0.2197, pruned_loss=0.03857, over 972210.67 frames.], batch size: 16, lr: 2.98e-04 2022-05-05 19:28:13,171 INFO [train.py:715] (0/8) Epoch 7, batch 7950, loss[loss=0.149, simple_loss=0.214, pruned_loss=0.04201, over 4977.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2195, pruned_loss=0.03837, over 972358.96 frames.], batch size: 15, lr: 2.98e-04 2022-05-05 19:28:52,646 INFO [train.py:715] (0/8) Epoch 7, batch 8000, loss[loss=0.1559, simple_loss=0.2285, pruned_loss=0.04163, over 4812.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2188, pruned_loss=0.03791, over 972792.62 frames.], batch size: 21, lr: 2.98e-04 2022-05-05 19:29:30,734 INFO [train.py:715] (0/8) Epoch 7, batch 8050, loss[loss=0.1354, simple_loss=0.2033, pruned_loss=0.03371, over 4874.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2181, pruned_loss=0.03762, over 973220.08 frames.], batch size: 16, lr: 2.98e-04 2022-05-05 19:30:09,294 INFO [train.py:715] (0/8) Epoch 7, batch 8100, loss[loss=0.1443, simple_loss=0.2172, pruned_loss=0.03564, over 4940.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2183, pruned_loss=0.03756, over 971954.56 frames.], batch size: 21, lr: 2.98e-04 2022-05-05 19:30:48,378 INFO [train.py:715] (0/8) Epoch 7, batch 8150, loss[loss=0.1563, simple_loss=0.2185, pruned_loss=0.04706, over 4831.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03762, over 971810.12 frames.], batch size: 13, lr: 2.98e-04 2022-05-05 19:31:26,681 INFO [train.py:715] (0/8) Epoch 7, batch 8200, loss[loss=0.1657, simple_loss=0.2355, pruned_loss=0.04796, over 4935.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2173, pruned_loss=0.0373, over 970632.21 frames.], batch size: 18, lr: 2.98e-04 2022-05-05 19:32:05,123 INFO [train.py:715] (0/8) Epoch 7, batch 8250, loss[loss=0.1598, simple_loss=0.2287, pruned_loss=0.04542, over 4748.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2178, pruned_loss=0.03788, over 970961.27 frames.], batch size: 19, lr: 2.98e-04 2022-05-05 19:32:43,778 INFO [train.py:715] (0/8) Epoch 7, batch 8300, loss[loss=0.1646, simple_loss=0.2279, pruned_loss=0.05059, over 4957.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2174, pruned_loss=0.03753, over 972049.14 frames.], batch size: 21, lr: 2.98e-04 2022-05-05 19:33:22,688 INFO [train.py:715] (0/8) Epoch 7, batch 8350, loss[loss=0.1421, simple_loss=0.2056, pruned_loss=0.03927, over 4913.00 frames.], tot_loss[loss=0.1469, simple_loss=0.218, pruned_loss=0.0379, over 972323.66 frames.], batch size: 17, lr: 2.98e-04 2022-05-05 19:34:00,640 INFO [train.py:715] (0/8) Epoch 7, batch 8400, loss[loss=0.1569, simple_loss=0.2154, pruned_loss=0.04919, over 4851.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2184, pruned_loss=0.0384, over 972783.63 frames.], batch size: 30, lr: 2.98e-04 2022-05-05 19:34:39,717 INFO [train.py:715] (0/8) Epoch 7, batch 8450, loss[loss=0.2081, simple_loss=0.278, pruned_loss=0.06914, over 4970.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2176, pruned_loss=0.03753, over 972741.06 frames.], batch size: 24, lr: 2.98e-04 2022-05-05 19:35:18,873 INFO [train.py:715] (0/8) Epoch 7, batch 8500, loss[loss=0.1446, simple_loss=0.2254, pruned_loss=0.03191, over 4899.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2173, pruned_loss=0.0374, over 972797.16 frames.], batch size: 17, lr: 2.98e-04 2022-05-05 19:35:58,052 INFO [train.py:715] (0/8) Epoch 7, batch 8550, loss[loss=0.154, simple_loss=0.2271, pruned_loss=0.04046, over 4862.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2168, pruned_loss=0.03732, over 972808.39 frames.], batch size: 20, lr: 2.97e-04 2022-05-05 19:36:36,293 INFO [train.py:715] (0/8) Epoch 7, batch 8600, loss[loss=0.1495, simple_loss=0.2168, pruned_loss=0.04107, over 4881.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2171, pruned_loss=0.03718, over 972824.17 frames.], batch size: 32, lr: 2.97e-04 2022-05-05 19:37:14,957 INFO [train.py:715] (0/8) Epoch 7, batch 8650, loss[loss=0.1252, simple_loss=0.1951, pruned_loss=0.02765, over 4900.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2165, pruned_loss=0.03689, over 972000.07 frames.], batch size: 22, lr: 2.97e-04 2022-05-05 19:37:54,307 INFO [train.py:715] (0/8) Epoch 7, batch 8700, loss[loss=0.1236, simple_loss=0.1976, pruned_loss=0.02474, over 4837.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03712, over 971932.73 frames.], batch size: 30, lr: 2.97e-04 2022-05-05 19:38:32,515 INFO [train.py:715] (0/8) Epoch 7, batch 8750, loss[loss=0.1702, simple_loss=0.2403, pruned_loss=0.05003, over 4897.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2177, pruned_loss=0.0368, over 972407.61 frames.], batch size: 17, lr: 2.97e-04 2022-05-05 19:39:11,383 INFO [train.py:715] (0/8) Epoch 7, batch 8800, loss[loss=0.1569, simple_loss=0.2327, pruned_loss=0.04056, over 4806.00 frames.], tot_loss[loss=0.1458, simple_loss=0.218, pruned_loss=0.03681, over 972149.15 frames.], batch size: 21, lr: 2.97e-04 2022-05-05 19:39:50,315 INFO [train.py:715] (0/8) Epoch 7, batch 8850, loss[loss=0.1804, simple_loss=0.2464, pruned_loss=0.05723, over 4858.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03705, over 972542.72 frames.], batch size: 16, lr: 2.97e-04 2022-05-05 19:40:30,006 INFO [train.py:715] (0/8) Epoch 7, batch 8900, loss[loss=0.1411, simple_loss=0.2154, pruned_loss=0.03341, over 4976.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03701, over 972571.53 frames.], batch size: 25, lr: 2.97e-04 2022-05-05 19:41:08,235 INFO [train.py:715] (0/8) Epoch 7, batch 8950, loss[loss=0.1756, simple_loss=0.2533, pruned_loss=0.04894, over 4781.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2192, pruned_loss=0.03797, over 973174.05 frames.], batch size: 17, lr: 2.97e-04 2022-05-05 19:41:46,833 INFO [train.py:715] (0/8) Epoch 7, batch 9000, loss[loss=0.1584, simple_loss=0.2322, pruned_loss=0.04227, over 4890.00 frames.], tot_loss[loss=0.147, simple_loss=0.2186, pruned_loss=0.03769, over 972278.36 frames.], batch size: 22, lr: 2.97e-04 2022-05-05 19:41:46,834 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 19:41:56,560 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1085, simple_loss=0.1932, pruned_loss=0.01192, over 914524.00 frames. 2022-05-05 19:42:35,334 INFO [train.py:715] (0/8) Epoch 7, batch 9050, loss[loss=0.1382, simple_loss=0.2093, pruned_loss=0.03356, over 4881.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2186, pruned_loss=0.0375, over 972114.22 frames.], batch size: 16, lr: 2.97e-04 2022-05-05 19:43:15,392 INFO [train.py:715] (0/8) Epoch 7, batch 9100, loss[loss=0.1677, simple_loss=0.2359, pruned_loss=0.04979, over 4750.00 frames.], tot_loss[loss=0.147, simple_loss=0.2187, pruned_loss=0.03763, over 972269.70 frames.], batch size: 16, lr: 2.97e-04 2022-05-05 19:43:54,067 INFO [train.py:715] (0/8) Epoch 7, batch 9150, loss[loss=0.139, simple_loss=0.2243, pruned_loss=0.02687, over 4962.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2183, pruned_loss=0.03714, over 972013.94 frames.], batch size: 24, lr: 2.97e-04 2022-05-05 19:44:32,871 INFO [train.py:715] (0/8) Epoch 7, batch 9200, loss[loss=0.1963, simple_loss=0.2691, pruned_loss=0.06171, over 4779.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03686, over 971614.12 frames.], batch size: 18, lr: 2.97e-04 2022-05-05 19:45:12,205 INFO [train.py:715] (0/8) Epoch 7, batch 9250, loss[loss=0.1629, simple_loss=0.237, pruned_loss=0.04443, over 4767.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2179, pruned_loss=0.03665, over 971773.45 frames.], batch size: 18, lr: 2.97e-04 2022-05-05 19:45:51,289 INFO [train.py:715] (0/8) Epoch 7, batch 9300, loss[loss=0.127, simple_loss=0.1955, pruned_loss=0.02923, over 4971.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2178, pruned_loss=0.03685, over 971052.61 frames.], batch size: 24, lr: 2.97e-04 2022-05-05 19:46:30,346 INFO [train.py:715] (0/8) Epoch 7, batch 9350, loss[loss=0.1293, simple_loss=0.2034, pruned_loss=0.02763, over 4821.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2165, pruned_loss=0.03628, over 971446.19 frames.], batch size: 13, lr: 2.97e-04 2022-05-05 19:47:08,477 INFO [train.py:715] (0/8) Epoch 7, batch 9400, loss[loss=0.147, simple_loss=0.2118, pruned_loss=0.04104, over 4849.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2166, pruned_loss=0.03638, over 971873.51 frames.], batch size: 32, lr: 2.97e-04 2022-05-05 19:47:48,270 INFO [train.py:715] (0/8) Epoch 7, batch 9450, loss[loss=0.1458, simple_loss=0.2171, pruned_loss=0.03729, over 4904.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2167, pruned_loss=0.03656, over 972067.28 frames.], batch size: 17, lr: 2.97e-04 2022-05-05 19:48:27,276 INFO [train.py:715] (0/8) Epoch 7, batch 9500, loss[loss=0.1403, simple_loss=0.2045, pruned_loss=0.03807, over 4795.00 frames.], tot_loss[loss=0.146, simple_loss=0.2175, pruned_loss=0.03722, over 971318.60 frames.], batch size: 14, lr: 2.97e-04 2022-05-05 19:49:05,876 INFO [train.py:715] (0/8) Epoch 7, batch 9550, loss[loss=0.1255, simple_loss=0.2036, pruned_loss=0.02368, over 4951.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03706, over 971042.60 frames.], batch size: 21, lr: 2.97e-04 2022-05-05 19:49:44,838 INFO [train.py:715] (0/8) Epoch 7, batch 9600, loss[loss=0.1679, simple_loss=0.2378, pruned_loss=0.04903, over 4821.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2185, pruned_loss=0.03781, over 971769.65 frames.], batch size: 26, lr: 2.97e-04 2022-05-05 19:50:23,438 INFO [train.py:715] (0/8) Epoch 7, batch 9650, loss[loss=0.131, simple_loss=0.2034, pruned_loss=0.02928, over 4982.00 frames.], tot_loss[loss=0.147, simple_loss=0.2181, pruned_loss=0.03793, over 972001.32 frames.], batch size: 25, lr: 2.97e-04 2022-05-05 19:51:02,956 INFO [train.py:715] (0/8) Epoch 7, batch 9700, loss[loss=0.1575, simple_loss=0.2282, pruned_loss=0.04335, over 4968.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2183, pruned_loss=0.03815, over 971733.88 frames.], batch size: 31, lr: 2.97e-04 2022-05-05 19:51:41,571 INFO [train.py:715] (0/8) Epoch 7, batch 9750, loss[loss=0.1622, simple_loss=0.2276, pruned_loss=0.04844, over 4985.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2183, pruned_loss=0.0385, over 972612.01 frames.], batch size: 28, lr: 2.97e-04 2022-05-05 19:52:20,958 INFO [train.py:715] (0/8) Epoch 7, batch 9800, loss[loss=0.1156, simple_loss=0.1865, pruned_loss=0.0224, over 4817.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2186, pruned_loss=0.03833, over 972483.15 frames.], batch size: 26, lr: 2.97e-04 2022-05-05 19:52:59,040 INFO [train.py:715] (0/8) Epoch 7, batch 9850, loss[loss=0.2173, simple_loss=0.2767, pruned_loss=0.079, over 4891.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2185, pruned_loss=0.03816, over 972688.01 frames.], batch size: 16, lr: 2.97e-04 2022-05-05 19:53:37,271 INFO [train.py:715] (0/8) Epoch 7, batch 9900, loss[loss=0.1327, simple_loss=0.219, pruned_loss=0.02323, over 4787.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2182, pruned_loss=0.03767, over 972401.75 frames.], batch size: 18, lr: 2.97e-04 2022-05-05 19:54:16,176 INFO [train.py:715] (0/8) Epoch 7, batch 9950, loss[loss=0.1445, simple_loss=0.2275, pruned_loss=0.03078, over 4794.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03756, over 972447.30 frames.], batch size: 24, lr: 2.97e-04 2022-05-05 19:54:55,284 INFO [train.py:715] (0/8) Epoch 7, batch 10000, loss[loss=0.1254, simple_loss=0.1937, pruned_loss=0.02856, over 4885.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2182, pruned_loss=0.03764, over 972570.78 frames.], batch size: 16, lr: 2.97e-04 2022-05-05 19:55:33,940 INFO [train.py:715] (0/8) Epoch 7, batch 10050, loss[loss=0.1265, simple_loss=0.1991, pruned_loss=0.02694, over 4821.00 frames.], tot_loss[loss=0.146, simple_loss=0.2177, pruned_loss=0.03714, over 972138.86 frames.], batch size: 13, lr: 2.97e-04 2022-05-05 19:56:12,503 INFO [train.py:715] (0/8) Epoch 7, batch 10100, loss[loss=0.1677, simple_loss=0.2394, pruned_loss=0.04801, over 4829.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2184, pruned_loss=0.03763, over 972100.08 frames.], batch size: 25, lr: 2.97e-04 2022-05-05 19:56:51,795 INFO [train.py:715] (0/8) Epoch 7, batch 10150, loss[loss=0.1468, simple_loss=0.2075, pruned_loss=0.04305, over 4978.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2184, pruned_loss=0.03728, over 972282.82 frames.], batch size: 15, lr: 2.97e-04 2022-05-05 19:57:30,413 INFO [train.py:715] (0/8) Epoch 7, batch 10200, loss[loss=0.1479, simple_loss=0.2255, pruned_loss=0.03519, over 4790.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03758, over 972176.47 frames.], batch size: 18, lr: 2.97e-04 2022-05-05 19:58:09,057 INFO [train.py:715] (0/8) Epoch 7, batch 10250, loss[loss=0.1579, simple_loss=0.2358, pruned_loss=0.03997, over 4911.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2183, pruned_loss=0.03718, over 972469.89 frames.], batch size: 18, lr: 2.96e-04 2022-05-05 19:58:48,250 INFO [train.py:715] (0/8) Epoch 7, batch 10300, loss[loss=0.1632, simple_loss=0.232, pruned_loss=0.04724, over 4800.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2186, pruned_loss=0.03722, over 971327.07 frames.], batch size: 21, lr: 2.96e-04 2022-05-05 19:59:26,897 INFO [train.py:715] (0/8) Epoch 7, batch 10350, loss[loss=0.1275, simple_loss=0.2026, pruned_loss=0.02626, over 4825.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2187, pruned_loss=0.03738, over 971288.00 frames.], batch size: 27, lr: 2.96e-04 2022-05-05 20:00:05,909 INFO [train.py:715] (0/8) Epoch 7, batch 10400, loss[loss=0.1451, simple_loss=0.2152, pruned_loss=0.03749, over 4802.00 frames.], tot_loss[loss=0.1473, simple_loss=0.219, pruned_loss=0.03786, over 971479.58 frames.], batch size: 25, lr: 2.96e-04 2022-05-05 20:00:44,690 INFO [train.py:715] (0/8) Epoch 7, batch 10450, loss[loss=0.139, simple_loss=0.2066, pruned_loss=0.03567, over 4784.00 frames.], tot_loss[loss=0.1476, simple_loss=0.2192, pruned_loss=0.03801, over 971471.59 frames.], batch size: 12, lr: 2.96e-04 2022-05-05 20:01:24,293 INFO [train.py:715] (0/8) Epoch 7, batch 10500, loss[loss=0.1406, simple_loss=0.213, pruned_loss=0.03411, over 4966.00 frames.], tot_loss[loss=0.147, simple_loss=0.2185, pruned_loss=0.0377, over 971602.34 frames.], batch size: 24, lr: 2.96e-04 2022-05-05 20:02:03,020 INFO [train.py:715] (0/8) Epoch 7, batch 10550, loss[loss=0.153, simple_loss=0.2236, pruned_loss=0.04126, over 4940.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2183, pruned_loss=0.03756, over 972104.32 frames.], batch size: 21, lr: 2.96e-04 2022-05-05 20:02:41,162 INFO [train.py:715] (0/8) Epoch 7, batch 10600, loss[loss=0.1543, simple_loss=0.2244, pruned_loss=0.04217, over 4888.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2187, pruned_loss=0.03773, over 972879.63 frames.], batch size: 22, lr: 2.96e-04 2022-05-05 20:03:20,353 INFO [train.py:715] (0/8) Epoch 7, batch 10650, loss[loss=0.1299, simple_loss=0.202, pruned_loss=0.02886, over 4922.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2179, pruned_loss=0.03717, over 972386.08 frames.], batch size: 23, lr: 2.96e-04 2022-05-05 20:03:59,391 INFO [train.py:715] (0/8) Epoch 7, batch 10700, loss[loss=0.1261, simple_loss=0.2011, pruned_loss=0.02553, over 4687.00 frames.], tot_loss[loss=0.146, simple_loss=0.218, pruned_loss=0.03703, over 972535.66 frames.], batch size: 15, lr: 2.96e-04 2022-05-05 20:04:38,879 INFO [train.py:715] (0/8) Epoch 7, batch 10750, loss[loss=0.1342, simple_loss=0.2076, pruned_loss=0.03034, over 4976.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2173, pruned_loss=0.03663, over 971733.34 frames.], batch size: 35, lr: 2.96e-04 2022-05-05 20:05:17,663 INFO [train.py:715] (0/8) Epoch 7, batch 10800, loss[loss=0.1559, simple_loss=0.2346, pruned_loss=0.03858, over 4849.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2176, pruned_loss=0.03663, over 972247.12 frames.], batch size: 20, lr: 2.96e-04 2022-05-05 20:05:57,419 INFO [train.py:715] (0/8) Epoch 7, batch 10850, loss[loss=0.1257, simple_loss=0.1986, pruned_loss=0.0264, over 4752.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03681, over 972639.76 frames.], batch size: 12, lr: 2.96e-04 2022-05-05 20:06:35,663 INFO [train.py:715] (0/8) Epoch 7, batch 10900, loss[loss=0.1713, simple_loss=0.2376, pruned_loss=0.05249, over 4924.00 frames.], tot_loss[loss=0.146, simple_loss=0.2175, pruned_loss=0.03722, over 972834.03 frames.], batch size: 35, lr: 2.96e-04 2022-05-05 20:07:14,753 INFO [train.py:715] (0/8) Epoch 7, batch 10950, loss[loss=0.1793, simple_loss=0.2487, pruned_loss=0.05497, over 4799.00 frames.], tot_loss[loss=0.145, simple_loss=0.2166, pruned_loss=0.03671, over 972360.12 frames.], batch size: 21, lr: 2.96e-04 2022-05-05 20:07:53,905 INFO [train.py:715] (0/8) Epoch 7, batch 11000, loss[loss=0.1083, simple_loss=0.1776, pruned_loss=0.01948, over 4864.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03653, over 971808.58 frames.], batch size: 12, lr: 2.96e-04 2022-05-05 20:08:32,744 INFO [train.py:715] (0/8) Epoch 7, batch 11050, loss[loss=0.1523, simple_loss=0.22, pruned_loss=0.04228, over 4865.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2178, pruned_loss=0.0372, over 971482.99 frames.], batch size: 30, lr: 2.96e-04 2022-05-05 20:09:11,468 INFO [train.py:715] (0/8) Epoch 7, batch 11100, loss[loss=0.1493, simple_loss=0.2239, pruned_loss=0.03728, over 4833.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2163, pruned_loss=0.0365, over 971953.18 frames.], batch size: 13, lr: 2.96e-04 2022-05-05 20:09:50,079 INFO [train.py:715] (0/8) Epoch 7, batch 11150, loss[loss=0.1386, simple_loss=0.2177, pruned_loss=0.02974, over 4741.00 frames.], tot_loss[loss=0.145, simple_loss=0.2166, pruned_loss=0.03666, over 971525.72 frames.], batch size: 16, lr: 2.96e-04 2022-05-05 20:10:29,707 INFO [train.py:715] (0/8) Epoch 7, batch 11200, loss[loss=0.1377, simple_loss=0.2128, pruned_loss=0.03132, over 4900.00 frames.], tot_loss[loss=0.1466, simple_loss=0.218, pruned_loss=0.03766, over 972215.80 frames.], batch size: 19, lr: 2.96e-04 2022-05-05 20:11:08,076 INFO [train.py:715] (0/8) Epoch 7, batch 11250, loss[loss=0.1276, simple_loss=0.1985, pruned_loss=0.02836, over 4967.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2179, pruned_loss=0.03792, over 971799.54 frames.], batch size: 14, lr: 2.96e-04 2022-05-05 20:11:46,236 INFO [train.py:715] (0/8) Epoch 7, batch 11300, loss[loss=0.1225, simple_loss=0.196, pruned_loss=0.02451, over 4988.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2174, pruned_loss=0.03751, over 972551.70 frames.], batch size: 28, lr: 2.96e-04 2022-05-05 20:12:25,977 INFO [train.py:715] (0/8) Epoch 7, batch 11350, loss[loss=0.136, simple_loss=0.2054, pruned_loss=0.03323, over 4977.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2169, pruned_loss=0.03678, over 972548.54 frames.], batch size: 25, lr: 2.96e-04 2022-05-05 20:13:04,520 INFO [train.py:715] (0/8) Epoch 7, batch 11400, loss[loss=0.1574, simple_loss=0.2259, pruned_loss=0.04443, over 4977.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2167, pruned_loss=0.03677, over 971838.36 frames.], batch size: 28, lr: 2.96e-04 2022-05-05 20:13:43,551 INFO [train.py:715] (0/8) Epoch 7, batch 11450, loss[loss=0.1264, simple_loss=0.2134, pruned_loss=0.01969, over 4988.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2174, pruned_loss=0.03688, over 972379.97 frames.], batch size: 28, lr: 2.96e-04 2022-05-05 20:14:22,144 INFO [train.py:715] (0/8) Epoch 7, batch 11500, loss[loss=0.1583, simple_loss=0.2126, pruned_loss=0.05198, over 4687.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2174, pruned_loss=0.0369, over 972072.39 frames.], batch size: 15, lr: 2.96e-04 2022-05-05 20:15:01,727 INFO [train.py:715] (0/8) Epoch 7, batch 11550, loss[loss=0.1624, simple_loss=0.2326, pruned_loss=0.04607, over 4851.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.0371, over 972141.05 frames.], batch size: 15, lr: 2.96e-04 2022-05-05 20:15:39,997 INFO [train.py:715] (0/8) Epoch 7, batch 11600, loss[loss=0.1484, simple_loss=0.2129, pruned_loss=0.04196, over 4926.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2171, pruned_loss=0.03682, over 972746.98 frames.], batch size: 18, lr: 2.96e-04 2022-05-05 20:16:18,806 INFO [train.py:715] (0/8) Epoch 7, batch 11650, loss[loss=0.1344, simple_loss=0.2023, pruned_loss=0.03324, over 4853.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2159, pruned_loss=0.03612, over 972171.26 frames.], batch size: 20, lr: 2.96e-04 2022-05-05 20:16:58,201 INFO [train.py:715] (0/8) Epoch 7, batch 11700, loss[loss=0.1477, simple_loss=0.2189, pruned_loss=0.03821, over 4813.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2156, pruned_loss=0.03614, over 972021.63 frames.], batch size: 13, lr: 2.96e-04 2022-05-05 20:17:36,278 INFO [train.py:715] (0/8) Epoch 7, batch 11750, loss[loss=0.1412, simple_loss=0.2225, pruned_loss=0.02995, over 4954.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2152, pruned_loss=0.03578, over 971600.58 frames.], batch size: 21, lr: 2.96e-04 2022-05-05 20:18:15,074 INFO [train.py:715] (0/8) Epoch 7, batch 11800, loss[loss=0.1572, simple_loss=0.235, pruned_loss=0.03964, over 4804.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2157, pruned_loss=0.03543, over 972238.23 frames.], batch size: 21, lr: 2.96e-04 2022-05-05 20:18:54,263 INFO [train.py:715] (0/8) Epoch 7, batch 11850, loss[loss=0.1309, simple_loss=0.209, pruned_loss=0.02641, over 4975.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2157, pruned_loss=0.03564, over 972550.65 frames.], batch size: 15, lr: 2.96e-04 2022-05-05 20:19:32,624 INFO [train.py:715] (0/8) Epoch 7, batch 11900, loss[loss=0.1306, simple_loss=0.2166, pruned_loss=0.02228, over 4973.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2163, pruned_loss=0.03633, over 972513.99 frames.], batch size: 40, lr: 2.96e-04 2022-05-05 20:20:11,919 INFO [train.py:715] (0/8) Epoch 7, batch 11950, loss[loss=0.1502, simple_loss=0.2266, pruned_loss=0.03685, over 4983.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2167, pruned_loss=0.03646, over 973094.59 frames.], batch size: 25, lr: 2.96e-04 2022-05-05 20:20:50,617 INFO [train.py:715] (0/8) Epoch 7, batch 12000, loss[loss=0.1781, simple_loss=0.2459, pruned_loss=0.05515, over 4837.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2171, pruned_loss=0.03665, over 973232.14 frames.], batch size: 15, lr: 2.95e-04 2022-05-05 20:20:50,618 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 20:21:00,228 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.108, simple_loss=0.193, pruned_loss=0.01154, over 914524.00 frames. 2022-05-05 20:21:38,892 INFO [train.py:715] (0/8) Epoch 7, batch 12050, loss[loss=0.1353, simple_loss=0.2198, pruned_loss=0.02539, over 4989.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2171, pruned_loss=0.03709, over 973336.43 frames.], batch size: 28, lr: 2.95e-04 2022-05-05 20:22:18,259 INFO [train.py:715] (0/8) Epoch 7, batch 12100, loss[loss=0.1323, simple_loss=0.1988, pruned_loss=0.03291, over 4930.00 frames.], tot_loss[loss=0.146, simple_loss=0.2174, pruned_loss=0.03732, over 972998.98 frames.], batch size: 29, lr: 2.95e-04 2022-05-05 20:22:56,852 INFO [train.py:715] (0/8) Epoch 7, batch 12150, loss[loss=0.1623, simple_loss=0.228, pruned_loss=0.04832, over 4905.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2179, pruned_loss=0.03727, over 972845.42 frames.], batch size: 19, lr: 2.95e-04 2022-05-05 20:23:35,614 INFO [train.py:715] (0/8) Epoch 7, batch 12200, loss[loss=0.1369, simple_loss=0.215, pruned_loss=0.02935, over 4783.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2176, pruned_loss=0.03693, over 972809.12 frames.], batch size: 18, lr: 2.95e-04 2022-05-05 20:24:14,742 INFO [train.py:715] (0/8) Epoch 7, batch 12250, loss[loss=0.1362, simple_loss=0.2058, pruned_loss=0.0333, over 4832.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2173, pruned_loss=0.03692, over 972746.21 frames.], batch size: 15, lr: 2.95e-04 2022-05-05 20:24:53,358 INFO [train.py:715] (0/8) Epoch 7, batch 12300, loss[loss=0.1776, simple_loss=0.2512, pruned_loss=0.05198, over 4968.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2173, pruned_loss=0.03712, over 973318.48 frames.], batch size: 24, lr: 2.95e-04 2022-05-05 20:25:18,624 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-256000.pt 2022-05-05 20:25:35,087 INFO [train.py:715] (0/8) Epoch 7, batch 12350, loss[loss=0.1424, simple_loss=0.2217, pruned_loss=0.03151, over 4933.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03704, over 972974.73 frames.], batch size: 21, lr: 2.95e-04 2022-05-05 20:26:13,787 INFO [train.py:715] (0/8) Epoch 7, batch 12400, loss[loss=0.1571, simple_loss=0.2322, pruned_loss=0.04101, over 4927.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2173, pruned_loss=0.03694, over 973103.21 frames.], batch size: 23, lr: 2.95e-04 2022-05-05 20:26:53,000 INFO [train.py:715] (0/8) Epoch 7, batch 12450, loss[loss=0.1597, simple_loss=0.2267, pruned_loss=0.04638, over 4959.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03709, over 973068.01 frames.], batch size: 24, lr: 2.95e-04 2022-05-05 20:27:31,399 INFO [train.py:715] (0/8) Epoch 7, batch 12500, loss[loss=0.1737, simple_loss=0.2382, pruned_loss=0.05462, over 4987.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2178, pruned_loss=0.03721, over 972846.84 frames.], batch size: 25, lr: 2.95e-04 2022-05-05 20:28:10,095 INFO [train.py:715] (0/8) Epoch 7, batch 12550, loss[loss=0.1434, simple_loss=0.2184, pruned_loss=0.03421, over 4907.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2178, pruned_loss=0.0377, over 972257.11 frames.], batch size: 18, lr: 2.95e-04 2022-05-05 20:28:49,192 INFO [train.py:715] (0/8) Epoch 7, batch 12600, loss[loss=0.1426, simple_loss=0.2212, pruned_loss=0.03198, over 4757.00 frames.], tot_loss[loss=0.1468, simple_loss=0.218, pruned_loss=0.03787, over 972639.58 frames.], batch size: 19, lr: 2.95e-04 2022-05-05 20:29:27,373 INFO [train.py:715] (0/8) Epoch 7, batch 12650, loss[loss=0.1343, simple_loss=0.2129, pruned_loss=0.0278, over 4766.00 frames.], tot_loss[loss=0.1473, simple_loss=0.2187, pruned_loss=0.03793, over 971910.23 frames.], batch size: 19, lr: 2.95e-04 2022-05-05 20:30:06,575 INFO [train.py:715] (0/8) Epoch 7, batch 12700, loss[loss=0.1406, simple_loss=0.2044, pruned_loss=0.03839, over 4683.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2185, pruned_loss=0.03761, over 971395.20 frames.], batch size: 15, lr: 2.95e-04 2022-05-05 20:30:44,736 INFO [train.py:715] (0/8) Epoch 7, batch 12750, loss[loss=0.1415, simple_loss=0.2162, pruned_loss=0.03345, over 4930.00 frames.], tot_loss[loss=0.1474, simple_loss=0.219, pruned_loss=0.03789, over 972660.43 frames.], batch size: 23, lr: 2.95e-04 2022-05-05 20:31:23,965 INFO [train.py:715] (0/8) Epoch 7, batch 12800, loss[loss=0.1252, simple_loss=0.2021, pruned_loss=0.02417, over 4915.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2182, pruned_loss=0.03729, over 972718.49 frames.], batch size: 19, lr: 2.95e-04 2022-05-05 20:32:02,913 INFO [train.py:715] (0/8) Epoch 7, batch 12850, loss[loss=0.135, simple_loss=0.2016, pruned_loss=0.03416, over 4966.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2167, pruned_loss=0.03647, over 972829.20 frames.], batch size: 35, lr: 2.95e-04 2022-05-05 20:32:41,508 INFO [train.py:715] (0/8) Epoch 7, batch 12900, loss[loss=0.1875, simple_loss=0.2539, pruned_loss=0.0606, over 4890.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2174, pruned_loss=0.03687, over 972699.30 frames.], batch size: 22, lr: 2.95e-04 2022-05-05 20:33:20,982 INFO [train.py:715] (0/8) Epoch 7, batch 12950, loss[loss=0.1384, simple_loss=0.2122, pruned_loss=0.03229, over 4877.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2184, pruned_loss=0.03739, over 973141.10 frames.], batch size: 20, lr: 2.95e-04 2022-05-05 20:33:59,927 INFO [train.py:715] (0/8) Epoch 7, batch 13000, loss[loss=0.1414, simple_loss=0.2166, pruned_loss=0.03315, over 4963.00 frames.], tot_loss[loss=0.147, simple_loss=0.2185, pruned_loss=0.03775, over 972841.93 frames.], batch size: 21, lr: 2.95e-04 2022-05-05 20:34:38,875 INFO [train.py:715] (0/8) Epoch 7, batch 13050, loss[loss=0.121, simple_loss=0.1969, pruned_loss=0.02252, over 4987.00 frames.], tot_loss[loss=0.146, simple_loss=0.2177, pruned_loss=0.03716, over 973319.41 frames.], batch size: 25, lr: 2.95e-04 2022-05-05 20:35:17,654 INFO [train.py:715] (0/8) Epoch 7, batch 13100, loss[loss=0.1366, simple_loss=0.2064, pruned_loss=0.03342, over 4824.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2179, pruned_loss=0.03735, over 972879.83 frames.], batch size: 26, lr: 2.95e-04 2022-05-05 20:35:57,323 INFO [train.py:715] (0/8) Epoch 7, batch 13150, loss[loss=0.1499, simple_loss=0.2301, pruned_loss=0.03484, over 4704.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03761, over 972806.81 frames.], batch size: 15, lr: 2.95e-04 2022-05-05 20:36:35,849 INFO [train.py:715] (0/8) Epoch 7, batch 13200, loss[loss=0.131, simple_loss=0.2069, pruned_loss=0.02758, over 4769.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2188, pruned_loss=0.03796, over 972635.93 frames.], batch size: 17, lr: 2.95e-04 2022-05-05 20:37:15,489 INFO [train.py:715] (0/8) Epoch 7, batch 13250, loss[loss=0.1583, simple_loss=0.2221, pruned_loss=0.04728, over 4937.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2184, pruned_loss=0.03758, over 973151.17 frames.], batch size: 23, lr: 2.95e-04 2022-05-05 20:37:54,873 INFO [train.py:715] (0/8) Epoch 7, batch 13300, loss[loss=0.1505, simple_loss=0.2207, pruned_loss=0.04012, over 4913.00 frames.], tot_loss[loss=0.1465, simple_loss=0.218, pruned_loss=0.03745, over 974434.98 frames.], batch size: 18, lr: 2.95e-04 2022-05-05 20:38:33,803 INFO [train.py:715] (0/8) Epoch 7, batch 13350, loss[loss=0.1326, simple_loss=0.2061, pruned_loss=0.02955, over 4768.00 frames.], tot_loss[loss=0.1466, simple_loss=0.218, pruned_loss=0.03757, over 973365.35 frames.], batch size: 18, lr: 2.95e-04 2022-05-05 20:39:12,809 INFO [train.py:715] (0/8) Epoch 7, batch 13400, loss[loss=0.1229, simple_loss=0.1973, pruned_loss=0.02431, over 4965.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2173, pruned_loss=0.03715, over 973185.02 frames.], batch size: 15, lr: 2.95e-04 2022-05-05 20:39:51,467 INFO [train.py:715] (0/8) Epoch 7, batch 13450, loss[loss=0.1568, simple_loss=0.234, pruned_loss=0.03986, over 4756.00 frames.], tot_loss[loss=0.146, simple_loss=0.2173, pruned_loss=0.0373, over 973196.90 frames.], batch size: 16, lr: 2.95e-04 2022-05-05 20:40:30,900 INFO [train.py:715] (0/8) Epoch 7, batch 13500, loss[loss=0.1245, simple_loss=0.1949, pruned_loss=0.02706, over 4981.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2174, pruned_loss=0.03739, over 972564.47 frames.], batch size: 28, lr: 2.95e-04 2022-05-05 20:41:09,545 INFO [train.py:715] (0/8) Epoch 7, batch 13550, loss[loss=0.1411, simple_loss=0.2104, pruned_loss=0.03594, over 4942.00 frames.], tot_loss[loss=0.146, simple_loss=0.217, pruned_loss=0.03748, over 972636.73 frames.], batch size: 39, lr: 2.95e-04 2022-05-05 20:41:48,021 INFO [train.py:715] (0/8) Epoch 7, batch 13600, loss[loss=0.131, simple_loss=0.2, pruned_loss=0.031, over 4814.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2183, pruned_loss=0.03775, over 973391.00 frames.], batch size: 13, lr: 2.95e-04 2022-05-05 20:42:26,943 INFO [train.py:715] (0/8) Epoch 7, batch 13650, loss[loss=0.117, simple_loss=0.1995, pruned_loss=0.01725, over 4988.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2177, pruned_loss=0.03705, over 973085.66 frames.], batch size: 27, lr: 2.95e-04 2022-05-05 20:43:05,963 INFO [train.py:715] (0/8) Epoch 7, batch 13700, loss[loss=0.1419, simple_loss=0.2213, pruned_loss=0.03121, over 4800.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03685, over 972691.56 frames.], batch size: 24, lr: 2.95e-04 2022-05-05 20:43:44,941 INFO [train.py:715] (0/8) Epoch 7, batch 13750, loss[loss=0.1714, simple_loss=0.2396, pruned_loss=0.0516, over 4988.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2179, pruned_loss=0.03734, over 972432.96 frames.], batch size: 28, lr: 2.94e-04 2022-05-05 20:44:23,920 INFO [train.py:715] (0/8) Epoch 7, batch 13800, loss[loss=0.1446, simple_loss=0.2106, pruned_loss=0.03926, over 4845.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2174, pruned_loss=0.03696, over 971723.64 frames.], batch size: 30, lr: 2.94e-04 2022-05-05 20:45:03,232 INFO [train.py:715] (0/8) Epoch 7, batch 13850, loss[loss=0.1158, simple_loss=0.1856, pruned_loss=0.02297, over 4887.00 frames.], tot_loss[loss=0.1451, simple_loss=0.217, pruned_loss=0.03661, over 971254.24 frames.], batch size: 22, lr: 2.94e-04 2022-05-05 20:45:41,498 INFO [train.py:715] (0/8) Epoch 7, batch 13900, loss[loss=0.1474, simple_loss=0.2293, pruned_loss=0.03276, over 4946.00 frames.], tot_loss[loss=0.1455, simple_loss=0.217, pruned_loss=0.03704, over 971196.05 frames.], batch size: 21, lr: 2.94e-04 2022-05-05 20:46:20,515 INFO [train.py:715] (0/8) Epoch 7, batch 13950, loss[loss=0.1662, simple_loss=0.2363, pruned_loss=0.04805, over 4959.00 frames.], tot_loss[loss=0.145, simple_loss=0.2164, pruned_loss=0.03681, over 971027.81 frames.], batch size: 24, lr: 2.94e-04 2022-05-05 20:46:59,558 INFO [train.py:715] (0/8) Epoch 7, batch 14000, loss[loss=0.1522, simple_loss=0.224, pruned_loss=0.04022, over 4946.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03655, over 971404.50 frames.], batch size: 29, lr: 2.94e-04 2022-05-05 20:47:38,922 INFO [train.py:715] (0/8) Epoch 7, batch 14050, loss[loss=0.1177, simple_loss=0.1857, pruned_loss=0.0248, over 4990.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2163, pruned_loss=0.03567, over 971681.55 frames.], batch size: 14, lr: 2.94e-04 2022-05-05 20:48:18,048 INFO [train.py:715] (0/8) Epoch 7, batch 14100, loss[loss=0.1639, simple_loss=0.2409, pruned_loss=0.04345, over 4846.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2172, pruned_loss=0.03621, over 972751.60 frames.], batch size: 30, lr: 2.94e-04 2022-05-05 20:48:56,861 INFO [train.py:715] (0/8) Epoch 7, batch 14150, loss[loss=0.1668, simple_loss=0.2418, pruned_loss=0.04588, over 4826.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2173, pruned_loss=0.03669, over 972491.83 frames.], batch size: 26, lr: 2.94e-04 2022-05-05 20:49:36,148 INFO [train.py:715] (0/8) Epoch 7, batch 14200, loss[loss=0.1303, simple_loss=0.2069, pruned_loss=0.02689, over 4972.00 frames.], tot_loss[loss=0.145, simple_loss=0.2168, pruned_loss=0.03665, over 972209.60 frames.], batch size: 24, lr: 2.94e-04 2022-05-05 20:50:14,405 INFO [train.py:715] (0/8) Epoch 7, batch 14250, loss[loss=0.1463, simple_loss=0.2197, pruned_loss=0.03639, over 4865.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2166, pruned_loss=0.03688, over 972106.90 frames.], batch size: 20, lr: 2.94e-04 2022-05-05 20:50:53,727 INFO [train.py:715] (0/8) Epoch 7, batch 14300, loss[loss=0.1499, simple_loss=0.2177, pruned_loss=0.04105, over 4985.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2166, pruned_loss=0.03683, over 971897.60 frames.], batch size: 28, lr: 2.94e-04 2022-05-05 20:51:33,004 INFO [train.py:715] (0/8) Epoch 7, batch 14350, loss[loss=0.1244, simple_loss=0.2003, pruned_loss=0.02422, over 4951.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2168, pruned_loss=0.03716, over 971751.39 frames.], batch size: 21, lr: 2.94e-04 2022-05-05 20:52:12,025 INFO [train.py:715] (0/8) Epoch 7, batch 14400, loss[loss=0.1483, simple_loss=0.2181, pruned_loss=0.03926, over 4889.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2177, pruned_loss=0.03741, over 972240.52 frames.], batch size: 19, lr: 2.94e-04 2022-05-05 20:52:50,726 INFO [train.py:715] (0/8) Epoch 7, batch 14450, loss[loss=0.1406, simple_loss=0.2077, pruned_loss=0.03681, over 4898.00 frames.], tot_loss[loss=0.1463, simple_loss=0.218, pruned_loss=0.03736, over 972464.28 frames.], batch size: 17, lr: 2.94e-04 2022-05-05 20:53:29,523 INFO [train.py:715] (0/8) Epoch 7, batch 14500, loss[loss=0.1456, simple_loss=0.2207, pruned_loss=0.0352, over 4963.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2181, pruned_loss=0.03729, over 972550.35 frames.], batch size: 24, lr: 2.94e-04 2022-05-05 20:54:09,104 INFO [train.py:715] (0/8) Epoch 7, batch 14550, loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03041, over 4774.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2184, pruned_loss=0.03724, over 971882.48 frames.], batch size: 12, lr: 2.94e-04 2022-05-05 20:54:47,911 INFO [train.py:715] (0/8) Epoch 7, batch 14600, loss[loss=0.1059, simple_loss=0.1791, pruned_loss=0.01628, over 4938.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2184, pruned_loss=0.03733, over 972344.22 frames.], batch size: 21, lr: 2.94e-04 2022-05-05 20:55:26,848 INFO [train.py:715] (0/8) Epoch 7, batch 14650, loss[loss=0.1605, simple_loss=0.2434, pruned_loss=0.0388, over 4801.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2176, pruned_loss=0.03696, over 971662.17 frames.], batch size: 24, lr: 2.94e-04 2022-05-05 20:56:05,807 INFO [train.py:715] (0/8) Epoch 7, batch 14700, loss[loss=0.1579, simple_loss=0.2293, pruned_loss=0.04321, over 4826.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2169, pruned_loss=0.0367, over 971090.81 frames.], batch size: 26, lr: 2.94e-04 2022-05-05 20:56:44,941 INFO [train.py:715] (0/8) Epoch 7, batch 14750, loss[loss=0.1303, simple_loss=0.2004, pruned_loss=0.03011, over 4766.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03725, over 970942.55 frames.], batch size: 14, lr: 2.94e-04 2022-05-05 20:57:23,492 INFO [train.py:715] (0/8) Epoch 7, batch 14800, loss[loss=0.1647, simple_loss=0.2316, pruned_loss=0.04887, over 4761.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2174, pruned_loss=0.03764, over 971032.69 frames.], batch size: 19, lr: 2.94e-04 2022-05-05 20:58:03,008 INFO [train.py:715] (0/8) Epoch 7, batch 14850, loss[loss=0.2001, simple_loss=0.2767, pruned_loss=0.06173, over 4979.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2187, pruned_loss=0.0379, over 971708.16 frames.], batch size: 26, lr: 2.94e-04 2022-05-05 20:58:41,950 INFO [train.py:715] (0/8) Epoch 7, batch 14900, loss[loss=0.1123, simple_loss=0.1768, pruned_loss=0.02395, over 4755.00 frames.], tot_loss[loss=0.147, simple_loss=0.2183, pruned_loss=0.03783, over 972098.76 frames.], batch size: 12, lr: 2.94e-04 2022-05-05 20:59:20,314 INFO [train.py:715] (0/8) Epoch 7, batch 14950, loss[loss=0.1808, simple_loss=0.2473, pruned_loss=0.05719, over 4933.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2187, pruned_loss=0.03811, over 972287.12 frames.], batch size: 35, lr: 2.94e-04 2022-05-05 20:59:59,924 INFO [train.py:715] (0/8) Epoch 7, batch 15000, loss[loss=0.1751, simple_loss=0.2411, pruned_loss=0.05457, over 4988.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2183, pruned_loss=0.03793, over 971822.29 frames.], batch size: 25, lr: 2.94e-04 2022-05-05 20:59:59,926 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 21:00:14,355 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1083, simple_loss=0.1931, pruned_loss=0.01175, over 914524.00 frames. 2022-05-05 21:00:53,495 INFO [train.py:715] (0/8) Epoch 7, batch 15050, loss[loss=0.1293, simple_loss=0.2138, pruned_loss=0.02243, over 4800.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2188, pruned_loss=0.03812, over 971398.07 frames.], batch size: 24, lr: 2.94e-04 2022-05-05 21:01:32,726 INFO [train.py:715] (0/8) Epoch 7, batch 15100, loss[loss=0.1317, simple_loss=0.2126, pruned_loss=0.02543, over 4827.00 frames.], tot_loss[loss=0.1475, simple_loss=0.219, pruned_loss=0.03804, over 970399.20 frames.], batch size: 13, lr: 2.94e-04 2022-05-05 21:02:11,973 INFO [train.py:715] (0/8) Epoch 7, batch 15150, loss[loss=0.123, simple_loss=0.1928, pruned_loss=0.02658, over 4854.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2197, pruned_loss=0.03822, over 970996.72 frames.], batch size: 13, lr: 2.94e-04 2022-05-05 21:02:50,721 INFO [train.py:715] (0/8) Epoch 7, batch 15200, loss[loss=0.1579, simple_loss=0.2213, pruned_loss=0.04729, over 4965.00 frames.], tot_loss[loss=0.148, simple_loss=0.2195, pruned_loss=0.03824, over 971937.78 frames.], batch size: 35, lr: 2.94e-04 2022-05-05 21:03:30,194 INFO [train.py:715] (0/8) Epoch 7, batch 15250, loss[loss=0.1317, simple_loss=0.2072, pruned_loss=0.02813, over 4816.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2192, pruned_loss=0.03813, over 972344.17 frames.], batch size: 26, lr: 2.94e-04 2022-05-05 21:04:09,389 INFO [train.py:715] (0/8) Epoch 7, batch 15300, loss[loss=0.177, simple_loss=0.2428, pruned_loss=0.0556, over 4979.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2191, pruned_loss=0.03779, over 972768.96 frames.], batch size: 31, lr: 2.94e-04 2022-05-05 21:04:48,394 INFO [train.py:715] (0/8) Epoch 7, batch 15350, loss[loss=0.205, simple_loss=0.2784, pruned_loss=0.06584, over 4854.00 frames.], tot_loss[loss=0.1481, simple_loss=0.2198, pruned_loss=0.03819, over 973176.19 frames.], batch size: 32, lr: 2.94e-04 2022-05-05 21:05:27,506 INFO [train.py:715] (0/8) Epoch 7, batch 15400, loss[loss=0.1416, simple_loss=0.2089, pruned_loss=0.03715, over 4705.00 frames.], tot_loss[loss=0.1472, simple_loss=0.219, pruned_loss=0.03769, over 973070.13 frames.], batch size: 15, lr: 2.94e-04 2022-05-05 21:06:05,997 INFO [train.py:715] (0/8) Epoch 7, batch 15450, loss[loss=0.1367, simple_loss=0.2171, pruned_loss=0.02815, over 4828.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2179, pruned_loss=0.03691, over 973273.97 frames.], batch size: 26, lr: 2.94e-04 2022-05-05 21:06:45,042 INFO [train.py:715] (0/8) Epoch 7, batch 15500, loss[loss=0.1233, simple_loss=0.1958, pruned_loss=0.02537, over 4962.00 frames.], tot_loss[loss=0.1458, simple_loss=0.218, pruned_loss=0.03679, over 973401.00 frames.], batch size: 24, lr: 2.93e-04 2022-05-05 21:07:23,161 INFO [train.py:715] (0/8) Epoch 7, batch 15550, loss[loss=0.142, simple_loss=0.2164, pruned_loss=0.03385, over 4841.00 frames.], tot_loss[loss=0.146, simple_loss=0.2181, pruned_loss=0.03691, over 973099.86 frames.], batch size: 30, lr: 2.93e-04 2022-05-05 21:08:02,566 INFO [train.py:715] (0/8) Epoch 7, batch 15600, loss[loss=0.1558, simple_loss=0.2244, pruned_loss=0.04362, over 4909.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2185, pruned_loss=0.0369, over 973380.74 frames.], batch size: 18, lr: 2.93e-04 2022-05-05 21:08:42,117 INFO [train.py:715] (0/8) Epoch 7, batch 15650, loss[loss=0.1363, simple_loss=0.2093, pruned_loss=0.0317, over 4957.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2174, pruned_loss=0.0367, over 972315.25 frames.], batch size: 15, lr: 2.93e-04 2022-05-05 21:09:20,364 INFO [train.py:715] (0/8) Epoch 7, batch 15700, loss[loss=0.1365, simple_loss=0.211, pruned_loss=0.03097, over 4868.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2171, pruned_loss=0.03619, over 972121.95 frames.], batch size: 32, lr: 2.93e-04 2022-05-05 21:09:59,355 INFO [train.py:715] (0/8) Epoch 7, batch 15750, loss[loss=0.1391, simple_loss=0.2102, pruned_loss=0.03398, over 4875.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2164, pruned_loss=0.03597, over 971598.35 frames.], batch size: 22, lr: 2.93e-04 2022-05-05 21:10:39,019 INFO [train.py:715] (0/8) Epoch 7, batch 15800, loss[loss=0.1302, simple_loss=0.2087, pruned_loss=0.02588, over 4806.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2169, pruned_loss=0.03641, over 971648.20 frames.], batch size: 25, lr: 2.93e-04 2022-05-05 21:11:18,129 INFO [train.py:715] (0/8) Epoch 7, batch 15850, loss[loss=0.152, simple_loss=0.2336, pruned_loss=0.03518, over 4919.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03649, over 972200.75 frames.], batch size: 18, lr: 2.93e-04 2022-05-05 21:11:57,173 INFO [train.py:715] (0/8) Epoch 7, batch 15900, loss[loss=0.1302, simple_loss=0.2008, pruned_loss=0.02978, over 4911.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2175, pruned_loss=0.03693, over 971446.87 frames.], batch size: 18, lr: 2.93e-04 2022-05-05 21:12:36,477 INFO [train.py:715] (0/8) Epoch 7, batch 15950, loss[loss=0.148, simple_loss=0.2117, pruned_loss=0.04218, over 4969.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03713, over 971453.37 frames.], batch size: 14, lr: 2.93e-04 2022-05-05 21:13:15,929 INFO [train.py:715] (0/8) Epoch 7, batch 16000, loss[loss=0.1609, simple_loss=0.2248, pruned_loss=0.04849, over 4822.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2168, pruned_loss=0.03704, over 972084.30 frames.], batch size: 13, lr: 2.93e-04 2022-05-05 21:13:54,025 INFO [train.py:715] (0/8) Epoch 7, batch 16050, loss[loss=0.1719, simple_loss=0.2389, pruned_loss=0.05251, over 4963.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2166, pruned_loss=0.03695, over 971935.03 frames.], batch size: 15, lr: 2.93e-04 2022-05-05 21:14:33,353 INFO [train.py:715] (0/8) Epoch 7, batch 16100, loss[loss=0.1488, simple_loss=0.2099, pruned_loss=0.04386, over 4956.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2161, pruned_loss=0.03635, over 973061.81 frames.], batch size: 15, lr: 2.93e-04 2022-05-05 21:15:12,279 INFO [train.py:715] (0/8) Epoch 7, batch 16150, loss[loss=0.1378, simple_loss=0.1997, pruned_loss=0.03793, over 4908.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2165, pruned_loss=0.03637, over 972981.88 frames.], batch size: 19, lr: 2.93e-04 2022-05-05 21:15:50,928 INFO [train.py:715] (0/8) Epoch 7, batch 16200, loss[loss=0.1394, simple_loss=0.2134, pruned_loss=0.03273, over 4967.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2173, pruned_loss=0.03695, over 973753.98 frames.], batch size: 24, lr: 2.93e-04 2022-05-05 21:16:30,077 INFO [train.py:715] (0/8) Epoch 7, batch 16250, loss[loss=0.1666, simple_loss=0.2369, pruned_loss=0.04815, over 4909.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2175, pruned_loss=0.03717, over 974227.24 frames.], batch size: 17, lr: 2.93e-04 2022-05-05 21:17:08,722 INFO [train.py:715] (0/8) Epoch 7, batch 16300, loss[loss=0.1329, simple_loss=0.2007, pruned_loss=0.03256, over 4774.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2167, pruned_loss=0.03689, over 974489.65 frames.], batch size: 12, lr: 2.93e-04 2022-05-05 21:17:48,270 INFO [train.py:715] (0/8) Epoch 7, batch 16350, loss[loss=0.1757, simple_loss=0.2427, pruned_loss=0.05431, over 4776.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2172, pruned_loss=0.03706, over 973457.47 frames.], batch size: 17, lr: 2.93e-04 2022-05-05 21:18:26,606 INFO [train.py:715] (0/8) Epoch 7, batch 16400, loss[loss=0.1326, simple_loss=0.1959, pruned_loss=0.03469, over 4798.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03706, over 972101.55 frames.], batch size: 24, lr: 2.93e-04 2022-05-05 21:19:05,502 INFO [train.py:715] (0/8) Epoch 7, batch 16450, loss[loss=0.1537, simple_loss=0.2257, pruned_loss=0.0409, over 4848.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2174, pruned_loss=0.03697, over 971902.17 frames.], batch size: 30, lr: 2.93e-04 2022-05-05 21:19:44,555 INFO [train.py:715] (0/8) Epoch 7, batch 16500, loss[loss=0.1692, simple_loss=0.2384, pruned_loss=0.05001, over 4897.00 frames.], tot_loss[loss=0.1458, simple_loss=0.218, pruned_loss=0.03686, over 972367.78 frames.], batch size: 17, lr: 2.93e-04 2022-05-05 21:20:22,825 INFO [train.py:715] (0/8) Epoch 7, batch 16550, loss[loss=0.1433, simple_loss=0.2227, pruned_loss=0.03196, over 4767.00 frames.], tot_loss[loss=0.146, simple_loss=0.2183, pruned_loss=0.0368, over 971428.04 frames.], batch size: 19, lr: 2.93e-04 2022-05-05 21:21:02,228 INFO [train.py:715] (0/8) Epoch 7, batch 16600, loss[loss=0.1411, simple_loss=0.1998, pruned_loss=0.04123, over 4947.00 frames.], tot_loss[loss=0.1456, simple_loss=0.218, pruned_loss=0.03659, over 971456.51 frames.], batch size: 24, lr: 2.93e-04 2022-05-05 21:21:41,394 INFO [train.py:715] (0/8) Epoch 7, batch 16650, loss[loss=0.129, simple_loss=0.2078, pruned_loss=0.02513, over 4989.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2183, pruned_loss=0.03716, over 971443.89 frames.], batch size: 28, lr: 2.93e-04 2022-05-05 21:22:20,541 INFO [train.py:715] (0/8) Epoch 7, batch 16700, loss[loss=0.1458, simple_loss=0.2192, pruned_loss=0.03624, over 4802.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2191, pruned_loss=0.03733, over 971978.05 frames.], batch size: 21, lr: 2.93e-04 2022-05-05 21:22:59,810 INFO [train.py:715] (0/8) Epoch 7, batch 16750, loss[loss=0.136, simple_loss=0.2115, pruned_loss=0.03022, over 4925.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2197, pruned_loss=0.0378, over 971857.20 frames.], batch size: 23, lr: 2.93e-04 2022-05-05 21:23:38,666 INFO [train.py:715] (0/8) Epoch 7, batch 16800, loss[loss=0.1219, simple_loss=0.192, pruned_loss=0.02587, over 4986.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2191, pruned_loss=0.03737, over 972348.05 frames.], batch size: 35, lr: 2.93e-04 2022-05-05 21:24:17,710 INFO [train.py:715] (0/8) Epoch 7, batch 16850, loss[loss=0.1558, simple_loss=0.2196, pruned_loss=0.04603, over 4781.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2184, pruned_loss=0.03708, over 972190.75 frames.], batch size: 14, lr: 2.93e-04 2022-05-05 21:24:56,996 INFO [train.py:715] (0/8) Epoch 7, batch 16900, loss[loss=0.1822, simple_loss=0.2547, pruned_loss=0.05483, over 4801.00 frames.], tot_loss[loss=0.1458, simple_loss=0.218, pruned_loss=0.03683, over 971174.98 frames.], batch size: 24, lr: 2.93e-04 2022-05-05 21:25:36,245 INFO [train.py:715] (0/8) Epoch 7, batch 16950, loss[loss=0.1816, simple_loss=0.2584, pruned_loss=0.05244, over 4769.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2172, pruned_loss=0.03668, over 971759.55 frames.], batch size: 14, lr: 2.93e-04 2022-05-05 21:26:14,895 INFO [train.py:715] (0/8) Epoch 7, batch 17000, loss[loss=0.1515, simple_loss=0.2265, pruned_loss=0.03824, over 4875.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2184, pruned_loss=0.03724, over 971679.42 frames.], batch size: 30, lr: 2.93e-04 2022-05-05 21:26:54,051 INFO [train.py:715] (0/8) Epoch 7, batch 17050, loss[loss=0.1225, simple_loss=0.1862, pruned_loss=0.02938, over 4798.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2183, pruned_loss=0.03743, over 971613.11 frames.], batch size: 21, lr: 2.93e-04 2022-05-05 21:27:32,507 INFO [train.py:715] (0/8) Epoch 7, batch 17100, loss[loss=0.1122, simple_loss=0.1852, pruned_loss=0.01959, over 4917.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2173, pruned_loss=0.03678, over 971650.62 frames.], batch size: 35, lr: 2.93e-04 2022-05-05 21:28:11,643 INFO [train.py:715] (0/8) Epoch 7, batch 17150, loss[loss=0.1266, simple_loss=0.2049, pruned_loss=0.02414, over 4819.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2173, pruned_loss=0.03672, over 972646.13 frames.], batch size: 26, lr: 2.93e-04 2022-05-05 21:28:50,899 INFO [train.py:715] (0/8) Epoch 7, batch 17200, loss[loss=0.1476, simple_loss=0.2074, pruned_loss=0.04388, over 4985.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2174, pruned_loss=0.03698, over 972148.74 frames.], batch size: 28, lr: 2.93e-04 2022-05-05 21:29:29,220 INFO [train.py:715] (0/8) Epoch 7, batch 17250, loss[loss=0.1309, simple_loss=0.2172, pruned_loss=0.0223, over 4822.00 frames.], tot_loss[loss=0.1465, simple_loss=0.218, pruned_loss=0.03752, over 970940.13 frames.], batch size: 25, lr: 2.92e-04 2022-05-05 21:30:08,290 INFO [train.py:715] (0/8) Epoch 7, batch 17300, loss[loss=0.1525, simple_loss=0.2232, pruned_loss=0.0409, over 4843.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2184, pruned_loss=0.03748, over 971483.27 frames.], batch size: 15, lr: 2.92e-04 2022-05-05 21:30:46,573 INFO [train.py:715] (0/8) Epoch 7, batch 17350, loss[loss=0.201, simple_loss=0.2656, pruned_loss=0.06825, over 4908.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2185, pruned_loss=0.0375, over 972495.00 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:31:25,648 INFO [train.py:715] (0/8) Epoch 7, batch 17400, loss[loss=0.1195, simple_loss=0.1943, pruned_loss=0.02231, over 4894.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2177, pruned_loss=0.03702, over 972412.96 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:32:04,439 INFO [train.py:715] (0/8) Epoch 7, batch 17450, loss[loss=0.1297, simple_loss=0.2088, pruned_loss=0.02529, over 4924.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2171, pruned_loss=0.03677, over 972445.15 frames.], batch size: 18, lr: 2.92e-04 2022-05-05 21:32:43,215 INFO [train.py:715] (0/8) Epoch 7, batch 17500, loss[loss=0.1776, simple_loss=0.2427, pruned_loss=0.05624, over 4810.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2183, pruned_loss=0.03759, over 971875.31 frames.], batch size: 21, lr: 2.92e-04 2022-05-05 21:33:22,410 INFO [train.py:715] (0/8) Epoch 7, batch 17550, loss[loss=0.1399, simple_loss=0.2085, pruned_loss=0.03562, over 4907.00 frames.], tot_loss[loss=0.1463, simple_loss=0.218, pruned_loss=0.03734, over 971910.60 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:34:00,738 INFO [train.py:715] (0/8) Epoch 7, batch 17600, loss[loss=0.1403, simple_loss=0.2099, pruned_loss=0.03531, over 4856.00 frames.], tot_loss[loss=0.146, simple_loss=0.2174, pruned_loss=0.03732, over 972381.78 frames.], batch size: 15, lr: 2.92e-04 2022-05-05 21:34:39,807 INFO [train.py:715] (0/8) Epoch 7, batch 17650, loss[loss=0.1318, simple_loss=0.2088, pruned_loss=0.02745, over 4834.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2173, pruned_loss=0.0371, over 972420.76 frames.], batch size: 13, lr: 2.92e-04 2022-05-05 21:35:19,108 INFO [train.py:715] (0/8) Epoch 7, batch 17700, loss[loss=0.1668, simple_loss=0.2399, pruned_loss=0.04686, over 4862.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2161, pruned_loss=0.03672, over 972224.43 frames.], batch size: 30, lr: 2.92e-04 2022-05-05 21:35:58,204 INFO [train.py:715] (0/8) Epoch 7, batch 17750, loss[loss=0.1725, simple_loss=0.2454, pruned_loss=0.04981, over 4971.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2155, pruned_loss=0.03642, over 973185.97 frames.], batch size: 35, lr: 2.92e-04 2022-05-05 21:36:37,511 INFO [train.py:715] (0/8) Epoch 7, batch 17800, loss[loss=0.147, simple_loss=0.2204, pruned_loss=0.03675, over 4787.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2154, pruned_loss=0.0367, over 973094.34 frames.], batch size: 21, lr: 2.92e-04 2022-05-05 21:37:16,000 INFO [train.py:715] (0/8) Epoch 7, batch 17850, loss[loss=0.1386, simple_loss=0.2082, pruned_loss=0.03453, over 4882.00 frames.], tot_loss[loss=0.1449, simple_loss=0.216, pruned_loss=0.03685, over 973007.42 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:37:55,611 INFO [train.py:715] (0/8) Epoch 7, batch 17900, loss[loss=0.139, simple_loss=0.2119, pruned_loss=0.03311, over 4687.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2165, pruned_loss=0.03694, over 973315.16 frames.], batch size: 15, lr: 2.92e-04 2022-05-05 21:38:34,074 INFO [train.py:715] (0/8) Epoch 7, batch 17950, loss[loss=0.1909, simple_loss=0.2434, pruned_loss=0.06924, over 4974.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2172, pruned_loss=0.0373, over 973117.79 frames.], batch size: 15, lr: 2.92e-04 2022-05-05 21:39:13,131 INFO [train.py:715] (0/8) Epoch 7, batch 18000, loss[loss=0.1357, simple_loss=0.2138, pruned_loss=0.02885, over 4750.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2175, pruned_loss=0.03756, over 972991.16 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:39:13,132 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 21:39:22,794 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1081, simple_loss=0.193, pruned_loss=0.01158, over 914524.00 frames. 2022-05-05 21:40:01,808 INFO [train.py:715] (0/8) Epoch 7, batch 18050, loss[loss=0.1572, simple_loss=0.2279, pruned_loss=0.04326, over 4985.00 frames.], tot_loss[loss=0.147, simple_loss=0.2182, pruned_loss=0.03795, over 973437.93 frames.], batch size: 31, lr: 2.92e-04 2022-05-05 21:40:41,006 INFO [train.py:715] (0/8) Epoch 7, batch 18100, loss[loss=0.1641, simple_loss=0.2189, pruned_loss=0.0547, over 4857.00 frames.], tot_loss[loss=0.1485, simple_loss=0.2196, pruned_loss=0.03866, over 972529.38 frames.], batch size: 32, lr: 2.92e-04 2022-05-05 21:41:19,565 INFO [train.py:715] (0/8) Epoch 7, batch 18150, loss[loss=0.1441, simple_loss=0.2173, pruned_loss=0.03545, over 4877.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2182, pruned_loss=0.03784, over 971660.08 frames.], batch size: 22, lr: 2.92e-04 2022-05-05 21:41:57,880 INFO [train.py:715] (0/8) Epoch 7, batch 18200, loss[loss=0.1347, simple_loss=0.2134, pruned_loss=0.02799, over 4823.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2184, pruned_loss=0.0382, over 971313.84 frames.], batch size: 27, lr: 2.92e-04 2022-05-05 21:42:36,253 INFO [train.py:715] (0/8) Epoch 7, batch 18250, loss[loss=0.1141, simple_loss=0.1828, pruned_loss=0.02268, over 4814.00 frames.], tot_loss[loss=0.148, simple_loss=0.2188, pruned_loss=0.03863, over 971417.99 frames.], batch size: 27, lr: 2.92e-04 2022-05-05 21:43:15,541 INFO [train.py:715] (0/8) Epoch 7, batch 18300, loss[loss=0.1162, simple_loss=0.1896, pruned_loss=0.02139, over 4932.00 frames.], tot_loss[loss=0.148, simple_loss=0.2192, pruned_loss=0.03837, over 972287.61 frames.], batch size: 29, lr: 2.92e-04 2022-05-05 21:43:53,554 INFO [train.py:715] (0/8) Epoch 7, batch 18350, loss[loss=0.1503, simple_loss=0.2152, pruned_loss=0.04267, over 4939.00 frames.], tot_loss[loss=0.1476, simple_loss=0.219, pruned_loss=0.03804, over 971372.36 frames.], batch size: 29, lr: 2.92e-04 2022-05-05 21:44:31,933 INFO [train.py:715] (0/8) Epoch 7, batch 18400, loss[loss=0.1543, simple_loss=0.2273, pruned_loss=0.04071, over 4792.00 frames.], tot_loss[loss=0.1474, simple_loss=0.2191, pruned_loss=0.03785, over 971698.21 frames.], batch size: 17, lr: 2.92e-04 2022-05-05 21:45:11,786 INFO [train.py:715] (0/8) Epoch 7, batch 18450, loss[loss=0.152, simple_loss=0.2151, pruned_loss=0.0445, over 4796.00 frames.], tot_loss[loss=0.1475, simple_loss=0.219, pruned_loss=0.03804, over 971546.69 frames.], batch size: 14, lr: 2.92e-04 2022-05-05 21:45:50,714 INFO [train.py:715] (0/8) Epoch 7, batch 18500, loss[loss=0.1245, simple_loss=0.1893, pruned_loss=0.02992, over 4777.00 frames.], tot_loss[loss=0.1478, simple_loss=0.2192, pruned_loss=0.03823, over 971823.95 frames.], batch size: 14, lr: 2.92e-04 2022-05-05 21:46:29,376 INFO [train.py:715] (0/8) Epoch 7, batch 18550, loss[loss=0.1412, simple_loss=0.2146, pruned_loss=0.03394, over 4643.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2185, pruned_loss=0.03824, over 971778.63 frames.], batch size: 13, lr: 2.92e-04 2022-05-05 21:47:08,448 INFO [train.py:715] (0/8) Epoch 7, batch 18600, loss[loss=0.1353, simple_loss=0.2255, pruned_loss=0.02259, over 4876.00 frames.], tot_loss[loss=0.147, simple_loss=0.2185, pruned_loss=0.03771, over 973344.83 frames.], batch size: 22, lr: 2.92e-04 2022-05-05 21:47:47,270 INFO [train.py:715] (0/8) Epoch 7, batch 18650, loss[loss=0.1261, simple_loss=0.2059, pruned_loss=0.0232, over 4913.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2177, pruned_loss=0.03704, over 973460.99 frames.], batch size: 17, lr: 2.92e-04 2022-05-05 21:48:25,122 INFO [train.py:715] (0/8) Epoch 7, batch 18700, loss[loss=0.1564, simple_loss=0.2271, pruned_loss=0.04283, over 4854.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2183, pruned_loss=0.03705, over 972456.45 frames.], batch size: 30, lr: 2.92e-04 2022-05-05 21:49:03,388 INFO [train.py:715] (0/8) Epoch 7, batch 18750, loss[loss=0.1522, simple_loss=0.2228, pruned_loss=0.04082, over 4863.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2188, pruned_loss=0.03703, over 973137.64 frames.], batch size: 16, lr: 2.92e-04 2022-05-05 21:49:42,759 INFO [train.py:715] (0/8) Epoch 7, batch 18800, loss[loss=0.1318, simple_loss=0.2101, pruned_loss=0.02675, over 4807.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2184, pruned_loss=0.03689, over 973185.41 frames.], batch size: 25, lr: 2.92e-04 2022-05-05 21:50:21,357 INFO [train.py:715] (0/8) Epoch 7, batch 18850, loss[loss=0.1505, simple_loss=0.2131, pruned_loss=0.04395, over 4857.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2184, pruned_loss=0.03748, over 973245.48 frames.], batch size: 30, lr: 2.92e-04 2022-05-05 21:50:59,414 INFO [train.py:715] (0/8) Epoch 7, batch 18900, loss[loss=0.1307, simple_loss=0.2066, pruned_loss=0.02739, over 4756.00 frames.], tot_loss[loss=0.1464, simple_loss=0.218, pruned_loss=0.03739, over 972718.74 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:51:36,457 INFO [train.py:715] (0/8) Epoch 7, batch 18950, loss[loss=0.2024, simple_loss=0.2736, pruned_loss=0.06563, over 4915.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2186, pruned_loss=0.03792, over 972768.22 frames.], batch size: 19, lr: 2.92e-04 2022-05-05 21:52:14,912 INFO [train.py:715] (0/8) Epoch 7, batch 19000, loss[loss=0.1361, simple_loss=0.2088, pruned_loss=0.0317, over 4941.00 frames.], tot_loss[loss=0.1469, simple_loss=0.218, pruned_loss=0.03786, over 972136.00 frames.], batch size: 23, lr: 2.92e-04 2022-05-05 21:52:52,513 INFO [train.py:715] (0/8) Epoch 7, batch 19050, loss[loss=0.1205, simple_loss=0.1978, pruned_loss=0.02161, over 4926.00 frames.], tot_loss[loss=0.1454, simple_loss=0.217, pruned_loss=0.03694, over 971777.46 frames.], batch size: 23, lr: 2.91e-04 2022-05-05 21:53:30,741 INFO [train.py:715] (0/8) Epoch 7, batch 19100, loss[loss=0.1249, simple_loss=0.2019, pruned_loss=0.02398, over 4926.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2162, pruned_loss=0.03635, over 971882.22 frames.], batch size: 23, lr: 2.91e-04 2022-05-05 21:54:09,410 INFO [train.py:715] (0/8) Epoch 7, batch 19150, loss[loss=0.1862, simple_loss=0.258, pruned_loss=0.05718, over 4773.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2166, pruned_loss=0.03628, over 971765.93 frames.], batch size: 17, lr: 2.91e-04 2022-05-05 21:54:47,122 INFO [train.py:715] (0/8) Epoch 7, batch 19200, loss[loss=0.147, simple_loss=0.2257, pruned_loss=0.03416, over 4918.00 frames.], tot_loss[loss=0.1439, simple_loss=0.216, pruned_loss=0.03595, over 971962.78 frames.], batch size: 18, lr: 2.91e-04 2022-05-05 21:55:24,836 INFO [train.py:715] (0/8) Epoch 7, batch 19250, loss[loss=0.1337, simple_loss=0.209, pruned_loss=0.02922, over 4860.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2163, pruned_loss=0.0361, over 972516.62 frames.], batch size: 13, lr: 2.91e-04 2022-05-05 21:56:02,876 INFO [train.py:715] (0/8) Epoch 7, batch 19300, loss[loss=0.1595, simple_loss=0.2273, pruned_loss=0.04586, over 4979.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2175, pruned_loss=0.03681, over 973169.54 frames.], batch size: 28, lr: 2.91e-04 2022-05-05 21:56:41,356 INFO [train.py:715] (0/8) Epoch 7, batch 19350, loss[loss=0.1247, simple_loss=0.1991, pruned_loss=0.02519, over 4853.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2174, pruned_loss=0.03674, over 973143.27 frames.], batch size: 20, lr: 2.91e-04 2022-05-05 21:57:18,828 INFO [train.py:715] (0/8) Epoch 7, batch 19400, loss[loss=0.1534, simple_loss=0.2295, pruned_loss=0.03861, over 4897.00 frames.], tot_loss[loss=0.1447, simple_loss=0.217, pruned_loss=0.0362, over 972556.73 frames.], batch size: 17, lr: 2.91e-04 2022-05-05 21:57:56,262 INFO [train.py:715] (0/8) Epoch 7, batch 19450, loss[loss=0.1611, simple_loss=0.2278, pruned_loss=0.04719, over 4858.00 frames.], tot_loss[loss=0.1449, simple_loss=0.217, pruned_loss=0.03643, over 972394.42 frames.], batch size: 32, lr: 2.91e-04 2022-05-05 21:58:34,320 INFO [train.py:715] (0/8) Epoch 7, batch 19500, loss[loss=0.1521, simple_loss=0.2221, pruned_loss=0.04102, over 4960.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2175, pruned_loss=0.03676, over 971374.59 frames.], batch size: 24, lr: 2.91e-04 2022-05-05 21:59:11,838 INFO [train.py:715] (0/8) Epoch 7, batch 19550, loss[loss=0.1486, simple_loss=0.2204, pruned_loss=0.03838, over 4844.00 frames.], tot_loss[loss=0.146, simple_loss=0.2178, pruned_loss=0.03714, over 971459.41 frames.], batch size: 34, lr: 2.91e-04 2022-05-05 21:59:49,561 INFO [train.py:715] (0/8) Epoch 7, batch 19600, loss[loss=0.146, simple_loss=0.2262, pruned_loss=0.03288, over 4810.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2172, pruned_loss=0.03673, over 972060.53 frames.], batch size: 26, lr: 2.91e-04 2022-05-05 22:00:27,122 INFO [train.py:715] (0/8) Epoch 7, batch 19650, loss[loss=0.1505, simple_loss=0.2158, pruned_loss=0.04264, over 4962.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2166, pruned_loss=0.03662, over 971888.89 frames.], batch size: 35, lr: 2.91e-04 2022-05-05 22:01:05,535 INFO [train.py:715] (0/8) Epoch 7, batch 19700, loss[loss=0.1511, simple_loss=0.2203, pruned_loss=0.04095, over 4859.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2176, pruned_loss=0.03707, over 972376.32 frames.], batch size: 32, lr: 2.91e-04 2022-05-05 22:01:42,743 INFO [train.py:715] (0/8) Epoch 7, batch 19750, loss[loss=0.1708, simple_loss=0.2485, pruned_loss=0.04655, over 4851.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03755, over 972607.09 frames.], batch size: 20, lr: 2.91e-04 2022-05-05 22:02:20,220 INFO [train.py:715] (0/8) Epoch 7, batch 19800, loss[loss=0.1404, simple_loss=0.2145, pruned_loss=0.03312, over 4783.00 frames.], tot_loss[loss=0.1463, simple_loss=0.218, pruned_loss=0.03727, over 972960.12 frames.], batch size: 14, lr: 2.91e-04 2022-05-05 22:02:58,031 INFO [train.py:715] (0/8) Epoch 7, batch 19850, loss[loss=0.154, simple_loss=0.2312, pruned_loss=0.03836, over 4907.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03732, over 972719.33 frames.], batch size: 19, lr: 2.91e-04 2022-05-05 22:03:35,855 INFO [train.py:715] (0/8) Epoch 7, batch 19900, loss[loss=0.1313, simple_loss=0.1942, pruned_loss=0.03425, over 4923.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2179, pruned_loss=0.03731, over 972802.72 frames.], batch size: 18, lr: 2.91e-04 2022-05-05 22:04:12,818 INFO [train.py:715] (0/8) Epoch 7, batch 19950, loss[loss=0.159, simple_loss=0.2291, pruned_loss=0.04447, over 4873.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03757, over 973338.33 frames.], batch size: 32, lr: 2.91e-04 2022-05-05 22:04:50,678 INFO [train.py:715] (0/8) Epoch 7, batch 20000, loss[loss=0.1312, simple_loss=0.2066, pruned_loss=0.02788, over 4806.00 frames.], tot_loss[loss=0.1477, simple_loss=0.2195, pruned_loss=0.03791, over 973428.35 frames.], batch size: 24, lr: 2.91e-04 2022-05-05 22:05:28,962 INFO [train.py:715] (0/8) Epoch 7, batch 20050, loss[loss=0.1545, simple_loss=0.2278, pruned_loss=0.0406, over 4920.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03763, over 973246.38 frames.], batch size: 18, lr: 2.91e-04 2022-05-05 22:06:06,295 INFO [train.py:715] (0/8) Epoch 7, batch 20100, loss[loss=0.145, simple_loss=0.2164, pruned_loss=0.03682, over 4913.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2182, pruned_loss=0.03739, over 973237.27 frames.], batch size: 18, lr: 2.91e-04 2022-05-05 22:06:43,746 INFO [train.py:715] (0/8) Epoch 7, batch 20150, loss[loss=0.1489, simple_loss=0.2075, pruned_loss=0.04514, over 4874.00 frames.], tot_loss[loss=0.1463, simple_loss=0.218, pruned_loss=0.03728, over 972906.14 frames.], batch size: 32, lr: 2.91e-04 2022-05-05 22:07:21,913 INFO [train.py:715] (0/8) Epoch 7, batch 20200, loss[loss=0.1413, simple_loss=0.2108, pruned_loss=0.03587, over 4945.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2174, pruned_loss=0.03687, over 972942.16 frames.], batch size: 21, lr: 2.91e-04 2022-05-05 22:08:00,052 INFO [train.py:715] (0/8) Epoch 7, batch 20250, loss[loss=0.1386, simple_loss=0.2041, pruned_loss=0.03651, over 4748.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2182, pruned_loss=0.03746, over 972668.29 frames.], batch size: 19, lr: 2.91e-04 2022-05-05 22:08:37,462 INFO [train.py:715] (0/8) Epoch 7, batch 20300, loss[loss=0.1426, simple_loss=0.2222, pruned_loss=0.03152, over 4919.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2184, pruned_loss=0.03757, over 972597.14 frames.], batch size: 17, lr: 2.91e-04 2022-05-05 22:09:02,831 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-264000.pt 2022-05-05 22:09:17,214 INFO [train.py:715] (0/8) Epoch 7, batch 20350, loss[loss=0.1421, simple_loss=0.2067, pruned_loss=0.03872, over 4959.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2182, pruned_loss=0.03726, over 972359.97 frames.], batch size: 24, lr: 2.91e-04 2022-05-05 22:09:55,127 INFO [train.py:715] (0/8) Epoch 7, batch 20400, loss[loss=0.1505, simple_loss=0.2272, pruned_loss=0.03686, over 4817.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2177, pruned_loss=0.03673, over 972238.90 frames.], batch size: 15, lr: 2.91e-04 2022-05-05 22:10:33,043 INFO [train.py:715] (0/8) Epoch 7, batch 20450, loss[loss=0.1618, simple_loss=0.2194, pruned_loss=0.05207, over 4836.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2169, pruned_loss=0.0364, over 972268.85 frames.], batch size: 13, lr: 2.91e-04 2022-05-05 22:11:10,604 INFO [train.py:715] (0/8) Epoch 7, batch 20500, loss[loss=0.1535, simple_loss=0.2223, pruned_loss=0.04234, over 4973.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2176, pruned_loss=0.03703, over 972590.69 frames.], batch size: 28, lr: 2.91e-04 2022-05-05 22:11:48,689 INFO [train.py:715] (0/8) Epoch 7, batch 20550, loss[loss=0.1498, simple_loss=0.2267, pruned_loss=0.0364, over 4884.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2186, pruned_loss=0.03718, over 971958.08 frames.], batch size: 22, lr: 2.91e-04 2022-05-05 22:12:26,839 INFO [train.py:715] (0/8) Epoch 7, batch 20600, loss[loss=0.1145, simple_loss=0.1887, pruned_loss=0.02015, over 4945.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2179, pruned_loss=0.03674, over 972660.44 frames.], batch size: 18, lr: 2.91e-04 2022-05-05 22:13:04,067 INFO [train.py:715] (0/8) Epoch 7, batch 20650, loss[loss=0.1513, simple_loss=0.2363, pruned_loss=0.03321, over 4917.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2171, pruned_loss=0.03677, over 972081.12 frames.], batch size: 23, lr: 2.91e-04 2022-05-05 22:13:41,768 INFO [train.py:715] (0/8) Epoch 7, batch 20700, loss[loss=0.1401, simple_loss=0.2168, pruned_loss=0.03176, over 4832.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03689, over 972077.74 frames.], batch size: 25, lr: 2.91e-04 2022-05-05 22:14:19,740 INFO [train.py:715] (0/8) Epoch 7, batch 20750, loss[loss=0.1044, simple_loss=0.1749, pruned_loss=0.0169, over 4808.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2177, pruned_loss=0.03664, over 971438.04 frames.], batch size: 12, lr: 2.91e-04 2022-05-05 22:14:57,386 INFO [train.py:715] (0/8) Epoch 7, batch 20800, loss[loss=0.1463, simple_loss=0.2098, pruned_loss=0.0414, over 4815.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2179, pruned_loss=0.03726, over 972311.80 frames.], batch size: 14, lr: 2.91e-04 2022-05-05 22:15:34,690 INFO [train.py:715] (0/8) Epoch 7, batch 20850, loss[loss=0.171, simple_loss=0.2413, pruned_loss=0.05028, over 4772.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2182, pruned_loss=0.03775, over 973460.99 frames.], batch size: 17, lr: 2.90e-04 2022-05-05 22:16:13,016 INFO [train.py:715] (0/8) Epoch 7, batch 20900, loss[loss=0.1438, simple_loss=0.2305, pruned_loss=0.02861, over 4708.00 frames.], tot_loss[loss=0.1466, simple_loss=0.218, pruned_loss=0.03762, over 973186.70 frames.], batch size: 15, lr: 2.90e-04 2022-05-05 22:16:50,907 INFO [train.py:715] (0/8) Epoch 7, batch 20950, loss[loss=0.1315, simple_loss=0.2071, pruned_loss=0.02797, over 4794.00 frames.], tot_loss[loss=0.1465, simple_loss=0.218, pruned_loss=0.0375, over 973132.23 frames.], batch size: 21, lr: 2.90e-04 2022-05-05 22:17:29,165 INFO [train.py:715] (0/8) Epoch 7, batch 21000, loss[loss=0.1358, simple_loss=0.2106, pruned_loss=0.03047, over 4953.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03731, over 972812.95 frames.], batch size: 29, lr: 2.90e-04 2022-05-05 22:17:29,166 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 22:17:39,073 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1082, simple_loss=0.193, pruned_loss=0.01169, over 914524.00 frames. 2022-05-05 22:18:17,063 INFO [train.py:715] (0/8) Epoch 7, batch 21050, loss[loss=0.1679, simple_loss=0.2302, pruned_loss=0.05278, over 4947.00 frames.], tot_loss[loss=0.1463, simple_loss=0.218, pruned_loss=0.03731, over 972833.68 frames.], batch size: 21, lr: 2.90e-04 2022-05-05 22:18:54,962 INFO [train.py:715] (0/8) Epoch 7, batch 21100, loss[loss=0.1734, simple_loss=0.2446, pruned_loss=0.05109, over 4832.00 frames.], tot_loss[loss=0.146, simple_loss=0.2177, pruned_loss=0.03713, over 973249.66 frames.], batch size: 30, lr: 2.90e-04 2022-05-05 22:19:32,987 INFO [train.py:715] (0/8) Epoch 7, batch 21150, loss[loss=0.1291, simple_loss=0.1948, pruned_loss=0.03166, over 4973.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2174, pruned_loss=0.03672, over 972864.54 frames.], batch size: 35, lr: 2.90e-04 2022-05-05 22:20:10,773 INFO [train.py:715] (0/8) Epoch 7, batch 21200, loss[loss=0.1521, simple_loss=0.229, pruned_loss=0.03757, over 4904.00 frames.], tot_loss[loss=0.146, simple_loss=0.2178, pruned_loss=0.03713, over 973043.72 frames.], batch size: 18, lr: 2.90e-04 2022-05-05 22:20:48,997 INFO [train.py:715] (0/8) Epoch 7, batch 21250, loss[loss=0.1489, simple_loss=0.2203, pruned_loss=0.03875, over 4910.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03713, over 973430.95 frames.], batch size: 18, lr: 2.90e-04 2022-05-05 22:21:27,128 INFO [train.py:715] (0/8) Epoch 7, batch 21300, loss[loss=0.195, simple_loss=0.2612, pruned_loss=0.06436, over 4944.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2168, pruned_loss=0.03705, over 972693.99 frames.], batch size: 35, lr: 2.90e-04 2022-05-05 22:22:04,499 INFO [train.py:715] (0/8) Epoch 7, batch 21350, loss[loss=0.1399, simple_loss=0.2018, pruned_loss=0.03904, over 4750.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2161, pruned_loss=0.03654, over 973609.71 frames.], batch size: 14, lr: 2.90e-04 2022-05-05 22:22:42,285 INFO [train.py:715] (0/8) Epoch 7, batch 21400, loss[loss=0.1142, simple_loss=0.1921, pruned_loss=0.01821, over 4922.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2161, pruned_loss=0.03678, over 973440.39 frames.], batch size: 29, lr: 2.90e-04 2022-05-05 22:23:20,546 INFO [train.py:715] (0/8) Epoch 7, batch 21450, loss[loss=0.1815, simple_loss=0.2493, pruned_loss=0.05683, over 4818.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2167, pruned_loss=0.03684, over 973254.07 frames.], batch size: 26, lr: 2.90e-04 2022-05-05 22:23:58,719 INFO [train.py:715] (0/8) Epoch 7, batch 21500, loss[loss=0.1443, simple_loss=0.2203, pruned_loss=0.03418, over 4820.00 frames.], tot_loss[loss=0.1458, simple_loss=0.217, pruned_loss=0.03725, over 972790.72 frames.], batch size: 25, lr: 2.90e-04 2022-05-05 22:24:36,574 INFO [train.py:715] (0/8) Epoch 7, batch 21550, loss[loss=0.1933, simple_loss=0.2629, pruned_loss=0.06189, over 4698.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2173, pruned_loss=0.03771, over 972854.14 frames.], batch size: 15, lr: 2.90e-04 2022-05-05 22:25:14,828 INFO [train.py:715] (0/8) Epoch 7, batch 21600, loss[loss=0.1665, simple_loss=0.2348, pruned_loss=0.0491, over 4820.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2178, pruned_loss=0.03779, over 972281.58 frames.], batch size: 13, lr: 2.90e-04 2022-05-05 22:25:53,298 INFO [train.py:715] (0/8) Epoch 7, batch 21650, loss[loss=0.1457, simple_loss=0.216, pruned_loss=0.03771, over 4819.00 frames.], tot_loss[loss=0.1458, simple_loss=0.217, pruned_loss=0.03735, over 972535.19 frames.], batch size: 26, lr: 2.90e-04 2022-05-05 22:26:30,666 INFO [train.py:715] (0/8) Epoch 7, batch 21700, loss[loss=0.1374, simple_loss=0.2087, pruned_loss=0.03303, over 4849.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2169, pruned_loss=0.03745, over 972643.11 frames.], batch size: 15, lr: 2.90e-04 2022-05-05 22:27:08,756 INFO [train.py:715] (0/8) Epoch 7, batch 21750, loss[loss=0.1359, simple_loss=0.2121, pruned_loss=0.02989, over 4994.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2165, pruned_loss=0.03699, over 973000.15 frames.], batch size: 16, lr: 2.90e-04 2022-05-05 22:27:46,868 INFO [train.py:715] (0/8) Epoch 7, batch 21800, loss[loss=0.1738, simple_loss=0.2528, pruned_loss=0.04737, over 4968.00 frames.], tot_loss[loss=0.1455, simple_loss=0.217, pruned_loss=0.03703, over 973052.41 frames.], batch size: 24, lr: 2.90e-04 2022-05-05 22:28:24,960 INFO [train.py:715] (0/8) Epoch 7, batch 21850, loss[loss=0.1544, simple_loss=0.2082, pruned_loss=0.05034, over 4786.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2169, pruned_loss=0.03706, over 972598.36 frames.], batch size: 17, lr: 2.90e-04 2022-05-05 22:29:02,870 INFO [train.py:715] (0/8) Epoch 7, batch 21900, loss[loss=0.156, simple_loss=0.2329, pruned_loss=0.03951, over 4954.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2174, pruned_loss=0.03717, over 972931.00 frames.], batch size: 39, lr: 2.90e-04 2022-05-05 22:29:40,812 INFO [train.py:715] (0/8) Epoch 7, batch 21950, loss[loss=0.1536, simple_loss=0.2145, pruned_loss=0.04638, over 4903.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2173, pruned_loss=0.03726, over 973107.21 frames.], batch size: 19, lr: 2.90e-04 2022-05-05 22:30:19,540 INFO [train.py:715] (0/8) Epoch 7, batch 22000, loss[loss=0.1542, simple_loss=0.237, pruned_loss=0.03571, over 4774.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2164, pruned_loss=0.03648, over 972823.30 frames.], batch size: 18, lr: 2.90e-04 2022-05-05 22:30:57,076 INFO [train.py:715] (0/8) Epoch 7, batch 22050, loss[loss=0.1304, simple_loss=0.1949, pruned_loss=0.033, over 4990.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2168, pruned_loss=0.03708, over 972699.52 frames.], batch size: 27, lr: 2.90e-04 2022-05-05 22:31:35,215 INFO [train.py:715] (0/8) Epoch 7, batch 22100, loss[loss=0.1576, simple_loss=0.2168, pruned_loss=0.04919, over 4658.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2167, pruned_loss=0.03752, over 972092.85 frames.], batch size: 13, lr: 2.90e-04 2022-05-05 22:32:13,473 INFO [train.py:715] (0/8) Epoch 7, batch 22150, loss[loss=0.1175, simple_loss=0.1882, pruned_loss=0.02337, over 4976.00 frames.], tot_loss[loss=0.1458, simple_loss=0.217, pruned_loss=0.03731, over 972515.41 frames.], batch size: 14, lr: 2.90e-04 2022-05-05 22:32:51,982 INFO [train.py:715] (0/8) Epoch 7, batch 22200, loss[loss=0.1658, simple_loss=0.2365, pruned_loss=0.04753, over 4859.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2177, pruned_loss=0.03751, over 972141.02 frames.], batch size: 20, lr: 2.90e-04 2022-05-05 22:33:29,478 INFO [train.py:715] (0/8) Epoch 7, batch 22250, loss[loss=0.1472, simple_loss=0.2348, pruned_loss=0.02975, over 4933.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2174, pruned_loss=0.03752, over 971943.01 frames.], batch size: 21, lr: 2.90e-04 2022-05-05 22:34:07,235 INFO [train.py:715] (0/8) Epoch 7, batch 22300, loss[loss=0.1309, simple_loss=0.2001, pruned_loss=0.03087, over 4844.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2182, pruned_loss=0.03774, over 971410.79 frames.], batch size: 30, lr: 2.90e-04 2022-05-05 22:34:45,532 INFO [train.py:715] (0/8) Epoch 7, batch 22350, loss[loss=0.1701, simple_loss=0.2424, pruned_loss=0.04892, over 4974.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2183, pruned_loss=0.03772, over 972577.74 frames.], batch size: 35, lr: 2.90e-04 2022-05-05 22:35:22,810 INFO [train.py:715] (0/8) Epoch 7, batch 22400, loss[loss=0.1517, simple_loss=0.2359, pruned_loss=0.03375, over 4954.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2182, pruned_loss=0.03766, over 972205.26 frames.], batch size: 24, lr: 2.90e-04 2022-05-05 22:36:00,503 INFO [train.py:715] (0/8) Epoch 7, batch 22450, loss[loss=0.1426, simple_loss=0.2099, pruned_loss=0.03766, over 4863.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2184, pruned_loss=0.03746, over 972345.03 frames.], batch size: 16, lr: 2.90e-04 2022-05-05 22:36:38,647 INFO [train.py:715] (0/8) Epoch 7, batch 22500, loss[loss=0.1506, simple_loss=0.2229, pruned_loss=0.03916, over 4848.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03704, over 972715.95 frames.], batch size: 34, lr: 2.90e-04 2022-05-05 22:37:16,687 INFO [train.py:715] (0/8) Epoch 7, batch 22550, loss[loss=0.1405, simple_loss=0.2169, pruned_loss=0.032, over 4815.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2181, pruned_loss=0.03737, over 972691.16 frames.], batch size: 25, lr: 2.90e-04 2022-05-05 22:37:54,354 INFO [train.py:715] (0/8) Epoch 7, batch 22600, loss[loss=0.1748, simple_loss=0.2479, pruned_loss=0.05084, over 4807.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2179, pruned_loss=0.0372, over 973200.33 frames.], batch size: 25, lr: 2.90e-04 2022-05-05 22:38:32,385 INFO [train.py:715] (0/8) Epoch 7, batch 22650, loss[loss=0.1389, simple_loss=0.2113, pruned_loss=0.03328, over 4971.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03655, over 974070.62 frames.], batch size: 15, lr: 2.90e-04 2022-05-05 22:39:10,748 INFO [train.py:715] (0/8) Epoch 7, batch 22700, loss[loss=0.1506, simple_loss=0.2141, pruned_loss=0.04359, over 4965.00 frames.], tot_loss[loss=0.1447, simple_loss=0.217, pruned_loss=0.03623, over 973839.52 frames.], batch size: 14, lr: 2.89e-04 2022-05-05 22:39:48,093 INFO [train.py:715] (0/8) Epoch 7, batch 22750, loss[loss=0.1398, simple_loss=0.2064, pruned_loss=0.03667, over 4971.00 frames.], tot_loss[loss=0.1456, simple_loss=0.218, pruned_loss=0.03653, over 972800.84 frames.], batch size: 14, lr: 2.89e-04 2022-05-05 22:40:25,725 INFO [train.py:715] (0/8) Epoch 7, batch 22800, loss[loss=0.1505, simple_loss=0.2219, pruned_loss=0.03962, over 4877.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2186, pruned_loss=0.03677, over 972921.81 frames.], batch size: 32, lr: 2.89e-04 2022-05-05 22:41:03,915 INFO [train.py:715] (0/8) Epoch 7, batch 22850, loss[loss=0.1571, simple_loss=0.214, pruned_loss=0.05013, over 4840.00 frames.], tot_loss[loss=0.146, simple_loss=0.2181, pruned_loss=0.03694, over 973488.79 frames.], batch size: 15, lr: 2.89e-04 2022-05-05 22:41:41,489 INFO [train.py:715] (0/8) Epoch 7, batch 22900, loss[loss=0.1099, simple_loss=0.1817, pruned_loss=0.01904, over 4988.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2169, pruned_loss=0.03641, over 973258.57 frames.], batch size: 14, lr: 2.89e-04 2022-05-05 22:42:19,135 INFO [train.py:715] (0/8) Epoch 7, batch 22950, loss[loss=0.1632, simple_loss=0.2292, pruned_loss=0.04865, over 4975.00 frames.], tot_loss[loss=0.1447, simple_loss=0.217, pruned_loss=0.03617, over 974022.61 frames.], batch size: 35, lr: 2.89e-04 2022-05-05 22:42:57,043 INFO [train.py:715] (0/8) Epoch 7, batch 23000, loss[loss=0.1262, simple_loss=0.1906, pruned_loss=0.03091, over 4844.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2184, pruned_loss=0.03742, over 973896.80 frames.], batch size: 15, lr: 2.89e-04 2022-05-05 22:43:35,191 INFO [train.py:715] (0/8) Epoch 7, batch 23050, loss[loss=0.1544, simple_loss=0.2293, pruned_loss=0.03977, over 4949.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2182, pruned_loss=0.0371, over 974716.08 frames.], batch size: 39, lr: 2.89e-04 2022-05-05 22:44:12,637 INFO [train.py:715] (0/8) Epoch 7, batch 23100, loss[loss=0.1309, simple_loss=0.2019, pruned_loss=0.02992, over 4925.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2175, pruned_loss=0.03697, over 973964.32 frames.], batch size: 39, lr: 2.89e-04 2022-05-05 22:44:49,933 INFO [train.py:715] (0/8) Epoch 7, batch 23150, loss[loss=0.1402, simple_loss=0.2134, pruned_loss=0.03347, over 4858.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2176, pruned_loss=0.03688, over 973946.26 frames.], batch size: 20, lr: 2.89e-04 2022-05-05 22:45:28,251 INFO [train.py:715] (0/8) Epoch 7, batch 23200, loss[loss=0.1375, simple_loss=0.221, pruned_loss=0.02702, over 4947.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2173, pruned_loss=0.03663, over 973677.10 frames.], batch size: 21, lr: 2.89e-04 2022-05-05 22:46:06,317 INFO [train.py:715] (0/8) Epoch 7, batch 23250, loss[loss=0.1554, simple_loss=0.2308, pruned_loss=0.04005, over 4897.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2174, pruned_loss=0.0364, over 973946.20 frames.], batch size: 17, lr: 2.89e-04 2022-05-05 22:46:43,798 INFO [train.py:715] (0/8) Epoch 7, batch 23300, loss[loss=0.147, simple_loss=0.2227, pruned_loss=0.03564, over 4861.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2179, pruned_loss=0.03667, over 973732.80 frames.], batch size: 32, lr: 2.89e-04 2022-05-05 22:47:22,578 INFO [train.py:715] (0/8) Epoch 7, batch 23350, loss[loss=0.1741, simple_loss=0.2392, pruned_loss=0.05449, over 4755.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2177, pruned_loss=0.03671, over 972867.55 frames.], batch size: 19, lr: 2.89e-04 2022-05-05 22:48:01,699 INFO [train.py:715] (0/8) Epoch 7, batch 23400, loss[loss=0.1127, simple_loss=0.1803, pruned_loss=0.02259, over 4795.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2176, pruned_loss=0.03685, over 972432.00 frames.], batch size: 12, lr: 2.89e-04 2022-05-05 22:48:40,122 INFO [train.py:715] (0/8) Epoch 7, batch 23450, loss[loss=0.139, simple_loss=0.2096, pruned_loss=0.03419, over 4760.00 frames.], tot_loss[loss=0.1454, simple_loss=0.217, pruned_loss=0.0369, over 971716.16 frames.], batch size: 19, lr: 2.89e-04 2022-05-05 22:49:18,246 INFO [train.py:715] (0/8) Epoch 7, batch 23500, loss[loss=0.1528, simple_loss=0.224, pruned_loss=0.0408, over 4800.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2175, pruned_loss=0.03696, over 971454.10 frames.], batch size: 12, lr: 2.89e-04 2022-05-05 22:49:56,227 INFO [train.py:715] (0/8) Epoch 7, batch 23550, loss[loss=0.142, simple_loss=0.2034, pruned_loss=0.04032, over 4968.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2176, pruned_loss=0.03731, over 971821.96 frames.], batch size: 35, lr: 2.89e-04 2022-05-05 22:50:34,436 INFO [train.py:715] (0/8) Epoch 7, batch 23600, loss[loss=0.1507, simple_loss=0.2133, pruned_loss=0.04403, over 4913.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2175, pruned_loss=0.03683, over 971853.24 frames.], batch size: 18, lr: 2.89e-04 2022-05-05 22:51:11,413 INFO [train.py:715] (0/8) Epoch 7, batch 23650, loss[loss=0.1575, simple_loss=0.2211, pruned_loss=0.04697, over 4802.00 frames.], tot_loss[loss=0.144, simple_loss=0.2161, pruned_loss=0.03594, over 971026.19 frames.], batch size: 21, lr: 2.89e-04 2022-05-05 22:51:49,263 INFO [train.py:715] (0/8) Epoch 7, batch 23700, loss[loss=0.1517, simple_loss=0.2283, pruned_loss=0.03753, over 4928.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2162, pruned_loss=0.03637, over 972777.95 frames.], batch size: 18, lr: 2.89e-04 2022-05-05 22:52:27,393 INFO [train.py:715] (0/8) Epoch 7, batch 23750, loss[loss=0.1654, simple_loss=0.2336, pruned_loss=0.04865, over 4799.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2157, pruned_loss=0.03601, over 972095.01 frames.], batch size: 21, lr: 2.89e-04 2022-05-05 22:53:04,573 INFO [train.py:715] (0/8) Epoch 7, batch 23800, loss[loss=0.1379, simple_loss=0.2118, pruned_loss=0.03199, over 4812.00 frames.], tot_loss[loss=0.144, simple_loss=0.216, pruned_loss=0.03605, over 971294.20 frames.], batch size: 13, lr: 2.89e-04 2022-05-05 22:53:42,348 INFO [train.py:715] (0/8) Epoch 7, batch 23850, loss[loss=0.1294, simple_loss=0.1981, pruned_loss=0.03035, over 4841.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2175, pruned_loss=0.03717, over 971341.60 frames.], batch size: 26, lr: 2.89e-04 2022-05-05 22:54:21,019 INFO [train.py:715] (0/8) Epoch 7, batch 23900, loss[loss=0.119, simple_loss=0.1933, pruned_loss=0.02229, over 4813.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2166, pruned_loss=0.03634, over 971537.15 frames.], batch size: 21, lr: 2.89e-04 2022-05-05 22:54:59,163 INFO [train.py:715] (0/8) Epoch 7, batch 23950, loss[loss=0.1488, simple_loss=0.2312, pruned_loss=0.03324, over 4943.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2172, pruned_loss=0.03652, over 971655.66 frames.], batch size: 21, lr: 2.89e-04 2022-05-05 22:55:36,635 INFO [train.py:715] (0/8) Epoch 7, batch 24000, loss[loss=0.1477, simple_loss=0.2171, pruned_loss=0.03911, over 4991.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2176, pruned_loss=0.03707, over 972229.23 frames.], batch size: 25, lr: 2.89e-04 2022-05-05 22:55:36,636 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 22:55:46,187 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.108, simple_loss=0.1929, pruned_loss=0.01156, over 914524.00 frames. 2022-05-05 22:56:23,725 INFO [train.py:715] (0/8) Epoch 7, batch 24050, loss[loss=0.139, simple_loss=0.2154, pruned_loss=0.0313, over 4902.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2163, pruned_loss=0.03651, over 972628.70 frames.], batch size: 17, lr: 2.89e-04 2022-05-05 22:57:02,030 INFO [train.py:715] (0/8) Epoch 7, batch 24100, loss[loss=0.1471, simple_loss=0.2194, pruned_loss=0.03734, over 4840.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2162, pruned_loss=0.03654, over 972848.06 frames.], batch size: 32, lr: 2.89e-04 2022-05-05 22:57:40,435 INFO [train.py:715] (0/8) Epoch 7, batch 24150, loss[loss=0.1697, simple_loss=0.2378, pruned_loss=0.05081, over 4910.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2158, pruned_loss=0.03597, over 973272.02 frames.], batch size: 19, lr: 2.89e-04 2022-05-05 22:58:18,170 INFO [train.py:715] (0/8) Epoch 7, batch 24200, loss[loss=0.1319, simple_loss=0.2162, pruned_loss=0.02376, over 4799.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2163, pruned_loss=0.03615, over 974083.97 frames.], batch size: 21, lr: 2.89e-04 2022-05-05 22:58:55,937 INFO [train.py:715] (0/8) Epoch 7, batch 24250, loss[loss=0.1632, simple_loss=0.2241, pruned_loss=0.05117, over 4901.00 frames.], tot_loss[loss=0.1455, simple_loss=0.217, pruned_loss=0.037, over 974101.01 frames.], batch size: 19, lr: 2.89e-04 2022-05-05 22:59:34,583 INFO [train.py:715] (0/8) Epoch 7, batch 24300, loss[loss=0.1489, simple_loss=0.2127, pruned_loss=0.04261, over 4949.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03681, over 973246.08 frames.], batch size: 24, lr: 2.89e-04 2022-05-05 23:00:12,421 INFO [train.py:715] (0/8) Epoch 7, batch 24350, loss[loss=0.1585, simple_loss=0.2303, pruned_loss=0.04338, over 4828.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2183, pruned_loss=0.03717, over 972517.15 frames.], batch size: 15, lr: 2.89e-04 2022-05-05 23:00:50,086 INFO [train.py:715] (0/8) Epoch 7, batch 24400, loss[loss=0.1447, simple_loss=0.2132, pruned_loss=0.03807, over 4927.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2183, pruned_loss=0.03722, over 972472.34 frames.], batch size: 29, lr: 2.89e-04 2022-05-05 23:01:28,242 INFO [train.py:715] (0/8) Epoch 7, batch 24450, loss[loss=0.1592, simple_loss=0.2246, pruned_loss=0.04691, over 4900.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2183, pruned_loss=0.03758, over 972230.11 frames.], batch size: 19, lr: 2.89e-04 2022-05-05 23:02:06,215 INFO [train.py:715] (0/8) Epoch 7, batch 24500, loss[loss=0.1439, simple_loss=0.2095, pruned_loss=0.03917, over 4936.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2185, pruned_loss=0.03789, over 971686.81 frames.], batch size: 35, lr: 2.89e-04 2022-05-05 23:02:43,831 INFO [train.py:715] (0/8) Epoch 7, batch 24550, loss[loss=0.1395, simple_loss=0.2097, pruned_loss=0.0346, over 4760.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2191, pruned_loss=0.03801, over 971402.45 frames.], batch size: 18, lr: 2.88e-04 2022-05-05 23:03:22,001 INFO [train.py:715] (0/8) Epoch 7, batch 24600, loss[loss=0.1323, simple_loss=0.2095, pruned_loss=0.02754, over 4952.00 frames.], tot_loss[loss=0.148, simple_loss=0.2194, pruned_loss=0.03831, over 972186.04 frames.], batch size: 29, lr: 2.88e-04 2022-05-05 23:04:01,120 INFO [train.py:715] (0/8) Epoch 7, batch 24650, loss[loss=0.1435, simple_loss=0.207, pruned_loss=0.03995, over 4827.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2186, pruned_loss=0.03782, over 972285.89 frames.], batch size: 30, lr: 2.88e-04 2022-05-05 23:04:39,569 INFO [train.py:715] (0/8) Epoch 7, batch 24700, loss[loss=0.1182, simple_loss=0.1894, pruned_loss=0.02349, over 4795.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2178, pruned_loss=0.03736, over 972037.64 frames.], batch size: 12, lr: 2.88e-04 2022-05-05 23:05:17,694 INFO [train.py:715] (0/8) Epoch 7, batch 24750, loss[loss=0.1394, simple_loss=0.2151, pruned_loss=0.03192, over 4949.00 frames.], tot_loss[loss=0.1451, simple_loss=0.217, pruned_loss=0.03665, over 971466.63 frames.], batch size: 35, lr: 2.88e-04 2022-05-05 23:05:56,158 INFO [train.py:715] (0/8) Epoch 7, batch 24800, loss[loss=0.1184, simple_loss=0.1982, pruned_loss=0.01931, over 4836.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2166, pruned_loss=0.03624, over 971614.61 frames.], batch size: 20, lr: 2.88e-04 2022-05-05 23:06:35,233 INFO [train.py:715] (0/8) Epoch 7, batch 24850, loss[loss=0.1314, simple_loss=0.2032, pruned_loss=0.02982, over 4710.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2175, pruned_loss=0.03642, over 971845.02 frames.], batch size: 15, lr: 2.88e-04 2022-05-05 23:07:13,829 INFO [train.py:715] (0/8) Epoch 7, batch 24900, loss[loss=0.1819, simple_loss=0.2439, pruned_loss=0.05992, over 4964.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2176, pruned_loss=0.0364, over 971370.90 frames.], batch size: 15, lr: 2.88e-04 2022-05-05 23:07:53,087 INFO [train.py:715] (0/8) Epoch 7, batch 24950, loss[loss=0.1555, simple_loss=0.2156, pruned_loss=0.04765, over 4802.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2171, pruned_loss=0.03639, over 971908.52 frames.], batch size: 14, lr: 2.88e-04 2022-05-05 23:08:32,939 INFO [train.py:715] (0/8) Epoch 7, batch 25000, loss[loss=0.1804, simple_loss=0.2454, pruned_loss=0.0577, over 4778.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2175, pruned_loss=0.03664, over 972255.08 frames.], batch size: 17, lr: 2.88e-04 2022-05-05 23:09:12,206 INFO [train.py:715] (0/8) Epoch 7, batch 25050, loss[loss=0.1449, simple_loss=0.2166, pruned_loss=0.03658, over 4790.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2176, pruned_loss=0.03664, over 972098.46 frames.], batch size: 14, lr: 2.88e-04 2022-05-05 23:09:51,229 INFO [train.py:715] (0/8) Epoch 7, batch 25100, loss[loss=0.1566, simple_loss=0.2269, pruned_loss=0.04316, over 4879.00 frames.], tot_loss[loss=0.1457, simple_loss=0.218, pruned_loss=0.0367, over 972275.03 frames.], batch size: 32, lr: 2.88e-04 2022-05-05 23:10:31,399 INFO [train.py:715] (0/8) Epoch 7, batch 25150, loss[loss=0.1461, simple_loss=0.2199, pruned_loss=0.03615, over 4819.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2182, pruned_loss=0.03723, over 972489.92 frames.], batch size: 15, lr: 2.88e-04 2022-05-05 23:11:11,712 INFO [train.py:715] (0/8) Epoch 7, batch 25200, loss[loss=0.1321, simple_loss=0.2018, pruned_loss=0.03118, over 4799.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2188, pruned_loss=0.03776, over 972721.04 frames.], batch size: 12, lr: 2.88e-04 2022-05-05 23:11:51,359 INFO [train.py:715] (0/8) Epoch 7, batch 25250, loss[loss=0.1499, simple_loss=0.2207, pruned_loss=0.03956, over 4860.00 frames.], tot_loss[loss=0.1475, simple_loss=0.2192, pruned_loss=0.03791, over 972250.28 frames.], batch size: 30, lr: 2.88e-04 2022-05-05 23:12:31,932 INFO [train.py:715] (0/8) Epoch 7, batch 25300, loss[loss=0.1682, simple_loss=0.2433, pruned_loss=0.04658, over 4788.00 frames.], tot_loss[loss=0.147, simple_loss=0.2186, pruned_loss=0.03772, over 972940.15 frames.], batch size: 17, lr: 2.88e-04 2022-05-05 23:13:13,663 INFO [train.py:715] (0/8) Epoch 7, batch 25350, loss[loss=0.1754, simple_loss=0.2327, pruned_loss=0.05904, over 4966.00 frames.], tot_loss[loss=0.1463, simple_loss=0.2175, pruned_loss=0.0375, over 972610.09 frames.], batch size: 15, lr: 2.88e-04 2022-05-05 23:13:55,231 INFO [train.py:715] (0/8) Epoch 7, batch 25400, loss[loss=0.1476, simple_loss=0.2101, pruned_loss=0.04255, over 4796.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2183, pruned_loss=0.03774, over 971380.61 frames.], batch size: 14, lr: 2.88e-04 2022-05-05 23:14:36,168 INFO [train.py:715] (0/8) Epoch 7, batch 25450, loss[loss=0.1658, simple_loss=0.2372, pruned_loss=0.04723, over 4911.00 frames.], tot_loss[loss=0.1471, simple_loss=0.2183, pruned_loss=0.03791, over 972797.67 frames.], batch size: 17, lr: 2.88e-04 2022-05-05 23:15:18,371 INFO [train.py:715] (0/8) Epoch 7, batch 25500, loss[loss=0.159, simple_loss=0.2159, pruned_loss=0.05102, over 4840.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2176, pruned_loss=0.03772, over 972955.93 frames.], batch size: 32, lr: 2.88e-04 2022-05-05 23:16:00,248 INFO [train.py:715] (0/8) Epoch 7, batch 25550, loss[loss=0.1499, simple_loss=0.2091, pruned_loss=0.04534, over 4875.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03761, over 972820.57 frames.], batch size: 16, lr: 2.88e-04 2022-05-05 23:16:41,005 INFO [train.py:715] (0/8) Epoch 7, batch 25600, loss[loss=0.1349, simple_loss=0.2021, pruned_loss=0.03379, over 4918.00 frames.], tot_loss[loss=0.1464, simple_loss=0.218, pruned_loss=0.03741, over 973002.02 frames.], batch size: 18, lr: 2.88e-04 2022-05-05 23:17:22,270 INFO [train.py:715] (0/8) Epoch 7, batch 25650, loss[loss=0.1343, simple_loss=0.2132, pruned_loss=0.02776, over 4972.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2179, pruned_loss=0.03727, over 972470.07 frames.], batch size: 25, lr: 2.88e-04 2022-05-05 23:18:03,670 INFO [train.py:715] (0/8) Epoch 7, batch 25700, loss[loss=0.1536, simple_loss=0.2208, pruned_loss=0.04324, over 4782.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2178, pruned_loss=0.03704, over 972566.08 frames.], batch size: 17, lr: 2.88e-04 2022-05-05 23:18:45,503 INFO [train.py:715] (0/8) Epoch 7, batch 25750, loss[loss=0.1423, simple_loss=0.2215, pruned_loss=0.03154, over 4783.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2168, pruned_loss=0.03626, over 971222.71 frames.], batch size: 18, lr: 2.88e-04 2022-05-05 23:19:26,136 INFO [train.py:715] (0/8) Epoch 7, batch 25800, loss[loss=0.1272, simple_loss=0.2012, pruned_loss=0.02657, over 4856.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2167, pruned_loss=0.03633, over 971625.98 frames.], batch size: 20, lr: 2.88e-04 2022-05-05 23:20:08,458 INFO [train.py:715] (0/8) Epoch 7, batch 25850, loss[loss=0.1418, simple_loss=0.2186, pruned_loss=0.03246, over 4892.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2161, pruned_loss=0.03626, over 971575.69 frames.], batch size: 39, lr: 2.88e-04 2022-05-05 23:20:50,387 INFO [train.py:715] (0/8) Epoch 7, batch 25900, loss[loss=0.1638, simple_loss=0.2347, pruned_loss=0.04649, over 4942.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2165, pruned_loss=0.03641, over 971975.41 frames.], batch size: 21, lr: 2.88e-04 2022-05-05 23:21:31,302 INFO [train.py:715] (0/8) Epoch 7, batch 25950, loss[loss=0.1548, simple_loss=0.2316, pruned_loss=0.03903, over 4878.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2171, pruned_loss=0.03702, over 971749.12 frames.], batch size: 22, lr: 2.88e-04 2022-05-05 23:22:12,742 INFO [train.py:715] (0/8) Epoch 7, batch 26000, loss[loss=0.1338, simple_loss=0.2085, pruned_loss=0.02952, over 4848.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03762, over 971356.00 frames.], batch size: 32, lr: 2.88e-04 2022-05-05 23:22:54,186 INFO [train.py:715] (0/8) Epoch 7, batch 26050, loss[loss=0.1487, simple_loss=0.2359, pruned_loss=0.03074, over 4815.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2172, pruned_loss=0.03712, over 971197.34 frames.], batch size: 26, lr: 2.88e-04 2022-05-05 23:23:36,131 INFO [train.py:715] (0/8) Epoch 7, batch 26100, loss[loss=0.1196, simple_loss=0.194, pruned_loss=0.02263, over 4927.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2165, pruned_loss=0.03679, over 971647.63 frames.], batch size: 18, lr: 2.88e-04 2022-05-05 23:24:16,474 INFO [train.py:715] (0/8) Epoch 7, batch 26150, loss[loss=0.1492, simple_loss=0.2141, pruned_loss=0.04215, over 4974.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2162, pruned_loss=0.03641, over 971226.86 frames.], batch size: 39, lr: 2.88e-04 2022-05-05 23:24:57,988 INFO [train.py:715] (0/8) Epoch 7, batch 26200, loss[loss=0.1423, simple_loss=0.219, pruned_loss=0.0328, over 4884.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2164, pruned_loss=0.03665, over 971855.80 frames.], batch size: 19, lr: 2.88e-04 2022-05-05 23:25:39,231 INFO [train.py:715] (0/8) Epoch 7, batch 26250, loss[loss=0.1415, simple_loss=0.2198, pruned_loss=0.03158, over 4935.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2158, pruned_loss=0.03629, over 971965.89 frames.], batch size: 21, lr: 2.88e-04 2022-05-05 23:26:19,597 INFO [train.py:715] (0/8) Epoch 7, batch 26300, loss[loss=0.156, simple_loss=0.2216, pruned_loss=0.04519, over 4846.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2159, pruned_loss=0.03641, over 971913.00 frames.], batch size: 15, lr: 2.88e-04 2022-05-05 23:26:59,775 INFO [train.py:715] (0/8) Epoch 7, batch 26350, loss[loss=0.1138, simple_loss=0.1829, pruned_loss=0.02236, over 4685.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2154, pruned_loss=0.03611, over 971056.46 frames.], batch size: 15, lr: 2.88e-04 2022-05-05 23:27:40,223 INFO [train.py:715] (0/8) Epoch 7, batch 26400, loss[loss=0.1362, simple_loss=0.2067, pruned_loss=0.03285, over 4693.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2166, pruned_loss=0.03682, over 970672.82 frames.], batch size: 15, lr: 2.87e-04 2022-05-05 23:28:20,878 INFO [train.py:715] (0/8) Epoch 7, batch 26450, loss[loss=0.1269, simple_loss=0.194, pruned_loss=0.02994, over 4752.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2168, pruned_loss=0.03688, over 970077.40 frames.], batch size: 18, lr: 2.87e-04 2022-05-05 23:29:00,623 INFO [train.py:715] (0/8) Epoch 7, batch 26500, loss[loss=0.1801, simple_loss=0.2444, pruned_loss=0.05791, over 4946.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03682, over 970679.10 frames.], batch size: 35, lr: 2.87e-04 2022-05-05 23:29:40,311 INFO [train.py:715] (0/8) Epoch 7, batch 26550, loss[loss=0.1523, simple_loss=0.2371, pruned_loss=0.0337, over 4940.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2175, pruned_loss=0.03681, over 970945.33 frames.], batch size: 23, lr: 2.87e-04 2022-05-05 23:30:20,783 INFO [train.py:715] (0/8) Epoch 7, batch 26600, loss[loss=0.1626, simple_loss=0.24, pruned_loss=0.04257, over 4810.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2162, pruned_loss=0.03608, over 971045.69 frames.], batch size: 25, lr: 2.87e-04 2022-05-05 23:31:00,456 INFO [train.py:715] (0/8) Epoch 7, batch 26650, loss[loss=0.1574, simple_loss=0.2237, pruned_loss=0.04556, over 4767.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03656, over 971361.00 frames.], batch size: 17, lr: 2.87e-04 2022-05-05 23:31:40,551 INFO [train.py:715] (0/8) Epoch 7, batch 26700, loss[loss=0.1392, simple_loss=0.2159, pruned_loss=0.03129, over 4976.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2172, pruned_loss=0.03668, over 972023.59 frames.], batch size: 25, lr: 2.87e-04 2022-05-05 23:32:21,227 INFO [train.py:715] (0/8) Epoch 7, batch 26750, loss[loss=0.1536, simple_loss=0.2241, pruned_loss=0.04154, over 4740.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2175, pruned_loss=0.03676, over 972062.10 frames.], batch size: 16, lr: 2.87e-04 2022-05-05 23:33:01,187 INFO [train.py:715] (0/8) Epoch 7, batch 26800, loss[loss=0.1308, simple_loss=0.2064, pruned_loss=0.02754, over 4815.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2171, pruned_loss=0.03681, over 972176.42 frames.], batch size: 27, lr: 2.87e-04 2022-05-05 23:33:40,946 INFO [train.py:715] (0/8) Epoch 7, batch 26850, loss[loss=0.1635, simple_loss=0.2401, pruned_loss=0.0435, over 4965.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03687, over 971612.81 frames.], batch size: 21, lr: 2.87e-04 2022-05-05 23:34:21,586 INFO [train.py:715] (0/8) Epoch 7, batch 26900, loss[loss=0.1681, simple_loss=0.2259, pruned_loss=0.0552, over 4783.00 frames.], tot_loss[loss=0.146, simple_loss=0.2173, pruned_loss=0.03734, over 971820.79 frames.], batch size: 18, lr: 2.87e-04 2022-05-05 23:35:02,612 INFO [train.py:715] (0/8) Epoch 7, batch 26950, loss[loss=0.1714, simple_loss=0.2479, pruned_loss=0.04747, over 4833.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2177, pruned_loss=0.03724, over 972345.40 frames.], batch size: 15, lr: 2.87e-04 2022-05-05 23:35:42,943 INFO [train.py:715] (0/8) Epoch 7, batch 27000, loss[loss=0.1712, simple_loss=0.237, pruned_loss=0.05264, over 4837.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2171, pruned_loss=0.03703, over 971854.89 frames.], batch size: 32, lr: 2.87e-04 2022-05-05 23:35:42,944 INFO [train.py:733] (0/8) Computing validation loss 2022-05-05 23:35:52,669 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.108, simple_loss=0.1928, pruned_loss=0.01156, over 914524.00 frames. 2022-05-05 23:36:33,215 INFO [train.py:715] (0/8) Epoch 7, batch 27050, loss[loss=0.1426, simple_loss=0.2175, pruned_loss=0.03387, over 4836.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2171, pruned_loss=0.03679, over 972118.94 frames.], batch size: 15, lr: 2.87e-04 2022-05-05 23:37:14,391 INFO [train.py:715] (0/8) Epoch 7, batch 27100, loss[loss=0.1254, simple_loss=0.1857, pruned_loss=0.03252, over 4796.00 frames.], tot_loss[loss=0.1459, simple_loss=0.218, pruned_loss=0.0369, over 972116.87 frames.], batch size: 12, lr: 2.87e-04 2022-05-05 23:37:56,265 INFO [train.py:715] (0/8) Epoch 7, batch 27150, loss[loss=0.1637, simple_loss=0.2322, pruned_loss=0.0476, over 4932.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2182, pruned_loss=0.03701, over 971674.33 frames.], batch size: 21, lr: 2.87e-04 2022-05-05 23:38:37,512 INFO [train.py:715] (0/8) Epoch 7, batch 27200, loss[loss=0.1455, simple_loss=0.2053, pruned_loss=0.04287, over 4793.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2176, pruned_loss=0.03667, over 971651.70 frames.], batch size: 24, lr: 2.87e-04 2022-05-05 23:39:18,967 INFO [train.py:715] (0/8) Epoch 7, batch 27250, loss[loss=0.126, simple_loss=0.2015, pruned_loss=0.02527, over 4768.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2172, pruned_loss=0.03653, over 971801.89 frames.], batch size: 19, lr: 2.87e-04 2022-05-05 23:40:00,805 INFO [train.py:715] (0/8) Epoch 7, batch 27300, loss[loss=0.1723, simple_loss=0.2395, pruned_loss=0.05257, over 4901.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2181, pruned_loss=0.03711, over 972919.28 frames.], batch size: 17, lr: 2.87e-04 2022-05-05 23:40:41,770 INFO [train.py:715] (0/8) Epoch 7, batch 27350, loss[loss=0.1424, simple_loss=0.2144, pruned_loss=0.03523, over 4901.00 frames.], tot_loss[loss=0.1461, simple_loss=0.218, pruned_loss=0.03715, over 973501.70 frames.], batch size: 19, lr: 2.87e-04 2022-05-05 23:41:23,070 INFO [train.py:715] (0/8) Epoch 7, batch 27400, loss[loss=0.1323, simple_loss=0.2111, pruned_loss=0.02677, over 4943.00 frames.], tot_loss[loss=0.1462, simple_loss=0.218, pruned_loss=0.03717, over 973653.54 frames.], batch size: 39, lr: 2.87e-04 2022-05-05 23:42:04,090 INFO [train.py:715] (0/8) Epoch 7, batch 27450, loss[loss=0.1474, simple_loss=0.2181, pruned_loss=0.03832, over 4802.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2174, pruned_loss=0.03663, over 973073.47 frames.], batch size: 21, lr: 2.87e-04 2022-05-05 23:42:45,312 INFO [train.py:715] (0/8) Epoch 7, batch 27500, loss[loss=0.1554, simple_loss=0.2094, pruned_loss=0.05068, over 4961.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03684, over 972852.48 frames.], batch size: 33, lr: 2.87e-04 2022-05-05 23:43:25,883 INFO [train.py:715] (0/8) Epoch 7, batch 27550, loss[loss=0.1305, simple_loss=0.2062, pruned_loss=0.0274, over 4811.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2184, pruned_loss=0.03727, over 973749.66 frames.], batch size: 21, lr: 2.87e-04 2022-05-05 23:44:06,407 INFO [train.py:715] (0/8) Epoch 7, batch 27600, loss[loss=0.1331, simple_loss=0.2127, pruned_loss=0.02672, over 4710.00 frames.], tot_loss[loss=0.1464, simple_loss=0.2185, pruned_loss=0.03715, over 973529.57 frames.], batch size: 15, lr: 2.87e-04 2022-05-05 23:44:47,815 INFO [train.py:715] (0/8) Epoch 7, batch 27650, loss[loss=0.1219, simple_loss=0.192, pruned_loss=0.02593, over 4841.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2184, pruned_loss=0.03699, over 972988.12 frames.], batch size: 12, lr: 2.87e-04 2022-05-05 23:45:28,512 INFO [train.py:715] (0/8) Epoch 7, batch 27700, loss[loss=0.1652, simple_loss=0.2383, pruned_loss=0.04607, over 4838.00 frames.], tot_loss[loss=0.146, simple_loss=0.218, pruned_loss=0.03703, over 973303.09 frames.], batch size: 15, lr: 2.87e-04 2022-05-05 23:46:09,252 INFO [train.py:715] (0/8) Epoch 7, batch 27750, loss[loss=0.1414, simple_loss=0.2166, pruned_loss=0.03305, over 4866.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2178, pruned_loss=0.03678, over 973761.30 frames.], batch size: 20, lr: 2.87e-04 2022-05-05 23:46:50,129 INFO [train.py:715] (0/8) Epoch 7, batch 27800, loss[loss=0.1418, simple_loss=0.2101, pruned_loss=0.03675, over 4787.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2176, pruned_loss=0.03661, over 973307.78 frames.], batch size: 17, lr: 2.87e-04 2022-05-05 23:47:31,344 INFO [train.py:715] (0/8) Epoch 7, batch 27850, loss[loss=0.1839, simple_loss=0.2334, pruned_loss=0.06715, over 4925.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2173, pruned_loss=0.03645, over 973335.21 frames.], batch size: 39, lr: 2.87e-04 2022-05-05 23:48:11,409 INFO [train.py:715] (0/8) Epoch 7, batch 27900, loss[loss=0.1264, simple_loss=0.1977, pruned_loss=0.02754, over 4640.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2178, pruned_loss=0.03677, over 972513.67 frames.], batch size: 13, lr: 2.87e-04 2022-05-05 23:48:52,369 INFO [train.py:715] (0/8) Epoch 7, batch 27950, loss[loss=0.2099, simple_loss=0.2705, pruned_loss=0.0747, over 4857.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2177, pruned_loss=0.03696, over 973207.53 frames.], batch size: 30, lr: 2.87e-04 2022-05-05 23:49:33,562 INFO [train.py:715] (0/8) Epoch 7, batch 28000, loss[loss=0.119, simple_loss=0.1912, pruned_loss=0.02336, over 4858.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2174, pruned_loss=0.03669, over 973403.17 frames.], batch size: 20, lr: 2.87e-04 2022-05-05 23:50:14,249 INFO [train.py:715] (0/8) Epoch 7, batch 28050, loss[loss=0.1465, simple_loss=0.2117, pruned_loss=0.04058, over 4983.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2173, pruned_loss=0.03658, over 973448.09 frames.], batch size: 14, lr: 2.87e-04 2022-05-05 23:50:54,414 INFO [train.py:715] (0/8) Epoch 7, batch 28100, loss[loss=0.1508, simple_loss=0.2216, pruned_loss=0.04001, over 4753.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2171, pruned_loss=0.03669, over 973247.29 frames.], batch size: 19, lr: 2.87e-04 2022-05-05 23:51:35,211 INFO [train.py:715] (0/8) Epoch 7, batch 28150, loss[loss=0.1192, simple_loss=0.1898, pruned_loss=0.02433, over 4944.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2172, pruned_loss=0.03667, over 973074.34 frames.], batch size: 21, lr: 2.87e-04 2022-05-05 23:52:16,642 INFO [train.py:715] (0/8) Epoch 7, batch 28200, loss[loss=0.1429, simple_loss=0.2083, pruned_loss=0.03876, over 4790.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03706, over 973016.23 frames.], batch size: 24, lr: 2.87e-04 2022-05-05 23:52:56,845 INFO [train.py:715] (0/8) Epoch 7, batch 28250, loss[loss=0.1566, simple_loss=0.2317, pruned_loss=0.04075, over 4760.00 frames.], tot_loss[loss=0.146, simple_loss=0.2179, pruned_loss=0.03703, over 972806.77 frames.], batch size: 18, lr: 2.87e-04 2022-05-05 23:53:38,367 INFO [train.py:715] (0/8) Epoch 7, batch 28300, loss[loss=0.1697, simple_loss=0.2315, pruned_loss=0.054, over 4813.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2174, pruned_loss=0.03642, over 972620.46 frames.], batch size: 27, lr: 2.86e-04 2022-05-05 23:54:05,433 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-272000.pt 2022-05-05 23:54:21,477 INFO [train.py:715] (0/8) Epoch 7, batch 28350, loss[loss=0.1727, simple_loss=0.2297, pruned_loss=0.05791, over 4847.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2168, pruned_loss=0.03672, over 973059.75 frames.], batch size: 30, lr: 2.86e-04 2022-05-05 23:55:01,295 INFO [train.py:715] (0/8) Epoch 7, batch 28400, loss[loss=0.1369, simple_loss=0.2029, pruned_loss=0.0355, over 4771.00 frames.], tot_loss[loss=0.145, simple_loss=0.2168, pruned_loss=0.03665, over 973221.49 frames.], batch size: 19, lr: 2.86e-04 2022-05-05 23:55:40,834 INFO [train.py:715] (0/8) Epoch 7, batch 28450, loss[loss=0.1174, simple_loss=0.2082, pruned_loss=0.01327, over 4986.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2172, pruned_loss=0.0367, over 974850.22 frames.], batch size: 25, lr: 2.86e-04 2022-05-05 23:56:20,928 INFO [train.py:715] (0/8) Epoch 7, batch 28500, loss[loss=0.1285, simple_loss=0.1945, pruned_loss=0.03126, over 4900.00 frames.], tot_loss[loss=0.145, simple_loss=0.2166, pruned_loss=0.03666, over 974130.49 frames.], batch size: 18, lr: 2.86e-04 2022-05-05 23:57:01,425 INFO [train.py:715] (0/8) Epoch 7, batch 28550, loss[loss=0.1643, simple_loss=0.2309, pruned_loss=0.04882, over 4838.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2161, pruned_loss=0.03663, over 972603.89 frames.], batch size: 30, lr: 2.86e-04 2022-05-05 23:57:41,423 INFO [train.py:715] (0/8) Epoch 7, batch 28600, loss[loss=0.1484, simple_loss=0.2194, pruned_loss=0.03875, over 4775.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2165, pruned_loss=0.03662, over 972350.46 frames.], batch size: 17, lr: 2.86e-04 2022-05-05 23:58:21,633 INFO [train.py:715] (0/8) Epoch 7, batch 28650, loss[loss=0.1416, simple_loss=0.2188, pruned_loss=0.03223, over 4839.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2167, pruned_loss=0.03636, over 972403.68 frames.], batch size: 30, lr: 2.86e-04 2022-05-05 23:59:03,077 INFO [train.py:715] (0/8) Epoch 7, batch 28700, loss[loss=0.1498, simple_loss=0.2292, pruned_loss=0.03514, over 4945.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2177, pruned_loss=0.03685, over 973011.15 frames.], batch size: 39, lr: 2.86e-04 2022-05-05 23:59:43,953 INFO [train.py:715] (0/8) Epoch 7, batch 28750, loss[loss=0.1607, simple_loss=0.2166, pruned_loss=0.05237, over 4815.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2183, pruned_loss=0.03741, over 972280.20 frames.], batch size: 13, lr: 2.86e-04 2022-05-06 00:00:24,190 INFO [train.py:715] (0/8) Epoch 7, batch 28800, loss[loss=0.1439, simple_loss=0.2171, pruned_loss=0.03539, over 4912.00 frames.], tot_loss[loss=0.146, simple_loss=0.2178, pruned_loss=0.03712, over 972230.65 frames.], batch size: 18, lr: 2.86e-04 2022-05-06 00:01:04,802 INFO [train.py:715] (0/8) Epoch 7, batch 28850, loss[loss=0.1416, simple_loss=0.2118, pruned_loss=0.03571, over 4938.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2183, pruned_loss=0.03707, over 972328.87 frames.], batch size: 39, lr: 2.86e-04 2022-05-06 00:01:45,168 INFO [train.py:715] (0/8) Epoch 7, batch 28900, loss[loss=0.155, simple_loss=0.2392, pruned_loss=0.03543, over 4757.00 frames.], tot_loss[loss=0.146, simple_loss=0.2183, pruned_loss=0.03689, over 972435.28 frames.], batch size: 16, lr: 2.86e-04 2022-05-06 00:02:24,694 INFO [train.py:715] (0/8) Epoch 7, batch 28950, loss[loss=0.1631, simple_loss=0.2296, pruned_loss=0.04834, over 4721.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2184, pruned_loss=0.03701, over 972041.00 frames.], batch size: 15, lr: 2.86e-04 2022-05-06 00:03:04,244 INFO [train.py:715] (0/8) Epoch 7, batch 29000, loss[loss=0.1358, simple_loss=0.2065, pruned_loss=0.03254, over 4873.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2183, pruned_loss=0.03677, over 972050.18 frames.], batch size: 39, lr: 2.86e-04 2022-05-06 00:03:44,903 INFO [train.py:715] (0/8) Epoch 7, batch 29050, loss[loss=0.1271, simple_loss=0.1982, pruned_loss=0.02804, over 4832.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2189, pruned_loss=0.03733, over 972269.38 frames.], batch size: 13, lr: 2.86e-04 2022-05-06 00:04:24,475 INFO [train.py:715] (0/8) Epoch 7, batch 29100, loss[loss=0.1796, simple_loss=0.2595, pruned_loss=0.04984, over 4840.00 frames.], tot_loss[loss=0.1468, simple_loss=0.2186, pruned_loss=0.03756, over 972435.70 frames.], batch size: 32, lr: 2.86e-04 2022-05-06 00:05:04,249 INFO [train.py:715] (0/8) Epoch 7, batch 29150, loss[loss=0.1449, simple_loss=0.2263, pruned_loss=0.03176, over 4859.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2186, pruned_loss=0.03759, over 972848.45 frames.], batch size: 20, lr: 2.86e-04 2022-05-06 00:05:44,142 INFO [train.py:715] (0/8) Epoch 7, batch 29200, loss[loss=0.139, simple_loss=0.2052, pruned_loss=0.03638, over 4898.00 frames.], tot_loss[loss=0.1469, simple_loss=0.2185, pruned_loss=0.03761, over 972469.01 frames.], batch size: 19, lr: 2.86e-04 2022-05-06 00:06:24,418 INFO [train.py:715] (0/8) Epoch 7, batch 29250, loss[loss=0.1313, simple_loss=0.2049, pruned_loss=0.02884, over 4834.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2172, pruned_loss=0.03703, over 973206.84 frames.], batch size: 25, lr: 2.86e-04 2022-05-06 00:07:04,326 INFO [train.py:715] (0/8) Epoch 7, batch 29300, loss[loss=0.1316, simple_loss=0.2029, pruned_loss=0.03016, over 4943.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2167, pruned_loss=0.037, over 972534.85 frames.], batch size: 21, lr: 2.86e-04 2022-05-06 00:07:44,012 INFO [train.py:715] (0/8) Epoch 7, batch 29350, loss[loss=0.1321, simple_loss=0.2136, pruned_loss=0.02533, over 4825.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2174, pruned_loss=0.03689, over 972803.66 frames.], batch size: 15, lr: 2.86e-04 2022-05-06 00:08:24,278 INFO [train.py:715] (0/8) Epoch 7, batch 29400, loss[loss=0.145, simple_loss=0.2213, pruned_loss=0.03434, over 4844.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2168, pruned_loss=0.03632, over 972575.91 frames.], batch size: 20, lr: 2.86e-04 2022-05-06 00:09:03,560 INFO [train.py:715] (0/8) Epoch 7, batch 29450, loss[loss=0.1489, simple_loss=0.234, pruned_loss=0.03195, over 4807.00 frames.], tot_loss[loss=0.145, simple_loss=0.2171, pruned_loss=0.03643, over 972442.09 frames.], batch size: 26, lr: 2.86e-04 2022-05-06 00:09:43,845 INFO [train.py:715] (0/8) Epoch 7, batch 29500, loss[loss=0.1527, simple_loss=0.2217, pruned_loss=0.04191, over 4955.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2173, pruned_loss=0.03667, over 972579.60 frames.], batch size: 39, lr: 2.86e-04 2022-05-06 00:10:23,571 INFO [train.py:715] (0/8) Epoch 7, batch 29550, loss[loss=0.1476, simple_loss=0.2219, pruned_loss=0.03661, over 4986.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2175, pruned_loss=0.03698, over 973182.76 frames.], batch size: 15, lr: 2.86e-04 2022-05-06 00:11:03,250 INFO [train.py:715] (0/8) Epoch 7, batch 29600, loss[loss=0.1759, simple_loss=0.2293, pruned_loss=0.06126, over 4752.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2176, pruned_loss=0.03682, over 972690.40 frames.], batch size: 16, lr: 2.86e-04 2022-05-06 00:11:43,204 INFO [train.py:715] (0/8) Epoch 7, batch 29650, loss[loss=0.1547, simple_loss=0.2202, pruned_loss=0.04458, over 4821.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2171, pruned_loss=0.03665, over 972694.34 frames.], batch size: 26, lr: 2.86e-04 2022-05-06 00:12:23,001 INFO [train.py:715] (0/8) Epoch 7, batch 29700, loss[loss=0.1086, simple_loss=0.175, pruned_loss=0.02113, over 4862.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03658, over 971334.92 frames.], batch size: 13, lr: 2.86e-04 2022-05-06 00:13:02,661 INFO [train.py:715] (0/8) Epoch 7, batch 29750, loss[loss=0.1618, simple_loss=0.2292, pruned_loss=0.04723, over 4962.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2176, pruned_loss=0.0367, over 971256.52 frames.], batch size: 15, lr: 2.86e-04 2022-05-06 00:13:42,292 INFO [train.py:715] (0/8) Epoch 7, batch 29800, loss[loss=0.1369, simple_loss=0.2056, pruned_loss=0.03413, over 4895.00 frames.], tot_loss[loss=0.145, simple_loss=0.2172, pruned_loss=0.03639, over 971890.30 frames.], batch size: 19, lr: 2.86e-04 2022-05-06 00:14:22,409 INFO [train.py:715] (0/8) Epoch 7, batch 29850, loss[loss=0.1399, simple_loss=0.1962, pruned_loss=0.04179, over 4877.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2177, pruned_loss=0.03676, over 972622.89 frames.], batch size: 32, lr: 2.86e-04 2022-05-06 00:15:02,280 INFO [train.py:715] (0/8) Epoch 7, batch 29900, loss[loss=0.1315, simple_loss=0.2022, pruned_loss=0.0304, over 4893.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2169, pruned_loss=0.03618, over 972739.43 frames.], batch size: 19, lr: 2.86e-04 2022-05-06 00:15:41,861 INFO [train.py:715] (0/8) Epoch 7, batch 29950, loss[loss=0.1615, simple_loss=0.2323, pruned_loss=0.04535, over 4772.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2163, pruned_loss=0.03616, over 973016.02 frames.], batch size: 18, lr: 2.86e-04 2022-05-06 00:16:21,223 INFO [train.py:715] (0/8) Epoch 7, batch 30000, loss[loss=0.1509, simple_loss=0.2289, pruned_loss=0.03648, over 4903.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2165, pruned_loss=0.03619, over 972790.72 frames.], batch size: 17, lr: 2.86e-04 2022-05-06 00:16:21,224 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 00:16:41,747 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.1081, simple_loss=0.1929, pruned_loss=0.01164, over 914524.00 frames. 2022-05-06 00:17:21,553 INFO [train.py:715] (0/8) Epoch 7, batch 30050, loss[loss=0.1788, simple_loss=0.2465, pruned_loss=0.05552, over 4819.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2167, pruned_loss=0.03609, over 973278.31 frames.], batch size: 25, lr: 2.86e-04 2022-05-06 00:18:00,846 INFO [train.py:715] (0/8) Epoch 7, batch 30100, loss[loss=0.1503, simple_loss=0.221, pruned_loss=0.03978, over 4747.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2162, pruned_loss=0.03581, over 972120.13 frames.], batch size: 16, lr: 2.86e-04 2022-05-06 00:18:40,785 INFO [train.py:715] (0/8) Epoch 7, batch 30150, loss[loss=0.1443, simple_loss=0.2144, pruned_loss=0.03716, over 4774.00 frames.], tot_loss[loss=0.1442, simple_loss=0.216, pruned_loss=0.03616, over 973248.20 frames.], batch size: 18, lr: 2.86e-04 2022-05-06 00:19:20,428 INFO [train.py:715] (0/8) Epoch 7, batch 30200, loss[loss=0.1456, simple_loss=0.2085, pruned_loss=0.04135, over 4982.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2157, pruned_loss=0.03575, over 972449.13 frames.], batch size: 14, lr: 2.85e-04 2022-05-06 00:20:00,694 INFO [train.py:715] (0/8) Epoch 7, batch 30250, loss[loss=0.1479, simple_loss=0.2298, pruned_loss=0.03297, over 4932.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2164, pruned_loss=0.03617, over 972838.71 frames.], batch size: 23, lr: 2.85e-04 2022-05-06 00:20:39,865 INFO [train.py:715] (0/8) Epoch 7, batch 30300, loss[loss=0.1452, simple_loss=0.2182, pruned_loss=0.03607, over 4975.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2175, pruned_loss=0.03674, over 972473.06 frames.], batch size: 31, lr: 2.85e-04 2022-05-06 00:21:19,489 INFO [train.py:715] (0/8) Epoch 7, batch 30350, loss[loss=0.1214, simple_loss=0.1962, pruned_loss=0.02333, over 4988.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2174, pruned_loss=0.03674, over 972596.17 frames.], batch size: 28, lr: 2.85e-04 2022-05-06 00:21:58,985 INFO [train.py:715] (0/8) Epoch 7, batch 30400, loss[loss=0.1555, simple_loss=0.2332, pruned_loss=0.03895, over 4883.00 frames.], tot_loss[loss=0.146, simple_loss=0.2178, pruned_loss=0.03704, over 972705.31 frames.], batch size: 16, lr: 2.85e-04 2022-05-06 00:22:38,975 INFO [train.py:715] (0/8) Epoch 7, batch 30450, loss[loss=0.1396, simple_loss=0.2149, pruned_loss=0.03216, over 4958.00 frames.], tot_loss[loss=0.145, simple_loss=0.217, pruned_loss=0.03655, over 972341.95 frames.], batch size: 28, lr: 2.85e-04 2022-05-06 00:23:18,895 INFO [train.py:715] (0/8) Epoch 7, batch 30500, loss[loss=0.1456, simple_loss=0.2111, pruned_loss=0.04002, over 4899.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03634, over 972237.73 frames.], batch size: 19, lr: 2.85e-04 2022-05-06 00:23:58,828 INFO [train.py:715] (0/8) Epoch 7, batch 30550, loss[loss=0.1311, simple_loss=0.2009, pruned_loss=0.03068, over 4967.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2164, pruned_loss=0.0362, over 972851.74 frames.], batch size: 24, lr: 2.85e-04 2022-05-06 00:24:38,527 INFO [train.py:715] (0/8) Epoch 7, batch 30600, loss[loss=0.1283, simple_loss=0.2009, pruned_loss=0.02781, over 4881.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2163, pruned_loss=0.03646, over 972857.95 frames.], batch size: 16, lr: 2.85e-04 2022-05-06 00:25:18,165 INFO [train.py:715] (0/8) Epoch 7, batch 30650, loss[loss=0.1366, simple_loss=0.212, pruned_loss=0.03057, over 4820.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2177, pruned_loss=0.03673, over 973138.18 frames.], batch size: 25, lr: 2.85e-04 2022-05-06 00:25:57,785 INFO [train.py:715] (0/8) Epoch 7, batch 30700, loss[loss=0.153, simple_loss=0.217, pruned_loss=0.04448, over 4890.00 frames.], tot_loss[loss=0.145, simple_loss=0.2171, pruned_loss=0.03644, over 972942.25 frames.], batch size: 22, lr: 2.85e-04 2022-05-06 00:26:36,833 INFO [train.py:715] (0/8) Epoch 7, batch 30750, loss[loss=0.1537, simple_loss=0.2198, pruned_loss=0.0438, over 4897.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2171, pruned_loss=0.03609, over 972905.88 frames.], batch size: 19, lr: 2.85e-04 2022-05-06 00:27:15,901 INFO [train.py:715] (0/8) Epoch 7, batch 30800, loss[loss=0.1536, simple_loss=0.2158, pruned_loss=0.04573, over 4848.00 frames.], tot_loss[loss=0.1456, simple_loss=0.218, pruned_loss=0.03662, over 972805.75 frames.], batch size: 32, lr: 2.85e-04 2022-05-06 00:27:55,685 INFO [train.py:715] (0/8) Epoch 7, batch 30850, loss[loss=0.1499, simple_loss=0.2102, pruned_loss=0.04475, over 4939.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2172, pruned_loss=0.03628, over 972832.20 frames.], batch size: 21, lr: 2.85e-04 2022-05-06 00:28:35,188 INFO [train.py:715] (0/8) Epoch 7, batch 30900, loss[loss=0.1282, simple_loss=0.2104, pruned_loss=0.02297, over 4812.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2171, pruned_loss=0.03587, over 972304.20 frames.], batch size: 26, lr: 2.85e-04 2022-05-06 00:29:15,589 INFO [train.py:715] (0/8) Epoch 7, batch 30950, loss[loss=0.1649, simple_loss=0.2325, pruned_loss=0.0486, over 4845.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2163, pruned_loss=0.03563, over 972124.04 frames.], batch size: 30, lr: 2.85e-04 2022-05-06 00:29:54,978 INFO [train.py:715] (0/8) Epoch 7, batch 31000, loss[loss=0.1221, simple_loss=0.1954, pruned_loss=0.02443, over 4794.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2163, pruned_loss=0.03609, over 972647.80 frames.], batch size: 14, lr: 2.85e-04 2022-05-06 00:30:34,536 INFO [train.py:715] (0/8) Epoch 7, batch 31050, loss[loss=0.1421, simple_loss=0.2144, pruned_loss=0.03494, over 4849.00 frames.], tot_loss[loss=0.1449, simple_loss=0.217, pruned_loss=0.03638, over 972563.67 frames.], batch size: 15, lr: 2.85e-04 2022-05-06 00:31:14,371 INFO [train.py:715] (0/8) Epoch 7, batch 31100, loss[loss=0.1441, simple_loss=0.2238, pruned_loss=0.0322, over 4982.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2175, pruned_loss=0.03673, over 973261.22 frames.], batch size: 35, lr: 2.85e-04 2022-05-06 00:31:54,490 INFO [train.py:715] (0/8) Epoch 7, batch 31150, loss[loss=0.1408, simple_loss=0.2081, pruned_loss=0.03674, over 4886.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2175, pruned_loss=0.0364, over 972877.05 frames.], batch size: 32, lr: 2.85e-04 2022-05-06 00:32:33,846 INFO [train.py:715] (0/8) Epoch 7, batch 31200, loss[loss=0.1303, simple_loss=0.1934, pruned_loss=0.03354, over 4838.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03633, over 972692.34 frames.], batch size: 13, lr: 2.85e-04 2022-05-06 00:33:13,814 INFO [train.py:715] (0/8) Epoch 7, batch 31250, loss[loss=0.1639, simple_loss=0.2321, pruned_loss=0.04786, over 4839.00 frames.], tot_loss[loss=0.1462, simple_loss=0.2179, pruned_loss=0.03723, over 972590.55 frames.], batch size: 30, lr: 2.85e-04 2022-05-06 00:33:54,538 INFO [train.py:715] (0/8) Epoch 7, batch 31300, loss[loss=0.1565, simple_loss=0.2215, pruned_loss=0.04571, over 4789.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03711, over 973421.80 frames.], batch size: 17, lr: 2.85e-04 2022-05-06 00:34:34,118 INFO [train.py:715] (0/8) Epoch 7, batch 31350, loss[loss=0.1242, simple_loss=0.2059, pruned_loss=0.02128, over 4814.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2174, pruned_loss=0.03707, over 972805.87 frames.], batch size: 27, lr: 2.85e-04 2022-05-06 00:35:14,066 INFO [train.py:715] (0/8) Epoch 7, batch 31400, loss[loss=0.153, simple_loss=0.228, pruned_loss=0.03899, over 4776.00 frames.], tot_loss[loss=0.1465, simple_loss=0.218, pruned_loss=0.0375, over 972418.25 frames.], batch size: 18, lr: 2.85e-04 2022-05-06 00:35:53,409 INFO [train.py:715] (0/8) Epoch 7, batch 31450, loss[loss=0.1652, simple_loss=0.225, pruned_loss=0.05266, over 4905.00 frames.], tot_loss[loss=0.1472, simple_loss=0.2187, pruned_loss=0.03781, over 973311.94 frames.], batch size: 17, lr: 2.85e-04 2022-05-06 00:36:33,190 INFO [train.py:715] (0/8) Epoch 7, batch 31500, loss[loss=0.1557, simple_loss=0.2221, pruned_loss=0.04461, over 4787.00 frames.], tot_loss[loss=0.1466, simple_loss=0.2179, pruned_loss=0.03762, over 973463.82 frames.], batch size: 18, lr: 2.85e-04 2022-05-06 00:37:12,315 INFO [train.py:715] (0/8) Epoch 7, batch 31550, loss[loss=0.1388, simple_loss=0.2096, pruned_loss=0.03397, over 4819.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2178, pruned_loss=0.03755, over 973811.20 frames.], batch size: 27, lr: 2.85e-04 2022-05-06 00:37:52,277 INFO [train.py:715] (0/8) Epoch 7, batch 31600, loss[loss=0.1141, simple_loss=0.1841, pruned_loss=0.02202, over 4969.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2165, pruned_loss=0.03648, over 973699.40 frames.], batch size: 24, lr: 2.85e-04 2022-05-06 00:38:32,097 INFO [train.py:715] (0/8) Epoch 7, batch 31650, loss[loss=0.1373, simple_loss=0.218, pruned_loss=0.0283, over 4909.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2164, pruned_loss=0.03641, over 974862.10 frames.], batch size: 39, lr: 2.85e-04 2022-05-06 00:39:11,524 INFO [train.py:715] (0/8) Epoch 7, batch 31700, loss[loss=0.128, simple_loss=0.2031, pruned_loss=0.02651, over 4800.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.03648, over 973983.19 frames.], batch size: 12, lr: 2.85e-04 2022-05-06 00:39:51,224 INFO [train.py:715] (0/8) Epoch 7, batch 31750, loss[loss=0.1439, simple_loss=0.2188, pruned_loss=0.03452, over 4882.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2173, pruned_loss=0.03714, over 973563.87 frames.], batch size: 19, lr: 2.85e-04 2022-05-06 00:40:30,487 INFO [train.py:715] (0/8) Epoch 7, batch 31800, loss[loss=0.1267, simple_loss=0.2004, pruned_loss=0.02644, over 4891.00 frames.], tot_loss[loss=0.145, simple_loss=0.2169, pruned_loss=0.03655, over 973177.46 frames.], batch size: 17, lr: 2.85e-04 2022-05-06 00:41:09,606 INFO [train.py:715] (0/8) Epoch 7, batch 31850, loss[loss=0.1418, simple_loss=0.2169, pruned_loss=0.03337, over 4916.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2162, pruned_loss=0.03616, over 973416.64 frames.], batch size: 17, lr: 2.85e-04 2022-05-06 00:41:49,862 INFO [train.py:715] (0/8) Epoch 7, batch 31900, loss[loss=0.1183, simple_loss=0.1891, pruned_loss=0.02371, over 4929.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2159, pruned_loss=0.03599, over 973110.25 frames.], batch size: 29, lr: 2.85e-04 2022-05-06 00:42:30,602 INFO [train.py:715] (0/8) Epoch 7, batch 31950, loss[loss=0.1561, simple_loss=0.2238, pruned_loss=0.04418, over 4911.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2159, pruned_loss=0.03595, over 973660.95 frames.], batch size: 19, lr: 2.85e-04 2022-05-06 00:43:11,063 INFO [train.py:715] (0/8) Epoch 7, batch 32000, loss[loss=0.1708, simple_loss=0.247, pruned_loss=0.04733, over 4917.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2163, pruned_loss=0.03651, over 972737.99 frames.], batch size: 29, lr: 2.85e-04 2022-05-06 00:43:50,731 INFO [train.py:715] (0/8) Epoch 7, batch 32050, loss[loss=0.1487, simple_loss=0.2183, pruned_loss=0.03954, over 4767.00 frames.], tot_loss[loss=0.1452, simple_loss=0.217, pruned_loss=0.03671, over 971693.73 frames.], batch size: 18, lr: 2.85e-04 2022-05-06 00:44:30,663 INFO [train.py:715] (0/8) Epoch 7, batch 32100, loss[loss=0.1858, simple_loss=0.2477, pruned_loss=0.06197, over 4944.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2173, pruned_loss=0.03699, over 971804.50 frames.], batch size: 23, lr: 2.85e-04 2022-05-06 00:45:10,478 INFO [train.py:715] (0/8) Epoch 7, batch 32150, loss[loss=0.1516, simple_loss=0.2221, pruned_loss=0.04051, over 4940.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2172, pruned_loss=0.03723, over 972260.77 frames.], batch size: 21, lr: 2.84e-04 2022-05-06 00:45:50,030 INFO [train.py:715] (0/8) Epoch 7, batch 32200, loss[loss=0.1687, simple_loss=0.2272, pruned_loss=0.0551, over 4836.00 frames.], tot_loss[loss=0.1459, simple_loss=0.217, pruned_loss=0.03743, over 971826.56 frames.], batch size: 30, lr: 2.84e-04 2022-05-06 00:46:29,882 INFO [train.py:715] (0/8) Epoch 7, batch 32250, loss[loss=0.1238, simple_loss=0.1927, pruned_loss=0.02741, over 4775.00 frames.], tot_loss[loss=0.1459, simple_loss=0.217, pruned_loss=0.03743, over 971420.65 frames.], batch size: 12, lr: 2.84e-04 2022-05-06 00:47:09,674 INFO [train.py:715] (0/8) Epoch 7, batch 32300, loss[loss=0.1377, simple_loss=0.2107, pruned_loss=0.03231, over 4981.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2179, pruned_loss=0.03761, over 972211.96 frames.], batch size: 28, lr: 2.84e-04 2022-05-06 00:47:50,011 INFO [train.py:715] (0/8) Epoch 7, batch 32350, loss[loss=0.1207, simple_loss=0.2064, pruned_loss=0.01749, over 4763.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2171, pruned_loss=0.03668, over 972639.46 frames.], batch size: 16, lr: 2.84e-04 2022-05-06 00:48:29,371 INFO [train.py:715] (0/8) Epoch 7, batch 32400, loss[loss=0.1259, simple_loss=0.1964, pruned_loss=0.02769, over 4875.00 frames.], tot_loss[loss=0.1454, simple_loss=0.217, pruned_loss=0.03692, over 973443.81 frames.], batch size: 16, lr: 2.84e-04 2022-05-06 00:49:09,262 INFO [train.py:715] (0/8) Epoch 7, batch 32450, loss[loss=0.1192, simple_loss=0.1933, pruned_loss=0.02257, over 4877.00 frames.], tot_loss[loss=0.146, simple_loss=0.2179, pruned_loss=0.03703, over 973444.93 frames.], batch size: 16, lr: 2.84e-04 2022-05-06 00:49:48,736 INFO [train.py:715] (0/8) Epoch 7, batch 32500, loss[loss=0.1433, simple_loss=0.2203, pruned_loss=0.03318, over 4805.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2176, pruned_loss=0.03695, over 972957.15 frames.], batch size: 21, lr: 2.84e-04 2022-05-06 00:50:28,300 INFO [train.py:715] (0/8) Epoch 7, batch 32550, loss[loss=0.1697, simple_loss=0.2394, pruned_loss=0.05, over 4772.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2175, pruned_loss=0.03672, over 973395.41 frames.], batch size: 19, lr: 2.84e-04 2022-05-06 00:51:08,052 INFO [train.py:715] (0/8) Epoch 7, batch 32600, loss[loss=0.1159, simple_loss=0.1831, pruned_loss=0.02436, over 4775.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2169, pruned_loss=0.03663, over 972848.63 frames.], batch size: 12, lr: 2.84e-04 2022-05-06 00:51:47,565 INFO [train.py:715] (0/8) Epoch 7, batch 32650, loss[loss=0.1372, simple_loss=0.2156, pruned_loss=0.02938, over 4926.00 frames.], tot_loss[loss=0.145, simple_loss=0.2171, pruned_loss=0.03642, over 972158.88 frames.], batch size: 39, lr: 2.84e-04 2022-05-06 00:52:27,384 INFO [train.py:715] (0/8) Epoch 7, batch 32700, loss[loss=0.1165, simple_loss=0.1858, pruned_loss=0.02355, over 4980.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2168, pruned_loss=0.03646, over 972232.20 frames.], batch size: 33, lr: 2.84e-04 2022-05-06 00:53:06,815 INFO [train.py:715] (0/8) Epoch 7, batch 32750, loss[loss=0.1414, simple_loss=0.2245, pruned_loss=0.02918, over 4974.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2168, pruned_loss=0.03665, over 971798.38 frames.], batch size: 24, lr: 2.84e-04 2022-05-06 00:53:47,305 INFO [train.py:715] (0/8) Epoch 7, batch 32800, loss[loss=0.1501, simple_loss=0.2221, pruned_loss=0.03905, over 4962.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2176, pruned_loss=0.03699, over 971605.66 frames.], batch size: 24, lr: 2.84e-04 2022-05-06 00:54:27,991 INFO [train.py:715] (0/8) Epoch 7, batch 32850, loss[loss=0.1315, simple_loss=0.2028, pruned_loss=0.0301, over 4967.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2171, pruned_loss=0.03665, over 971291.69 frames.], batch size: 24, lr: 2.84e-04 2022-05-06 00:55:08,133 INFO [train.py:715] (0/8) Epoch 7, batch 32900, loss[loss=0.1196, simple_loss=0.1937, pruned_loss=0.02272, over 4813.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2171, pruned_loss=0.0367, over 971410.92 frames.], batch size: 14, lr: 2.84e-04 2022-05-06 00:55:48,470 INFO [train.py:715] (0/8) Epoch 7, batch 32950, loss[loss=0.1309, simple_loss=0.2124, pruned_loss=0.02468, over 4844.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2169, pruned_loss=0.0363, over 971321.58 frames.], batch size: 20, lr: 2.84e-04 2022-05-06 00:56:28,427 INFO [train.py:715] (0/8) Epoch 7, batch 33000, loss[loss=0.1576, simple_loss=0.2267, pruned_loss=0.04423, over 4905.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2168, pruned_loss=0.03618, over 971642.61 frames.], batch size: 17, lr: 2.84e-04 2022-05-06 00:56:28,428 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 00:56:38,008 INFO [train.py:742] (0/8) Epoch 7, validation: loss=0.108, simple_loss=0.1927, pruned_loss=0.01164, over 914524.00 frames. 2022-05-06 00:57:17,518 INFO [train.py:715] (0/8) Epoch 7, batch 33050, loss[loss=0.1699, simple_loss=0.2421, pruned_loss=0.04887, over 4956.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2163, pruned_loss=0.03647, over 971660.27 frames.], batch size: 21, lr: 2.84e-04 2022-05-06 00:57:57,503 INFO [train.py:715] (0/8) Epoch 7, batch 33100, loss[loss=0.1139, simple_loss=0.1738, pruned_loss=0.02705, over 4790.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2166, pruned_loss=0.03634, over 971343.10 frames.], batch size: 14, lr: 2.84e-04 2022-05-06 00:58:36,954 INFO [train.py:715] (0/8) Epoch 7, batch 33150, loss[loss=0.1594, simple_loss=0.2316, pruned_loss=0.04362, over 4967.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2165, pruned_loss=0.03634, over 972186.53 frames.], batch size: 24, lr: 2.84e-04 2022-05-06 00:59:16,724 INFO [train.py:715] (0/8) Epoch 7, batch 33200, loss[loss=0.168, simple_loss=0.2319, pruned_loss=0.05201, over 4791.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2162, pruned_loss=0.03661, over 972164.10 frames.], batch size: 17, lr: 2.84e-04 2022-05-06 00:59:56,294 INFO [train.py:715] (0/8) Epoch 7, batch 33250, loss[loss=0.1696, simple_loss=0.2549, pruned_loss=0.04218, over 4784.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2175, pruned_loss=0.03706, over 971818.58 frames.], batch size: 17, lr: 2.84e-04 2022-05-06 01:00:35,755 INFO [train.py:715] (0/8) Epoch 7, batch 33300, loss[loss=0.1595, simple_loss=0.223, pruned_loss=0.04798, over 4854.00 frames.], tot_loss[loss=0.145, simple_loss=0.2167, pruned_loss=0.0366, over 971923.24 frames.], batch size: 32, lr: 2.84e-04 2022-05-06 01:01:15,275 INFO [train.py:715] (0/8) Epoch 7, batch 33350, loss[loss=0.1165, simple_loss=0.189, pruned_loss=0.02203, over 4825.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2162, pruned_loss=0.03666, over 971675.63 frames.], batch size: 26, lr: 2.84e-04 2022-05-06 01:01:55,573 INFO [train.py:715] (0/8) Epoch 7, batch 33400, loss[loss=0.1512, simple_loss=0.2253, pruned_loss=0.03852, over 4807.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2163, pruned_loss=0.03626, over 972556.00 frames.], batch size: 21, lr: 2.84e-04 2022-05-06 01:02:35,668 INFO [train.py:715] (0/8) Epoch 7, batch 33450, loss[loss=0.1471, simple_loss=0.2225, pruned_loss=0.03579, over 4920.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2158, pruned_loss=0.0356, over 971753.89 frames.], batch size: 19, lr: 2.84e-04 2022-05-06 01:03:16,252 INFO [train.py:715] (0/8) Epoch 7, batch 33500, loss[loss=0.1366, simple_loss=0.2121, pruned_loss=0.03059, over 4986.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2166, pruned_loss=0.03585, over 972030.19 frames.], batch size: 28, lr: 2.84e-04 2022-05-06 01:03:56,830 INFO [train.py:715] (0/8) Epoch 7, batch 33550, loss[loss=0.1366, simple_loss=0.2032, pruned_loss=0.03496, over 4892.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2169, pruned_loss=0.03604, over 971704.53 frames.], batch size: 16, lr: 2.84e-04 2022-05-06 01:04:37,432 INFO [train.py:715] (0/8) Epoch 7, batch 33600, loss[loss=0.1576, simple_loss=0.2177, pruned_loss=0.04876, over 4865.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2174, pruned_loss=0.0365, over 971445.52 frames.], batch size: 32, lr: 2.84e-04 2022-05-06 01:05:17,937 INFO [train.py:715] (0/8) Epoch 7, batch 33650, loss[loss=0.1235, simple_loss=0.1954, pruned_loss=0.02584, over 4747.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2178, pruned_loss=0.03655, over 971561.79 frames.], batch size: 16, lr: 2.84e-04 2022-05-06 01:05:57,808 INFO [train.py:715] (0/8) Epoch 7, batch 33700, loss[loss=0.1924, simple_loss=0.2559, pruned_loss=0.06444, over 4908.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2173, pruned_loss=0.0367, over 971919.72 frames.], batch size: 17, lr: 2.84e-04 2022-05-06 01:06:37,960 INFO [train.py:715] (0/8) Epoch 7, batch 33750, loss[loss=0.1392, simple_loss=0.2065, pruned_loss=0.03595, over 4968.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2172, pruned_loss=0.03668, over 972167.69 frames.], batch size: 14, lr: 2.84e-04 2022-05-06 01:07:17,449 INFO [train.py:715] (0/8) Epoch 7, batch 33800, loss[loss=0.1355, simple_loss=0.1949, pruned_loss=0.03805, over 4800.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2174, pruned_loss=0.03656, over 971725.26 frames.], batch size: 25, lr: 2.84e-04 2022-05-06 01:07:58,042 INFO [train.py:715] (0/8) Epoch 7, batch 33850, loss[loss=0.1285, simple_loss=0.1948, pruned_loss=0.03113, over 4797.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2167, pruned_loss=0.03611, over 972463.74 frames.], batch size: 14, lr: 2.84e-04 2022-05-06 01:08:37,723 INFO [train.py:715] (0/8) Epoch 7, batch 33900, loss[loss=0.1384, simple_loss=0.209, pruned_loss=0.03389, over 4765.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03557, over 972331.25 frames.], batch size: 19, lr: 2.84e-04 2022-05-06 01:09:17,826 INFO [train.py:715] (0/8) Epoch 7, batch 33950, loss[loss=0.1465, simple_loss=0.2246, pruned_loss=0.03423, over 4760.00 frames.], tot_loss[loss=0.144, simple_loss=0.2169, pruned_loss=0.03554, over 972336.25 frames.], batch size: 19, lr: 2.84e-04 2022-05-06 01:09:57,283 INFO [train.py:715] (0/8) Epoch 7, batch 34000, loss[loss=0.1414, simple_loss=0.2145, pruned_loss=0.03413, over 4912.00 frames.], tot_loss[loss=0.1442, simple_loss=0.217, pruned_loss=0.03573, over 972731.79 frames.], batch size: 19, lr: 2.84e-04 2022-05-06 01:10:37,474 INFO [train.py:715] (0/8) Epoch 7, batch 34050, loss[loss=0.1512, simple_loss=0.2163, pruned_loss=0.043, over 4907.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2172, pruned_loss=0.03608, over 973609.37 frames.], batch size: 19, lr: 2.84e-04 2022-05-06 01:11:17,475 INFO [train.py:715] (0/8) Epoch 7, batch 34100, loss[loss=0.1552, simple_loss=0.2248, pruned_loss=0.04279, over 4930.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2193, pruned_loss=0.03708, over 974607.34 frames.], batch size: 23, lr: 2.83e-04 2022-05-06 01:11:57,001 INFO [train.py:715] (0/8) Epoch 7, batch 34150, loss[loss=0.1307, simple_loss=0.206, pruned_loss=0.02768, over 4875.00 frames.], tot_loss[loss=0.146, simple_loss=0.2183, pruned_loss=0.03686, over 974203.12 frames.], batch size: 30, lr: 2.83e-04 2022-05-06 01:12:37,401 INFO [train.py:715] (0/8) Epoch 7, batch 34200, loss[loss=0.1659, simple_loss=0.2363, pruned_loss=0.04778, over 4825.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2177, pruned_loss=0.03673, over 973300.36 frames.], batch size: 15, lr: 2.83e-04 2022-05-06 01:13:17,638 INFO [train.py:715] (0/8) Epoch 7, batch 34250, loss[loss=0.1076, simple_loss=0.1877, pruned_loss=0.01378, over 4935.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2175, pruned_loss=0.03638, over 973444.59 frames.], batch size: 29, lr: 2.83e-04 2022-05-06 01:13:58,299 INFO [train.py:715] (0/8) Epoch 7, batch 34300, loss[loss=0.1754, simple_loss=0.2397, pruned_loss=0.05556, over 4955.00 frames.], tot_loss[loss=0.1454, simple_loss=0.218, pruned_loss=0.0364, over 973337.51 frames.], batch size: 15, lr: 2.83e-04 2022-05-06 01:14:38,113 INFO [train.py:715] (0/8) Epoch 7, batch 34350, loss[loss=0.1284, simple_loss=0.2025, pruned_loss=0.02719, over 4949.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2175, pruned_loss=0.03595, over 972821.01 frames.], batch size: 21, lr: 2.83e-04 2022-05-06 01:15:18,244 INFO [train.py:715] (0/8) Epoch 7, batch 34400, loss[loss=0.1552, simple_loss=0.2221, pruned_loss=0.04414, over 4788.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2173, pruned_loss=0.03625, over 972576.91 frames.], batch size: 21, lr: 2.83e-04 2022-05-06 01:15:58,920 INFO [train.py:715] (0/8) Epoch 7, batch 34450, loss[loss=0.1464, simple_loss=0.2214, pruned_loss=0.03571, over 4973.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2182, pruned_loss=0.03671, over 972074.73 frames.], batch size: 14, lr: 2.83e-04 2022-05-06 01:16:38,144 INFO [train.py:715] (0/8) Epoch 7, batch 34500, loss[loss=0.1189, simple_loss=0.1907, pruned_loss=0.0236, over 4866.00 frames.], tot_loss[loss=0.1467, simple_loss=0.2191, pruned_loss=0.03712, over 972194.76 frames.], batch size: 16, lr: 2.83e-04 2022-05-06 01:17:18,208 INFO [train.py:715] (0/8) Epoch 7, batch 34550, loss[loss=0.1509, simple_loss=0.211, pruned_loss=0.0454, over 4845.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2182, pruned_loss=0.03683, over 972849.70 frames.], batch size: 20, lr: 2.83e-04 2022-05-06 01:17:58,845 INFO [train.py:715] (0/8) Epoch 7, batch 34600, loss[loss=0.1461, simple_loss=0.2245, pruned_loss=0.03387, over 4986.00 frames.], tot_loss[loss=0.145, simple_loss=0.217, pruned_loss=0.03645, over 972763.58 frames.], batch size: 28, lr: 2.83e-04 2022-05-06 01:18:38,810 INFO [train.py:715] (0/8) Epoch 7, batch 34650, loss[loss=0.174, simple_loss=0.2493, pruned_loss=0.0493, over 4816.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2173, pruned_loss=0.0369, over 972953.15 frames.], batch size: 25, lr: 2.83e-04 2022-05-06 01:19:19,025 INFO [train.py:715] (0/8) Epoch 7, batch 34700, loss[loss=0.1524, simple_loss=0.2264, pruned_loss=0.03919, over 4846.00 frames.], tot_loss[loss=0.1457, simple_loss=0.2176, pruned_loss=0.03689, over 972108.24 frames.], batch size: 26, lr: 2.83e-04 2022-05-06 01:19:57,500 INFO [train.py:715] (0/8) Epoch 7, batch 34750, loss[loss=0.1457, simple_loss=0.215, pruned_loss=0.0382, over 4766.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2174, pruned_loss=0.03675, over 971427.30 frames.], batch size: 12, lr: 2.83e-04 2022-05-06 01:20:35,931 INFO [train.py:715] (0/8) Epoch 7, batch 34800, loss[loss=0.1381, simple_loss=0.1982, pruned_loss=0.03903, over 4777.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.0365, over 970310.74 frames.], batch size: 12, lr: 2.83e-04 2022-05-06 01:20:44,700 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-7.pt 2022-05-06 01:21:27,009 INFO [train.py:715] (0/8) Epoch 8, batch 0, loss[loss=0.1502, simple_loss=0.2334, pruned_loss=0.03355, over 4819.00 frames.], tot_loss[loss=0.1502, simple_loss=0.2334, pruned_loss=0.03355, over 4819.00 frames.], batch size: 25, lr: 2.69e-04 2022-05-06 01:22:06,297 INFO [train.py:715] (0/8) Epoch 8, batch 50, loss[loss=0.1458, simple_loss=0.2215, pruned_loss=0.03506, over 4780.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2168, pruned_loss=0.03498, over 219726.68 frames.], batch size: 14, lr: 2.69e-04 2022-05-06 01:22:47,062 INFO [train.py:715] (0/8) Epoch 8, batch 100, loss[loss=0.1302, simple_loss=0.1985, pruned_loss=0.03098, over 4986.00 frames.], tot_loss[loss=0.143, simple_loss=0.2156, pruned_loss=0.03522, over 387228.63 frames.], batch size: 25, lr: 2.69e-04 2022-05-06 01:23:26,800 INFO [train.py:715] (0/8) Epoch 8, batch 150, loss[loss=0.1622, simple_loss=0.2259, pruned_loss=0.0493, over 4750.00 frames.], tot_loss[loss=0.1425, simple_loss=0.214, pruned_loss=0.03549, over 517093.91 frames.], batch size: 16, lr: 2.69e-04 2022-05-06 01:24:07,303 INFO [train.py:715] (0/8) Epoch 8, batch 200, loss[loss=0.1484, simple_loss=0.2224, pruned_loss=0.03724, over 4985.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2139, pruned_loss=0.03543, over 617918.26 frames.], batch size: 40, lr: 2.69e-04 2022-05-06 01:24:47,112 INFO [train.py:715] (0/8) Epoch 8, batch 250, loss[loss=0.1304, simple_loss=0.2147, pruned_loss=0.02303, over 4926.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2134, pruned_loss=0.03493, over 696588.68 frames.], batch size: 29, lr: 2.69e-04 2022-05-06 01:25:27,369 INFO [train.py:715] (0/8) Epoch 8, batch 300, loss[loss=0.1152, simple_loss=0.1917, pruned_loss=0.01937, over 4865.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2127, pruned_loss=0.03456, over 757076.01 frames.], batch size: 16, lr: 2.69e-04 2022-05-06 01:26:07,149 INFO [train.py:715] (0/8) Epoch 8, batch 350, loss[loss=0.1376, simple_loss=0.2084, pruned_loss=0.03341, over 4805.00 frames.], tot_loss[loss=0.1409, simple_loss=0.213, pruned_loss=0.03439, over 805124.99 frames.], batch size: 26, lr: 2.69e-04 2022-05-06 01:26:46,034 INFO [train.py:715] (0/8) Epoch 8, batch 400, loss[loss=0.1471, simple_loss=0.2168, pruned_loss=0.03871, over 4802.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2133, pruned_loss=0.03458, over 842726.38 frames.], batch size: 21, lr: 2.69e-04 2022-05-06 01:27:26,631 INFO [train.py:715] (0/8) Epoch 8, batch 450, loss[loss=0.1726, simple_loss=0.2357, pruned_loss=0.05476, over 4875.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2144, pruned_loss=0.03502, over 870657.84 frames.], batch size: 32, lr: 2.69e-04 2022-05-06 01:28:06,606 INFO [train.py:715] (0/8) Epoch 8, batch 500, loss[loss=0.113, simple_loss=0.1906, pruned_loss=0.01767, over 4986.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2158, pruned_loss=0.03586, over 893308.09 frames.], batch size: 28, lr: 2.69e-04 2022-05-06 01:28:47,251 INFO [train.py:715] (0/8) Epoch 8, batch 550, loss[loss=0.146, simple_loss=0.2218, pruned_loss=0.0351, over 4986.00 frames.], tot_loss[loss=0.1435, simple_loss=0.216, pruned_loss=0.03549, over 911125.61 frames.], batch size: 28, lr: 2.69e-04 2022-05-06 01:29:26,913 INFO [train.py:715] (0/8) Epoch 8, batch 600, loss[loss=0.1345, simple_loss=0.2139, pruned_loss=0.02754, over 4829.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2166, pruned_loss=0.0361, over 924851.65 frames.], batch size: 15, lr: 2.69e-04 2022-05-06 01:30:07,133 INFO [train.py:715] (0/8) Epoch 8, batch 650, loss[loss=0.1909, simple_loss=0.2532, pruned_loss=0.06431, over 4883.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2166, pruned_loss=0.0362, over 934680.60 frames.], batch size: 16, lr: 2.68e-04 2022-05-06 01:30:47,388 INFO [train.py:715] (0/8) Epoch 8, batch 700, loss[loss=0.1403, simple_loss=0.2161, pruned_loss=0.03228, over 4747.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2163, pruned_loss=0.03598, over 942265.77 frames.], batch size: 19, lr: 2.68e-04 2022-05-06 01:31:27,084 INFO [train.py:715] (0/8) Epoch 8, batch 750, loss[loss=0.1163, simple_loss=0.1845, pruned_loss=0.02406, over 4953.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2161, pruned_loss=0.03563, over 948515.89 frames.], batch size: 14, lr: 2.68e-04 2022-05-06 01:32:07,141 INFO [train.py:715] (0/8) Epoch 8, batch 800, loss[loss=0.1279, simple_loss=0.201, pruned_loss=0.02736, over 4783.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2156, pruned_loss=0.03547, over 953499.50 frames.], batch size: 17, lr: 2.68e-04 2022-05-06 01:32:47,131 INFO [train.py:715] (0/8) Epoch 8, batch 850, loss[loss=0.1446, simple_loss=0.2246, pruned_loss=0.03233, over 4941.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2157, pruned_loss=0.03565, over 957697.67 frames.], batch size: 21, lr: 2.68e-04 2022-05-06 01:33:28,550 INFO [train.py:715] (0/8) Epoch 8, batch 900, loss[loss=0.1574, simple_loss=0.2293, pruned_loss=0.04279, over 4821.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2157, pruned_loss=0.03585, over 961268.53 frames.], batch size: 21, lr: 2.68e-04 2022-05-06 01:34:08,652 INFO [train.py:715] (0/8) Epoch 8, batch 950, loss[loss=0.1543, simple_loss=0.2238, pruned_loss=0.04244, over 4860.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2167, pruned_loss=0.03657, over 964196.66 frames.], batch size: 34, lr: 2.68e-04 2022-05-06 01:34:49,697 INFO [train.py:715] (0/8) Epoch 8, batch 1000, loss[loss=0.1248, simple_loss=0.2042, pruned_loss=0.02272, over 4827.00 frames.], tot_loss[loss=0.1452, simple_loss=0.217, pruned_loss=0.0367, over 966223.61 frames.], batch size: 13, lr: 2.68e-04 2022-05-06 01:35:30,785 INFO [train.py:715] (0/8) Epoch 8, batch 1050, loss[loss=0.1416, simple_loss=0.21, pruned_loss=0.03662, over 4824.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.03653, over 967457.37 frames.], batch size: 30, lr: 2.68e-04 2022-05-06 01:36:11,900 INFO [train.py:715] (0/8) Epoch 8, batch 1100, loss[loss=0.1519, simple_loss=0.2256, pruned_loss=0.03909, over 4983.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2166, pruned_loss=0.03655, over 968288.84 frames.], batch size: 25, lr: 2.68e-04 2022-05-06 01:36:52,402 INFO [train.py:715] (0/8) Epoch 8, batch 1150, loss[loss=0.1634, simple_loss=0.23, pruned_loss=0.0484, over 4789.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2165, pruned_loss=0.0366, over 968544.27 frames.], batch size: 17, lr: 2.68e-04 2022-05-06 01:37:33,430 INFO [train.py:715] (0/8) Epoch 8, batch 1200, loss[loss=0.1537, simple_loss=0.219, pruned_loss=0.04421, over 4925.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2166, pruned_loss=0.03655, over 969440.96 frames.], batch size: 18, lr: 2.68e-04 2022-05-06 01:38:14,753 INFO [train.py:715] (0/8) Epoch 8, batch 1250, loss[loss=0.1329, simple_loss=0.2023, pruned_loss=0.03179, over 4889.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2164, pruned_loss=0.03656, over 970221.01 frames.], batch size: 22, lr: 2.68e-04 2022-05-06 01:38:55,091 INFO [train.py:715] (0/8) Epoch 8, batch 1300, loss[loss=0.1267, simple_loss=0.1955, pruned_loss=0.02895, over 4950.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2165, pruned_loss=0.03652, over 970372.89 frames.], batch size: 21, lr: 2.68e-04 2022-05-06 01:39:36,446 INFO [train.py:715] (0/8) Epoch 8, batch 1350, loss[loss=0.142, simple_loss=0.2087, pruned_loss=0.03766, over 4783.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2165, pruned_loss=0.03628, over 970737.78 frames.], batch size: 12, lr: 2.68e-04 2022-05-06 01:40:17,095 INFO [train.py:715] (0/8) Epoch 8, batch 1400, loss[loss=0.131, simple_loss=0.2, pruned_loss=0.03097, over 4803.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2165, pruned_loss=0.03639, over 971155.33 frames.], batch size: 12, lr: 2.68e-04 2022-05-06 01:40:57,929 INFO [train.py:715] (0/8) Epoch 8, batch 1450, loss[loss=0.1807, simple_loss=0.252, pruned_loss=0.05469, over 4889.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2174, pruned_loss=0.03685, over 971938.36 frames.], batch size: 16, lr: 2.68e-04 2022-05-06 01:41:37,778 INFO [train.py:715] (0/8) Epoch 8, batch 1500, loss[loss=0.1127, simple_loss=0.178, pruned_loss=0.02377, over 4886.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2166, pruned_loss=0.03597, over 972330.56 frames.], batch size: 22, lr: 2.68e-04 2022-05-06 01:41:56,727 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-280000.pt 2022-05-06 01:42:20,410 INFO [train.py:715] (0/8) Epoch 8, batch 1550, loss[loss=0.1444, simple_loss=0.2181, pruned_loss=0.03532, over 4769.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03636, over 972856.77 frames.], batch size: 19, lr: 2.68e-04 2022-05-06 01:43:00,530 INFO [train.py:715] (0/8) Epoch 8, batch 1600, loss[loss=0.1456, simple_loss=0.2133, pruned_loss=0.039, over 4635.00 frames.], tot_loss[loss=0.1448, simple_loss=0.217, pruned_loss=0.03625, over 972463.76 frames.], batch size: 13, lr: 2.68e-04 2022-05-06 01:43:39,973 INFO [train.py:715] (0/8) Epoch 8, batch 1650, loss[loss=0.1712, simple_loss=0.2321, pruned_loss=0.05519, over 4902.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2164, pruned_loss=0.03621, over 972440.19 frames.], batch size: 17, lr: 2.68e-04 2022-05-06 01:44:20,203 INFO [train.py:715] (0/8) Epoch 8, batch 1700, loss[loss=0.1601, simple_loss=0.2342, pruned_loss=0.04297, over 4849.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2162, pruned_loss=0.03626, over 972069.64 frames.], batch size: 30, lr: 2.68e-04 2022-05-06 01:44:59,606 INFO [train.py:715] (0/8) Epoch 8, batch 1750, loss[loss=0.1716, simple_loss=0.2467, pruned_loss=0.04821, over 4985.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2171, pruned_loss=0.03617, over 972542.78 frames.], batch size: 25, lr: 2.68e-04 2022-05-06 01:45:39,054 INFO [train.py:715] (0/8) Epoch 8, batch 1800, loss[loss=0.1453, simple_loss=0.2144, pruned_loss=0.03814, over 4847.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2169, pruned_loss=0.0361, over 972917.74 frames.], batch size: 30, lr: 2.68e-04 2022-05-06 01:46:18,112 INFO [train.py:715] (0/8) Epoch 8, batch 1850, loss[loss=0.1572, simple_loss=0.2275, pruned_loss=0.04342, over 4693.00 frames.], tot_loss[loss=0.1448, simple_loss=0.217, pruned_loss=0.03628, over 972629.18 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:46:57,508 INFO [train.py:715] (0/8) Epoch 8, batch 1900, loss[loss=0.1294, simple_loss=0.1947, pruned_loss=0.03208, over 4838.00 frames.], tot_loss[loss=0.145, simple_loss=0.2171, pruned_loss=0.03644, over 973398.43 frames.], batch size: 30, lr: 2.68e-04 2022-05-06 01:47:37,011 INFO [train.py:715] (0/8) Epoch 8, batch 1950, loss[loss=0.1396, simple_loss=0.2069, pruned_loss=0.03612, over 4751.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2158, pruned_loss=0.03598, over 971973.15 frames.], batch size: 19, lr: 2.68e-04 2022-05-06 01:48:16,130 INFO [train.py:715] (0/8) Epoch 8, batch 2000, loss[loss=0.147, simple_loss=0.2198, pruned_loss=0.03716, over 4926.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2167, pruned_loss=0.03677, over 971485.51 frames.], batch size: 18, lr: 2.68e-04 2022-05-06 01:48:56,142 INFO [train.py:715] (0/8) Epoch 8, batch 2050, loss[loss=0.1529, simple_loss=0.2167, pruned_loss=0.0446, over 4987.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2165, pruned_loss=0.0365, over 971584.32 frames.], batch size: 28, lr: 2.68e-04 2022-05-06 01:49:35,099 INFO [train.py:715] (0/8) Epoch 8, batch 2100, loss[loss=0.1469, simple_loss=0.2215, pruned_loss=0.03616, over 4807.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2162, pruned_loss=0.03658, over 971077.82 frames.], batch size: 21, lr: 2.68e-04 2022-05-06 01:50:14,047 INFO [train.py:715] (0/8) Epoch 8, batch 2150, loss[loss=0.1822, simple_loss=0.2493, pruned_loss=0.0575, over 4699.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2157, pruned_loss=0.0361, over 970997.82 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:50:53,032 INFO [train.py:715] (0/8) Epoch 8, batch 2200, loss[loss=0.1408, simple_loss=0.2181, pruned_loss=0.03182, over 4823.00 frames.], tot_loss[loss=0.144, simple_loss=0.2157, pruned_loss=0.03619, over 971347.11 frames.], batch size: 27, lr: 2.68e-04 2022-05-06 01:51:32,659 INFO [train.py:715] (0/8) Epoch 8, batch 2250, loss[loss=0.1401, simple_loss=0.2121, pruned_loss=0.034, over 4947.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2156, pruned_loss=0.03626, over 972364.55 frames.], batch size: 29, lr: 2.68e-04 2022-05-06 01:52:12,075 INFO [train.py:715] (0/8) Epoch 8, batch 2300, loss[loss=0.1253, simple_loss=0.1985, pruned_loss=0.02603, over 4960.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2153, pruned_loss=0.03574, over 973252.08 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:52:50,785 INFO [train.py:715] (0/8) Epoch 8, batch 2350, loss[loss=0.1426, simple_loss=0.2182, pruned_loss=0.0335, over 4849.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2159, pruned_loss=0.03595, over 972913.41 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:53:30,840 INFO [train.py:715] (0/8) Epoch 8, batch 2400, loss[loss=0.1413, simple_loss=0.2123, pruned_loss=0.03515, over 4705.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2149, pruned_loss=0.0359, over 973216.72 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:54:10,337 INFO [train.py:715] (0/8) Epoch 8, batch 2450, loss[loss=0.137, simple_loss=0.2165, pruned_loss=0.02882, over 4768.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2162, pruned_loss=0.03684, over 972666.27 frames.], batch size: 19, lr: 2.68e-04 2022-05-06 01:54:49,892 INFO [train.py:715] (0/8) Epoch 8, batch 2500, loss[loss=0.1583, simple_loss=0.2378, pruned_loss=0.03938, over 4976.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2163, pruned_loss=0.03659, over 972900.82 frames.], batch size: 14, lr: 2.68e-04 2022-05-06 01:55:28,673 INFO [train.py:715] (0/8) Epoch 8, batch 2550, loss[loss=0.1345, simple_loss=0.2147, pruned_loss=0.02717, over 4850.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2159, pruned_loss=0.03631, over 972865.87 frames.], batch size: 13, lr: 2.68e-04 2022-05-06 01:56:08,301 INFO [train.py:715] (0/8) Epoch 8, batch 2600, loss[loss=0.1294, simple_loss=0.2043, pruned_loss=0.02728, over 4966.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2156, pruned_loss=0.0364, over 972326.97 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:56:47,550 INFO [train.py:715] (0/8) Epoch 8, batch 2650, loss[loss=0.1342, simple_loss=0.2148, pruned_loss=0.02678, over 4835.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2154, pruned_loss=0.03606, over 972207.29 frames.], batch size: 32, lr: 2.68e-04 2022-05-06 01:57:27,032 INFO [train.py:715] (0/8) Epoch 8, batch 2700, loss[loss=0.1501, simple_loss=0.2225, pruned_loss=0.03886, over 4709.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2151, pruned_loss=0.03569, over 972082.99 frames.], batch size: 15, lr: 2.68e-04 2022-05-06 01:58:06,370 INFO [train.py:715] (0/8) Epoch 8, batch 2750, loss[loss=0.1423, simple_loss=0.207, pruned_loss=0.03879, over 4986.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2155, pruned_loss=0.03567, over 972646.60 frames.], batch size: 28, lr: 2.67e-04 2022-05-06 01:58:45,748 INFO [train.py:715] (0/8) Epoch 8, batch 2800, loss[loss=0.1345, simple_loss=0.2067, pruned_loss=0.03114, over 4819.00 frames.], tot_loss[loss=0.1428, simple_loss=0.215, pruned_loss=0.03531, over 973090.84 frames.], batch size: 25, lr: 2.67e-04 2022-05-06 01:59:24,995 INFO [train.py:715] (0/8) Epoch 8, batch 2850, loss[loss=0.1367, simple_loss=0.2119, pruned_loss=0.03069, over 4681.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2153, pruned_loss=0.03546, over 973414.78 frames.], batch size: 15, lr: 2.67e-04 2022-05-06 02:00:03,840 INFO [train.py:715] (0/8) Epoch 8, batch 2900, loss[loss=0.1427, simple_loss=0.2214, pruned_loss=0.03203, over 4928.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2151, pruned_loss=0.0353, over 973382.86 frames.], batch size: 18, lr: 2.67e-04 2022-05-06 02:00:43,806 INFO [train.py:715] (0/8) Epoch 8, batch 2950, loss[loss=0.1395, simple_loss=0.2159, pruned_loss=0.03154, over 4814.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2145, pruned_loss=0.03498, over 973425.01 frames.], batch size: 25, lr: 2.67e-04 2022-05-06 02:01:22,465 INFO [train.py:715] (0/8) Epoch 8, batch 3000, loss[loss=0.1561, simple_loss=0.2262, pruned_loss=0.04303, over 4746.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2142, pruned_loss=0.03473, over 973788.18 frames.], batch size: 16, lr: 2.67e-04 2022-05-06 02:01:22,466 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 02:01:32,130 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1076, simple_loss=0.1923, pruned_loss=0.0115, over 914524.00 frames. 2022-05-06 02:02:11,362 INFO [train.py:715] (0/8) Epoch 8, batch 3050, loss[loss=0.1255, simple_loss=0.2077, pruned_loss=0.02167, over 4801.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2146, pruned_loss=0.03489, over 974298.30 frames.], batch size: 24, lr: 2.67e-04 2022-05-06 02:02:50,367 INFO [train.py:715] (0/8) Epoch 8, batch 3100, loss[loss=0.1293, simple_loss=0.1969, pruned_loss=0.03078, over 4917.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2153, pruned_loss=0.03509, over 975231.84 frames.], batch size: 19, lr: 2.67e-04 2022-05-06 02:03:29,323 INFO [train.py:715] (0/8) Epoch 8, batch 3150, loss[loss=0.1578, simple_loss=0.2274, pruned_loss=0.04406, over 4881.00 frames.], tot_loss[loss=0.143, simple_loss=0.216, pruned_loss=0.03501, over 974197.98 frames.], batch size: 16, lr: 2.67e-04 2022-05-06 02:04:09,013 INFO [train.py:715] (0/8) Epoch 8, batch 3200, loss[loss=0.1244, simple_loss=0.2029, pruned_loss=0.02291, over 4967.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2154, pruned_loss=0.03489, over 973525.40 frames.], batch size: 24, lr: 2.67e-04 2022-05-06 02:04:48,444 INFO [train.py:715] (0/8) Epoch 8, batch 3250, loss[loss=0.1307, simple_loss=0.1978, pruned_loss=0.0318, over 4799.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2165, pruned_loss=0.03561, over 972765.22 frames.], batch size: 21, lr: 2.67e-04 2022-05-06 02:05:28,478 INFO [train.py:715] (0/8) Epoch 8, batch 3300, loss[loss=0.1567, simple_loss=0.2258, pruned_loss=0.0438, over 4703.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2162, pruned_loss=0.03566, over 972540.25 frames.], batch size: 15, lr: 2.67e-04 2022-05-06 02:06:08,833 INFO [train.py:715] (0/8) Epoch 8, batch 3350, loss[loss=0.1567, simple_loss=0.2242, pruned_loss=0.04458, over 4946.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2159, pruned_loss=0.03573, over 971755.66 frames.], batch size: 39, lr: 2.67e-04 2022-05-06 02:06:49,934 INFO [train.py:715] (0/8) Epoch 8, batch 3400, loss[loss=0.1389, simple_loss=0.2155, pruned_loss=0.03116, over 4757.00 frames.], tot_loss[loss=0.143, simple_loss=0.2155, pruned_loss=0.03528, over 972561.81 frames.], batch size: 14, lr: 2.67e-04 2022-05-06 02:07:30,801 INFO [train.py:715] (0/8) Epoch 8, batch 3450, loss[loss=0.1146, simple_loss=0.1837, pruned_loss=0.02273, over 4939.00 frames.], tot_loss[loss=0.1436, simple_loss=0.216, pruned_loss=0.03562, over 972120.88 frames.], batch size: 29, lr: 2.67e-04 2022-05-06 02:08:11,008 INFO [train.py:715] (0/8) Epoch 8, batch 3500, loss[loss=0.1628, simple_loss=0.2277, pruned_loss=0.04897, over 4797.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2162, pruned_loss=0.03622, over 972886.92 frames.], batch size: 17, lr: 2.67e-04 2022-05-06 02:08:52,347 INFO [train.py:715] (0/8) Epoch 8, batch 3550, loss[loss=0.1411, simple_loss=0.2096, pruned_loss=0.03632, over 4940.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2168, pruned_loss=0.03638, over 971914.24 frames.], batch size: 29, lr: 2.67e-04 2022-05-06 02:09:33,200 INFO [train.py:715] (0/8) Epoch 8, batch 3600, loss[loss=0.1106, simple_loss=0.1853, pruned_loss=0.01789, over 4797.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2162, pruned_loss=0.03645, over 972309.80 frames.], batch size: 12, lr: 2.67e-04 2022-05-06 02:10:13,454 INFO [train.py:715] (0/8) Epoch 8, batch 3650, loss[loss=0.155, simple_loss=0.2228, pruned_loss=0.0436, over 4869.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2172, pruned_loss=0.03686, over 972303.01 frames.], batch size: 20, lr: 2.67e-04 2022-05-06 02:10:53,933 INFO [train.py:715] (0/8) Epoch 8, batch 3700, loss[loss=0.1331, simple_loss=0.2024, pruned_loss=0.03192, over 4690.00 frames.], tot_loss[loss=0.1439, simple_loss=0.216, pruned_loss=0.03587, over 972011.80 frames.], batch size: 15, lr: 2.67e-04 2022-05-06 02:11:34,276 INFO [train.py:715] (0/8) Epoch 8, batch 3750, loss[loss=0.09542, simple_loss=0.1627, pruned_loss=0.01409, over 4796.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2152, pruned_loss=0.03529, over 972214.40 frames.], batch size: 12, lr: 2.67e-04 2022-05-06 02:12:13,640 INFO [train.py:715] (0/8) Epoch 8, batch 3800, loss[loss=0.1532, simple_loss=0.2207, pruned_loss=0.04284, over 4927.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2155, pruned_loss=0.03567, over 972070.35 frames.], batch size: 18, lr: 2.67e-04 2022-05-06 02:12:54,029 INFO [train.py:715] (0/8) Epoch 8, batch 3850, loss[loss=0.1574, simple_loss=0.2255, pruned_loss=0.0447, over 4775.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2156, pruned_loss=0.03567, over 971803.36 frames.], batch size: 17, lr: 2.67e-04 2022-05-06 02:13:34,217 INFO [train.py:715] (0/8) Epoch 8, batch 3900, loss[loss=0.1387, simple_loss=0.2028, pruned_loss=0.03728, over 4907.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2156, pruned_loss=0.03546, over 971654.71 frames.], batch size: 17, lr: 2.67e-04 2022-05-06 02:14:14,989 INFO [train.py:715] (0/8) Epoch 8, batch 3950, loss[loss=0.1537, simple_loss=0.2277, pruned_loss=0.03983, over 4783.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2157, pruned_loss=0.03533, over 971158.08 frames.], batch size: 18, lr: 2.67e-04 2022-05-06 02:14:54,900 INFO [train.py:715] (0/8) Epoch 8, batch 4000, loss[loss=0.1316, simple_loss=0.2051, pruned_loss=0.0291, over 4796.00 frames.], tot_loss[loss=0.143, simple_loss=0.2154, pruned_loss=0.03529, over 970378.50 frames.], batch size: 17, lr: 2.67e-04 2022-05-06 02:15:35,362 INFO [train.py:715] (0/8) Epoch 8, batch 4050, loss[loss=0.1345, simple_loss=0.2105, pruned_loss=0.02925, over 4935.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2149, pruned_loss=0.03496, over 970521.39 frames.], batch size: 23, lr: 2.67e-04 2022-05-06 02:16:16,171 INFO [train.py:715] (0/8) Epoch 8, batch 4100, loss[loss=0.1413, simple_loss=0.205, pruned_loss=0.03878, over 4847.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2147, pruned_loss=0.03517, over 970384.66 frames.], batch size: 34, lr: 2.67e-04 2022-05-06 02:16:55,923 INFO [train.py:715] (0/8) Epoch 8, batch 4150, loss[loss=0.1538, simple_loss=0.2193, pruned_loss=0.04412, over 4805.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2156, pruned_loss=0.03594, over 970162.51 frames.], batch size: 25, lr: 2.67e-04 2022-05-06 02:17:35,658 INFO [train.py:715] (0/8) Epoch 8, batch 4200, loss[loss=0.1464, simple_loss=0.2178, pruned_loss=0.03752, over 4947.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2163, pruned_loss=0.0362, over 971133.39 frames.], batch size: 21, lr: 2.67e-04 2022-05-06 02:18:15,230 INFO [train.py:715] (0/8) Epoch 8, batch 4250, loss[loss=0.1402, simple_loss=0.2036, pruned_loss=0.03845, over 4815.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2164, pruned_loss=0.0365, over 971956.99 frames.], batch size: 12, lr: 2.67e-04 2022-05-06 02:18:54,986 INFO [train.py:715] (0/8) Epoch 8, batch 4300, loss[loss=0.14, simple_loss=0.216, pruned_loss=0.032, over 4913.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2167, pruned_loss=0.03648, over 971180.63 frames.], batch size: 23, lr: 2.67e-04 2022-05-06 02:19:34,150 INFO [train.py:715] (0/8) Epoch 8, batch 4350, loss[loss=0.1712, simple_loss=0.2479, pruned_loss=0.04722, over 4733.00 frames.], tot_loss[loss=0.1451, simple_loss=0.217, pruned_loss=0.03659, over 971793.88 frames.], batch size: 16, lr: 2.67e-04 2022-05-06 02:20:13,542 INFO [train.py:715] (0/8) Epoch 8, batch 4400, loss[loss=0.1462, simple_loss=0.2305, pruned_loss=0.03093, over 4902.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2173, pruned_loss=0.03672, over 971373.56 frames.], batch size: 19, lr: 2.67e-04 2022-05-06 02:20:53,463 INFO [train.py:715] (0/8) Epoch 8, batch 4450, loss[loss=0.14, simple_loss=0.2046, pruned_loss=0.03774, over 4726.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2171, pruned_loss=0.03697, over 972382.78 frames.], batch size: 16, lr: 2.67e-04 2022-05-06 02:21:33,235 INFO [train.py:715] (0/8) Epoch 8, batch 4500, loss[loss=0.1238, simple_loss=0.2016, pruned_loss=0.02299, over 4816.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2162, pruned_loss=0.03654, over 972748.99 frames.], batch size: 27, lr: 2.67e-04 2022-05-06 02:22:12,203 INFO [train.py:715] (0/8) Epoch 8, batch 4550, loss[loss=0.1483, simple_loss=0.2213, pruned_loss=0.03762, over 4971.00 frames.], tot_loss[loss=0.1443, simple_loss=0.216, pruned_loss=0.03634, over 972363.79 frames.], batch size: 39, lr: 2.67e-04 2022-05-06 02:22:52,185 INFO [train.py:715] (0/8) Epoch 8, batch 4600, loss[loss=0.1536, simple_loss=0.2304, pruned_loss=0.03845, over 4712.00 frames.], tot_loss[loss=0.1444, simple_loss=0.216, pruned_loss=0.0364, over 971042.57 frames.], batch size: 15, lr: 2.67e-04 2022-05-06 02:23:31,718 INFO [train.py:715] (0/8) Epoch 8, batch 4650, loss[loss=0.158, simple_loss=0.2297, pruned_loss=0.04315, over 4912.00 frames.], tot_loss[loss=0.145, simple_loss=0.2166, pruned_loss=0.03675, over 972840.94 frames.], batch size: 23, lr: 2.67e-04 2022-05-06 02:24:11,298 INFO [train.py:715] (0/8) Epoch 8, batch 4700, loss[loss=0.1518, simple_loss=0.2188, pruned_loss=0.04239, over 4963.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2165, pruned_loss=0.03646, over 972303.62 frames.], batch size: 35, lr: 2.67e-04 2022-05-06 02:24:50,827 INFO [train.py:715] (0/8) Epoch 8, batch 4750, loss[loss=0.1236, simple_loss=0.2018, pruned_loss=0.02266, over 4820.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2163, pruned_loss=0.03648, over 972641.54 frames.], batch size: 25, lr: 2.67e-04 2022-05-06 02:25:30,486 INFO [train.py:715] (0/8) Epoch 8, batch 4800, loss[loss=0.1674, simple_loss=0.2278, pruned_loss=0.05355, over 4688.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2164, pruned_loss=0.03647, over 973239.93 frames.], batch size: 15, lr: 2.67e-04 2022-05-06 02:26:10,388 INFO [train.py:715] (0/8) Epoch 8, batch 4850, loss[loss=0.1557, simple_loss=0.2334, pruned_loss=0.03898, over 4811.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2159, pruned_loss=0.03618, over 972434.92 frames.], batch size: 27, lr: 2.66e-04 2022-05-06 02:26:49,513 INFO [train.py:715] (0/8) Epoch 8, batch 4900, loss[loss=0.1668, simple_loss=0.2452, pruned_loss=0.04417, over 4980.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2159, pruned_loss=0.03558, over 972362.57 frames.], batch size: 24, lr: 2.66e-04 2022-05-06 02:27:29,276 INFO [train.py:715] (0/8) Epoch 8, batch 4950, loss[loss=0.1341, simple_loss=0.2075, pruned_loss=0.03039, over 4814.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2159, pruned_loss=0.03545, over 971839.59 frames.], batch size: 25, lr: 2.66e-04 2022-05-06 02:28:08,940 INFO [train.py:715] (0/8) Epoch 8, batch 5000, loss[loss=0.1492, simple_loss=0.2242, pruned_loss=0.03714, over 4834.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.03489, over 971279.64 frames.], batch size: 30, lr: 2.66e-04 2022-05-06 02:28:47,811 INFO [train.py:715] (0/8) Epoch 8, batch 5050, loss[loss=0.1468, simple_loss=0.2137, pruned_loss=0.03994, over 4987.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2145, pruned_loss=0.03493, over 972193.36 frames.], batch size: 14, lr: 2.66e-04 2022-05-06 02:29:26,962 INFO [train.py:715] (0/8) Epoch 8, batch 5100, loss[loss=0.1338, simple_loss=0.2094, pruned_loss=0.02907, over 4928.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.03464, over 972146.98 frames.], batch size: 18, lr: 2.66e-04 2022-05-06 02:30:06,422 INFO [train.py:715] (0/8) Epoch 8, batch 5150, loss[loss=0.1499, simple_loss=0.218, pruned_loss=0.04089, over 4946.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2144, pruned_loss=0.03474, over 971564.39 frames.], batch size: 18, lr: 2.66e-04 2022-05-06 02:30:45,327 INFO [train.py:715] (0/8) Epoch 8, batch 5200, loss[loss=0.1557, simple_loss=0.2235, pruned_loss=0.04401, over 4848.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2136, pruned_loss=0.03461, over 971839.04 frames.], batch size: 16, lr: 2.66e-04 2022-05-06 02:31:24,025 INFO [train.py:715] (0/8) Epoch 8, batch 5250, loss[loss=0.1753, simple_loss=0.2412, pruned_loss=0.05469, over 4865.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2149, pruned_loss=0.03541, over 972403.79 frames.], batch size: 32, lr: 2.66e-04 2022-05-06 02:32:04,133 INFO [train.py:715] (0/8) Epoch 8, batch 5300, loss[loss=0.1806, simple_loss=0.2394, pruned_loss=0.06094, over 4856.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2159, pruned_loss=0.03561, over 973090.17 frames.], batch size: 32, lr: 2.66e-04 2022-05-06 02:32:43,755 INFO [train.py:715] (0/8) Epoch 8, batch 5350, loss[loss=0.1377, simple_loss=0.2097, pruned_loss=0.03289, over 4759.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2156, pruned_loss=0.03558, over 972732.58 frames.], batch size: 19, lr: 2.66e-04 2022-05-06 02:33:23,691 INFO [train.py:715] (0/8) Epoch 8, batch 5400, loss[loss=0.1573, simple_loss=0.2226, pruned_loss=0.04598, over 4792.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2159, pruned_loss=0.03579, over 973112.03 frames.], batch size: 24, lr: 2.66e-04 2022-05-06 02:34:04,180 INFO [train.py:715] (0/8) Epoch 8, batch 5450, loss[loss=0.1445, simple_loss=0.2034, pruned_loss=0.04282, over 4962.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03561, over 973817.05 frames.], batch size: 35, lr: 2.66e-04 2022-05-06 02:34:44,675 INFO [train.py:715] (0/8) Epoch 8, batch 5500, loss[loss=0.1665, simple_loss=0.2432, pruned_loss=0.04491, over 4783.00 frames.], tot_loss[loss=0.1434, simple_loss=0.216, pruned_loss=0.03542, over 973875.27 frames.], batch size: 18, lr: 2.66e-04 2022-05-06 02:35:24,970 INFO [train.py:715] (0/8) Epoch 8, batch 5550, loss[loss=0.1179, simple_loss=0.1953, pruned_loss=0.02029, over 4770.00 frames.], tot_loss[loss=0.144, simple_loss=0.2164, pruned_loss=0.03577, over 972992.06 frames.], batch size: 18, lr: 2.66e-04 2022-05-06 02:36:04,809 INFO [train.py:715] (0/8) Epoch 8, batch 5600, loss[loss=0.1478, simple_loss=0.2128, pruned_loss=0.0414, over 4867.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2171, pruned_loss=0.03592, over 971850.27 frames.], batch size: 16, lr: 2.66e-04 2022-05-06 02:36:44,875 INFO [train.py:715] (0/8) Epoch 8, batch 5650, loss[loss=0.1463, simple_loss=0.2294, pruned_loss=0.03164, over 4718.00 frames.], tot_loss[loss=0.144, simple_loss=0.2162, pruned_loss=0.03594, over 970782.62 frames.], batch size: 15, lr: 2.66e-04 2022-05-06 02:37:23,998 INFO [train.py:715] (0/8) Epoch 8, batch 5700, loss[loss=0.134, simple_loss=0.2041, pruned_loss=0.03193, over 4779.00 frames.], tot_loss[loss=0.144, simple_loss=0.2159, pruned_loss=0.03602, over 972047.87 frames.], batch size: 17, lr: 2.66e-04 2022-05-06 02:38:03,515 INFO [train.py:715] (0/8) Epoch 8, batch 5750, loss[loss=0.1634, simple_loss=0.2293, pruned_loss=0.04872, over 4982.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2158, pruned_loss=0.03577, over 971939.06 frames.], batch size: 35, lr: 2.66e-04 2022-05-06 02:38:42,299 INFO [train.py:715] (0/8) Epoch 8, batch 5800, loss[loss=0.1377, simple_loss=0.2055, pruned_loss=0.03494, over 4838.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2169, pruned_loss=0.03607, over 972404.78 frames.], batch size: 15, lr: 2.66e-04 2022-05-06 02:39:21,798 INFO [train.py:715] (0/8) Epoch 8, batch 5850, loss[loss=0.1357, simple_loss=0.2162, pruned_loss=0.02758, over 4757.00 frames.], tot_loss[loss=0.144, simple_loss=0.2164, pruned_loss=0.03579, over 972173.98 frames.], batch size: 19, lr: 2.66e-04 2022-05-06 02:40:00,567 INFO [train.py:715] (0/8) Epoch 8, batch 5900, loss[loss=0.1266, simple_loss=0.2001, pruned_loss=0.02655, over 4765.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2161, pruned_loss=0.03537, over 971978.76 frames.], batch size: 16, lr: 2.66e-04 2022-05-06 02:40:40,148 INFO [train.py:715] (0/8) Epoch 8, batch 5950, loss[loss=0.1586, simple_loss=0.2237, pruned_loss=0.0467, over 4642.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03563, over 971866.71 frames.], batch size: 13, lr: 2.66e-04 2022-05-06 02:41:20,034 INFO [train.py:715] (0/8) Epoch 8, batch 6000, loss[loss=0.1243, simple_loss=0.1947, pruned_loss=0.02699, over 4863.00 frames.], tot_loss[loss=0.1436, simple_loss=0.216, pruned_loss=0.03561, over 972675.29 frames.], batch size: 20, lr: 2.66e-04 2022-05-06 02:41:20,034 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 02:41:29,608 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1075, simple_loss=0.1921, pruned_loss=0.01146, over 914524.00 frames. 2022-05-06 02:42:09,072 INFO [train.py:715] (0/8) Epoch 8, batch 6050, loss[loss=0.1242, simple_loss=0.2009, pruned_loss=0.02377, over 4805.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2168, pruned_loss=0.03625, over 972530.74 frames.], batch size: 25, lr: 2.66e-04 2022-05-06 02:42:48,765 INFO [train.py:715] (0/8) Epoch 8, batch 6100, loss[loss=0.1627, simple_loss=0.2241, pruned_loss=0.05063, over 4865.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2172, pruned_loss=0.03628, over 972981.03 frames.], batch size: 32, lr: 2.66e-04 2022-05-06 02:43:28,430 INFO [train.py:715] (0/8) Epoch 8, batch 6150, loss[loss=0.1486, simple_loss=0.2259, pruned_loss=0.03565, over 4863.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2169, pruned_loss=0.03583, over 973174.79 frames.], batch size: 20, lr: 2.66e-04 2022-05-06 02:44:08,981 INFO [train.py:715] (0/8) Epoch 8, batch 6200, loss[loss=0.1843, simple_loss=0.256, pruned_loss=0.05625, over 4763.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2163, pruned_loss=0.03603, over 973136.35 frames.], batch size: 16, lr: 2.66e-04 2022-05-06 02:44:49,470 INFO [train.py:715] (0/8) Epoch 8, batch 6250, loss[loss=0.1612, simple_loss=0.2292, pruned_loss=0.04657, over 4652.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2153, pruned_loss=0.03579, over 973054.66 frames.], batch size: 13, lr: 2.66e-04 2022-05-06 02:45:29,140 INFO [train.py:715] (0/8) Epoch 8, batch 6300, loss[loss=0.1531, simple_loss=0.2315, pruned_loss=0.03737, over 4860.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2151, pruned_loss=0.03528, over 973012.00 frames.], batch size: 16, lr: 2.66e-04 2022-05-06 02:46:08,059 INFO [train.py:715] (0/8) Epoch 8, batch 6350, loss[loss=0.152, simple_loss=0.2184, pruned_loss=0.04278, over 4964.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2157, pruned_loss=0.03563, over 973173.53 frames.], batch size: 24, lr: 2.66e-04 2022-05-06 02:46:47,829 INFO [train.py:715] (0/8) Epoch 8, batch 6400, loss[loss=0.1321, simple_loss=0.1972, pruned_loss=0.03352, over 4945.00 frames.], tot_loss[loss=0.1439, simple_loss=0.216, pruned_loss=0.03596, over 973379.68 frames.], batch size: 35, lr: 2.66e-04 2022-05-06 02:47:27,061 INFO [train.py:715] (0/8) Epoch 8, batch 6450, loss[loss=0.1526, simple_loss=0.2166, pruned_loss=0.04427, over 4858.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2165, pruned_loss=0.03606, over 972859.25 frames.], batch size: 32, lr: 2.66e-04 2022-05-06 02:48:06,512 INFO [train.py:715] (0/8) Epoch 8, batch 6500, loss[loss=0.1547, simple_loss=0.2241, pruned_loss=0.04268, over 4746.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2177, pruned_loss=0.0366, over 972481.37 frames.], batch size: 12, lr: 2.66e-04 2022-05-06 02:48:45,638 INFO [train.py:715] (0/8) Epoch 8, batch 6550, loss[loss=0.1491, simple_loss=0.2169, pruned_loss=0.04059, over 4846.00 frames.], tot_loss[loss=0.145, simple_loss=0.2173, pruned_loss=0.03633, over 972219.43 frames.], batch size: 15, lr: 2.66e-04 2022-05-06 02:49:25,291 INFO [train.py:715] (0/8) Epoch 8, batch 6600, loss[loss=0.135, simple_loss=0.2116, pruned_loss=0.02923, over 4927.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2169, pruned_loss=0.03601, over 972674.59 frames.], batch size: 18, lr: 2.66e-04 2022-05-06 02:50:04,617 INFO [train.py:715] (0/8) Epoch 8, batch 6650, loss[loss=0.1403, simple_loss=0.2163, pruned_loss=0.03216, over 4917.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2166, pruned_loss=0.03603, over 972649.64 frames.], batch size: 17, lr: 2.66e-04 2022-05-06 02:50:43,402 INFO [train.py:715] (0/8) Epoch 8, batch 6700, loss[loss=0.1594, simple_loss=0.2355, pruned_loss=0.04163, over 4903.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2161, pruned_loss=0.03606, over 973304.49 frames.], batch size: 22, lr: 2.66e-04 2022-05-06 02:51:23,630 INFO [train.py:715] (0/8) Epoch 8, batch 6750, loss[loss=0.1283, simple_loss=0.2018, pruned_loss=0.02743, over 4964.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2168, pruned_loss=0.03641, over 973585.73 frames.], batch size: 24, lr: 2.66e-04 2022-05-06 02:52:03,055 INFO [train.py:715] (0/8) Epoch 8, batch 6800, loss[loss=0.1398, simple_loss=0.2068, pruned_loss=0.03642, over 4887.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2159, pruned_loss=0.03577, over 974340.07 frames.], batch size: 16, lr: 2.66e-04 2022-05-06 02:52:42,025 INFO [train.py:715] (0/8) Epoch 8, batch 6850, loss[loss=0.1521, simple_loss=0.2198, pruned_loss=0.04222, over 4899.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2161, pruned_loss=0.03567, over 973726.61 frames.], batch size: 19, lr: 2.66e-04 2022-05-06 02:53:21,947 INFO [train.py:715] (0/8) Epoch 8, batch 6900, loss[loss=0.1629, simple_loss=0.2285, pruned_loss=0.04866, over 4690.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2155, pruned_loss=0.03558, over 973078.40 frames.], batch size: 15, lr: 2.66e-04 2022-05-06 02:54:02,354 INFO [train.py:715] (0/8) Epoch 8, batch 6950, loss[loss=0.1055, simple_loss=0.187, pruned_loss=0.012, over 4921.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2158, pruned_loss=0.03556, over 973092.27 frames.], batch size: 23, lr: 2.66e-04 2022-05-06 02:54:42,170 INFO [train.py:715] (0/8) Epoch 8, batch 7000, loss[loss=0.1335, simple_loss=0.2012, pruned_loss=0.03294, over 4985.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2159, pruned_loss=0.03517, over 972889.59 frames.], batch size: 28, lr: 2.65e-04 2022-05-06 02:55:21,780 INFO [train.py:715] (0/8) Epoch 8, batch 7050, loss[loss=0.1383, simple_loss=0.2092, pruned_loss=0.03369, over 4798.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2151, pruned_loss=0.03523, over 972849.06 frames.], batch size: 21, lr: 2.65e-04 2022-05-06 02:56:01,470 INFO [train.py:715] (0/8) Epoch 8, batch 7100, loss[loss=0.1473, simple_loss=0.2118, pruned_loss=0.04139, over 4875.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2153, pruned_loss=0.0352, over 973548.55 frames.], batch size: 16, lr: 2.65e-04 2022-05-06 02:56:41,139 INFO [train.py:715] (0/8) Epoch 8, batch 7150, loss[loss=0.1297, simple_loss=0.1953, pruned_loss=0.03207, over 4795.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2154, pruned_loss=0.03562, over 973605.92 frames.], batch size: 12, lr: 2.65e-04 2022-05-06 02:57:20,442 INFO [train.py:715] (0/8) Epoch 8, batch 7200, loss[loss=0.1601, simple_loss=0.244, pruned_loss=0.03807, over 4808.00 frames.], tot_loss[loss=0.1439, simple_loss=0.216, pruned_loss=0.03594, over 972981.32 frames.], batch size: 25, lr: 2.65e-04 2022-05-06 02:57:59,445 INFO [train.py:715] (0/8) Epoch 8, batch 7250, loss[loss=0.1282, simple_loss=0.1958, pruned_loss=0.03029, over 4815.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2166, pruned_loss=0.03585, over 973478.90 frames.], batch size: 12, lr: 2.65e-04 2022-05-06 02:58:39,555 INFO [train.py:715] (0/8) Epoch 8, batch 7300, loss[loss=0.1142, simple_loss=0.1857, pruned_loss=0.02131, over 4776.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2164, pruned_loss=0.03533, over 973884.22 frames.], batch size: 17, lr: 2.65e-04 2022-05-06 02:59:18,928 INFO [train.py:715] (0/8) Epoch 8, batch 7350, loss[loss=0.158, simple_loss=0.2317, pruned_loss=0.04213, over 4790.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03533, over 973965.47 frames.], batch size: 21, lr: 2.65e-04 2022-05-06 02:59:58,517 INFO [train.py:715] (0/8) Epoch 8, batch 7400, loss[loss=0.1508, simple_loss=0.2213, pruned_loss=0.04012, over 4859.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2156, pruned_loss=0.03559, over 972619.27 frames.], batch size: 32, lr: 2.65e-04 2022-05-06 03:00:38,454 INFO [train.py:715] (0/8) Epoch 8, batch 7450, loss[loss=0.1489, simple_loss=0.212, pruned_loss=0.04294, over 4978.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2147, pruned_loss=0.03532, over 972601.63 frames.], batch size: 14, lr: 2.65e-04 2022-05-06 03:01:18,179 INFO [train.py:715] (0/8) Epoch 8, batch 7500, loss[loss=0.1224, simple_loss=0.2007, pruned_loss=0.02205, over 4875.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2159, pruned_loss=0.03573, over 972566.62 frames.], batch size: 16, lr: 2.65e-04 2022-05-06 03:01:57,871 INFO [train.py:715] (0/8) Epoch 8, batch 7550, loss[loss=0.1209, simple_loss=0.188, pruned_loss=0.0269, over 4779.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2161, pruned_loss=0.03633, over 972170.31 frames.], batch size: 14, lr: 2.65e-04 2022-05-06 03:02:37,818 INFO [train.py:715] (0/8) Epoch 8, batch 7600, loss[loss=0.1365, simple_loss=0.2114, pruned_loss=0.03081, over 4848.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2156, pruned_loss=0.03576, over 972478.49 frames.], batch size: 32, lr: 2.65e-04 2022-05-06 03:03:17,988 INFO [train.py:715] (0/8) Epoch 8, batch 7650, loss[loss=0.1797, simple_loss=0.2469, pruned_loss=0.05625, over 4828.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2153, pruned_loss=0.0356, over 973425.24 frames.], batch size: 30, lr: 2.65e-04 2022-05-06 03:03:57,439 INFO [train.py:715] (0/8) Epoch 8, batch 7700, loss[loss=0.1634, simple_loss=0.2298, pruned_loss=0.04852, over 4885.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2165, pruned_loss=0.03617, over 973671.10 frames.], batch size: 22, lr: 2.65e-04 2022-05-06 03:04:36,608 INFO [train.py:715] (0/8) Epoch 8, batch 7750, loss[loss=0.1872, simple_loss=0.2577, pruned_loss=0.05836, over 4877.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03639, over 973519.58 frames.], batch size: 16, lr: 2.65e-04 2022-05-06 03:05:16,799 INFO [train.py:715] (0/8) Epoch 8, batch 7800, loss[loss=0.1284, simple_loss=0.2047, pruned_loss=0.02606, over 4849.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2173, pruned_loss=0.03648, over 973189.64 frames.], batch size: 20, lr: 2.65e-04 2022-05-06 03:05:56,861 INFO [train.py:715] (0/8) Epoch 8, batch 7850, loss[loss=0.1793, simple_loss=0.2497, pruned_loss=0.05442, over 4789.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2169, pruned_loss=0.03627, over 972576.27 frames.], batch size: 18, lr: 2.65e-04 2022-05-06 03:06:35,514 INFO [train.py:715] (0/8) Epoch 8, batch 7900, loss[loss=0.1303, simple_loss=0.1971, pruned_loss=0.03178, over 4805.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2172, pruned_loss=0.03647, over 973163.66 frames.], batch size: 21, lr: 2.65e-04 2022-05-06 03:07:15,006 INFO [train.py:715] (0/8) Epoch 8, batch 7950, loss[loss=0.1314, simple_loss=0.2078, pruned_loss=0.0275, over 4928.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2171, pruned_loss=0.03617, over 973515.47 frames.], batch size: 23, lr: 2.65e-04 2022-05-06 03:07:54,691 INFO [train.py:715] (0/8) Epoch 8, batch 8000, loss[loss=0.133, simple_loss=0.2103, pruned_loss=0.0278, over 4774.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2163, pruned_loss=0.03626, over 973023.54 frames.], batch size: 18, lr: 2.65e-04 2022-05-06 03:08:33,646 INFO [train.py:715] (0/8) Epoch 8, batch 8050, loss[loss=0.1594, simple_loss=0.226, pruned_loss=0.04642, over 4960.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2158, pruned_loss=0.0362, over 973749.97 frames.], batch size: 15, lr: 2.65e-04 2022-05-06 03:09:12,019 INFO [train.py:715] (0/8) Epoch 8, batch 8100, loss[loss=0.1487, simple_loss=0.2205, pruned_loss=0.03849, over 4934.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2157, pruned_loss=0.03639, over 972745.00 frames.], batch size: 23, lr: 2.65e-04 2022-05-06 03:09:51,245 INFO [train.py:715] (0/8) Epoch 8, batch 8150, loss[loss=0.1319, simple_loss=0.1978, pruned_loss=0.03298, over 4806.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2154, pruned_loss=0.03601, over 973256.32 frames.], batch size: 14, lr: 2.65e-04 2022-05-06 03:10:31,278 INFO [train.py:715] (0/8) Epoch 8, batch 8200, loss[loss=0.1215, simple_loss=0.1836, pruned_loss=0.02972, over 4818.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2158, pruned_loss=0.03629, over 973031.39 frames.], batch size: 12, lr: 2.65e-04 2022-05-06 03:11:09,939 INFO [train.py:715] (0/8) Epoch 8, batch 8250, loss[loss=0.1548, simple_loss=0.2245, pruned_loss=0.04254, over 4919.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2161, pruned_loss=0.03663, over 972723.76 frames.], batch size: 18, lr: 2.65e-04 2022-05-06 03:11:48,871 INFO [train.py:715] (0/8) Epoch 8, batch 8300, loss[loss=0.1978, simple_loss=0.2579, pruned_loss=0.0689, over 4910.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2172, pruned_loss=0.03727, over 973100.25 frames.], batch size: 17, lr: 2.65e-04 2022-05-06 03:12:28,296 INFO [train.py:715] (0/8) Epoch 8, batch 8350, loss[loss=0.1468, simple_loss=0.2155, pruned_loss=0.03905, over 4919.00 frames.], tot_loss[loss=0.146, simple_loss=0.2178, pruned_loss=0.03708, over 973457.69 frames.], batch size: 17, lr: 2.65e-04 2022-05-06 03:13:07,311 INFO [train.py:715] (0/8) Epoch 8, batch 8400, loss[loss=0.1463, simple_loss=0.2238, pruned_loss=0.03444, over 4816.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03683, over 972525.60 frames.], batch size: 21, lr: 2.65e-04 2022-05-06 03:13:45,968 INFO [train.py:715] (0/8) Epoch 8, batch 8450, loss[loss=0.1139, simple_loss=0.1817, pruned_loss=0.02305, over 4964.00 frames.], tot_loss[loss=0.146, simple_loss=0.2175, pruned_loss=0.03729, over 972740.84 frames.], batch size: 14, lr: 2.65e-04 2022-05-06 03:14:25,530 INFO [train.py:715] (0/8) Epoch 8, batch 8500, loss[loss=0.1252, simple_loss=0.1956, pruned_loss=0.02743, over 4996.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2166, pruned_loss=0.03697, over 973153.45 frames.], batch size: 14, lr: 2.65e-04 2022-05-06 03:15:05,497 INFO [train.py:715] (0/8) Epoch 8, batch 8550, loss[loss=0.1196, simple_loss=0.1937, pruned_loss=0.02275, over 4801.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2166, pruned_loss=0.03665, over 972849.22 frames.], batch size: 21, lr: 2.65e-04 2022-05-06 03:15:44,162 INFO [train.py:715] (0/8) Epoch 8, batch 8600, loss[loss=0.1403, simple_loss=0.2184, pruned_loss=0.03112, over 4907.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.03647, over 972688.90 frames.], batch size: 17, lr: 2.65e-04 2022-05-06 03:16:23,282 INFO [train.py:715] (0/8) Epoch 8, batch 8650, loss[loss=0.1183, simple_loss=0.1949, pruned_loss=0.02082, over 4950.00 frames.], tot_loss[loss=0.144, simple_loss=0.216, pruned_loss=0.03602, over 972233.42 frames.], batch size: 23, lr: 2.65e-04 2022-05-06 03:17:02,901 INFO [train.py:715] (0/8) Epoch 8, batch 8700, loss[loss=0.1213, simple_loss=0.1886, pruned_loss=0.02706, over 4909.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2153, pruned_loss=0.03565, over 972623.71 frames.], batch size: 19, lr: 2.65e-04 2022-05-06 03:17:41,700 INFO [train.py:715] (0/8) Epoch 8, batch 8750, loss[loss=0.1485, simple_loss=0.2197, pruned_loss=0.0386, over 4889.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2153, pruned_loss=0.03552, over 972227.64 frames.], batch size: 32, lr: 2.65e-04 2022-05-06 03:18:20,675 INFO [train.py:715] (0/8) Epoch 8, batch 8800, loss[loss=0.1589, simple_loss=0.2303, pruned_loss=0.04377, over 4949.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2162, pruned_loss=0.03577, over 972760.13 frames.], batch size: 39, lr: 2.65e-04 2022-05-06 03:19:00,217 INFO [train.py:715] (0/8) Epoch 8, batch 8850, loss[loss=0.1314, simple_loss=0.2062, pruned_loss=0.02825, over 4749.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2161, pruned_loss=0.03589, over 971633.66 frames.], batch size: 16, lr: 2.65e-04 2022-05-06 03:19:39,729 INFO [train.py:715] (0/8) Epoch 8, batch 8900, loss[loss=0.1624, simple_loss=0.247, pruned_loss=0.03892, over 4813.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03605, over 972406.14 frames.], batch size: 21, lr: 2.65e-04 2022-05-06 03:20:18,228 INFO [train.py:715] (0/8) Epoch 8, batch 8950, loss[loss=0.1339, simple_loss=0.2132, pruned_loss=0.02725, over 4758.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2177, pruned_loss=0.03646, over 972074.97 frames.], batch size: 18, lr: 2.65e-04 2022-05-06 03:20:57,337 INFO [train.py:715] (0/8) Epoch 8, batch 9000, loss[loss=0.1672, simple_loss=0.2409, pruned_loss=0.04675, over 4974.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2175, pruned_loss=0.03658, over 972021.95 frames.], batch size: 35, lr: 2.65e-04 2022-05-06 03:20:57,339 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 03:21:06,881 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1075, simple_loss=0.1922, pruned_loss=0.01144, over 914524.00 frames. 2022-05-06 03:21:46,743 INFO [train.py:715] (0/8) Epoch 8, batch 9050, loss[loss=0.1408, simple_loss=0.212, pruned_loss=0.03479, over 4921.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2158, pruned_loss=0.03564, over 971796.54 frames.], batch size: 29, lr: 2.65e-04 2022-05-06 03:22:26,221 INFO [train.py:715] (0/8) Epoch 8, batch 9100, loss[loss=0.1545, simple_loss=0.2177, pruned_loss=0.04563, over 4879.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03562, over 971482.55 frames.], batch size: 16, lr: 2.65e-04 2022-05-06 03:23:05,922 INFO [train.py:715] (0/8) Epoch 8, batch 9150, loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03283, over 4754.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2153, pruned_loss=0.0349, over 971250.96 frames.], batch size: 19, lr: 2.64e-04 2022-05-06 03:23:44,123 INFO [train.py:715] (0/8) Epoch 8, batch 9200, loss[loss=0.1503, simple_loss=0.2314, pruned_loss=0.03459, over 4917.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2159, pruned_loss=0.03513, over 972246.41 frames.], batch size: 18, lr: 2.64e-04 2022-05-06 03:24:23,669 INFO [train.py:715] (0/8) Epoch 8, batch 9250, loss[loss=0.1158, simple_loss=0.1953, pruned_loss=0.01819, over 4826.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2156, pruned_loss=0.03532, over 971966.69 frames.], batch size: 13, lr: 2.64e-04 2022-05-06 03:25:03,200 INFO [train.py:715] (0/8) Epoch 8, batch 9300, loss[loss=0.1582, simple_loss=0.2328, pruned_loss=0.04182, over 4828.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2164, pruned_loss=0.03535, over 972776.05 frames.], batch size: 30, lr: 2.64e-04 2022-05-06 03:25:42,057 INFO [train.py:715] (0/8) Epoch 8, batch 9350, loss[loss=0.1326, simple_loss=0.2175, pruned_loss=0.0238, over 4836.00 frames.], tot_loss[loss=0.143, simple_loss=0.2159, pruned_loss=0.03505, over 972229.98 frames.], batch size: 15, lr: 2.64e-04 2022-05-06 03:26:20,916 INFO [train.py:715] (0/8) Epoch 8, batch 9400, loss[loss=0.1761, simple_loss=0.255, pruned_loss=0.04857, over 4935.00 frames.], tot_loss[loss=0.1433, simple_loss=0.216, pruned_loss=0.03528, over 972451.39 frames.], batch size: 23, lr: 2.64e-04 2022-05-06 03:27:00,376 INFO [train.py:715] (0/8) Epoch 8, batch 9450, loss[loss=0.1533, simple_loss=0.2176, pruned_loss=0.04454, over 4785.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2163, pruned_loss=0.03537, over 972399.31 frames.], batch size: 18, lr: 2.64e-04 2022-05-06 03:27:40,542 INFO [train.py:715] (0/8) Epoch 8, batch 9500, loss[loss=0.1559, simple_loss=0.2277, pruned_loss=0.04204, over 4962.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2163, pruned_loss=0.03527, over 973118.51 frames.], batch size: 35, lr: 2.64e-04 2022-05-06 03:27:58,600 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-288000.pt 2022-05-06 03:28:21,699 INFO [train.py:715] (0/8) Epoch 8, batch 9550, loss[loss=0.169, simple_loss=0.2163, pruned_loss=0.06082, over 4857.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03503, over 971968.62 frames.], batch size: 34, lr: 2.64e-04 2022-05-06 03:29:01,734 INFO [train.py:715] (0/8) Epoch 8, batch 9600, loss[loss=0.1313, simple_loss=0.2126, pruned_loss=0.025, over 4920.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2147, pruned_loss=0.03437, over 971437.92 frames.], batch size: 21, lr: 2.64e-04 2022-05-06 03:29:41,770 INFO [train.py:715] (0/8) Epoch 8, batch 9650, loss[loss=0.1498, simple_loss=0.2315, pruned_loss=0.03402, over 4934.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2142, pruned_loss=0.03432, over 971458.67 frames.], batch size: 21, lr: 2.64e-04 2022-05-06 03:30:21,096 INFO [train.py:715] (0/8) Epoch 8, batch 9700, loss[loss=0.113, simple_loss=0.1856, pruned_loss=0.02018, over 4870.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2151, pruned_loss=0.03406, over 972075.14 frames.], batch size: 22, lr: 2.64e-04 2022-05-06 03:30:59,866 INFO [train.py:715] (0/8) Epoch 8, batch 9750, loss[loss=0.1485, simple_loss=0.2279, pruned_loss=0.03455, over 4825.00 frames.], tot_loss[loss=0.142, simple_loss=0.2148, pruned_loss=0.03456, over 972179.82 frames.], batch size: 15, lr: 2.64e-04 2022-05-06 03:31:39,480 INFO [train.py:715] (0/8) Epoch 8, batch 9800, loss[loss=0.1582, simple_loss=0.2322, pruned_loss=0.04212, over 4848.00 frames.], tot_loss[loss=0.1424, simple_loss=0.215, pruned_loss=0.03492, over 972168.21 frames.], batch size: 32, lr: 2.64e-04 2022-05-06 03:32:18,970 INFO [train.py:715] (0/8) Epoch 8, batch 9850, loss[loss=0.1356, simple_loss=0.2067, pruned_loss=0.03223, over 4983.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2158, pruned_loss=0.03524, over 972167.07 frames.], batch size: 28, lr: 2.64e-04 2022-05-06 03:32:58,275 INFO [train.py:715] (0/8) Epoch 8, batch 9900, loss[loss=0.152, simple_loss=0.2192, pruned_loss=0.04238, over 4980.00 frames.], tot_loss[loss=0.144, simple_loss=0.2166, pruned_loss=0.03568, over 971811.60 frames.], batch size: 24, lr: 2.64e-04 2022-05-06 03:33:37,620 INFO [train.py:715] (0/8) Epoch 8, batch 9950, loss[loss=0.1181, simple_loss=0.1881, pruned_loss=0.02411, over 4834.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2164, pruned_loss=0.03571, over 972436.39 frames.], batch size: 13, lr: 2.64e-04 2022-05-06 03:34:17,529 INFO [train.py:715] (0/8) Epoch 8, batch 10000, loss[loss=0.1205, simple_loss=0.199, pruned_loss=0.02104, over 4781.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2163, pruned_loss=0.03569, over 972416.79 frames.], batch size: 18, lr: 2.64e-04 2022-05-06 03:34:56,513 INFO [train.py:715] (0/8) Epoch 8, batch 10050, loss[loss=0.1196, simple_loss=0.1926, pruned_loss=0.02329, over 4765.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.0357, over 972242.77 frames.], batch size: 12, lr: 2.64e-04 2022-05-06 03:35:35,061 INFO [train.py:715] (0/8) Epoch 8, batch 10100, loss[loss=0.1691, simple_loss=0.2387, pruned_loss=0.04973, over 4966.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2156, pruned_loss=0.03562, over 972597.48 frames.], batch size: 35, lr: 2.64e-04 2022-05-06 03:36:15,138 INFO [train.py:715] (0/8) Epoch 8, batch 10150, loss[loss=0.1712, simple_loss=0.2397, pruned_loss=0.05132, over 4924.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2151, pruned_loss=0.03524, over 973669.85 frames.], batch size: 23, lr: 2.64e-04 2022-05-06 03:36:55,125 INFO [train.py:715] (0/8) Epoch 8, batch 10200, loss[loss=0.1434, simple_loss=0.2157, pruned_loss=0.03559, over 4973.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03505, over 973476.90 frames.], batch size: 25, lr: 2.64e-04 2022-05-06 03:37:34,624 INFO [train.py:715] (0/8) Epoch 8, batch 10250, loss[loss=0.1308, simple_loss=0.2026, pruned_loss=0.0295, over 4761.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2149, pruned_loss=0.03506, over 973719.41 frames.], batch size: 19, lr: 2.64e-04 2022-05-06 03:38:14,429 INFO [train.py:715] (0/8) Epoch 8, batch 10300, loss[loss=0.1452, simple_loss=0.21, pruned_loss=0.04017, over 4860.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2148, pruned_loss=0.03521, over 973664.98 frames.], batch size: 32, lr: 2.64e-04 2022-05-06 03:38:53,946 INFO [train.py:715] (0/8) Epoch 8, batch 10350, loss[loss=0.1329, simple_loss=0.2097, pruned_loss=0.02803, over 4756.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.03568, over 973568.06 frames.], batch size: 19, lr: 2.64e-04 2022-05-06 03:39:32,637 INFO [train.py:715] (0/8) Epoch 8, batch 10400, loss[loss=0.1245, simple_loss=0.2013, pruned_loss=0.02385, over 4803.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2165, pruned_loss=0.03598, over 973007.19 frames.], batch size: 21, lr: 2.64e-04 2022-05-06 03:40:12,239 INFO [train.py:715] (0/8) Epoch 8, batch 10450, loss[loss=0.1373, simple_loss=0.2187, pruned_loss=0.02796, over 4931.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2167, pruned_loss=0.03609, over 971813.29 frames.], batch size: 29, lr: 2.64e-04 2022-05-06 03:40:51,307 INFO [train.py:715] (0/8) Epoch 8, batch 10500, loss[loss=0.1418, simple_loss=0.2148, pruned_loss=0.03439, over 4976.00 frames.], tot_loss[loss=0.144, simple_loss=0.216, pruned_loss=0.03602, over 972636.54 frames.], batch size: 25, lr: 2.64e-04 2022-05-06 03:41:30,154 INFO [train.py:715] (0/8) Epoch 8, batch 10550, loss[loss=0.1371, simple_loss=0.2195, pruned_loss=0.02732, over 4890.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03575, over 972234.43 frames.], batch size: 16, lr: 2.64e-04 2022-05-06 03:42:08,779 INFO [train.py:715] (0/8) Epoch 8, batch 10600, loss[loss=0.1604, simple_loss=0.2404, pruned_loss=0.04018, over 4961.00 frames.], tot_loss[loss=0.1435, simple_loss=0.216, pruned_loss=0.0355, over 971459.56 frames.], batch size: 15, lr: 2.64e-04 2022-05-06 03:42:48,071 INFO [train.py:715] (0/8) Epoch 8, batch 10650, loss[loss=0.1542, simple_loss=0.2248, pruned_loss=0.04179, over 4977.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2172, pruned_loss=0.03591, over 971005.02 frames.], batch size: 35, lr: 2.64e-04 2022-05-06 03:43:27,255 INFO [train.py:715] (0/8) Epoch 8, batch 10700, loss[loss=0.145, simple_loss=0.2066, pruned_loss=0.04172, over 4781.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2163, pruned_loss=0.03578, over 971353.44 frames.], batch size: 14, lr: 2.64e-04 2022-05-06 03:44:06,354 INFO [train.py:715] (0/8) Epoch 8, batch 10750, loss[loss=0.1499, simple_loss=0.2323, pruned_loss=0.03373, over 4866.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2161, pruned_loss=0.03567, over 971773.29 frames.], batch size: 22, lr: 2.64e-04 2022-05-06 03:44:46,292 INFO [train.py:715] (0/8) Epoch 8, batch 10800, loss[loss=0.1492, simple_loss=0.219, pruned_loss=0.03966, over 4848.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2149, pruned_loss=0.03515, over 972111.95 frames.], batch size: 27, lr: 2.64e-04 2022-05-06 03:45:26,103 INFO [train.py:715] (0/8) Epoch 8, batch 10850, loss[loss=0.1357, simple_loss=0.2167, pruned_loss=0.02732, over 4753.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03507, over 972472.68 frames.], batch size: 19, lr: 2.64e-04 2022-05-06 03:46:05,367 INFO [train.py:715] (0/8) Epoch 8, batch 10900, loss[loss=0.173, simple_loss=0.2523, pruned_loss=0.04679, over 4846.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2157, pruned_loss=0.03552, over 972479.62 frames.], batch size: 15, lr: 2.64e-04 2022-05-06 03:46:44,374 INFO [train.py:715] (0/8) Epoch 8, batch 10950, loss[loss=0.1739, simple_loss=0.2404, pruned_loss=0.05371, over 4753.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2164, pruned_loss=0.03603, over 972392.78 frames.], batch size: 16, lr: 2.64e-04 2022-05-06 03:47:24,375 INFO [train.py:715] (0/8) Epoch 8, batch 11000, loss[loss=0.136, simple_loss=0.204, pruned_loss=0.034, over 4874.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2165, pruned_loss=0.036, over 972389.85 frames.], batch size: 16, lr: 2.64e-04 2022-05-06 03:48:03,906 INFO [train.py:715] (0/8) Epoch 8, batch 11050, loss[loss=0.1569, simple_loss=0.2351, pruned_loss=0.03938, over 4922.00 frames.], tot_loss[loss=0.1445, simple_loss=0.217, pruned_loss=0.03604, over 972048.38 frames.], batch size: 29, lr: 2.64e-04 2022-05-06 03:48:42,668 INFO [train.py:715] (0/8) Epoch 8, batch 11100, loss[loss=0.1546, simple_loss=0.2264, pruned_loss=0.0414, over 4872.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2166, pruned_loss=0.03589, over 971759.56 frames.], batch size: 22, lr: 2.64e-04 2022-05-06 03:49:22,142 INFO [train.py:715] (0/8) Epoch 8, batch 11150, loss[loss=0.1132, simple_loss=0.1825, pruned_loss=0.02199, over 4822.00 frames.], tot_loss[loss=0.1449, simple_loss=0.217, pruned_loss=0.03641, over 972253.29 frames.], batch size: 13, lr: 2.64e-04 2022-05-06 03:50:01,940 INFO [train.py:715] (0/8) Epoch 8, batch 11200, loss[loss=0.1393, simple_loss=0.2036, pruned_loss=0.0375, over 4809.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2161, pruned_loss=0.03611, over 972018.73 frames.], batch size: 12, lr: 2.64e-04 2022-05-06 03:50:40,567 INFO [train.py:715] (0/8) Epoch 8, batch 11250, loss[loss=0.122, simple_loss=0.2032, pruned_loss=0.02045, over 4817.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2159, pruned_loss=0.03586, over 971495.13 frames.], batch size: 27, lr: 2.64e-04 2022-05-06 03:51:19,594 INFO [train.py:715] (0/8) Epoch 8, batch 11300, loss[loss=0.1436, simple_loss=0.2082, pruned_loss=0.03947, over 4994.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2149, pruned_loss=0.03546, over 971729.49 frames.], batch size: 14, lr: 2.64e-04 2022-05-06 03:51:58,925 INFO [train.py:715] (0/8) Epoch 8, batch 11350, loss[loss=0.1687, simple_loss=0.2487, pruned_loss=0.04431, over 4914.00 frames.], tot_loss[loss=0.1428, simple_loss=0.215, pruned_loss=0.03531, over 971092.44 frames.], batch size: 39, lr: 2.63e-04 2022-05-06 03:52:37,404 INFO [train.py:715] (0/8) Epoch 8, batch 11400, loss[loss=0.1249, simple_loss=0.2007, pruned_loss=0.02456, over 4828.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2164, pruned_loss=0.03609, over 970714.81 frames.], batch size: 26, lr: 2.63e-04 2022-05-06 03:53:16,046 INFO [train.py:715] (0/8) Epoch 8, batch 11450, loss[loss=0.1584, simple_loss=0.2259, pruned_loss=0.04544, over 4914.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2162, pruned_loss=0.03609, over 971029.48 frames.], batch size: 17, lr: 2.63e-04 2022-05-06 03:53:55,350 INFO [train.py:715] (0/8) Epoch 8, batch 11500, loss[loss=0.1697, simple_loss=0.2349, pruned_loss=0.05229, over 4876.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2168, pruned_loss=0.03639, over 971553.18 frames.], batch size: 16, lr: 2.63e-04 2022-05-06 03:54:34,452 INFO [train.py:715] (0/8) Epoch 8, batch 11550, loss[loss=0.1345, simple_loss=0.202, pruned_loss=0.03355, over 4765.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2166, pruned_loss=0.03626, over 971572.74 frames.], batch size: 12, lr: 2.63e-04 2022-05-06 03:55:13,509 INFO [train.py:715] (0/8) Epoch 8, batch 11600, loss[loss=0.1416, simple_loss=0.2154, pruned_loss=0.03396, over 4758.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2157, pruned_loss=0.03574, over 971320.40 frames.], batch size: 19, lr: 2.63e-04 2022-05-06 03:55:53,443 INFO [train.py:715] (0/8) Epoch 8, batch 11650, loss[loss=0.142, simple_loss=0.2039, pruned_loss=0.04003, over 4933.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2148, pruned_loss=0.03486, over 972062.73 frames.], batch size: 23, lr: 2.63e-04 2022-05-06 03:56:33,837 INFO [train.py:715] (0/8) Epoch 8, batch 11700, loss[loss=0.1785, simple_loss=0.2373, pruned_loss=0.0598, over 4958.00 frames.], tot_loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03513, over 972157.98 frames.], batch size: 24, lr: 2.63e-04 2022-05-06 03:57:13,266 INFO [train.py:715] (0/8) Epoch 8, batch 11750, loss[loss=0.1424, simple_loss=0.218, pruned_loss=0.03342, over 4635.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2155, pruned_loss=0.03504, over 972054.63 frames.], batch size: 13, lr: 2.63e-04 2022-05-06 03:57:52,302 INFO [train.py:715] (0/8) Epoch 8, batch 11800, loss[loss=0.1371, simple_loss=0.2116, pruned_loss=0.03129, over 4975.00 frames.], tot_loss[loss=0.1424, simple_loss=0.215, pruned_loss=0.03489, over 972381.00 frames.], batch size: 24, lr: 2.63e-04 2022-05-06 03:58:32,046 INFO [train.py:715] (0/8) Epoch 8, batch 11850, loss[loss=0.1295, simple_loss=0.2038, pruned_loss=0.0276, over 4859.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2155, pruned_loss=0.03487, over 972974.06 frames.], batch size: 16, lr: 2.63e-04 2022-05-06 03:59:11,744 INFO [train.py:715] (0/8) Epoch 8, batch 11900, loss[loss=0.1523, simple_loss=0.2189, pruned_loss=0.04283, over 4939.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2151, pruned_loss=0.03487, over 973878.93 frames.], batch size: 39, lr: 2.63e-04 2022-05-06 03:59:51,344 INFO [train.py:715] (0/8) Epoch 8, batch 11950, loss[loss=0.1791, simple_loss=0.2482, pruned_loss=0.05497, over 4988.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03437, over 972505.85 frames.], batch size: 31, lr: 2.63e-04 2022-05-06 04:00:30,528 INFO [train.py:715] (0/8) Epoch 8, batch 12000, loss[loss=0.1375, simple_loss=0.2116, pruned_loss=0.03169, over 4928.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2148, pruned_loss=0.03491, over 973104.09 frames.], batch size: 21, lr: 2.63e-04 2022-05-06 04:00:30,529 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 04:00:40,091 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1076, simple_loss=0.1923, pruned_loss=0.0115, over 914524.00 frames. 2022-05-06 04:01:19,840 INFO [train.py:715] (0/8) Epoch 8, batch 12050, loss[loss=0.1361, simple_loss=0.2125, pruned_loss=0.02983, over 4933.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2149, pruned_loss=0.03513, over 972452.07 frames.], batch size: 23, lr: 2.63e-04 2022-05-06 04:01:59,445 INFO [train.py:715] (0/8) Epoch 8, batch 12100, loss[loss=0.1175, simple_loss=0.1978, pruned_loss=0.01863, over 4865.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.03573, over 972032.40 frames.], batch size: 20, lr: 2.63e-04 2022-05-06 04:02:38,521 INFO [train.py:715] (0/8) Epoch 8, batch 12150, loss[loss=0.1782, simple_loss=0.2443, pruned_loss=0.05603, over 4903.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03598, over 972073.56 frames.], batch size: 23, lr: 2.63e-04 2022-05-06 04:03:17,591 INFO [train.py:715] (0/8) Epoch 8, batch 12200, loss[loss=0.1514, simple_loss=0.2173, pruned_loss=0.04271, over 4753.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2162, pruned_loss=0.03605, over 972083.93 frames.], batch size: 14, lr: 2.63e-04 2022-05-06 04:03:57,160 INFO [train.py:715] (0/8) Epoch 8, batch 12250, loss[loss=0.1142, simple_loss=0.1879, pruned_loss=0.02025, over 4986.00 frames.], tot_loss[loss=0.144, simple_loss=0.216, pruned_loss=0.03599, over 972037.97 frames.], batch size: 28, lr: 2.63e-04 2022-05-06 04:04:36,392 INFO [train.py:715] (0/8) Epoch 8, batch 12300, loss[loss=0.1603, simple_loss=0.2202, pruned_loss=0.0502, over 4995.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2162, pruned_loss=0.03622, over 972075.12 frames.], batch size: 15, lr: 2.63e-04 2022-05-06 04:05:15,234 INFO [train.py:715] (0/8) Epoch 8, batch 12350, loss[loss=0.1543, simple_loss=0.2225, pruned_loss=0.04304, over 4903.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2162, pruned_loss=0.03577, over 972284.73 frames.], batch size: 17, lr: 2.63e-04 2022-05-06 04:05:54,657 INFO [train.py:715] (0/8) Epoch 8, batch 12400, loss[loss=0.1339, simple_loss=0.2067, pruned_loss=0.03057, over 4803.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2173, pruned_loss=0.03645, over 972987.10 frames.], batch size: 25, lr: 2.63e-04 2022-05-06 04:06:34,250 INFO [train.py:715] (0/8) Epoch 8, batch 12450, loss[loss=0.124, simple_loss=0.1978, pruned_loss=0.02514, over 4818.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2176, pruned_loss=0.03652, over 972976.22 frames.], batch size: 26, lr: 2.63e-04 2022-05-06 04:07:13,255 INFO [train.py:715] (0/8) Epoch 8, batch 12500, loss[loss=0.1431, simple_loss=0.2097, pruned_loss=0.0382, over 4795.00 frames.], tot_loss[loss=0.1452, simple_loss=0.217, pruned_loss=0.03669, over 973021.78 frames.], batch size: 21, lr: 2.63e-04 2022-05-06 04:07:52,124 INFO [train.py:715] (0/8) Epoch 8, batch 12550, loss[loss=0.1855, simple_loss=0.2567, pruned_loss=0.05714, over 4854.00 frames.], tot_loss[loss=0.1461, simple_loss=0.2177, pruned_loss=0.03726, over 972790.45 frames.], batch size: 20, lr: 2.63e-04 2022-05-06 04:08:31,829 INFO [train.py:715] (0/8) Epoch 8, batch 12600, loss[loss=0.1786, simple_loss=0.2519, pruned_loss=0.05264, over 4700.00 frames.], tot_loss[loss=0.1459, simple_loss=0.2174, pruned_loss=0.03719, over 972531.09 frames.], batch size: 15, lr: 2.63e-04 2022-05-06 04:09:10,875 INFO [train.py:715] (0/8) Epoch 8, batch 12650, loss[loss=0.1817, simple_loss=0.2422, pruned_loss=0.06062, over 4907.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2172, pruned_loss=0.03696, over 972739.00 frames.], batch size: 18, lr: 2.63e-04 2022-05-06 04:09:50,736 INFO [train.py:715] (0/8) Epoch 8, batch 12700, loss[loss=0.1324, simple_loss=0.208, pruned_loss=0.02836, over 4847.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2165, pruned_loss=0.03636, over 972403.02 frames.], batch size: 26, lr: 2.63e-04 2022-05-06 04:10:30,123 INFO [train.py:715] (0/8) Epoch 8, batch 12750, loss[loss=0.2307, simple_loss=0.3, pruned_loss=0.08067, over 4908.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2163, pruned_loss=0.03613, over 971154.92 frames.], batch size: 19, lr: 2.63e-04 2022-05-06 04:11:10,321 INFO [train.py:715] (0/8) Epoch 8, batch 12800, loss[loss=0.124, simple_loss=0.2013, pruned_loss=0.02333, over 4987.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03601, over 971816.75 frames.], batch size: 35, lr: 2.63e-04 2022-05-06 04:11:48,981 INFO [train.py:715] (0/8) Epoch 8, batch 12850, loss[loss=0.1287, simple_loss=0.1991, pruned_loss=0.02909, over 4794.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2168, pruned_loss=0.03626, over 971785.11 frames.], batch size: 17, lr: 2.63e-04 2022-05-06 04:12:28,012 INFO [train.py:715] (0/8) Epoch 8, batch 12900, loss[loss=0.1246, simple_loss=0.1983, pruned_loss=0.02548, over 4822.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2168, pruned_loss=0.03639, over 971634.02 frames.], batch size: 27, lr: 2.63e-04 2022-05-06 04:13:07,524 INFO [train.py:715] (0/8) Epoch 8, batch 12950, loss[loss=0.1667, simple_loss=0.2457, pruned_loss=0.04385, over 4782.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2167, pruned_loss=0.03638, over 972027.62 frames.], batch size: 17, lr: 2.63e-04 2022-05-06 04:13:46,911 INFO [train.py:715] (0/8) Epoch 8, batch 13000, loss[loss=0.1413, simple_loss=0.2218, pruned_loss=0.03034, over 4936.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2162, pruned_loss=0.03568, over 972321.25 frames.], batch size: 21, lr: 2.63e-04 2022-05-06 04:14:26,213 INFO [train.py:715] (0/8) Epoch 8, batch 13050, loss[loss=0.1602, simple_loss=0.2299, pruned_loss=0.04521, over 4882.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2158, pruned_loss=0.03552, over 972307.40 frames.], batch size: 22, lr: 2.63e-04 2022-05-06 04:15:05,641 INFO [train.py:715] (0/8) Epoch 8, batch 13100, loss[loss=0.1652, simple_loss=0.247, pruned_loss=0.04169, over 4954.00 frames.], tot_loss[loss=0.144, simple_loss=0.2164, pruned_loss=0.03581, over 972852.22 frames.], batch size: 15, lr: 2.63e-04 2022-05-06 04:15:45,373 INFO [train.py:715] (0/8) Epoch 8, batch 13150, loss[loss=0.1384, simple_loss=0.2184, pruned_loss=0.02918, over 4761.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2161, pruned_loss=0.03585, over 973800.21 frames.], batch size: 19, lr: 2.63e-04 2022-05-06 04:16:24,326 INFO [train.py:715] (0/8) Epoch 8, batch 13200, loss[loss=0.1409, simple_loss=0.2194, pruned_loss=0.03121, over 4969.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2164, pruned_loss=0.03593, over 973184.57 frames.], batch size: 35, lr: 2.63e-04 2022-05-06 04:17:03,715 INFO [train.py:715] (0/8) Epoch 8, batch 13250, loss[loss=0.1261, simple_loss=0.2012, pruned_loss=0.02548, over 4943.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2153, pruned_loss=0.03521, over 972542.72 frames.], batch size: 29, lr: 2.63e-04 2022-05-06 04:17:43,333 INFO [train.py:715] (0/8) Epoch 8, batch 13300, loss[loss=0.1663, simple_loss=0.2285, pruned_loss=0.05209, over 4914.00 frames.], tot_loss[loss=0.143, simple_loss=0.2151, pruned_loss=0.03546, over 971765.01 frames.], batch size: 19, lr: 2.63e-04 2022-05-06 04:18:22,357 INFO [train.py:715] (0/8) Epoch 8, batch 13350, loss[loss=0.1362, simple_loss=0.2036, pruned_loss=0.03439, over 4935.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2151, pruned_loss=0.03591, over 972189.48 frames.], batch size: 29, lr: 2.63e-04 2022-05-06 04:19:00,998 INFO [train.py:715] (0/8) Epoch 8, batch 13400, loss[loss=0.1582, simple_loss=0.2463, pruned_loss=0.03501, over 4903.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03538, over 972298.41 frames.], batch size: 17, lr: 2.63e-04 2022-05-06 04:19:39,800 INFO [train.py:715] (0/8) Epoch 8, batch 13450, loss[loss=0.1258, simple_loss=0.2036, pruned_loss=0.02398, over 4808.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03602, over 972014.93 frames.], batch size: 21, lr: 2.63e-04 2022-05-06 04:20:19,852 INFO [train.py:715] (0/8) Epoch 8, batch 13500, loss[loss=0.1324, simple_loss=0.2019, pruned_loss=0.03141, over 4804.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2153, pruned_loss=0.03608, over 973269.53 frames.], batch size: 13, lr: 2.63e-04 2022-05-06 04:20:58,644 INFO [train.py:715] (0/8) Epoch 8, batch 13550, loss[loss=0.1478, simple_loss=0.2212, pruned_loss=0.03719, over 4795.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2154, pruned_loss=0.03587, over 973773.13 frames.], batch size: 14, lr: 2.62e-04 2022-05-06 04:21:37,838 INFO [train.py:715] (0/8) Epoch 8, batch 13600, loss[loss=0.1424, simple_loss=0.2153, pruned_loss=0.03471, over 4647.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2155, pruned_loss=0.03545, over 973766.37 frames.], batch size: 13, lr: 2.62e-04 2022-05-06 04:22:16,978 INFO [train.py:715] (0/8) Epoch 8, batch 13650, loss[loss=0.1727, simple_loss=0.2374, pruned_loss=0.05396, over 4968.00 frames.], tot_loss[loss=0.144, simple_loss=0.2162, pruned_loss=0.03596, over 973195.56 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:22:56,126 INFO [train.py:715] (0/8) Epoch 8, batch 13700, loss[loss=0.1642, simple_loss=0.235, pruned_loss=0.04672, over 4841.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2161, pruned_loss=0.0357, over 972692.35 frames.], batch size: 30, lr: 2.62e-04 2022-05-06 04:23:34,767 INFO [train.py:715] (0/8) Epoch 8, batch 13750, loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02992, over 4987.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2158, pruned_loss=0.03596, over 972900.06 frames.], batch size: 14, lr: 2.62e-04 2022-05-06 04:24:13,491 INFO [train.py:715] (0/8) Epoch 8, batch 13800, loss[loss=0.153, simple_loss=0.2302, pruned_loss=0.03794, over 4778.00 frames.], tot_loss[loss=0.145, simple_loss=0.217, pruned_loss=0.0365, over 973223.14 frames.], batch size: 17, lr: 2.62e-04 2022-05-06 04:24:52,946 INFO [train.py:715] (0/8) Epoch 8, batch 13850, loss[loss=0.1228, simple_loss=0.1993, pruned_loss=0.02318, over 4899.00 frames.], tot_loss[loss=0.1452, simple_loss=0.217, pruned_loss=0.03667, over 972212.67 frames.], batch size: 18, lr: 2.62e-04 2022-05-06 04:25:31,239 INFO [train.py:715] (0/8) Epoch 8, batch 13900, loss[loss=0.1345, simple_loss=0.2051, pruned_loss=0.03192, over 4856.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2173, pruned_loss=0.03679, over 972468.30 frames.], batch size: 20, lr: 2.62e-04 2022-05-06 04:26:10,330 INFO [train.py:715] (0/8) Epoch 8, batch 13950, loss[loss=0.1318, simple_loss=0.203, pruned_loss=0.03033, over 4928.00 frames.], tot_loss[loss=0.1455, simple_loss=0.2174, pruned_loss=0.03679, over 972265.51 frames.], batch size: 29, lr: 2.62e-04 2022-05-06 04:26:49,429 INFO [train.py:715] (0/8) Epoch 8, batch 14000, loss[loss=0.09505, simple_loss=0.1688, pruned_loss=0.01063, over 4783.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2164, pruned_loss=0.03617, over 972096.87 frames.], batch size: 12, lr: 2.62e-04 2022-05-06 04:27:28,485 INFO [train.py:715] (0/8) Epoch 8, batch 14050, loss[loss=0.1469, simple_loss=0.2293, pruned_loss=0.03231, over 4792.00 frames.], tot_loss[loss=0.144, simple_loss=0.2161, pruned_loss=0.036, over 971791.03 frames.], batch size: 21, lr: 2.62e-04 2022-05-06 04:28:06,675 INFO [train.py:715] (0/8) Epoch 8, batch 14100, loss[loss=0.151, simple_loss=0.2229, pruned_loss=0.03953, over 4949.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2161, pruned_loss=0.03584, over 972180.80 frames.], batch size: 21, lr: 2.62e-04 2022-05-06 04:28:45,328 INFO [train.py:715] (0/8) Epoch 8, batch 14150, loss[loss=0.1456, simple_loss=0.2151, pruned_loss=0.0381, over 4844.00 frames.], tot_loss[loss=0.145, simple_loss=0.217, pruned_loss=0.03646, over 972174.89 frames.], batch size: 30, lr: 2.62e-04 2022-05-06 04:29:25,590 INFO [train.py:715] (0/8) Epoch 8, batch 14200, loss[loss=0.1691, simple_loss=0.236, pruned_loss=0.05111, over 4935.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2173, pruned_loss=0.03657, over 972763.24 frames.], batch size: 23, lr: 2.62e-04 2022-05-06 04:30:04,162 INFO [train.py:715] (0/8) Epoch 8, batch 14250, loss[loss=0.167, simple_loss=0.2415, pruned_loss=0.04623, over 4741.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2176, pruned_loss=0.0366, over 972372.08 frames.], batch size: 19, lr: 2.62e-04 2022-05-06 04:30:44,069 INFO [train.py:715] (0/8) Epoch 8, batch 14300, loss[loss=0.1688, simple_loss=0.2482, pruned_loss=0.0447, over 4946.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2173, pruned_loss=0.03617, over 972457.02 frames.], batch size: 21, lr: 2.62e-04 2022-05-06 04:31:23,530 INFO [train.py:715] (0/8) Epoch 8, batch 14350, loss[loss=0.1532, simple_loss=0.2277, pruned_loss=0.03942, over 4835.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2166, pruned_loss=0.03586, over 972217.86 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:32:02,822 INFO [train.py:715] (0/8) Epoch 8, batch 14400, loss[loss=0.1576, simple_loss=0.2318, pruned_loss=0.04169, over 4776.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2179, pruned_loss=0.03641, over 971434.97 frames.], batch size: 18, lr: 2.62e-04 2022-05-06 04:32:41,516 INFO [train.py:715] (0/8) Epoch 8, batch 14450, loss[loss=0.1733, simple_loss=0.2433, pruned_loss=0.05168, over 4919.00 frames.], tot_loss[loss=0.1456, simple_loss=0.2179, pruned_loss=0.03667, over 972055.03 frames.], batch size: 18, lr: 2.62e-04 2022-05-06 04:33:20,777 INFO [train.py:715] (0/8) Epoch 8, batch 14500, loss[loss=0.1216, simple_loss=0.1957, pruned_loss=0.02375, over 4838.00 frames.], tot_loss[loss=0.1465, simple_loss=0.2186, pruned_loss=0.03717, over 973289.84 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:34:00,253 INFO [train.py:715] (0/8) Epoch 8, batch 14550, loss[loss=0.1257, simple_loss=0.1958, pruned_loss=0.02785, over 4756.00 frames.], tot_loss[loss=0.146, simple_loss=0.2182, pruned_loss=0.03691, over 972994.78 frames.], batch size: 16, lr: 2.62e-04 2022-05-06 04:34:38,290 INFO [train.py:715] (0/8) Epoch 8, batch 14600, loss[loss=0.142, simple_loss=0.2257, pruned_loss=0.02919, over 4847.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2177, pruned_loss=0.03632, over 974166.20 frames.], batch size: 27, lr: 2.62e-04 2022-05-06 04:35:17,875 INFO [train.py:715] (0/8) Epoch 8, batch 14650, loss[loss=0.1351, simple_loss=0.2173, pruned_loss=0.02648, over 4709.00 frames.], tot_loss[loss=0.1445, simple_loss=0.217, pruned_loss=0.03603, over 973662.39 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:35:57,137 INFO [train.py:715] (0/8) Epoch 8, batch 14700, loss[loss=0.143, simple_loss=0.217, pruned_loss=0.03452, over 4889.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2159, pruned_loss=0.03532, over 973898.35 frames.], batch size: 22, lr: 2.62e-04 2022-05-06 04:36:35,956 INFO [train.py:715] (0/8) Epoch 8, batch 14750, loss[loss=0.1658, simple_loss=0.2361, pruned_loss=0.04771, over 4968.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2154, pruned_loss=0.03562, over 973796.60 frames.], batch size: 24, lr: 2.62e-04 2022-05-06 04:37:14,352 INFO [train.py:715] (0/8) Epoch 8, batch 14800, loss[loss=0.1206, simple_loss=0.1989, pruned_loss=0.02121, over 4839.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2158, pruned_loss=0.03566, over 973266.48 frames.], batch size: 26, lr: 2.62e-04 2022-05-06 04:37:54,165 INFO [train.py:715] (0/8) Epoch 8, batch 14850, loss[loss=0.1727, simple_loss=0.2436, pruned_loss=0.05093, over 4841.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2158, pruned_loss=0.0356, over 972971.13 frames.], batch size: 13, lr: 2.62e-04 2022-05-06 04:38:33,084 INFO [train.py:715] (0/8) Epoch 8, batch 14900, loss[loss=0.1248, simple_loss=0.1997, pruned_loss=0.02493, over 4805.00 frames.], tot_loss[loss=0.143, simple_loss=0.2154, pruned_loss=0.03528, over 972052.76 frames.], batch size: 12, lr: 2.62e-04 2022-05-06 04:39:11,868 INFO [train.py:715] (0/8) Epoch 8, batch 14950, loss[loss=0.1743, simple_loss=0.2428, pruned_loss=0.05294, over 4876.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2145, pruned_loss=0.03501, over 972579.43 frames.], batch size: 22, lr: 2.62e-04 2022-05-06 04:39:51,070 INFO [train.py:715] (0/8) Epoch 8, batch 15000, loss[loss=0.137, simple_loss=0.211, pruned_loss=0.03155, over 4739.00 frames.], tot_loss[loss=0.1418, simple_loss=0.214, pruned_loss=0.03478, over 972500.99 frames.], batch size: 16, lr: 2.62e-04 2022-05-06 04:39:51,071 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 04:40:00,793 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1076, simple_loss=0.1921, pruned_loss=0.01153, over 914524.00 frames. 2022-05-06 04:40:40,556 INFO [train.py:715] (0/8) Epoch 8, batch 15050, loss[loss=0.1315, simple_loss=0.217, pruned_loss=0.02303, over 4950.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2133, pruned_loss=0.03446, over 972326.95 frames.], batch size: 24, lr: 2.62e-04 2022-05-06 04:41:19,876 INFO [train.py:715] (0/8) Epoch 8, batch 15100, loss[loss=0.1216, simple_loss=0.1913, pruned_loss=0.02599, over 4842.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2138, pruned_loss=0.03469, over 972449.88 frames.], batch size: 13, lr: 2.62e-04 2022-05-06 04:41:59,413 INFO [train.py:715] (0/8) Epoch 8, batch 15150, loss[loss=0.1604, simple_loss=0.2341, pruned_loss=0.04333, over 4696.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2137, pruned_loss=0.03447, over 972315.15 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:42:38,831 INFO [train.py:715] (0/8) Epoch 8, batch 15200, loss[loss=0.1207, simple_loss=0.1961, pruned_loss=0.02261, over 4775.00 frames.], tot_loss[loss=0.1424, simple_loss=0.215, pruned_loss=0.03493, over 972325.00 frames.], batch size: 14, lr: 2.62e-04 2022-05-06 04:43:18,560 INFO [train.py:715] (0/8) Epoch 8, batch 15250, loss[loss=0.1683, simple_loss=0.2303, pruned_loss=0.05313, over 4845.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2148, pruned_loss=0.03484, over 972370.60 frames.], batch size: 30, lr: 2.62e-04 2022-05-06 04:43:58,532 INFO [train.py:715] (0/8) Epoch 8, batch 15300, loss[loss=0.1924, simple_loss=0.2668, pruned_loss=0.05904, over 4805.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2146, pruned_loss=0.03482, over 972235.20 frames.], batch size: 21, lr: 2.62e-04 2022-05-06 04:44:37,106 INFO [train.py:715] (0/8) Epoch 8, batch 15350, loss[loss=0.1311, simple_loss=0.2048, pruned_loss=0.02863, over 4994.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2146, pruned_loss=0.035, over 972587.49 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:45:16,994 INFO [train.py:715] (0/8) Epoch 8, batch 15400, loss[loss=0.1191, simple_loss=0.1976, pruned_loss=0.02034, over 4935.00 frames.], tot_loss[loss=0.143, simple_loss=0.2152, pruned_loss=0.03539, over 973145.22 frames.], batch size: 23, lr: 2.62e-04 2022-05-06 04:45:55,982 INFO [train.py:715] (0/8) Epoch 8, batch 15450, loss[loss=0.1732, simple_loss=0.2349, pruned_loss=0.05578, over 4811.00 frames.], tot_loss[loss=0.143, simple_loss=0.2152, pruned_loss=0.03544, over 972138.26 frames.], batch size: 21, lr: 2.62e-04 2022-05-06 04:46:34,942 INFO [train.py:715] (0/8) Epoch 8, batch 15500, loss[loss=0.1118, simple_loss=0.1783, pruned_loss=0.02261, over 4689.00 frames.], tot_loss[loss=0.143, simple_loss=0.2148, pruned_loss=0.03555, over 972043.88 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:47:13,673 INFO [train.py:715] (0/8) Epoch 8, batch 15550, loss[loss=0.1149, simple_loss=0.1903, pruned_loss=0.01977, over 4869.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2152, pruned_loss=0.03577, over 972414.85 frames.], batch size: 20, lr: 2.62e-04 2022-05-06 04:47:52,417 INFO [train.py:715] (0/8) Epoch 8, batch 15600, loss[loss=0.1399, simple_loss=0.2002, pruned_loss=0.03977, over 4691.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2158, pruned_loss=0.03593, over 971670.87 frames.], batch size: 15, lr: 2.62e-04 2022-05-06 04:48:32,583 INFO [train.py:715] (0/8) Epoch 8, batch 15650, loss[loss=0.1282, simple_loss=0.2041, pruned_loss=0.02615, over 4759.00 frames.], tot_loss[loss=0.143, simple_loss=0.2149, pruned_loss=0.0355, over 971446.91 frames.], batch size: 19, lr: 2.62e-04 2022-05-06 04:49:11,086 INFO [train.py:715] (0/8) Epoch 8, batch 15700, loss[loss=0.1637, simple_loss=0.2294, pruned_loss=0.04902, over 4960.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2145, pruned_loss=0.03538, over 972141.45 frames.], batch size: 35, lr: 2.62e-04 2022-05-06 04:49:50,910 INFO [train.py:715] (0/8) Epoch 8, batch 15750, loss[loss=0.1527, simple_loss=0.2313, pruned_loss=0.03707, over 4811.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2156, pruned_loss=0.03594, over 972095.15 frames.], batch size: 24, lr: 2.62e-04 2022-05-06 04:50:30,388 INFO [train.py:715] (0/8) Epoch 8, batch 15800, loss[loss=0.1688, simple_loss=0.2393, pruned_loss=0.04913, over 4887.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2152, pruned_loss=0.03584, over 971440.21 frames.], batch size: 19, lr: 2.61e-04 2022-05-06 04:51:09,452 INFO [train.py:715] (0/8) Epoch 8, batch 15850, loss[loss=0.1434, simple_loss=0.2048, pruned_loss=0.04098, over 4978.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2155, pruned_loss=0.03608, over 972751.78 frames.], batch size: 28, lr: 2.61e-04 2022-05-06 04:51:48,553 INFO [train.py:715] (0/8) Epoch 8, batch 15900, loss[loss=0.1405, simple_loss=0.2215, pruned_loss=0.02974, over 4982.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2154, pruned_loss=0.03645, over 973644.48 frames.], batch size: 25, lr: 2.61e-04 2022-05-06 04:52:27,775 INFO [train.py:715] (0/8) Epoch 8, batch 15950, loss[loss=0.1253, simple_loss=0.1991, pruned_loss=0.02573, over 4699.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2154, pruned_loss=0.03642, over 972999.74 frames.], batch size: 15, lr: 2.61e-04 2022-05-06 04:53:07,055 INFO [train.py:715] (0/8) Epoch 8, batch 16000, loss[loss=0.1234, simple_loss=0.1981, pruned_loss=0.02436, over 4904.00 frames.], tot_loss[loss=0.144, simple_loss=0.2156, pruned_loss=0.03614, over 973276.51 frames.], batch size: 17, lr: 2.61e-04 2022-05-06 04:53:45,655 INFO [train.py:715] (0/8) Epoch 8, batch 16050, loss[loss=0.1571, simple_loss=0.2289, pruned_loss=0.04263, over 4767.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2152, pruned_loss=0.03576, over 973831.19 frames.], batch size: 18, lr: 2.61e-04 2022-05-06 04:54:25,525 INFO [train.py:715] (0/8) Epoch 8, batch 16100, loss[loss=0.1465, simple_loss=0.2212, pruned_loss=0.03592, over 4982.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2159, pruned_loss=0.03591, over 973907.17 frames.], batch size: 28, lr: 2.61e-04 2022-05-06 04:55:04,003 INFO [train.py:715] (0/8) Epoch 8, batch 16150, loss[loss=0.2032, simple_loss=0.2512, pruned_loss=0.07763, over 4846.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2159, pruned_loss=0.036, over 972492.37 frames.], batch size: 30, lr: 2.61e-04 2022-05-06 04:55:43,546 INFO [train.py:715] (0/8) Epoch 8, batch 16200, loss[loss=0.1438, simple_loss=0.2204, pruned_loss=0.03366, over 4971.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2153, pruned_loss=0.03557, over 972389.40 frames.], batch size: 15, lr: 2.61e-04 2022-05-06 04:56:21,931 INFO [train.py:715] (0/8) Epoch 8, batch 16250, loss[loss=0.1485, simple_loss=0.2183, pruned_loss=0.03935, over 4721.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2158, pruned_loss=0.03574, over 972385.27 frames.], batch size: 16, lr: 2.61e-04 2022-05-06 04:57:01,392 INFO [train.py:715] (0/8) Epoch 8, batch 16300, loss[loss=0.1374, simple_loss=0.2084, pruned_loss=0.03318, over 4808.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.0352, over 972939.65 frames.], batch size: 21, lr: 2.61e-04 2022-05-06 04:57:40,823 INFO [train.py:715] (0/8) Epoch 8, batch 16350, loss[loss=0.1555, simple_loss=0.2317, pruned_loss=0.03964, over 4817.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03572, over 973120.74 frames.], batch size: 13, lr: 2.61e-04 2022-05-06 04:58:19,596 INFO [train.py:715] (0/8) Epoch 8, batch 16400, loss[loss=0.1543, simple_loss=0.2326, pruned_loss=0.03801, over 4777.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2149, pruned_loss=0.03535, over 972864.24 frames.], batch size: 14, lr: 2.61e-04 2022-05-06 04:58:58,712 INFO [train.py:715] (0/8) Epoch 8, batch 16450, loss[loss=0.1254, simple_loss=0.2027, pruned_loss=0.02408, over 4926.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2158, pruned_loss=0.03588, over 973111.74 frames.], batch size: 23, lr: 2.61e-04 2022-05-06 04:59:37,559 INFO [train.py:715] (0/8) Epoch 8, batch 16500, loss[loss=0.1332, simple_loss=0.2034, pruned_loss=0.0315, over 4814.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2161, pruned_loss=0.03606, over 972728.08 frames.], batch size: 12, lr: 2.61e-04 2022-05-06 05:00:17,262 INFO [train.py:715] (0/8) Epoch 8, batch 16550, loss[loss=0.1593, simple_loss=0.223, pruned_loss=0.04779, over 4954.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.03645, over 971850.57 frames.], batch size: 39, lr: 2.61e-04 2022-05-06 05:00:56,284 INFO [train.py:715] (0/8) Epoch 8, batch 16600, loss[loss=0.1265, simple_loss=0.2028, pruned_loss=0.02509, over 4798.00 frames.], tot_loss[loss=0.1453, simple_loss=0.2173, pruned_loss=0.03667, over 972080.55 frames.], batch size: 24, lr: 2.61e-04 2022-05-06 05:01:35,312 INFO [train.py:715] (0/8) Epoch 8, batch 16650, loss[loss=0.1641, simple_loss=0.2399, pruned_loss=0.04419, over 4909.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03635, over 971705.89 frames.], batch size: 18, lr: 2.61e-04 2022-05-06 05:02:14,555 INFO [train.py:715] (0/8) Epoch 8, batch 16700, loss[loss=0.1727, simple_loss=0.2461, pruned_loss=0.04961, over 4911.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2171, pruned_loss=0.03657, over 972210.80 frames.], batch size: 19, lr: 2.61e-04 2022-05-06 05:02:53,471 INFO [train.py:715] (0/8) Epoch 8, batch 16750, loss[loss=0.1696, simple_loss=0.2275, pruned_loss=0.05589, over 4762.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2159, pruned_loss=0.03628, over 972635.21 frames.], batch size: 18, lr: 2.61e-04 2022-05-06 05:03:33,066 INFO [train.py:715] (0/8) Epoch 8, batch 16800, loss[loss=0.1287, simple_loss=0.2082, pruned_loss=0.02462, over 4857.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2157, pruned_loss=0.0363, over 972764.34 frames.], batch size: 13, lr: 2.61e-04 2022-05-06 05:04:12,042 INFO [train.py:715] (0/8) Epoch 8, batch 16850, loss[loss=0.149, simple_loss=0.215, pruned_loss=0.04144, over 4988.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2149, pruned_loss=0.03578, over 972437.91 frames.], batch size: 24, lr: 2.61e-04 2022-05-06 05:04:51,954 INFO [train.py:715] (0/8) Epoch 8, batch 16900, loss[loss=0.1604, simple_loss=0.238, pruned_loss=0.04146, over 4979.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2161, pruned_loss=0.03628, over 973355.98 frames.], batch size: 25, lr: 2.61e-04 2022-05-06 05:05:30,451 INFO [train.py:715] (0/8) Epoch 8, batch 16950, loss[loss=0.1365, simple_loss=0.2174, pruned_loss=0.02777, over 4961.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2156, pruned_loss=0.03615, over 972768.33 frames.], batch size: 21, lr: 2.61e-04 2022-05-06 05:06:10,146 INFO [train.py:715] (0/8) Epoch 8, batch 17000, loss[loss=0.1259, simple_loss=0.2107, pruned_loss=0.02059, over 4809.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2156, pruned_loss=0.03549, over 972283.85 frames.], batch size: 21, lr: 2.61e-04 2022-05-06 05:06:49,659 INFO [train.py:715] (0/8) Epoch 8, batch 17050, loss[loss=0.1262, simple_loss=0.2005, pruned_loss=0.02593, over 4816.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2156, pruned_loss=0.03563, over 972572.22 frames.], batch size: 25, lr: 2.61e-04 2022-05-06 05:07:28,338 INFO [train.py:715] (0/8) Epoch 8, batch 17100, loss[loss=0.1369, simple_loss=0.2058, pruned_loss=0.03403, over 4943.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2148, pruned_loss=0.03518, over 971474.21 frames.], batch size: 21, lr: 2.61e-04 2022-05-06 05:08:08,031 INFO [train.py:715] (0/8) Epoch 8, batch 17150, loss[loss=0.1292, simple_loss=0.2076, pruned_loss=0.02538, over 4842.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2155, pruned_loss=0.03569, over 971805.60 frames.], batch size: 15, lr: 2.61e-04 2022-05-06 05:08:47,209 INFO [train.py:715] (0/8) Epoch 8, batch 17200, loss[loss=0.1141, simple_loss=0.1831, pruned_loss=0.02259, over 4780.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2154, pruned_loss=0.03552, over 972663.54 frames.], batch size: 12, lr: 2.61e-04 2022-05-06 05:09:26,324 INFO [train.py:715] (0/8) Epoch 8, batch 17250, loss[loss=0.1392, simple_loss=0.2212, pruned_loss=0.02858, over 4802.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2154, pruned_loss=0.0354, over 972537.63 frames.], batch size: 21, lr: 2.61e-04 2022-05-06 05:10:04,657 INFO [train.py:715] (0/8) Epoch 8, batch 17300, loss[loss=0.1117, simple_loss=0.1899, pruned_loss=0.01678, over 4801.00 frames.], tot_loss[loss=0.1428, simple_loss=0.215, pruned_loss=0.03529, over 971914.78 frames.], batch size: 25, lr: 2.61e-04 2022-05-06 05:10:44,496 INFO [train.py:715] (0/8) Epoch 8, batch 17350, loss[loss=0.2215, simple_loss=0.2679, pruned_loss=0.08755, over 4938.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03574, over 972607.52 frames.], batch size: 18, lr: 2.61e-04 2022-05-06 05:11:23,596 INFO [train.py:715] (0/8) Epoch 8, batch 17400, loss[loss=0.1395, simple_loss=0.1946, pruned_loss=0.04225, over 4773.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2161, pruned_loss=0.03583, over 972631.51 frames.], batch size: 12, lr: 2.61e-04 2022-05-06 05:12:02,691 INFO [train.py:715] (0/8) Epoch 8, batch 17450, loss[loss=0.1219, simple_loss=0.2051, pruned_loss=0.01931, over 4824.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2173, pruned_loss=0.03659, over 971999.78 frames.], batch size: 25, lr: 2.61e-04 2022-05-06 05:12:42,120 INFO [train.py:715] (0/8) Epoch 8, batch 17500, loss[loss=0.1389, simple_loss=0.2093, pruned_loss=0.03423, over 4919.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2171, pruned_loss=0.03666, over 972014.12 frames.], batch size: 23, lr: 2.61e-04 2022-05-06 05:12:59,929 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-296000.pt 2022-05-06 05:13:23,164 INFO [train.py:715] (0/8) Epoch 8, batch 17550, loss[loss=0.1818, simple_loss=0.2509, pruned_loss=0.05641, over 4968.00 frames.], tot_loss[loss=0.1458, simple_loss=0.218, pruned_loss=0.03684, over 972073.50 frames.], batch size: 35, lr: 2.61e-04 2022-05-06 05:14:02,974 INFO [train.py:715] (0/8) Epoch 8, batch 17600, loss[loss=0.1342, simple_loss=0.2087, pruned_loss=0.02981, over 4944.00 frames.], tot_loss[loss=0.1458, simple_loss=0.2179, pruned_loss=0.03686, over 972035.47 frames.], batch size: 23, lr: 2.61e-04 2022-05-06 05:14:41,722 INFO [train.py:715] (0/8) Epoch 8, batch 17650, loss[loss=0.1473, simple_loss=0.2119, pruned_loss=0.04129, over 4740.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2174, pruned_loss=0.03649, over 971501.89 frames.], batch size: 16, lr: 2.61e-04 2022-05-06 05:15:22,839 INFO [train.py:715] (0/8) Epoch 8, batch 17700, loss[loss=0.1523, simple_loss=0.228, pruned_loss=0.03833, over 4741.00 frames.], tot_loss[loss=0.1448, simple_loss=0.217, pruned_loss=0.0363, over 971705.06 frames.], batch size: 19, lr: 2.61e-04 2022-05-06 05:16:02,814 INFO [train.py:715] (0/8) Epoch 8, batch 17750, loss[loss=0.1575, simple_loss=0.2282, pruned_loss=0.04346, over 4770.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2156, pruned_loss=0.03591, over 971681.17 frames.], batch size: 18, lr: 2.61e-04 2022-05-06 05:16:43,287 INFO [train.py:715] (0/8) Epoch 8, batch 17800, loss[loss=0.1489, simple_loss=0.2156, pruned_loss=0.04112, over 4758.00 frames.], tot_loss[loss=0.1454, simple_loss=0.2172, pruned_loss=0.03684, over 971951.39 frames.], batch size: 14, lr: 2.61e-04 2022-05-06 05:17:23,948 INFO [train.py:715] (0/8) Epoch 8, batch 17850, loss[loss=0.139, simple_loss=0.2047, pruned_loss=0.03667, over 4855.00 frames.], tot_loss[loss=0.145, simple_loss=0.2169, pruned_loss=0.03654, over 971982.78 frames.], batch size: 13, lr: 2.61e-04 2022-05-06 05:18:04,810 INFO [train.py:715] (0/8) Epoch 8, batch 17900, loss[loss=0.1564, simple_loss=0.2269, pruned_loss=0.04292, over 4881.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2156, pruned_loss=0.03598, over 971722.79 frames.], batch size: 16, lr: 2.61e-04 2022-05-06 05:18:46,217 INFO [train.py:715] (0/8) Epoch 8, batch 17950, loss[loss=0.1383, simple_loss=0.2149, pruned_loss=0.03084, over 4856.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2161, pruned_loss=0.0365, over 970717.37 frames.], batch size: 15, lr: 2.61e-04 2022-05-06 05:19:26,622 INFO [train.py:715] (0/8) Epoch 8, batch 18000, loss[loss=0.108, simple_loss=0.1707, pruned_loss=0.02259, over 4783.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2162, pruned_loss=0.03667, over 971798.00 frames.], batch size: 12, lr: 2.61e-04 2022-05-06 05:19:26,623 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 05:19:36,398 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1073, simple_loss=0.1919, pruned_loss=0.01138, over 914524.00 frames. 2022-05-06 05:20:17,010 INFO [train.py:715] (0/8) Epoch 8, batch 18050, loss[loss=0.1236, simple_loss=0.19, pruned_loss=0.02867, over 4835.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2149, pruned_loss=0.03599, over 971848.05 frames.], batch size: 13, lr: 2.60e-04 2022-05-06 05:20:59,052 INFO [train.py:715] (0/8) Epoch 8, batch 18100, loss[loss=0.1669, simple_loss=0.2235, pruned_loss=0.05517, over 4749.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2146, pruned_loss=0.03582, over 971583.47 frames.], batch size: 16, lr: 2.60e-04 2022-05-06 05:21:40,108 INFO [train.py:715] (0/8) Epoch 8, batch 18150, loss[loss=0.1402, simple_loss=0.211, pruned_loss=0.03468, over 4704.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2153, pruned_loss=0.0363, over 971598.52 frames.], batch size: 15, lr: 2.60e-04 2022-05-06 05:22:21,015 INFO [train.py:715] (0/8) Epoch 8, batch 18200, loss[loss=0.1493, simple_loss=0.2151, pruned_loss=0.04177, over 4744.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2153, pruned_loss=0.03612, over 972497.14 frames.], batch size: 19, lr: 2.60e-04 2022-05-06 05:23:02,795 INFO [train.py:715] (0/8) Epoch 8, batch 18250, loss[loss=0.1482, simple_loss=0.2198, pruned_loss=0.03827, over 4936.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2154, pruned_loss=0.03613, over 972368.51 frames.], batch size: 23, lr: 2.60e-04 2022-05-06 05:23:43,811 INFO [train.py:715] (0/8) Epoch 8, batch 18300, loss[loss=0.1339, simple_loss=0.2104, pruned_loss=0.02873, over 4903.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2162, pruned_loss=0.03629, over 971684.94 frames.], batch size: 19, lr: 2.60e-04 2022-05-06 05:24:25,290 INFO [train.py:715] (0/8) Epoch 8, batch 18350, loss[loss=0.134, simple_loss=0.2016, pruned_loss=0.03323, over 4922.00 frames.], tot_loss[loss=0.1442, simple_loss=0.216, pruned_loss=0.03623, over 971376.80 frames.], batch size: 18, lr: 2.60e-04 2022-05-06 05:25:06,146 INFO [train.py:715] (0/8) Epoch 8, batch 18400, loss[loss=0.1459, simple_loss=0.2176, pruned_loss=0.03707, over 4968.00 frames.], tot_loss[loss=0.144, simple_loss=0.2157, pruned_loss=0.03612, over 971399.79 frames.], batch size: 39, lr: 2.60e-04 2022-05-06 05:25:47,832 INFO [train.py:715] (0/8) Epoch 8, batch 18450, loss[loss=0.1334, simple_loss=0.2124, pruned_loss=0.02718, over 4983.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2151, pruned_loss=0.03576, over 972195.07 frames.], batch size: 25, lr: 2.60e-04 2022-05-06 05:26:28,560 INFO [train.py:715] (0/8) Epoch 8, batch 18500, loss[loss=0.1609, simple_loss=0.2355, pruned_loss=0.04315, over 4879.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2149, pruned_loss=0.03522, over 972631.46 frames.], batch size: 22, lr: 2.60e-04 2022-05-06 05:27:08,962 INFO [train.py:715] (0/8) Epoch 8, batch 18550, loss[loss=0.1462, simple_loss=0.2263, pruned_loss=0.03303, over 4858.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2149, pruned_loss=0.03528, over 971553.12 frames.], batch size: 32, lr: 2.60e-04 2022-05-06 05:27:50,211 INFO [train.py:715] (0/8) Epoch 8, batch 18600, loss[loss=0.1257, simple_loss=0.1944, pruned_loss=0.02848, over 4806.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2149, pruned_loss=0.03522, over 972158.46 frames.], batch size: 13, lr: 2.60e-04 2022-05-06 05:28:30,414 INFO [train.py:715] (0/8) Epoch 8, batch 18650, loss[loss=0.1752, simple_loss=0.245, pruned_loss=0.05273, over 4803.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03508, over 972472.36 frames.], batch size: 21, lr: 2.60e-04 2022-05-06 05:29:09,922 INFO [train.py:715] (0/8) Epoch 8, batch 18700, loss[loss=0.1697, simple_loss=0.2304, pruned_loss=0.05448, over 4867.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2152, pruned_loss=0.03567, over 971393.55 frames.], batch size: 16, lr: 2.60e-04 2022-05-06 05:29:49,891 INFO [train.py:715] (0/8) Epoch 8, batch 18750, loss[loss=0.1293, simple_loss=0.2046, pruned_loss=0.027, over 4776.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2153, pruned_loss=0.03584, over 971798.15 frames.], batch size: 18, lr: 2.60e-04 2022-05-06 05:30:30,982 INFO [train.py:715] (0/8) Epoch 8, batch 18800, loss[loss=0.1216, simple_loss=0.196, pruned_loss=0.02361, over 4924.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2144, pruned_loss=0.03539, over 971836.88 frames.], batch size: 21, lr: 2.60e-04 2022-05-06 05:31:10,610 INFO [train.py:715] (0/8) Epoch 8, batch 18850, loss[loss=0.1542, simple_loss=0.2166, pruned_loss=0.04589, over 4996.00 frames.], tot_loss[loss=0.1425, simple_loss=0.214, pruned_loss=0.03551, over 971388.85 frames.], batch size: 15, lr: 2.60e-04 2022-05-06 05:31:50,017 INFO [train.py:715] (0/8) Epoch 8, batch 18900, loss[loss=0.1448, simple_loss=0.2153, pruned_loss=0.03719, over 4918.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2152, pruned_loss=0.03604, over 972046.88 frames.], batch size: 18, lr: 2.60e-04 2022-05-06 05:32:30,282 INFO [train.py:715] (0/8) Epoch 8, batch 18950, loss[loss=0.1542, simple_loss=0.2314, pruned_loss=0.03853, over 4757.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03602, over 972106.24 frames.], batch size: 14, lr: 2.60e-04 2022-05-06 05:33:10,181 INFO [train.py:715] (0/8) Epoch 8, batch 19000, loss[loss=0.146, simple_loss=0.2347, pruned_loss=0.0286, over 4887.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2163, pruned_loss=0.03574, over 971157.98 frames.], batch size: 22, lr: 2.60e-04 2022-05-06 05:33:50,120 INFO [train.py:715] (0/8) Epoch 8, batch 19050, loss[loss=0.146, simple_loss=0.2139, pruned_loss=0.03902, over 4973.00 frames.], tot_loss[loss=0.144, simple_loss=0.2165, pruned_loss=0.03573, over 971103.35 frames.], batch size: 28, lr: 2.60e-04 2022-05-06 05:34:31,417 INFO [train.py:715] (0/8) Epoch 8, batch 19100, loss[loss=0.1388, simple_loss=0.2097, pruned_loss=0.034, over 4830.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.03571, over 971937.00 frames.], batch size: 26, lr: 2.60e-04 2022-05-06 05:35:13,334 INFO [train.py:715] (0/8) Epoch 8, batch 19150, loss[loss=0.1439, simple_loss=0.2212, pruned_loss=0.03332, over 4987.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03565, over 971889.99 frames.], batch size: 28, lr: 2.60e-04 2022-05-06 05:35:55,000 INFO [train.py:715] (0/8) Epoch 8, batch 19200, loss[loss=0.1378, simple_loss=0.2075, pruned_loss=0.03408, over 4904.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2164, pruned_loss=0.03587, over 972243.87 frames.], batch size: 23, lr: 2.60e-04 2022-05-06 05:36:35,262 INFO [train.py:715] (0/8) Epoch 8, batch 19250, loss[loss=0.1492, simple_loss=0.2256, pruned_loss=0.03643, over 4814.00 frames.], tot_loss[loss=0.1444, simple_loss=0.217, pruned_loss=0.03593, over 972565.26 frames.], batch size: 27, lr: 2.60e-04 2022-05-06 05:37:17,458 INFO [train.py:715] (0/8) Epoch 8, batch 19300, loss[loss=0.1309, simple_loss=0.2019, pruned_loss=0.02989, over 4900.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2161, pruned_loss=0.03544, over 971760.17 frames.], batch size: 22, lr: 2.60e-04 2022-05-06 05:37:58,609 INFO [train.py:715] (0/8) Epoch 8, batch 19350, loss[loss=0.1497, simple_loss=0.213, pruned_loss=0.04315, over 4980.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03562, over 972020.86 frames.], batch size: 25, lr: 2.60e-04 2022-05-06 05:38:39,851 INFO [train.py:715] (0/8) Epoch 8, batch 19400, loss[loss=0.1433, simple_loss=0.2226, pruned_loss=0.03202, over 4876.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03511, over 972562.68 frames.], batch size: 16, lr: 2.60e-04 2022-05-06 05:39:21,783 INFO [train.py:715] (0/8) Epoch 8, batch 19450, loss[loss=0.147, simple_loss=0.2179, pruned_loss=0.03804, over 4812.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.0349, over 972789.22 frames.], batch size: 26, lr: 2.60e-04 2022-05-06 05:40:03,277 INFO [train.py:715] (0/8) Epoch 8, batch 19500, loss[loss=0.1268, simple_loss=0.1984, pruned_loss=0.02762, over 4815.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2142, pruned_loss=0.03459, over 972543.66 frames.], batch size: 12, lr: 2.60e-04 2022-05-06 05:40:44,567 INFO [train.py:715] (0/8) Epoch 8, batch 19550, loss[loss=0.1325, simple_loss=0.2078, pruned_loss=0.02866, over 4864.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03417, over 972873.23 frames.], batch size: 20, lr: 2.60e-04 2022-05-06 05:41:25,035 INFO [train.py:715] (0/8) Epoch 8, batch 19600, loss[loss=0.1328, simple_loss=0.202, pruned_loss=0.03186, over 4859.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2142, pruned_loss=0.03473, over 971647.05 frames.], batch size: 20, lr: 2.60e-04 2022-05-06 05:42:06,544 INFO [train.py:715] (0/8) Epoch 8, batch 19650, loss[loss=0.1416, simple_loss=0.217, pruned_loss=0.03307, over 4907.00 frames.], tot_loss[loss=0.142, simple_loss=0.2147, pruned_loss=0.03463, over 972263.94 frames.], batch size: 19, lr: 2.60e-04 2022-05-06 05:42:47,226 INFO [train.py:715] (0/8) Epoch 8, batch 19700, loss[loss=0.1417, simple_loss=0.2241, pruned_loss=0.02972, over 4856.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2136, pruned_loss=0.03412, over 972340.48 frames.], batch size: 20, lr: 2.60e-04 2022-05-06 05:43:28,211 INFO [train.py:715] (0/8) Epoch 8, batch 19750, loss[loss=0.1355, simple_loss=0.2121, pruned_loss=0.02943, over 4873.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2139, pruned_loss=0.03444, over 971979.73 frames.], batch size: 32, lr: 2.60e-04 2022-05-06 05:44:09,860 INFO [train.py:715] (0/8) Epoch 8, batch 19800, loss[loss=0.1322, simple_loss=0.2114, pruned_loss=0.02647, over 4956.00 frames.], tot_loss[loss=0.141, simple_loss=0.2138, pruned_loss=0.03415, over 971341.02 frames.], batch size: 21, lr: 2.60e-04 2022-05-06 05:44:50,904 INFO [train.py:715] (0/8) Epoch 8, batch 19850, loss[loss=0.1547, simple_loss=0.2221, pruned_loss=0.04364, over 4881.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2143, pruned_loss=0.03467, over 971732.61 frames.], batch size: 22, lr: 2.60e-04 2022-05-06 05:45:31,216 INFO [train.py:715] (0/8) Epoch 8, batch 19900, loss[loss=0.1386, simple_loss=0.2195, pruned_loss=0.02879, over 4960.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2145, pruned_loss=0.03425, over 972649.85 frames.], batch size: 24, lr: 2.60e-04 2022-05-06 05:46:10,975 INFO [train.py:715] (0/8) Epoch 8, batch 19950, loss[loss=0.1275, simple_loss=0.2034, pruned_loss=0.02586, over 4801.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2159, pruned_loss=0.03518, over 972617.61 frames.], batch size: 13, lr: 2.60e-04 2022-05-06 05:46:51,592 INFO [train.py:715] (0/8) Epoch 8, batch 20000, loss[loss=0.1578, simple_loss=0.2243, pruned_loss=0.04569, over 4916.00 frames.], tot_loss[loss=0.143, simple_loss=0.2158, pruned_loss=0.03514, over 972395.02 frames.], batch size: 23, lr: 2.60e-04 2022-05-06 05:47:32,112 INFO [train.py:715] (0/8) Epoch 8, batch 20050, loss[loss=0.1353, simple_loss=0.2066, pruned_loss=0.03202, over 4920.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2159, pruned_loss=0.03524, over 973305.29 frames.], batch size: 23, lr: 2.60e-04 2022-05-06 05:48:12,624 INFO [train.py:715] (0/8) Epoch 8, batch 20100, loss[loss=0.1493, simple_loss=0.2203, pruned_loss=0.03913, over 4745.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2161, pruned_loss=0.03529, over 972932.43 frames.], batch size: 19, lr: 2.60e-04 2022-05-06 05:48:53,760 INFO [train.py:715] (0/8) Epoch 8, batch 20150, loss[loss=0.122, simple_loss=0.1914, pruned_loss=0.02633, over 4941.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2159, pruned_loss=0.03525, over 972917.93 frames.], batch size: 29, lr: 2.60e-04 2022-05-06 05:49:34,568 INFO [train.py:715] (0/8) Epoch 8, batch 20200, loss[loss=0.146, simple_loss=0.2199, pruned_loss=0.03602, over 4937.00 frames.], tot_loss[loss=0.143, simple_loss=0.2161, pruned_loss=0.03498, over 972047.66 frames.], batch size: 29, lr: 2.60e-04 2022-05-06 05:50:15,439 INFO [train.py:715] (0/8) Epoch 8, batch 20250, loss[loss=0.1454, simple_loss=0.2161, pruned_loss=0.03728, over 4831.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2157, pruned_loss=0.03526, over 972285.14 frames.], batch size: 13, lr: 2.60e-04 2022-05-06 05:50:56,708 INFO [train.py:715] (0/8) Epoch 8, batch 20300, loss[loss=0.1203, simple_loss=0.1918, pruned_loss=0.0244, over 4790.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03508, over 971894.95 frames.], batch size: 17, lr: 2.60e-04 2022-05-06 05:51:37,706 INFO [train.py:715] (0/8) Epoch 8, batch 20350, loss[loss=0.1848, simple_loss=0.2694, pruned_loss=0.05009, over 4846.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2157, pruned_loss=0.03577, over 972439.13 frames.], batch size: 20, lr: 2.59e-04 2022-05-06 05:52:18,257 INFO [train.py:715] (0/8) Epoch 8, batch 20400, loss[loss=0.1379, simple_loss=0.2133, pruned_loss=0.03126, over 4974.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2157, pruned_loss=0.03553, over 972972.43 frames.], batch size: 24, lr: 2.59e-04 2022-05-06 05:52:58,517 INFO [train.py:715] (0/8) Epoch 8, batch 20450, loss[loss=0.1178, simple_loss=0.1843, pruned_loss=0.0256, over 4962.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2159, pruned_loss=0.03564, over 972714.86 frames.], batch size: 15, lr: 2.59e-04 2022-05-06 05:53:39,595 INFO [train.py:715] (0/8) Epoch 8, batch 20500, loss[loss=0.1276, simple_loss=0.1939, pruned_loss=0.03069, over 4760.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2161, pruned_loss=0.03568, over 972884.19 frames.], batch size: 12, lr: 2.59e-04 2022-05-06 05:54:20,089 INFO [train.py:715] (0/8) Epoch 8, batch 20550, loss[loss=0.1512, simple_loss=0.2293, pruned_loss=0.03656, over 4705.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2161, pruned_loss=0.03549, over 972630.83 frames.], batch size: 15, lr: 2.59e-04 2022-05-06 05:55:00,453 INFO [train.py:715] (0/8) Epoch 8, batch 20600, loss[loss=0.1371, simple_loss=0.2072, pruned_loss=0.03346, over 4797.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2159, pruned_loss=0.0352, over 972671.01 frames.], batch size: 14, lr: 2.59e-04 2022-05-06 05:55:41,411 INFO [train.py:715] (0/8) Epoch 8, batch 20650, loss[loss=0.1469, simple_loss=0.2169, pruned_loss=0.03838, over 4966.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2165, pruned_loss=0.0354, over 971999.31 frames.], batch size: 35, lr: 2.59e-04 2022-05-06 05:56:22,564 INFO [train.py:715] (0/8) Epoch 8, batch 20700, loss[loss=0.1479, simple_loss=0.22, pruned_loss=0.0379, over 4940.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2157, pruned_loss=0.03504, over 972719.83 frames.], batch size: 21, lr: 2.59e-04 2022-05-06 05:57:02,756 INFO [train.py:715] (0/8) Epoch 8, batch 20750, loss[loss=0.1419, simple_loss=0.2178, pruned_loss=0.03295, over 4805.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2157, pruned_loss=0.03506, over 971811.43 frames.], batch size: 25, lr: 2.59e-04 2022-05-06 05:57:42,967 INFO [train.py:715] (0/8) Epoch 8, batch 20800, loss[loss=0.1129, simple_loss=0.1824, pruned_loss=0.02174, over 4772.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2156, pruned_loss=0.03479, over 971575.81 frames.], batch size: 14, lr: 2.59e-04 2022-05-06 05:58:24,022 INFO [train.py:715] (0/8) Epoch 8, batch 20850, loss[loss=0.1389, simple_loss=0.217, pruned_loss=0.03036, over 4991.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2159, pruned_loss=0.03494, over 971552.83 frames.], batch size: 28, lr: 2.59e-04 2022-05-06 05:59:04,430 INFO [train.py:715] (0/8) Epoch 8, batch 20900, loss[loss=0.1283, simple_loss=0.1947, pruned_loss=0.03098, over 4843.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2156, pruned_loss=0.03488, over 971829.15 frames.], batch size: 30, lr: 2.59e-04 2022-05-06 05:59:43,021 INFO [train.py:715] (0/8) Epoch 8, batch 20950, loss[loss=0.1541, simple_loss=0.2202, pruned_loss=0.04402, over 4989.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2155, pruned_loss=0.03495, over 971875.52 frames.], batch size: 28, lr: 2.59e-04 2022-05-06 06:00:22,703 INFO [train.py:715] (0/8) Epoch 8, batch 21000, loss[loss=0.1346, simple_loss=0.212, pruned_loss=0.02861, over 4891.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2151, pruned_loss=0.03488, over 971736.73 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:00:22,705 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 06:00:32,256 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1072, simple_loss=0.1919, pruned_loss=0.01129, over 914524.00 frames. 2022-05-06 06:01:12,647 INFO [train.py:715] (0/8) Epoch 8, batch 21050, loss[loss=0.1452, simple_loss=0.2093, pruned_loss=0.04053, over 4985.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03487, over 970974.46 frames.], batch size: 15, lr: 2.59e-04 2022-05-06 06:01:52,988 INFO [train.py:715] (0/8) Epoch 8, batch 21100, loss[loss=0.1313, simple_loss=0.2085, pruned_loss=0.02704, over 4758.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2142, pruned_loss=0.03516, over 971596.66 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:02:31,462 INFO [train.py:715] (0/8) Epoch 8, batch 21150, loss[loss=0.1667, simple_loss=0.2402, pruned_loss=0.04654, over 4988.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.03509, over 972130.82 frames.], batch size: 25, lr: 2.59e-04 2022-05-06 06:03:10,264 INFO [train.py:715] (0/8) Epoch 8, batch 21200, loss[loss=0.1548, simple_loss=0.23, pruned_loss=0.03981, over 4939.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2153, pruned_loss=0.03547, over 973124.72 frames.], batch size: 21, lr: 2.59e-04 2022-05-06 06:03:49,967 INFO [train.py:715] (0/8) Epoch 8, batch 21250, loss[loss=0.1484, simple_loss=0.2072, pruned_loss=0.04477, over 4755.00 frames.], tot_loss[loss=0.144, simple_loss=0.2159, pruned_loss=0.03604, over 973560.94 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:04:29,228 INFO [train.py:715] (0/8) Epoch 8, batch 21300, loss[loss=0.1424, simple_loss=0.2222, pruned_loss=0.03129, over 4820.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.03575, over 972968.58 frames.], batch size: 25, lr: 2.59e-04 2022-05-06 06:05:07,758 INFO [train.py:715] (0/8) Epoch 8, batch 21350, loss[loss=0.1724, simple_loss=0.2422, pruned_loss=0.05127, over 4896.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2166, pruned_loss=0.03618, over 972702.19 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:05:47,404 INFO [train.py:715] (0/8) Epoch 8, batch 21400, loss[loss=0.1554, simple_loss=0.2197, pruned_loss=0.0455, over 4863.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2168, pruned_loss=0.03623, over 972872.45 frames.], batch size: 16, lr: 2.59e-04 2022-05-06 06:06:27,497 INFO [train.py:715] (0/8) Epoch 8, batch 21450, loss[loss=0.1532, simple_loss=0.2214, pruned_loss=0.04252, over 4861.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2163, pruned_loss=0.03537, over 973315.49 frames.], batch size: 13, lr: 2.59e-04 2022-05-06 06:07:06,790 INFO [train.py:715] (0/8) Epoch 8, batch 21500, loss[loss=0.1414, simple_loss=0.2182, pruned_loss=0.03229, over 4975.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2162, pruned_loss=0.03523, over 973868.82 frames.], batch size: 15, lr: 2.59e-04 2022-05-06 06:07:45,790 INFO [train.py:715] (0/8) Epoch 8, batch 21550, loss[loss=0.1219, simple_loss=0.1939, pruned_loss=0.02497, over 4816.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2163, pruned_loss=0.03589, over 973472.35 frames.], batch size: 13, lr: 2.59e-04 2022-05-06 06:08:25,814 INFO [train.py:715] (0/8) Epoch 8, batch 21600, loss[loss=0.1747, simple_loss=0.2405, pruned_loss=0.05442, over 4733.00 frames.], tot_loss[loss=0.1451, simple_loss=0.217, pruned_loss=0.03656, over 973759.05 frames.], batch size: 16, lr: 2.59e-04 2022-05-06 06:09:04,794 INFO [train.py:715] (0/8) Epoch 8, batch 21650, loss[loss=0.1457, simple_loss=0.2122, pruned_loss=0.03963, over 4954.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.03653, over 972814.32 frames.], batch size: 21, lr: 2.59e-04 2022-05-06 06:09:43,492 INFO [train.py:715] (0/8) Epoch 8, batch 21700, loss[loss=0.1608, simple_loss=0.2376, pruned_loss=0.04202, over 4922.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2165, pruned_loss=0.03621, over 972727.03 frames.], batch size: 18, lr: 2.59e-04 2022-05-06 06:10:23,858 INFO [train.py:715] (0/8) Epoch 8, batch 21750, loss[loss=0.1432, simple_loss=0.2247, pruned_loss=0.03083, over 4816.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03601, over 972995.64 frames.], batch size: 27, lr: 2.59e-04 2022-05-06 06:11:03,698 INFO [train.py:715] (0/8) Epoch 8, batch 21800, loss[loss=0.1724, simple_loss=0.2349, pruned_loss=0.0549, over 4893.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2155, pruned_loss=0.03549, over 972524.33 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:11:42,811 INFO [train.py:715] (0/8) Epoch 8, batch 21850, loss[loss=0.1239, simple_loss=0.1925, pruned_loss=0.02765, over 4853.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2149, pruned_loss=0.03503, over 972552.53 frames.], batch size: 32, lr: 2.59e-04 2022-05-06 06:12:21,177 INFO [train.py:715] (0/8) Epoch 8, batch 21900, loss[loss=0.1848, simple_loss=0.2558, pruned_loss=0.05688, over 4820.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2148, pruned_loss=0.03496, over 971916.78 frames.], batch size: 26, lr: 2.59e-04 2022-05-06 06:13:00,619 INFO [train.py:715] (0/8) Epoch 8, batch 21950, loss[loss=0.1567, simple_loss=0.2277, pruned_loss=0.04285, over 4984.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2148, pruned_loss=0.03515, over 971679.76 frames.], batch size: 24, lr: 2.59e-04 2022-05-06 06:13:39,700 INFO [train.py:715] (0/8) Epoch 8, batch 22000, loss[loss=0.1108, simple_loss=0.1858, pruned_loss=0.01792, over 4895.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2149, pruned_loss=0.03543, over 972572.36 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:14:18,326 INFO [train.py:715] (0/8) Epoch 8, batch 22050, loss[loss=0.1377, simple_loss=0.2089, pruned_loss=0.03326, over 4942.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2151, pruned_loss=0.03554, over 973332.36 frames.], batch size: 29, lr: 2.59e-04 2022-05-06 06:14:58,045 INFO [train.py:715] (0/8) Epoch 8, batch 22100, loss[loss=0.1699, simple_loss=0.2461, pruned_loss=0.0469, over 4889.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2158, pruned_loss=0.03578, over 972740.21 frames.], batch size: 39, lr: 2.59e-04 2022-05-06 06:15:37,419 INFO [train.py:715] (0/8) Epoch 8, batch 22150, loss[loss=0.1468, simple_loss=0.2236, pruned_loss=0.03503, over 4907.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2159, pruned_loss=0.03629, over 972026.24 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:16:16,518 INFO [train.py:715] (0/8) Epoch 8, batch 22200, loss[loss=0.1297, simple_loss=0.2059, pruned_loss=0.02678, over 4910.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2151, pruned_loss=0.03566, over 971971.22 frames.], batch size: 19, lr: 2.59e-04 2022-05-06 06:16:55,348 INFO [train.py:715] (0/8) Epoch 8, batch 22250, loss[loss=0.1538, simple_loss=0.2317, pruned_loss=0.03794, over 4889.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2148, pruned_loss=0.03517, over 973506.82 frames.], batch size: 22, lr: 2.59e-04 2022-05-06 06:17:34,563 INFO [train.py:715] (0/8) Epoch 8, batch 22300, loss[loss=0.1372, simple_loss=0.2057, pruned_loss=0.03436, over 4879.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2147, pruned_loss=0.03512, over 973073.32 frames.], batch size: 22, lr: 2.59e-04 2022-05-06 06:18:13,307 INFO [train.py:715] (0/8) Epoch 8, batch 22350, loss[loss=0.1574, simple_loss=0.225, pruned_loss=0.04489, over 4973.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2151, pruned_loss=0.03539, over 973416.56 frames.], batch size: 24, lr: 2.59e-04 2022-05-06 06:18:51,905 INFO [train.py:715] (0/8) Epoch 8, batch 22400, loss[loss=0.1479, simple_loss=0.2249, pruned_loss=0.03546, over 4913.00 frames.], tot_loss[loss=0.143, simple_loss=0.2152, pruned_loss=0.03539, over 972866.01 frames.], batch size: 23, lr: 2.59e-04 2022-05-06 06:19:31,232 INFO [train.py:715] (0/8) Epoch 8, batch 22450, loss[loss=0.1309, simple_loss=0.202, pruned_loss=0.02986, over 4913.00 frames.], tot_loss[loss=0.1429, simple_loss=0.215, pruned_loss=0.03533, over 972270.15 frames.], batch size: 17, lr: 2.59e-04 2022-05-06 06:20:10,733 INFO [train.py:715] (0/8) Epoch 8, batch 22500, loss[loss=0.1276, simple_loss=0.2066, pruned_loss=0.02427, over 4866.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2145, pruned_loss=0.03545, over 971859.46 frames.], batch size: 16, lr: 2.59e-04 2022-05-06 06:20:49,333 INFO [train.py:715] (0/8) Epoch 8, batch 22550, loss[loss=0.1617, simple_loss=0.2205, pruned_loss=0.05148, over 4930.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2147, pruned_loss=0.03537, over 971905.43 frames.], batch size: 18, lr: 2.59e-04 2022-05-06 06:21:28,251 INFO [train.py:715] (0/8) Epoch 8, batch 22600, loss[loss=0.1568, simple_loss=0.2242, pruned_loss=0.04473, over 4834.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2146, pruned_loss=0.0352, over 971929.71 frames.], batch size: 25, lr: 2.59e-04 2022-05-06 06:22:07,734 INFO [train.py:715] (0/8) Epoch 8, batch 22650, loss[loss=0.1293, simple_loss=0.2039, pruned_loss=0.02733, over 4842.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2143, pruned_loss=0.03496, over 971922.87 frames.], batch size: 32, lr: 2.58e-04 2022-05-06 06:22:46,456 INFO [train.py:715] (0/8) Epoch 8, batch 22700, loss[loss=0.147, simple_loss=0.2288, pruned_loss=0.03259, over 4826.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2152, pruned_loss=0.03565, over 971877.41 frames.], batch size: 25, lr: 2.58e-04 2022-05-06 06:23:24,774 INFO [train.py:715] (0/8) Epoch 8, batch 22750, loss[loss=0.127, simple_loss=0.199, pruned_loss=0.02745, over 4760.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2158, pruned_loss=0.0359, over 972402.86 frames.], batch size: 19, lr: 2.58e-04 2022-05-06 06:24:04,595 INFO [train.py:715] (0/8) Epoch 8, batch 22800, loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03107, over 4810.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2145, pruned_loss=0.03524, over 971607.11 frames.], batch size: 25, lr: 2.58e-04 2022-05-06 06:24:43,768 INFO [train.py:715] (0/8) Epoch 8, batch 22850, loss[loss=0.128, simple_loss=0.1893, pruned_loss=0.03338, over 4814.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2149, pruned_loss=0.03545, over 971816.55 frames.], batch size: 13, lr: 2.58e-04 2022-05-06 06:25:22,843 INFO [train.py:715] (0/8) Epoch 8, batch 22900, loss[loss=0.1145, simple_loss=0.1783, pruned_loss=0.02538, over 4934.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2147, pruned_loss=0.03525, over 972403.46 frames.], batch size: 18, lr: 2.58e-04 2022-05-06 06:26:01,956 INFO [train.py:715] (0/8) Epoch 8, batch 22950, loss[loss=0.1365, simple_loss=0.2055, pruned_loss=0.03379, over 4953.00 frames.], tot_loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03525, over 971688.19 frames.], batch size: 24, lr: 2.58e-04 2022-05-06 06:26:41,731 INFO [train.py:715] (0/8) Epoch 8, batch 23000, loss[loss=0.1626, simple_loss=0.2264, pruned_loss=0.04943, over 4986.00 frames.], tot_loss[loss=0.143, simple_loss=0.215, pruned_loss=0.03548, over 971566.59 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:27:20,525 INFO [train.py:715] (0/8) Epoch 8, batch 23050, loss[loss=0.1379, simple_loss=0.206, pruned_loss=0.03488, over 4906.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2154, pruned_loss=0.03556, over 971633.64 frames.], batch size: 17, lr: 2.58e-04 2022-05-06 06:27:59,218 INFO [train.py:715] (0/8) Epoch 8, batch 23100, loss[loss=0.1697, simple_loss=0.2272, pruned_loss=0.05616, over 4751.00 frames.], tot_loss[loss=0.1445, simple_loss=0.217, pruned_loss=0.03602, over 971917.65 frames.], batch size: 16, lr: 2.58e-04 2022-05-06 06:28:39,374 INFO [train.py:715] (0/8) Epoch 8, batch 23150, loss[loss=0.1335, simple_loss=0.2046, pruned_loss=0.0312, over 4975.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2162, pruned_loss=0.03597, over 973035.78 frames.], batch size: 35, lr: 2.58e-04 2022-05-06 06:29:18,751 INFO [train.py:715] (0/8) Epoch 8, batch 23200, loss[loss=0.1342, simple_loss=0.2223, pruned_loss=0.02307, over 4865.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2163, pruned_loss=0.03593, over 971478.64 frames.], batch size: 20, lr: 2.58e-04 2022-05-06 06:29:57,395 INFO [train.py:715] (0/8) Epoch 8, batch 23250, loss[loss=0.1352, simple_loss=0.2068, pruned_loss=0.03183, over 4785.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2158, pruned_loss=0.03573, over 971967.40 frames.], batch size: 14, lr: 2.58e-04 2022-05-06 06:30:36,511 INFO [train.py:715] (0/8) Epoch 8, batch 23300, loss[loss=0.1598, simple_loss=0.2449, pruned_loss=0.0373, over 4968.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2161, pruned_loss=0.03601, over 972172.24 frames.], batch size: 25, lr: 2.58e-04 2022-05-06 06:31:16,262 INFO [train.py:715] (0/8) Epoch 8, batch 23350, loss[loss=0.1441, simple_loss=0.2213, pruned_loss=0.03345, over 4756.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2161, pruned_loss=0.0355, over 972277.79 frames.], batch size: 16, lr: 2.58e-04 2022-05-06 06:31:55,024 INFO [train.py:715] (0/8) Epoch 8, batch 23400, loss[loss=0.1333, simple_loss=0.1972, pruned_loss=0.03464, over 4819.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2153, pruned_loss=0.03518, over 973262.58 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:32:33,887 INFO [train.py:715] (0/8) Epoch 8, batch 23450, loss[loss=0.1605, simple_loss=0.2259, pruned_loss=0.0476, over 4914.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2156, pruned_loss=0.03542, over 971948.10 frames.], batch size: 18, lr: 2.58e-04 2022-05-06 06:33:13,364 INFO [train.py:715] (0/8) Epoch 8, batch 23500, loss[loss=0.1202, simple_loss=0.1844, pruned_loss=0.02796, over 4826.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03508, over 971980.83 frames.], batch size: 13, lr: 2.58e-04 2022-05-06 06:33:52,529 INFO [train.py:715] (0/8) Epoch 8, batch 23550, loss[loss=0.1927, simple_loss=0.2578, pruned_loss=0.06382, over 4691.00 frames.], tot_loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03519, over 971916.54 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:34:31,316 INFO [train.py:715] (0/8) Epoch 8, batch 23600, loss[loss=0.165, simple_loss=0.2279, pruned_loss=0.05099, over 4799.00 frames.], tot_loss[loss=0.1431, simple_loss=0.215, pruned_loss=0.03566, over 972880.66 frames.], batch size: 18, lr: 2.58e-04 2022-05-06 06:35:10,236 INFO [train.py:715] (0/8) Epoch 8, batch 23650, loss[loss=0.1707, simple_loss=0.2428, pruned_loss=0.04932, over 4851.00 frames.], tot_loss[loss=0.143, simple_loss=0.2149, pruned_loss=0.03557, over 972043.50 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:35:50,045 INFO [train.py:715] (0/8) Epoch 8, batch 23700, loss[loss=0.1296, simple_loss=0.1919, pruned_loss=0.03363, over 4830.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2156, pruned_loss=0.036, over 972284.37 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:36:28,664 INFO [train.py:715] (0/8) Epoch 8, batch 23750, loss[loss=0.1393, simple_loss=0.2164, pruned_loss=0.03111, over 4772.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2155, pruned_loss=0.03583, over 971535.19 frames.], batch size: 14, lr: 2.58e-04 2022-05-06 06:37:07,512 INFO [train.py:715] (0/8) Epoch 8, batch 23800, loss[loss=0.1598, simple_loss=0.2248, pruned_loss=0.04734, over 4749.00 frames.], tot_loss[loss=0.1442, simple_loss=0.216, pruned_loss=0.03619, over 971955.75 frames.], batch size: 16, lr: 2.58e-04 2022-05-06 06:37:46,984 INFO [train.py:715] (0/8) Epoch 8, batch 23850, loss[loss=0.1404, simple_loss=0.2185, pruned_loss=0.03113, over 4921.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2162, pruned_loss=0.03607, over 972442.92 frames.], batch size: 29, lr: 2.58e-04 2022-05-06 06:38:26,639 INFO [train.py:715] (0/8) Epoch 8, batch 23900, loss[loss=0.1531, simple_loss=0.2341, pruned_loss=0.03604, over 4943.00 frames.], tot_loss[loss=0.1451, simple_loss=0.2173, pruned_loss=0.03644, over 972769.16 frames.], batch size: 21, lr: 2.58e-04 2022-05-06 06:39:05,506 INFO [train.py:715] (0/8) Epoch 8, batch 23950, loss[loss=0.1296, simple_loss=0.2082, pruned_loss=0.02551, over 4923.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2165, pruned_loss=0.03609, over 973253.36 frames.], batch size: 23, lr: 2.58e-04 2022-05-06 06:39:44,886 INFO [train.py:715] (0/8) Epoch 8, batch 24000, loss[loss=0.1415, simple_loss=0.2072, pruned_loss=0.03789, over 4894.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2154, pruned_loss=0.03512, over 973873.30 frames.], batch size: 19, lr: 2.58e-04 2022-05-06 06:39:44,887 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 06:39:54,531 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1075, simple_loss=0.192, pruned_loss=0.01146, over 914524.00 frames. 2022-05-06 06:40:33,718 INFO [train.py:715] (0/8) Epoch 8, batch 24050, loss[loss=0.1338, simple_loss=0.2059, pruned_loss=0.03084, over 4816.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2153, pruned_loss=0.03493, over 973557.75 frames.], batch size: 25, lr: 2.58e-04 2022-05-06 06:41:13,148 INFO [train.py:715] (0/8) Epoch 8, batch 24100, loss[loss=0.1085, simple_loss=0.1908, pruned_loss=0.01309, over 4765.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2153, pruned_loss=0.03469, over 973161.97 frames.], batch size: 19, lr: 2.58e-04 2022-05-06 06:41:52,116 INFO [train.py:715] (0/8) Epoch 8, batch 24150, loss[loss=0.1182, simple_loss=0.1961, pruned_loss=0.02014, over 4982.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2152, pruned_loss=0.03475, over 972382.47 frames.], batch size: 14, lr: 2.58e-04 2022-05-06 06:42:31,048 INFO [train.py:715] (0/8) Epoch 8, batch 24200, loss[loss=0.1641, simple_loss=0.2281, pruned_loss=0.05002, over 4779.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2151, pruned_loss=0.03498, over 972411.61 frames.], batch size: 14, lr: 2.58e-04 2022-05-06 06:43:11,237 INFO [train.py:715] (0/8) Epoch 8, batch 24250, loss[loss=0.1414, simple_loss=0.2135, pruned_loss=0.03467, over 4964.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03505, over 972382.27 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:43:50,601 INFO [train.py:715] (0/8) Epoch 8, batch 24300, loss[loss=0.1215, simple_loss=0.2022, pruned_loss=0.02046, over 4810.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2155, pruned_loss=0.03548, over 972354.97 frames.], batch size: 24, lr: 2.58e-04 2022-05-06 06:44:29,312 INFO [train.py:715] (0/8) Epoch 8, batch 24350, loss[loss=0.1361, simple_loss=0.211, pruned_loss=0.03061, over 4810.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2161, pruned_loss=0.0357, over 972552.17 frames.], batch size: 12, lr: 2.58e-04 2022-05-06 06:45:08,113 INFO [train.py:715] (0/8) Epoch 8, batch 24400, loss[loss=0.1354, simple_loss=0.2071, pruned_loss=0.03187, over 4992.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2165, pruned_loss=0.03593, over 973251.40 frames.], batch size: 14, lr: 2.58e-04 2022-05-06 06:45:47,149 INFO [train.py:715] (0/8) Epoch 8, batch 24450, loss[loss=0.1622, simple_loss=0.2336, pruned_loss=0.04534, over 4876.00 frames.], tot_loss[loss=0.1439, simple_loss=0.216, pruned_loss=0.03589, over 973820.64 frames.], batch size: 16, lr: 2.58e-04 2022-05-06 06:46:26,135 INFO [train.py:715] (0/8) Epoch 8, batch 24500, loss[loss=0.1452, simple_loss=0.2161, pruned_loss=0.03717, over 4864.00 frames.], tot_loss[loss=0.144, simple_loss=0.2159, pruned_loss=0.03603, over 972858.51 frames.], batch size: 30, lr: 2.58e-04 2022-05-06 06:47:04,985 INFO [train.py:715] (0/8) Epoch 8, batch 24550, loss[loss=0.1206, simple_loss=0.1925, pruned_loss=0.02438, over 4987.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2153, pruned_loss=0.03563, over 973273.68 frames.], batch size: 25, lr: 2.58e-04 2022-05-06 06:47:44,929 INFO [train.py:715] (0/8) Epoch 8, batch 24600, loss[loss=0.1247, simple_loss=0.2036, pruned_loss=0.02285, over 4783.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2156, pruned_loss=0.03534, over 973378.07 frames.], batch size: 18, lr: 2.58e-04 2022-05-06 06:48:24,239 INFO [train.py:715] (0/8) Epoch 8, batch 24650, loss[loss=0.1176, simple_loss=0.1862, pruned_loss=0.02449, over 4958.00 frames.], tot_loss[loss=0.1423, simple_loss=0.215, pruned_loss=0.03481, over 973331.21 frames.], batch size: 15, lr: 2.58e-04 2022-05-06 06:49:02,876 INFO [train.py:715] (0/8) Epoch 8, batch 24700, loss[loss=0.151, simple_loss=0.2178, pruned_loss=0.04212, over 4745.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.03512, over 972023.53 frames.], batch size: 16, lr: 2.58e-04 2022-05-06 06:49:42,048 INFO [train.py:715] (0/8) Epoch 8, batch 24750, loss[loss=0.1208, simple_loss=0.1936, pruned_loss=0.02394, over 4829.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2157, pruned_loss=0.03592, over 972592.54 frames.], batch size: 30, lr: 2.58e-04 2022-05-06 06:50:21,620 INFO [train.py:715] (0/8) Epoch 8, batch 24800, loss[loss=0.1178, simple_loss=0.1843, pruned_loss=0.02569, over 4747.00 frames.], tot_loss[loss=0.1442, simple_loss=0.2162, pruned_loss=0.0361, over 972496.73 frames.], batch size: 12, lr: 2.58e-04 2022-05-06 06:51:00,470 INFO [train.py:715] (0/8) Epoch 8, batch 24850, loss[loss=0.1223, simple_loss=0.1937, pruned_loss=0.02547, over 4753.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2153, pruned_loss=0.03585, over 972370.40 frames.], batch size: 19, lr: 2.58e-04 2022-05-06 06:51:39,139 INFO [train.py:715] (0/8) Epoch 8, batch 24900, loss[loss=0.1484, simple_loss=0.2076, pruned_loss=0.04454, over 4797.00 frames.], tot_loss[loss=0.142, simple_loss=0.2139, pruned_loss=0.03511, over 972749.95 frames.], batch size: 24, lr: 2.58e-04 2022-05-06 06:52:19,143 INFO [train.py:715] (0/8) Epoch 8, batch 24950, loss[loss=0.1388, simple_loss=0.2075, pruned_loss=0.03499, over 4938.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2152, pruned_loss=0.03555, over 973002.81 frames.], batch size: 39, lr: 2.58e-04 2022-05-06 06:52:58,629 INFO [train.py:715] (0/8) Epoch 8, batch 25000, loss[loss=0.1261, simple_loss=0.2004, pruned_loss=0.02595, over 4892.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2145, pruned_loss=0.03527, over 973456.19 frames.], batch size: 22, lr: 2.57e-04 2022-05-06 06:53:37,563 INFO [train.py:715] (0/8) Epoch 8, batch 25050, loss[loss=0.1628, simple_loss=0.2309, pruned_loss=0.04732, over 4845.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2147, pruned_loss=0.03482, over 973145.90 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 06:54:16,391 INFO [train.py:715] (0/8) Epoch 8, batch 25100, loss[loss=0.1368, simple_loss=0.2069, pruned_loss=0.03333, over 4810.00 frames.], tot_loss[loss=0.1426, simple_loss=0.215, pruned_loss=0.03513, over 972781.85 frames.], batch size: 25, lr: 2.57e-04 2022-05-06 06:54:55,805 INFO [train.py:715] (0/8) Epoch 8, batch 25150, loss[loss=0.1176, simple_loss=0.1987, pruned_loss=0.01823, over 4813.00 frames.], tot_loss[loss=0.143, simple_loss=0.2152, pruned_loss=0.03537, over 972346.28 frames.], batch size: 26, lr: 2.57e-04 2022-05-06 06:55:34,830 INFO [train.py:715] (0/8) Epoch 8, batch 25200, loss[loss=0.1364, simple_loss=0.2072, pruned_loss=0.03282, over 4777.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2156, pruned_loss=0.03587, over 971985.49 frames.], batch size: 18, lr: 2.57e-04 2022-05-06 06:56:13,821 INFO [train.py:715] (0/8) Epoch 8, batch 25250, loss[loss=0.1424, simple_loss=0.2104, pruned_loss=0.03719, over 4841.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2153, pruned_loss=0.03592, over 972234.67 frames.], batch size: 30, lr: 2.57e-04 2022-05-06 06:56:53,389 INFO [train.py:715] (0/8) Epoch 8, batch 25300, loss[loss=0.1556, simple_loss=0.2244, pruned_loss=0.04335, over 4842.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2159, pruned_loss=0.03642, over 971162.83 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 06:57:32,352 INFO [train.py:715] (0/8) Epoch 8, batch 25350, loss[loss=0.1238, simple_loss=0.1966, pruned_loss=0.02548, over 4871.00 frames.], tot_loss[loss=0.144, simple_loss=0.2157, pruned_loss=0.03613, over 971871.66 frames.], batch size: 22, lr: 2.57e-04 2022-05-06 06:58:11,171 INFO [train.py:715] (0/8) Epoch 8, batch 25400, loss[loss=0.19, simple_loss=0.2529, pruned_loss=0.06354, over 4913.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2161, pruned_loss=0.03657, over 972082.03 frames.], batch size: 18, lr: 2.57e-04 2022-05-06 06:58:50,228 INFO [train.py:715] (0/8) Epoch 8, batch 25450, loss[loss=0.1202, simple_loss=0.2023, pruned_loss=0.01899, over 4812.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2153, pruned_loss=0.03611, over 972274.80 frames.], batch size: 25, lr: 2.57e-04 2022-05-06 06:59:30,371 INFO [train.py:715] (0/8) Epoch 8, batch 25500, loss[loss=0.1309, simple_loss=0.2033, pruned_loss=0.02924, over 4836.00 frames.], tot_loss[loss=0.1434, simple_loss=0.215, pruned_loss=0.03589, over 972353.59 frames.], batch size: 25, lr: 2.57e-04 2022-05-06 06:59:47,788 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-304000.pt 2022-05-06 07:00:12,380 INFO [train.py:715] (0/8) Epoch 8, batch 25550, loss[loss=0.1488, simple_loss=0.2112, pruned_loss=0.04314, over 4930.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2145, pruned_loss=0.03543, over 972694.87 frames.], batch size: 23, lr: 2.57e-04 2022-05-06 07:00:51,652 INFO [train.py:715] (0/8) Epoch 8, batch 25600, loss[loss=0.1418, simple_loss=0.2175, pruned_loss=0.0331, over 4991.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2143, pruned_loss=0.03571, over 972677.99 frames.], batch size: 14, lr: 2.57e-04 2022-05-06 07:01:30,734 INFO [train.py:715] (0/8) Epoch 8, batch 25650, loss[loss=0.1326, simple_loss=0.2076, pruned_loss=0.02876, over 4983.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2136, pruned_loss=0.03483, over 972731.07 frames.], batch size: 26, lr: 2.57e-04 2022-05-06 07:02:09,696 INFO [train.py:715] (0/8) Epoch 8, batch 25700, loss[loss=0.1361, simple_loss=0.2046, pruned_loss=0.03378, over 4795.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2137, pruned_loss=0.03501, over 972337.64 frames.], batch size: 24, lr: 2.57e-04 2022-05-06 07:02:48,863 INFO [train.py:715] (0/8) Epoch 8, batch 25750, loss[loss=0.15, simple_loss=0.2213, pruned_loss=0.03935, over 4828.00 frames.], tot_loss[loss=0.1422, simple_loss=0.214, pruned_loss=0.03518, over 972857.40 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 07:03:27,687 INFO [train.py:715] (0/8) Epoch 8, batch 25800, loss[loss=0.1347, simple_loss=0.206, pruned_loss=0.03172, over 4753.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2142, pruned_loss=0.03517, over 973041.21 frames.], batch size: 16, lr: 2.57e-04 2022-05-06 07:04:06,653 INFO [train.py:715] (0/8) Epoch 8, batch 25850, loss[loss=0.1274, simple_loss=0.2036, pruned_loss=0.02566, over 4987.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2143, pruned_loss=0.0355, over 972812.76 frames.], batch size: 14, lr: 2.57e-04 2022-05-06 07:04:45,938 INFO [train.py:715] (0/8) Epoch 8, batch 25900, loss[loss=0.1488, simple_loss=0.2223, pruned_loss=0.03763, over 4888.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2143, pruned_loss=0.03524, over 972614.92 frames.], batch size: 22, lr: 2.57e-04 2022-05-06 07:05:24,608 INFO [train.py:715] (0/8) Epoch 8, batch 25950, loss[loss=0.1643, simple_loss=0.2337, pruned_loss=0.0475, over 4857.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2147, pruned_loss=0.03545, over 973476.08 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 07:06:03,740 INFO [train.py:715] (0/8) Epoch 8, batch 26000, loss[loss=0.1353, simple_loss=0.2191, pruned_loss=0.02577, over 4890.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2144, pruned_loss=0.03525, over 972228.54 frames.], batch size: 19, lr: 2.57e-04 2022-05-06 07:06:42,906 INFO [train.py:715] (0/8) Epoch 8, batch 26050, loss[loss=0.159, simple_loss=0.2241, pruned_loss=0.04693, over 4974.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2146, pruned_loss=0.03589, over 972428.21 frames.], batch size: 28, lr: 2.57e-04 2022-05-06 07:07:21,666 INFO [train.py:715] (0/8) Epoch 8, batch 26100, loss[loss=0.1241, simple_loss=0.1894, pruned_loss=0.02941, over 4810.00 frames.], tot_loss[loss=0.1431, simple_loss=0.215, pruned_loss=0.03561, over 972036.68 frames.], batch size: 13, lr: 2.57e-04 2022-05-06 07:08:01,300 INFO [train.py:715] (0/8) Epoch 8, batch 26150, loss[loss=0.1058, simple_loss=0.1796, pruned_loss=0.01605, over 4989.00 frames.], tot_loss[loss=0.1422, simple_loss=0.214, pruned_loss=0.03516, over 973006.29 frames.], batch size: 25, lr: 2.57e-04 2022-05-06 07:08:40,492 INFO [train.py:715] (0/8) Epoch 8, batch 26200, loss[loss=0.1615, simple_loss=0.2282, pruned_loss=0.04744, over 4766.00 frames.], tot_loss[loss=0.1422, simple_loss=0.214, pruned_loss=0.03516, over 973463.75 frames.], batch size: 19, lr: 2.57e-04 2022-05-06 07:09:19,621 INFO [train.py:715] (0/8) Epoch 8, batch 26250, loss[loss=0.1295, simple_loss=0.1965, pruned_loss=0.03129, over 4816.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2131, pruned_loss=0.03464, over 973245.18 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 07:09:57,935 INFO [train.py:715] (0/8) Epoch 8, batch 26300, loss[loss=0.1216, simple_loss=0.1918, pruned_loss=0.0257, over 4785.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2134, pruned_loss=0.03458, over 973120.99 frames.], batch size: 12, lr: 2.57e-04 2022-05-06 07:10:37,572 INFO [train.py:715] (0/8) Epoch 8, batch 26350, loss[loss=0.171, simple_loss=0.2321, pruned_loss=0.05492, over 4839.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2144, pruned_loss=0.03547, over 972290.19 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 07:11:16,888 INFO [train.py:715] (0/8) Epoch 8, batch 26400, loss[loss=0.164, simple_loss=0.2344, pruned_loss=0.04682, over 4781.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2158, pruned_loss=0.03567, over 971619.17 frames.], batch size: 14, lr: 2.57e-04 2022-05-06 07:11:55,836 INFO [train.py:715] (0/8) Epoch 8, batch 26450, loss[loss=0.1292, simple_loss=0.1979, pruned_loss=0.0302, over 4993.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2161, pruned_loss=0.03555, over 971612.70 frames.], batch size: 14, lr: 2.57e-04 2022-05-06 07:12:34,672 INFO [train.py:715] (0/8) Epoch 8, batch 26500, loss[loss=0.1434, simple_loss=0.2059, pruned_loss=0.0404, over 4975.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2165, pruned_loss=0.03586, over 972073.73 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 07:13:13,273 INFO [train.py:715] (0/8) Epoch 8, batch 26550, loss[loss=0.1087, simple_loss=0.1697, pruned_loss=0.02384, over 4975.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2157, pruned_loss=0.03526, over 972343.57 frames.], batch size: 14, lr: 2.57e-04 2022-05-06 07:13:52,657 INFO [train.py:715] (0/8) Epoch 8, batch 26600, loss[loss=0.1289, simple_loss=0.2052, pruned_loss=0.02633, over 4925.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2151, pruned_loss=0.035, over 972892.48 frames.], batch size: 17, lr: 2.57e-04 2022-05-06 07:14:30,714 INFO [train.py:715] (0/8) Epoch 8, batch 26650, loss[loss=0.185, simple_loss=0.2624, pruned_loss=0.05381, over 4922.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2146, pruned_loss=0.03505, over 972973.50 frames.], batch size: 18, lr: 2.57e-04 2022-05-06 07:15:10,077 INFO [train.py:715] (0/8) Epoch 8, batch 26700, loss[loss=0.1378, simple_loss=0.213, pruned_loss=0.03129, over 4948.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2146, pruned_loss=0.03503, over 973615.89 frames.], batch size: 24, lr: 2.57e-04 2022-05-06 07:15:49,149 INFO [train.py:715] (0/8) Epoch 8, batch 26750, loss[loss=0.1404, simple_loss=0.215, pruned_loss=0.03288, over 4859.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2155, pruned_loss=0.03549, over 973050.89 frames.], batch size: 13, lr: 2.57e-04 2022-05-06 07:16:27,929 INFO [train.py:715] (0/8) Epoch 8, batch 26800, loss[loss=0.1526, simple_loss=0.2304, pruned_loss=0.03742, over 4795.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2164, pruned_loss=0.03647, over 972872.71 frames.], batch size: 18, lr: 2.57e-04 2022-05-06 07:17:07,164 INFO [train.py:715] (0/8) Epoch 8, batch 26850, loss[loss=0.1153, simple_loss=0.1947, pruned_loss=0.01799, over 4919.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2163, pruned_loss=0.03621, over 974371.86 frames.], batch size: 18, lr: 2.57e-04 2022-05-06 07:17:46,413 INFO [train.py:715] (0/8) Epoch 8, batch 26900, loss[loss=0.1364, simple_loss=0.1963, pruned_loss=0.03823, over 4795.00 frames.], tot_loss[loss=0.144, simple_loss=0.2161, pruned_loss=0.03594, over 973747.74 frames.], batch size: 14, lr: 2.57e-04 2022-05-06 07:18:25,463 INFO [train.py:715] (0/8) Epoch 8, batch 26950, loss[loss=0.1453, simple_loss=0.217, pruned_loss=0.03675, over 4775.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2164, pruned_loss=0.03617, over 972903.27 frames.], batch size: 17, lr: 2.57e-04 2022-05-06 07:19:04,350 INFO [train.py:715] (0/8) Epoch 8, batch 27000, loss[loss=0.158, simple_loss=0.2176, pruned_loss=0.04916, over 4876.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2159, pruned_loss=0.03591, over 972856.99 frames.], batch size: 32, lr: 2.57e-04 2022-05-06 07:19:04,352 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 07:19:13,680 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1072, simple_loss=0.1919, pruned_loss=0.01129, over 914524.00 frames. 2022-05-06 07:19:52,525 INFO [train.py:715] (0/8) Epoch 8, batch 27050, loss[loss=0.115, simple_loss=0.1833, pruned_loss=0.02336, over 4900.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2153, pruned_loss=0.03566, over 972985.69 frames.], batch size: 23, lr: 2.57e-04 2022-05-06 07:20:31,871 INFO [train.py:715] (0/8) Epoch 8, batch 27100, loss[loss=0.1647, simple_loss=0.225, pruned_loss=0.05219, over 4835.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2148, pruned_loss=0.03533, over 972420.66 frames.], batch size: 15, lr: 2.57e-04 2022-05-06 07:21:10,969 INFO [train.py:715] (0/8) Epoch 8, batch 27150, loss[loss=0.1663, simple_loss=0.2375, pruned_loss=0.04751, over 4872.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2155, pruned_loss=0.0352, over 972384.25 frames.], batch size: 32, lr: 2.57e-04 2022-05-06 07:21:49,181 INFO [train.py:715] (0/8) Epoch 8, batch 27200, loss[loss=0.1572, simple_loss=0.2241, pruned_loss=0.0452, over 4967.00 frames.], tot_loss[loss=0.1436, simple_loss=0.216, pruned_loss=0.03561, over 972883.10 frames.], batch size: 35, lr: 2.57e-04 2022-05-06 07:22:28,509 INFO [train.py:715] (0/8) Epoch 8, batch 27250, loss[loss=0.1512, simple_loss=0.2206, pruned_loss=0.04091, over 4942.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.03564, over 972953.81 frames.], batch size: 35, lr: 2.57e-04 2022-05-06 07:23:07,825 INFO [train.py:715] (0/8) Epoch 8, batch 27300, loss[loss=0.1727, simple_loss=0.2362, pruned_loss=0.05459, over 4801.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2162, pruned_loss=0.03555, over 973298.14 frames.], batch size: 21, lr: 2.57e-04 2022-05-06 07:23:46,492 INFO [train.py:715] (0/8) Epoch 8, batch 27350, loss[loss=0.1162, simple_loss=0.1912, pruned_loss=0.02057, over 4830.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2158, pruned_loss=0.03533, over 973563.72 frames.], batch size: 26, lr: 2.57e-04 2022-05-06 07:24:25,181 INFO [train.py:715] (0/8) Epoch 8, batch 27400, loss[loss=0.1551, simple_loss=0.2308, pruned_loss=0.0397, over 4780.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2158, pruned_loss=0.03533, over 973389.11 frames.], batch size: 17, lr: 2.56e-04 2022-05-06 07:25:04,320 INFO [train.py:715] (0/8) Epoch 8, batch 27450, loss[loss=0.1527, simple_loss=0.2196, pruned_loss=0.04286, over 4699.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2157, pruned_loss=0.03539, over 972874.35 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:25:43,018 INFO [train.py:715] (0/8) Epoch 8, batch 27500, loss[loss=0.1247, simple_loss=0.1898, pruned_loss=0.02983, over 4980.00 frames.], tot_loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03515, over 972837.19 frames.], batch size: 14, lr: 2.56e-04 2022-05-06 07:26:21,672 INFO [train.py:715] (0/8) Epoch 8, batch 27550, loss[loss=0.1648, simple_loss=0.224, pruned_loss=0.05278, over 4700.00 frames.], tot_loss[loss=0.1436, simple_loss=0.216, pruned_loss=0.03559, over 972328.77 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:27:01,333 INFO [train.py:715] (0/8) Epoch 8, batch 27600, loss[loss=0.1456, simple_loss=0.2229, pruned_loss=0.03414, over 4776.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2158, pruned_loss=0.03568, over 972142.28 frames.], batch size: 18, lr: 2.56e-04 2022-05-06 07:27:40,424 INFO [train.py:715] (0/8) Epoch 8, batch 27650, loss[loss=0.1197, simple_loss=0.1881, pruned_loss=0.02571, over 4837.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2155, pruned_loss=0.03582, over 972507.96 frames.], batch size: 12, lr: 2.56e-04 2022-05-06 07:28:19,091 INFO [train.py:715] (0/8) Epoch 8, batch 27700, loss[loss=0.1318, simple_loss=0.2068, pruned_loss=0.02838, over 4933.00 frames.], tot_loss[loss=0.143, simple_loss=0.2149, pruned_loss=0.0355, over 973111.75 frames.], batch size: 29, lr: 2.56e-04 2022-05-06 07:28:58,322 INFO [train.py:715] (0/8) Epoch 8, batch 27750, loss[loss=0.1616, simple_loss=0.2405, pruned_loss=0.04134, over 4883.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2146, pruned_loss=0.03529, over 972929.96 frames.], batch size: 22, lr: 2.56e-04 2022-05-06 07:29:38,018 INFO [train.py:715] (0/8) Epoch 8, batch 27800, loss[loss=0.1565, simple_loss=0.2337, pruned_loss=0.03968, over 4752.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2149, pruned_loss=0.03529, over 973150.83 frames.], batch size: 16, lr: 2.56e-04 2022-05-06 07:30:16,790 INFO [train.py:715] (0/8) Epoch 8, batch 27850, loss[loss=0.1196, simple_loss=0.1928, pruned_loss=0.02318, over 4945.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2154, pruned_loss=0.03547, over 972720.43 frames.], batch size: 29, lr: 2.56e-04 2022-05-06 07:30:54,913 INFO [train.py:715] (0/8) Epoch 8, batch 27900, loss[loss=0.1653, simple_loss=0.2405, pruned_loss=0.04503, over 4916.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2154, pruned_loss=0.03519, over 973344.23 frames.], batch size: 18, lr: 2.56e-04 2022-05-06 07:31:34,147 INFO [train.py:715] (0/8) Epoch 8, batch 27950, loss[loss=0.1283, simple_loss=0.2115, pruned_loss=0.02256, over 4695.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2151, pruned_loss=0.03471, over 972914.41 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:32:13,473 INFO [train.py:715] (0/8) Epoch 8, batch 28000, loss[loss=0.1694, simple_loss=0.2341, pruned_loss=0.05233, over 4708.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2153, pruned_loss=0.03496, over 972590.48 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:32:51,690 INFO [train.py:715] (0/8) Epoch 8, batch 28050, loss[loss=0.1372, simple_loss=0.2285, pruned_loss=0.02296, over 4882.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2152, pruned_loss=0.03496, over 972439.17 frames.], batch size: 16, lr: 2.56e-04 2022-05-06 07:33:31,444 INFO [train.py:715] (0/8) Epoch 8, batch 28100, loss[loss=0.1453, simple_loss=0.2168, pruned_loss=0.03687, over 4965.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03534, over 972301.12 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:34:10,512 INFO [train.py:715] (0/8) Epoch 8, batch 28150, loss[loss=0.1178, simple_loss=0.1984, pruned_loss=0.01858, over 4851.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2153, pruned_loss=0.03546, over 972432.19 frames.], batch size: 20, lr: 2.56e-04 2022-05-06 07:34:49,969 INFO [train.py:715] (0/8) Epoch 8, batch 28200, loss[loss=0.1445, simple_loss=0.2079, pruned_loss=0.04053, over 4837.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2157, pruned_loss=0.03579, over 972011.00 frames.], batch size: 13, lr: 2.56e-04 2022-05-06 07:35:29,402 INFO [train.py:715] (0/8) Epoch 8, batch 28250, loss[loss=0.1461, simple_loss=0.2221, pruned_loss=0.03501, over 4901.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2158, pruned_loss=0.03564, over 971544.68 frames.], batch size: 19, lr: 2.56e-04 2022-05-06 07:36:09,670 INFO [train.py:715] (0/8) Epoch 8, batch 28300, loss[loss=0.1204, simple_loss=0.1959, pruned_loss=0.02245, over 4929.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2153, pruned_loss=0.03527, over 972167.12 frames.], batch size: 21, lr: 2.56e-04 2022-05-06 07:36:49,587 INFO [train.py:715] (0/8) Epoch 8, batch 28350, loss[loss=0.1444, simple_loss=0.2223, pruned_loss=0.03326, over 4771.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2152, pruned_loss=0.0352, over 972171.85 frames.], batch size: 18, lr: 2.56e-04 2022-05-06 07:37:28,934 INFO [train.py:715] (0/8) Epoch 8, batch 28400, loss[loss=0.1144, simple_loss=0.1888, pruned_loss=0.01997, over 4861.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2153, pruned_loss=0.03517, over 972369.96 frames.], batch size: 12, lr: 2.56e-04 2022-05-06 07:38:08,993 INFO [train.py:715] (0/8) Epoch 8, batch 28450, loss[loss=0.1226, simple_loss=0.184, pruned_loss=0.03053, over 4774.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2146, pruned_loss=0.03481, over 972735.59 frames.], batch size: 12, lr: 2.56e-04 2022-05-06 07:38:48,158 INFO [train.py:715] (0/8) Epoch 8, batch 28500, loss[loss=0.1345, simple_loss=0.2048, pruned_loss=0.03208, over 4839.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2156, pruned_loss=0.03526, over 972726.99 frames.], batch size: 30, lr: 2.56e-04 2022-05-06 07:39:26,865 INFO [train.py:715] (0/8) Epoch 8, batch 28550, loss[loss=0.1305, simple_loss=0.2032, pruned_loss=0.02888, over 4788.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2158, pruned_loss=0.03555, over 972686.98 frames.], batch size: 14, lr: 2.56e-04 2022-05-06 07:40:05,724 INFO [train.py:715] (0/8) Epoch 8, batch 28600, loss[loss=0.1425, simple_loss=0.2226, pruned_loss=0.03118, over 4820.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03534, over 973306.55 frames.], batch size: 26, lr: 2.56e-04 2022-05-06 07:40:45,398 INFO [train.py:715] (0/8) Epoch 8, batch 28650, loss[loss=0.1316, simple_loss=0.2006, pruned_loss=0.03133, over 4832.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2153, pruned_loss=0.03549, over 972461.09 frames.], batch size: 13, lr: 2.56e-04 2022-05-06 07:41:24,251 INFO [train.py:715] (0/8) Epoch 8, batch 28700, loss[loss=0.1392, simple_loss=0.2138, pruned_loss=0.03229, over 4751.00 frames.], tot_loss[loss=0.1439, simple_loss=0.216, pruned_loss=0.03587, over 973000.13 frames.], batch size: 16, lr: 2.56e-04 2022-05-06 07:42:02,599 INFO [train.py:715] (0/8) Epoch 8, batch 28750, loss[loss=0.133, simple_loss=0.2101, pruned_loss=0.02796, over 4918.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2153, pruned_loss=0.03556, over 972613.67 frames.], batch size: 18, lr: 2.56e-04 2022-05-06 07:42:42,144 INFO [train.py:715] (0/8) Epoch 8, batch 28800, loss[loss=0.1234, simple_loss=0.2016, pruned_loss=0.02261, over 4970.00 frames.], tot_loss[loss=0.143, simple_loss=0.2153, pruned_loss=0.03538, over 972424.41 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:43:21,537 INFO [train.py:715] (0/8) Epoch 8, batch 28850, loss[loss=0.1305, simple_loss=0.2027, pruned_loss=0.02916, over 4984.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2156, pruned_loss=0.0354, over 972930.47 frames.], batch size: 14, lr: 2.56e-04 2022-05-06 07:44:00,548 INFO [train.py:715] (0/8) Epoch 8, batch 28900, loss[loss=0.1417, simple_loss=0.2106, pruned_loss=0.03638, over 4955.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2157, pruned_loss=0.03539, over 973150.20 frames.], batch size: 35, lr: 2.56e-04 2022-05-06 07:44:39,169 INFO [train.py:715] (0/8) Epoch 8, batch 28950, loss[loss=0.1246, simple_loss=0.1977, pruned_loss=0.02578, over 4911.00 frames.], tot_loss[loss=0.143, simple_loss=0.2153, pruned_loss=0.03533, over 973089.26 frames.], batch size: 17, lr: 2.56e-04 2022-05-06 07:45:18,517 INFO [train.py:715] (0/8) Epoch 8, batch 29000, loss[loss=0.1289, simple_loss=0.2012, pruned_loss=0.02832, over 4831.00 frames.], tot_loss[loss=0.143, simple_loss=0.2154, pruned_loss=0.03532, over 972818.49 frames.], batch size: 26, lr: 2.56e-04 2022-05-06 07:45:57,177 INFO [train.py:715] (0/8) Epoch 8, batch 29050, loss[loss=0.1407, simple_loss=0.2192, pruned_loss=0.03105, over 4827.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2159, pruned_loss=0.03536, over 973057.53 frames.], batch size: 13, lr: 2.56e-04 2022-05-06 07:46:36,416 INFO [train.py:715] (0/8) Epoch 8, batch 29100, loss[loss=0.1162, simple_loss=0.196, pruned_loss=0.01825, over 4826.00 frames.], tot_loss[loss=0.1437, simple_loss=0.216, pruned_loss=0.03569, over 973405.19 frames.], batch size: 13, lr: 2.56e-04 2022-05-06 07:47:14,942 INFO [train.py:715] (0/8) Epoch 8, batch 29150, loss[loss=0.158, simple_loss=0.2324, pruned_loss=0.04176, over 4970.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2156, pruned_loss=0.03544, over 973198.35 frames.], batch size: 24, lr: 2.56e-04 2022-05-06 07:47:54,240 INFO [train.py:715] (0/8) Epoch 8, batch 29200, loss[loss=0.1412, simple_loss=0.2156, pruned_loss=0.0334, over 4887.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2145, pruned_loss=0.03507, over 972903.02 frames.], batch size: 16, lr: 2.56e-04 2022-05-06 07:48:32,867 INFO [train.py:715] (0/8) Epoch 8, batch 29250, loss[loss=0.1471, simple_loss=0.2241, pruned_loss=0.03505, over 4986.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03506, over 973107.91 frames.], batch size: 24, lr: 2.56e-04 2022-05-06 07:49:11,136 INFO [train.py:715] (0/8) Epoch 8, batch 29300, loss[loss=0.1399, simple_loss=0.2252, pruned_loss=0.02734, over 4939.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2158, pruned_loss=0.0352, over 973116.05 frames.], batch size: 29, lr: 2.56e-04 2022-05-06 07:49:50,320 INFO [train.py:715] (0/8) Epoch 8, batch 29350, loss[loss=0.1642, simple_loss=0.2346, pruned_loss=0.04694, over 4984.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2164, pruned_loss=0.03545, over 973817.02 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:50:29,150 INFO [train.py:715] (0/8) Epoch 8, batch 29400, loss[loss=0.1249, simple_loss=0.2066, pruned_loss=0.02162, over 4807.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2163, pruned_loss=0.03552, over 972263.57 frames.], batch size: 13, lr: 2.56e-04 2022-05-06 07:51:08,796 INFO [train.py:715] (0/8) Epoch 8, batch 29450, loss[loss=0.1691, simple_loss=0.2388, pruned_loss=0.04971, over 4879.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2165, pruned_loss=0.0358, over 972680.06 frames.], batch size: 16, lr: 2.56e-04 2022-05-06 07:51:48,077 INFO [train.py:715] (0/8) Epoch 8, batch 29500, loss[loss=0.1349, simple_loss=0.2003, pruned_loss=0.0347, over 4870.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2163, pruned_loss=0.03578, over 973176.19 frames.], batch size: 20, lr: 2.56e-04 2022-05-06 07:52:27,548 INFO [train.py:715] (0/8) Epoch 8, batch 29550, loss[loss=0.1438, simple_loss=0.2136, pruned_loss=0.03693, over 4884.00 frames.], tot_loss[loss=0.1441, simple_loss=0.2159, pruned_loss=0.03612, over 972379.77 frames.], batch size: 22, lr: 2.56e-04 2022-05-06 07:53:06,113 INFO [train.py:715] (0/8) Epoch 8, batch 29600, loss[loss=0.1299, simple_loss=0.2129, pruned_loss=0.02348, over 4785.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2166, pruned_loss=0.03652, over 972644.48 frames.], batch size: 18, lr: 2.56e-04 2022-05-06 07:53:45,381 INFO [train.py:715] (0/8) Epoch 8, batch 29650, loss[loss=0.1337, simple_loss=0.2036, pruned_loss=0.03188, over 4808.00 frames.], tot_loss[loss=0.1441, simple_loss=0.216, pruned_loss=0.0361, over 972039.45 frames.], batch size: 25, lr: 2.56e-04 2022-05-06 07:54:24,982 INFO [train.py:715] (0/8) Epoch 8, batch 29700, loss[loss=0.1425, simple_loss=0.2135, pruned_loss=0.03576, over 4835.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2143, pruned_loss=0.03555, over 971887.67 frames.], batch size: 13, lr: 2.56e-04 2022-05-06 07:55:03,538 INFO [train.py:715] (0/8) Epoch 8, batch 29750, loss[loss=0.1616, simple_loss=0.2275, pruned_loss=0.04788, over 4810.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2144, pruned_loss=0.03548, over 972902.55 frames.], batch size: 15, lr: 2.56e-04 2022-05-06 07:55:42,375 INFO [train.py:715] (0/8) Epoch 8, batch 29800, loss[loss=0.1526, simple_loss=0.2243, pruned_loss=0.04041, over 4709.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2147, pruned_loss=0.03534, over 971858.06 frames.], batch size: 15, lr: 2.55e-04 2022-05-06 07:56:21,282 INFO [train.py:715] (0/8) Epoch 8, batch 29850, loss[loss=0.214, simple_loss=0.2793, pruned_loss=0.07432, over 4757.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03578, over 971778.90 frames.], batch size: 18, lr: 2.55e-04 2022-05-06 07:57:00,652 INFO [train.py:715] (0/8) Epoch 8, batch 29900, loss[loss=0.1738, simple_loss=0.2445, pruned_loss=0.05157, over 4967.00 frames.], tot_loss[loss=0.1445, simple_loss=0.2164, pruned_loss=0.03627, over 972225.38 frames.], batch size: 15, lr: 2.55e-04 2022-05-06 07:57:39,544 INFO [train.py:715] (0/8) Epoch 8, batch 29950, loss[loss=0.1363, simple_loss=0.2189, pruned_loss=0.02685, over 4992.00 frames.], tot_loss[loss=0.1452, simple_loss=0.217, pruned_loss=0.03664, over 972763.16 frames.], batch size: 25, lr: 2.55e-04 2022-05-06 07:58:18,654 INFO [train.py:715] (0/8) Epoch 8, batch 30000, loss[loss=0.1469, simple_loss=0.1976, pruned_loss=0.04812, over 4781.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2169, pruned_loss=0.03628, over 972825.73 frames.], batch size: 12, lr: 2.55e-04 2022-05-06 07:58:18,655 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 07:58:28,241 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1073, simple_loss=0.1918, pruned_loss=0.01141, over 914524.00 frames. 2022-05-06 07:59:07,023 INFO [train.py:715] (0/8) Epoch 8, batch 30050, loss[loss=0.1358, simple_loss=0.2119, pruned_loss=0.02986, over 4892.00 frames.], tot_loss[loss=0.1452, simple_loss=0.2177, pruned_loss=0.03633, over 972865.31 frames.], batch size: 22, lr: 2.55e-04 2022-05-06 07:59:46,355 INFO [train.py:715] (0/8) Epoch 8, batch 30100, loss[loss=0.1524, simple_loss=0.2323, pruned_loss=0.03626, over 4966.00 frames.], tot_loss[loss=0.144, simple_loss=0.2167, pruned_loss=0.03569, over 972885.33 frames.], batch size: 21, lr: 2.55e-04 2022-05-06 08:00:25,656 INFO [train.py:715] (0/8) Epoch 8, batch 30150, loss[loss=0.1521, simple_loss=0.2155, pruned_loss=0.04439, over 4962.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03576, over 972157.46 frames.], batch size: 15, lr: 2.55e-04 2022-05-06 08:01:04,250 INFO [train.py:715] (0/8) Epoch 8, batch 30200, loss[loss=0.1483, simple_loss=0.2202, pruned_loss=0.03824, over 4927.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2162, pruned_loss=0.03555, over 972057.86 frames.], batch size: 18, lr: 2.55e-04 2022-05-06 08:01:43,181 INFO [train.py:715] (0/8) Epoch 8, batch 30250, loss[loss=0.1634, simple_loss=0.2308, pruned_loss=0.04803, over 4895.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2159, pruned_loss=0.03553, over 972500.47 frames.], batch size: 19, lr: 2.55e-04 2022-05-06 08:02:22,869 INFO [train.py:715] (0/8) Epoch 8, batch 30300, loss[loss=0.1381, simple_loss=0.2132, pruned_loss=0.0315, over 4886.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2147, pruned_loss=0.03499, over 973106.31 frames.], batch size: 22, lr: 2.55e-04 2022-05-06 08:03:01,867 INFO [train.py:715] (0/8) Epoch 8, batch 30350, loss[loss=0.1287, simple_loss=0.2144, pruned_loss=0.0215, over 4941.00 frames.], tot_loss[loss=0.1426, simple_loss=0.215, pruned_loss=0.03512, over 972945.65 frames.], batch size: 23, lr: 2.55e-04 2022-05-06 08:03:40,563 INFO [train.py:715] (0/8) Epoch 8, batch 30400, loss[loss=0.1301, simple_loss=0.2085, pruned_loss=0.02582, over 4777.00 frames.], tot_loss[loss=0.142, simple_loss=0.2145, pruned_loss=0.03476, over 972271.26 frames.], batch size: 18, lr: 2.55e-04 2022-05-06 08:04:19,868 INFO [train.py:715] (0/8) Epoch 8, batch 30450, loss[loss=0.1492, simple_loss=0.2208, pruned_loss=0.03883, over 4866.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2145, pruned_loss=0.03454, over 971449.78 frames.], batch size: 32, lr: 2.55e-04 2022-05-06 08:04:58,850 INFO [train.py:715] (0/8) Epoch 8, batch 30500, loss[loss=0.1582, simple_loss=0.2356, pruned_loss=0.04044, over 4780.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2165, pruned_loss=0.03536, over 972732.03 frames.], batch size: 17, lr: 2.55e-04 2022-05-06 08:05:37,496 INFO [train.py:715] (0/8) Epoch 8, batch 30550, loss[loss=0.1557, simple_loss=0.2136, pruned_loss=0.04889, over 4685.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2164, pruned_loss=0.03566, over 972013.35 frames.], batch size: 15, lr: 2.55e-04 2022-05-06 08:06:16,535 INFO [train.py:715] (0/8) Epoch 8, batch 30600, loss[loss=0.1584, simple_loss=0.2336, pruned_loss=0.04158, over 4896.00 frames.], tot_loss[loss=0.1443, simple_loss=0.217, pruned_loss=0.0358, over 971881.33 frames.], batch size: 16, lr: 2.55e-04 2022-05-06 08:06:56,252 INFO [train.py:715] (0/8) Epoch 8, batch 30650, loss[loss=0.1333, simple_loss=0.2054, pruned_loss=0.03056, over 4755.00 frames.], tot_loss[loss=0.1431, simple_loss=0.216, pruned_loss=0.03507, over 972129.84 frames.], batch size: 19, lr: 2.55e-04 2022-05-06 08:07:35,434 INFO [train.py:715] (0/8) Epoch 8, batch 30700, loss[loss=0.1235, simple_loss=0.2004, pruned_loss=0.02326, over 4915.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2154, pruned_loss=0.03506, over 972980.12 frames.], batch size: 23, lr: 2.55e-04 2022-05-06 08:08:15,306 INFO [train.py:715] (0/8) Epoch 8, batch 30750, loss[loss=0.1055, simple_loss=0.1764, pruned_loss=0.01735, over 4811.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2149, pruned_loss=0.03487, over 972439.59 frames.], batch size: 13, lr: 2.55e-04 2022-05-06 08:08:55,422 INFO [train.py:715] (0/8) Epoch 8, batch 30800, loss[loss=0.1512, simple_loss=0.2248, pruned_loss=0.03884, over 4897.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2152, pruned_loss=0.03514, over 972070.33 frames.], batch size: 19, lr: 2.55e-04 2022-05-06 08:09:33,881 INFO [train.py:715] (0/8) Epoch 8, batch 30850, loss[loss=0.1329, simple_loss=0.2118, pruned_loss=0.02697, over 4805.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2154, pruned_loss=0.03525, over 972634.29 frames.], batch size: 21, lr: 2.55e-04 2022-05-06 08:10:12,783 INFO [train.py:715] (0/8) Epoch 8, batch 30900, loss[loss=0.1774, simple_loss=0.2494, pruned_loss=0.05271, over 4904.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2154, pruned_loss=0.03557, over 971411.72 frames.], batch size: 17, lr: 2.55e-04 2022-05-06 08:10:52,539 INFO [train.py:715] (0/8) Epoch 8, batch 30950, loss[loss=0.1197, simple_loss=0.1911, pruned_loss=0.02412, over 4857.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2148, pruned_loss=0.03533, over 971485.11 frames.], batch size: 20, lr: 2.55e-04 2022-05-06 08:11:32,557 INFO [train.py:715] (0/8) Epoch 8, batch 31000, loss[loss=0.118, simple_loss=0.1899, pruned_loss=0.02303, over 4959.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2143, pruned_loss=0.03465, over 971677.91 frames.], batch size: 25, lr: 2.55e-04 2022-05-06 08:12:11,802 INFO [train.py:715] (0/8) Epoch 8, batch 31050, loss[loss=0.1513, simple_loss=0.2211, pruned_loss=0.04073, over 4981.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2148, pruned_loss=0.0351, over 970542.95 frames.], batch size: 14, lr: 2.55e-04 2022-05-06 08:12:51,403 INFO [train.py:715] (0/8) Epoch 8, batch 31100, loss[loss=0.1506, simple_loss=0.2323, pruned_loss=0.03442, over 4976.00 frames.], tot_loss[loss=0.142, simple_loss=0.2145, pruned_loss=0.03469, over 971039.80 frames.], batch size: 14, lr: 2.55e-04 2022-05-06 08:13:30,937 INFO [train.py:715] (0/8) Epoch 8, batch 31150, loss[loss=0.1327, simple_loss=0.2172, pruned_loss=0.02408, over 4991.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2151, pruned_loss=0.03489, over 971388.54 frames.], batch size: 28, lr: 2.55e-04 2022-05-06 08:14:09,972 INFO [train.py:715] (0/8) Epoch 8, batch 31200, loss[loss=0.1487, simple_loss=0.2234, pruned_loss=0.03699, over 4922.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2153, pruned_loss=0.03542, over 971302.75 frames.], batch size: 18, lr: 2.55e-04 2022-05-06 08:14:48,710 INFO [train.py:715] (0/8) Epoch 8, batch 31250, loss[loss=0.1216, simple_loss=0.1891, pruned_loss=0.02703, over 4899.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2154, pruned_loss=0.03552, over 971168.52 frames.], batch size: 17, lr: 2.55e-04 2022-05-06 08:15:28,179 INFO [train.py:715] (0/8) Epoch 8, batch 31300, loss[loss=0.124, simple_loss=0.2042, pruned_loss=0.02186, over 4806.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2159, pruned_loss=0.03542, over 972102.73 frames.], batch size: 25, lr: 2.55e-04 2022-05-06 08:16:07,662 INFO [train.py:715] (0/8) Epoch 8, batch 31350, loss[loss=0.1305, simple_loss=0.2042, pruned_loss=0.02836, over 4829.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2157, pruned_loss=0.03532, over 971135.48 frames.], batch size: 26, lr: 2.55e-04 2022-05-06 08:16:46,293 INFO [train.py:715] (0/8) Epoch 8, batch 31400, loss[loss=0.1234, simple_loss=0.1873, pruned_loss=0.02976, over 4779.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2155, pruned_loss=0.0356, over 971606.68 frames.], batch size: 17, lr: 2.55e-04 2022-05-06 08:17:25,748 INFO [train.py:715] (0/8) Epoch 8, batch 31450, loss[loss=0.1371, simple_loss=0.2164, pruned_loss=0.02891, over 4876.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2157, pruned_loss=0.03548, over 971682.97 frames.], batch size: 22, lr: 2.55e-04 2022-05-06 08:18:05,872 INFO [train.py:715] (0/8) Epoch 8, batch 31500, loss[loss=0.1564, simple_loss=0.2175, pruned_loss=0.04766, over 4753.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2157, pruned_loss=0.03548, over 971319.48 frames.], batch size: 16, lr: 2.55e-04 2022-05-06 08:18:45,119 INFO [train.py:715] (0/8) Epoch 8, batch 31550, loss[loss=0.133, simple_loss=0.2057, pruned_loss=0.03015, over 4981.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2155, pruned_loss=0.03548, over 972293.98 frames.], batch size: 25, lr: 2.55e-04 2022-05-06 08:19:24,100 INFO [train.py:715] (0/8) Epoch 8, batch 31600, loss[loss=0.148, simple_loss=0.2167, pruned_loss=0.03965, over 4946.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2155, pruned_loss=0.03549, over 972972.19 frames.], batch size: 35, lr: 2.55e-04 2022-05-06 08:20:03,755 INFO [train.py:715] (0/8) Epoch 8, batch 31650, loss[loss=0.1596, simple_loss=0.2251, pruned_loss=0.04704, over 4901.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03554, over 973553.21 frames.], batch size: 17, lr: 2.55e-04 2022-05-06 08:20:43,076 INFO [train.py:715] (0/8) Epoch 8, batch 31700, loss[loss=0.1348, simple_loss=0.2111, pruned_loss=0.02925, over 4827.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2166, pruned_loss=0.03602, over 973484.35 frames.], batch size: 13, lr: 2.55e-04 2022-05-06 08:21:22,757 INFO [train.py:715] (0/8) Epoch 8, batch 31750, loss[loss=0.158, simple_loss=0.2477, pruned_loss=0.03416, over 4927.00 frames.], tot_loss[loss=0.1444, simple_loss=0.2169, pruned_loss=0.03594, over 973608.59 frames.], batch size: 29, lr: 2.55e-04 2022-05-06 08:22:01,967 INFO [train.py:715] (0/8) Epoch 8, batch 31800, loss[loss=0.1528, simple_loss=0.2224, pruned_loss=0.04155, over 4816.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2163, pruned_loss=0.03552, over 973336.66 frames.], batch size: 15, lr: 2.55e-04 2022-05-06 08:22:41,011 INFO [train.py:715] (0/8) Epoch 8, batch 31850, loss[loss=0.1577, simple_loss=0.2255, pruned_loss=0.045, over 4889.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03536, over 973887.40 frames.], batch size: 16, lr: 2.55e-04 2022-05-06 08:23:19,919 INFO [train.py:715] (0/8) Epoch 8, batch 31900, loss[loss=0.1468, simple_loss=0.2303, pruned_loss=0.03165, over 4853.00 frames.], tot_loss[loss=0.1428, simple_loss=0.215, pruned_loss=0.03529, over 974136.61 frames.], batch size: 20, lr: 2.55e-04 2022-05-06 08:23:58,317 INFO [train.py:715] (0/8) Epoch 8, batch 31950, loss[loss=0.126, simple_loss=0.1975, pruned_loss=0.02721, over 4969.00 frames.], tot_loss[loss=0.143, simple_loss=0.2153, pruned_loss=0.03539, over 973434.05 frames.], batch size: 15, lr: 2.55e-04 2022-05-06 08:24:37,605 INFO [train.py:715] (0/8) Epoch 8, batch 32000, loss[loss=0.1312, simple_loss=0.2148, pruned_loss=0.02383, over 4882.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2151, pruned_loss=0.03498, over 972994.27 frames.], batch size: 20, lr: 2.55e-04 2022-05-06 08:25:17,169 INFO [train.py:715] (0/8) Epoch 8, batch 32050, loss[loss=0.1431, simple_loss=0.212, pruned_loss=0.03712, over 4866.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.03508, over 972480.37 frames.], batch size: 32, lr: 2.55e-04 2022-05-06 08:25:55,736 INFO [train.py:715] (0/8) Epoch 8, batch 32100, loss[loss=0.1266, simple_loss=0.1941, pruned_loss=0.02951, over 4958.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2158, pruned_loss=0.03488, over 972091.12 frames.], batch size: 35, lr: 2.55e-04 2022-05-06 08:26:34,468 INFO [train.py:715] (0/8) Epoch 8, batch 32150, loss[loss=0.1436, simple_loss=0.2137, pruned_loss=0.03676, over 4786.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2157, pruned_loss=0.03505, over 971497.81 frames.], batch size: 14, lr: 2.55e-04 2022-05-06 08:27:14,042 INFO [train.py:715] (0/8) Epoch 8, batch 32200, loss[loss=0.133, simple_loss=0.2036, pruned_loss=0.03119, over 4848.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2162, pruned_loss=0.03561, over 970769.94 frames.], batch size: 30, lr: 2.54e-04 2022-05-06 08:27:52,861 INFO [train.py:715] (0/8) Epoch 8, batch 32250, loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.03507, over 4751.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2161, pruned_loss=0.03551, over 970946.77 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:28:32,330 INFO [train.py:715] (0/8) Epoch 8, batch 32300, loss[loss=0.1442, simple_loss=0.2183, pruned_loss=0.03503, over 4928.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2168, pruned_loss=0.03619, over 970206.76 frames.], batch size: 18, lr: 2.54e-04 2022-05-06 08:29:11,538 INFO [train.py:715] (0/8) Epoch 8, batch 32350, loss[loss=0.1281, simple_loss=0.2026, pruned_loss=0.02685, over 4815.00 frames.], tot_loss[loss=0.1449, simple_loss=0.2172, pruned_loss=0.03634, over 970700.36 frames.], batch size: 26, lr: 2.54e-04 2022-05-06 08:29:51,454 INFO [train.py:715] (0/8) Epoch 8, batch 32400, loss[loss=0.1201, simple_loss=0.1998, pruned_loss=0.0202, over 4859.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2171, pruned_loss=0.03629, over 970861.52 frames.], batch size: 20, lr: 2.54e-04 2022-05-06 08:30:30,382 INFO [train.py:715] (0/8) Epoch 8, batch 32450, loss[loss=0.1671, simple_loss=0.2353, pruned_loss=0.04941, over 4898.00 frames.], tot_loss[loss=0.1448, simple_loss=0.2169, pruned_loss=0.03629, over 970885.79 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:31:09,405 INFO [train.py:715] (0/8) Epoch 8, batch 32500, loss[loss=0.1597, simple_loss=0.2386, pruned_loss=0.04041, over 4703.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2169, pruned_loss=0.0361, over 970279.25 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:31:48,951 INFO [train.py:715] (0/8) Epoch 8, batch 32550, loss[loss=0.129, simple_loss=0.2066, pruned_loss=0.02567, over 4768.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2158, pruned_loss=0.03547, over 970078.06 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:32:27,498 INFO [train.py:715] (0/8) Epoch 8, batch 32600, loss[loss=0.1553, simple_loss=0.218, pruned_loss=0.04631, over 4691.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2161, pruned_loss=0.03586, over 970011.88 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:33:06,726 INFO [train.py:715] (0/8) Epoch 8, batch 32650, loss[loss=0.1432, simple_loss=0.2205, pruned_loss=0.03299, over 4763.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2153, pruned_loss=0.03562, over 969919.15 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:33:45,981 INFO [train.py:715] (0/8) Epoch 8, batch 32700, loss[loss=0.1259, simple_loss=0.2051, pruned_loss=0.0233, over 4927.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2155, pruned_loss=0.03591, over 970909.76 frames.], batch size: 18, lr: 2.54e-04 2022-05-06 08:34:26,177 INFO [train.py:715] (0/8) Epoch 8, batch 32750, loss[loss=0.1551, simple_loss=0.2282, pruned_loss=0.04097, over 4699.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2147, pruned_loss=0.03538, over 970500.91 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:35:04,663 INFO [train.py:715] (0/8) Epoch 8, batch 32800, loss[loss=0.122, simple_loss=0.2024, pruned_loss=0.02085, over 4922.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2148, pruned_loss=0.03465, over 970437.01 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:35:43,306 INFO [train.py:715] (0/8) Epoch 8, batch 32850, loss[loss=0.1587, simple_loss=0.2262, pruned_loss=0.04557, over 4949.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2145, pruned_loss=0.0346, over 971395.99 frames.], batch size: 39, lr: 2.54e-04 2022-05-06 08:36:22,463 INFO [train.py:715] (0/8) Epoch 8, batch 32900, loss[loss=0.1259, simple_loss=0.2018, pruned_loss=0.02499, over 4910.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2136, pruned_loss=0.03433, over 972042.01 frames.], batch size: 17, lr: 2.54e-04 2022-05-06 08:37:00,741 INFO [train.py:715] (0/8) Epoch 8, batch 32950, loss[loss=0.1354, simple_loss=0.208, pruned_loss=0.03142, over 4766.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2134, pruned_loss=0.03453, over 971727.36 frames.], batch size: 16, lr: 2.54e-04 2022-05-06 08:37:39,627 INFO [train.py:715] (0/8) Epoch 8, batch 33000, loss[loss=0.131, simple_loss=0.2135, pruned_loss=0.02429, over 4939.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2136, pruned_loss=0.03453, over 971834.81 frames.], batch size: 23, lr: 2.54e-04 2022-05-06 08:37:39,628 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 08:37:52,641 INFO [train.py:742] (0/8) Epoch 8, validation: loss=0.1071, simple_loss=0.1917, pruned_loss=0.01126, over 914524.00 frames. 2022-05-06 08:38:31,970 INFO [train.py:715] (0/8) Epoch 8, batch 33050, loss[loss=0.1415, simple_loss=0.2171, pruned_loss=0.03294, over 4978.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2139, pruned_loss=0.03447, over 972508.04 frames.], batch size: 26, lr: 2.54e-04 2022-05-06 08:39:10,825 INFO [train.py:715] (0/8) Epoch 8, batch 33100, loss[loss=0.1464, simple_loss=0.2162, pruned_loss=0.03828, over 4976.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03449, over 972271.70 frames.], batch size: 14, lr: 2.54e-04 2022-05-06 08:39:50,121 INFO [train.py:715] (0/8) Epoch 8, batch 33150, loss[loss=0.1392, simple_loss=0.2051, pruned_loss=0.03667, over 4830.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2147, pruned_loss=0.03443, over 972626.41 frames.], batch size: 13, lr: 2.54e-04 2022-05-06 08:40:28,829 INFO [train.py:715] (0/8) Epoch 8, batch 33200, loss[loss=0.1168, simple_loss=0.1949, pruned_loss=0.01932, over 4986.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2149, pruned_loss=0.03428, over 973110.90 frames.], batch size: 33, lr: 2.54e-04 2022-05-06 08:41:08,504 INFO [train.py:715] (0/8) Epoch 8, batch 33250, loss[loss=0.1174, simple_loss=0.187, pruned_loss=0.02388, over 4769.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2149, pruned_loss=0.03436, over 973087.72 frames.], batch size: 12, lr: 2.54e-04 2022-05-06 08:41:48,100 INFO [train.py:715] (0/8) Epoch 8, batch 33300, loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03137, over 4782.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2159, pruned_loss=0.03487, over 973244.30 frames.], batch size: 17, lr: 2.54e-04 2022-05-06 08:42:26,898 INFO [train.py:715] (0/8) Epoch 8, batch 33350, loss[loss=0.1529, simple_loss=0.2347, pruned_loss=0.0356, over 4953.00 frames.], tot_loss[loss=0.1427, simple_loss=0.216, pruned_loss=0.03472, over 973693.08 frames.], batch size: 21, lr: 2.54e-04 2022-05-06 08:43:06,262 INFO [train.py:715] (0/8) Epoch 8, batch 33400, loss[loss=0.1565, simple_loss=0.233, pruned_loss=0.04004, over 4933.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2162, pruned_loss=0.03501, over 973502.35 frames.], batch size: 39, lr: 2.54e-04 2022-05-06 08:43:45,174 INFO [train.py:715] (0/8) Epoch 8, batch 33450, loss[loss=0.113, simple_loss=0.1888, pruned_loss=0.01855, over 4933.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2151, pruned_loss=0.03458, over 973160.45 frames.], batch size: 29, lr: 2.54e-04 2022-05-06 08:44:24,006 INFO [train.py:715] (0/8) Epoch 8, batch 33500, loss[loss=0.1216, simple_loss=0.198, pruned_loss=0.02266, over 4752.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2157, pruned_loss=0.03507, over 972404.24 frames.], batch size: 16, lr: 2.54e-04 2022-05-06 08:44:41,886 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-312000.pt 2022-05-06 08:45:05,010 INFO [train.py:715] (0/8) Epoch 8, batch 33550, loss[loss=0.1621, simple_loss=0.2396, pruned_loss=0.04227, over 4876.00 frames.], tot_loss[loss=0.143, simple_loss=0.2158, pruned_loss=0.03514, over 972050.77 frames.], batch size: 16, lr: 2.54e-04 2022-05-06 08:45:44,463 INFO [train.py:715] (0/8) Epoch 8, batch 33600, loss[loss=0.1676, simple_loss=0.2428, pruned_loss=0.04619, over 4694.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2153, pruned_loss=0.03484, over 971031.34 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:46:23,907 INFO [train.py:715] (0/8) Epoch 8, batch 33650, loss[loss=0.2193, simple_loss=0.3072, pruned_loss=0.0657, over 4928.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2156, pruned_loss=0.03488, over 971219.87 frames.], batch size: 23, lr: 2.54e-04 2022-05-06 08:47:02,972 INFO [train.py:715] (0/8) Epoch 8, batch 33700, loss[loss=0.1193, simple_loss=0.1964, pruned_loss=0.02114, over 4788.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2156, pruned_loss=0.03504, over 971241.37 frames.], batch size: 14, lr: 2.54e-04 2022-05-06 08:47:41,961 INFO [train.py:715] (0/8) Epoch 8, batch 33750, loss[loss=0.1416, simple_loss=0.2183, pruned_loss=0.03249, over 4868.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2149, pruned_loss=0.03471, over 971784.61 frames.], batch size: 20, lr: 2.54e-04 2022-05-06 08:48:20,683 INFO [train.py:715] (0/8) Epoch 8, batch 33800, loss[loss=0.1312, simple_loss=0.2075, pruned_loss=0.02746, over 4808.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2153, pruned_loss=0.03466, over 971887.48 frames.], batch size: 21, lr: 2.54e-04 2022-05-06 08:48:59,305 INFO [train.py:715] (0/8) Epoch 8, batch 33850, loss[loss=0.1533, simple_loss=0.2257, pruned_loss=0.04049, over 4949.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2155, pruned_loss=0.03464, over 972280.16 frames.], batch size: 29, lr: 2.54e-04 2022-05-06 08:49:38,115 INFO [train.py:715] (0/8) Epoch 8, batch 33900, loss[loss=0.1707, simple_loss=0.2368, pruned_loss=0.05231, over 4942.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2152, pruned_loss=0.03489, over 972215.06 frames.], batch size: 39, lr: 2.54e-04 2022-05-06 08:50:17,035 INFO [train.py:715] (0/8) Epoch 8, batch 33950, loss[loss=0.1279, simple_loss=0.1931, pruned_loss=0.03142, over 4861.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2149, pruned_loss=0.03479, over 972442.50 frames.], batch size: 20, lr: 2.54e-04 2022-05-06 08:50:56,631 INFO [train.py:715] (0/8) Epoch 8, batch 34000, loss[loss=0.2077, simple_loss=0.2663, pruned_loss=0.07454, over 4975.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2161, pruned_loss=0.03538, over 972907.75 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:51:35,546 INFO [train.py:715] (0/8) Epoch 8, batch 34050, loss[loss=0.1946, simple_loss=0.2552, pruned_loss=0.06704, over 4954.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2162, pruned_loss=0.03551, over 972595.81 frames.], batch size: 39, lr: 2.54e-04 2022-05-06 08:52:14,816 INFO [train.py:715] (0/8) Epoch 8, batch 34100, loss[loss=0.146, simple_loss=0.2312, pruned_loss=0.03039, over 4699.00 frames.], tot_loss[loss=0.1435, simple_loss=0.216, pruned_loss=0.03547, over 972962.47 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:52:53,780 INFO [train.py:715] (0/8) Epoch 8, batch 34150, loss[loss=0.1246, simple_loss=0.2017, pruned_loss=0.02379, over 4695.00 frames.], tot_loss[loss=0.1439, simple_loss=0.2157, pruned_loss=0.03603, over 972219.79 frames.], batch size: 15, lr: 2.54e-04 2022-05-06 08:53:32,396 INFO [train.py:715] (0/8) Epoch 8, batch 34200, loss[loss=0.1782, simple_loss=0.2446, pruned_loss=0.05592, over 4761.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2149, pruned_loss=0.03576, over 971814.65 frames.], batch size: 16, lr: 2.54e-04 2022-05-06 08:54:11,303 INFO [train.py:715] (0/8) Epoch 8, batch 34250, loss[loss=0.1508, simple_loss=0.2193, pruned_loss=0.04113, over 4847.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2144, pruned_loss=0.03507, over 972186.43 frames.], batch size: 30, lr: 2.54e-04 2022-05-06 08:54:50,277 INFO [train.py:715] (0/8) Epoch 8, batch 34300, loss[loss=0.1415, simple_loss=0.2162, pruned_loss=0.03338, over 4865.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2136, pruned_loss=0.03476, over 971919.17 frames.], batch size: 32, lr: 2.54e-04 2022-05-06 08:55:29,025 INFO [train.py:715] (0/8) Epoch 8, batch 34350, loss[loss=0.1397, simple_loss=0.2131, pruned_loss=0.03316, over 4768.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2133, pruned_loss=0.03426, over 971996.19 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:56:07,453 INFO [train.py:715] (0/8) Epoch 8, batch 34400, loss[loss=0.1291, simple_loss=0.2096, pruned_loss=0.02431, over 4904.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2144, pruned_loss=0.03462, over 972464.26 frames.], batch size: 19, lr: 2.54e-04 2022-05-06 08:56:46,675 INFO [train.py:715] (0/8) Epoch 8, batch 34450, loss[loss=0.1228, simple_loss=0.1917, pruned_loss=0.02696, over 4909.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2139, pruned_loss=0.03441, over 971902.08 frames.], batch size: 18, lr: 2.54e-04 2022-05-06 08:57:26,047 INFO [train.py:715] (0/8) Epoch 8, batch 34500, loss[loss=0.1501, simple_loss=0.2303, pruned_loss=0.03492, over 4851.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.0343, over 972255.46 frames.], batch size: 20, lr: 2.54e-04 2022-05-06 08:58:04,289 INFO [train.py:715] (0/8) Epoch 8, batch 34550, loss[loss=0.1182, simple_loss=0.1858, pruned_loss=0.02525, over 4992.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2145, pruned_loss=0.03411, over 972397.36 frames.], batch size: 14, lr: 2.54e-04 2022-05-06 08:58:42,924 INFO [train.py:715] (0/8) Epoch 8, batch 34600, loss[loss=0.1695, simple_loss=0.2511, pruned_loss=0.04393, over 4801.00 frames.], tot_loss[loss=0.141, simple_loss=0.2143, pruned_loss=0.03384, over 972500.14 frames.], batch size: 21, lr: 2.54e-04 2022-05-06 08:59:21,847 INFO [train.py:715] (0/8) Epoch 8, batch 34650, loss[loss=0.113, simple_loss=0.1867, pruned_loss=0.01962, over 4852.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2147, pruned_loss=0.03432, over 972735.19 frames.], batch size: 20, lr: 2.53e-04 2022-05-06 09:00:01,503 INFO [train.py:715] (0/8) Epoch 8, batch 34700, loss[loss=0.1529, simple_loss=0.2153, pruned_loss=0.04526, over 4854.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2143, pruned_loss=0.03464, over 971902.46 frames.], batch size: 30, lr: 2.53e-04 2022-05-06 09:00:38,661 INFO [train.py:715] (0/8) Epoch 8, batch 34750, loss[loss=0.1628, simple_loss=0.235, pruned_loss=0.04523, over 4848.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2152, pruned_loss=0.03495, over 971263.11 frames.], batch size: 20, lr: 2.53e-04 2022-05-06 09:01:15,261 INFO [train.py:715] (0/8) Epoch 8, batch 34800, loss[loss=0.1317, simple_loss=0.1927, pruned_loss=0.03539, over 4756.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2139, pruned_loss=0.03514, over 970571.91 frames.], batch size: 12, lr: 2.53e-04 2022-05-06 09:01:23,627 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-8.pt 2022-05-06 09:02:04,636 INFO [train.py:715] (0/8) Epoch 9, batch 0, loss[loss=0.1389, simple_loss=0.2133, pruned_loss=0.03229, over 4865.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2133, pruned_loss=0.03229, over 4865.00 frames.], batch size: 20, lr: 2.42e-04 2022-05-06 09:02:43,973 INFO [train.py:715] (0/8) Epoch 9, batch 50, loss[loss=0.132, simple_loss=0.2116, pruned_loss=0.02621, over 4929.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03507, over 218922.19 frames.], batch size: 29, lr: 2.41e-04 2022-05-06 09:03:23,609 INFO [train.py:715] (0/8) Epoch 9, batch 100, loss[loss=0.1655, simple_loss=0.2333, pruned_loss=0.04888, over 4815.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2137, pruned_loss=0.03471, over 385518.26 frames.], batch size: 26, lr: 2.41e-04 2022-05-06 09:04:02,101 INFO [train.py:715] (0/8) Epoch 9, batch 150, loss[loss=0.1218, simple_loss=0.1894, pruned_loss=0.02713, over 4793.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2149, pruned_loss=0.03542, over 515619.38 frames.], batch size: 17, lr: 2.41e-04 2022-05-06 09:04:42,540 INFO [train.py:715] (0/8) Epoch 9, batch 200, loss[loss=0.1396, simple_loss=0.2138, pruned_loss=0.0327, over 4981.00 frames.], tot_loss[loss=0.142, simple_loss=0.2144, pruned_loss=0.03484, over 618021.85 frames.], batch size: 14, lr: 2.41e-04 2022-05-06 09:05:21,803 INFO [train.py:715] (0/8) Epoch 9, batch 250, loss[loss=0.1275, simple_loss=0.1933, pruned_loss=0.03083, over 4859.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2156, pruned_loss=0.03513, over 696493.17 frames.], batch size: 30, lr: 2.41e-04 2022-05-06 09:06:01,094 INFO [train.py:715] (0/8) Epoch 9, batch 300, loss[loss=0.1378, simple_loss=0.2087, pruned_loss=0.03349, over 4800.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2155, pruned_loss=0.03538, over 757480.08 frames.], batch size: 24, lr: 2.41e-04 2022-05-06 09:06:40,654 INFO [train.py:715] (0/8) Epoch 9, batch 350, loss[loss=0.1298, simple_loss=0.2065, pruned_loss=0.02655, over 4888.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2158, pruned_loss=0.03549, over 804597.05 frames.], batch size: 16, lr: 2.41e-04 2022-05-06 09:07:20,401 INFO [train.py:715] (0/8) Epoch 9, batch 400, loss[loss=0.1581, simple_loss=0.2357, pruned_loss=0.0402, over 4817.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2164, pruned_loss=0.03609, over 842527.93 frames.], batch size: 13, lr: 2.41e-04 2022-05-06 09:07:59,734 INFO [train.py:715] (0/8) Epoch 9, batch 450, loss[loss=0.1148, simple_loss=0.1898, pruned_loss=0.01988, over 4879.00 frames.], tot_loss[loss=0.1446, simple_loss=0.2165, pruned_loss=0.03638, over 871581.02 frames.], batch size: 22, lr: 2.41e-04 2022-05-06 09:08:38,886 INFO [train.py:715] (0/8) Epoch 9, batch 500, loss[loss=0.1178, simple_loss=0.1884, pruned_loss=0.02356, over 4900.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2156, pruned_loss=0.03561, over 894977.33 frames.], batch size: 18, lr: 2.41e-04 2022-05-06 09:09:19,203 INFO [train.py:715] (0/8) Epoch 9, batch 550, loss[loss=0.1597, simple_loss=0.2268, pruned_loss=0.04631, over 4984.00 frames.], tot_loss[loss=0.1435, simple_loss=0.216, pruned_loss=0.03554, over 912540.79 frames.], batch size: 24, lr: 2.41e-04 2022-05-06 09:09:58,806 INFO [train.py:715] (0/8) Epoch 9, batch 600, loss[loss=0.1572, simple_loss=0.2187, pruned_loss=0.04786, over 4793.00 frames.], tot_loss[loss=0.1447, simple_loss=0.2169, pruned_loss=0.0362, over 925394.12 frames.], batch size: 24, lr: 2.41e-04 2022-05-06 09:10:37,826 INFO [train.py:715] (0/8) Epoch 9, batch 650, loss[loss=0.1618, simple_loss=0.2251, pruned_loss=0.04928, over 4775.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2163, pruned_loss=0.03555, over 935512.41 frames.], batch size: 14, lr: 2.41e-04 2022-05-06 09:11:16,916 INFO [train.py:715] (0/8) Epoch 9, batch 700, loss[loss=0.1757, simple_loss=0.2298, pruned_loss=0.06082, over 4760.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2163, pruned_loss=0.03563, over 943175.24 frames.], batch size: 19, lr: 2.41e-04 2022-05-06 09:11:56,397 INFO [train.py:715] (0/8) Epoch 9, batch 750, loss[loss=0.1442, simple_loss=0.2109, pruned_loss=0.0388, over 4689.00 frames.], tot_loss[loss=0.1438, simple_loss=0.216, pruned_loss=0.0358, over 949575.40 frames.], batch size: 15, lr: 2.41e-04 2022-05-06 09:12:35,539 INFO [train.py:715] (0/8) Epoch 9, batch 800, loss[loss=0.1196, simple_loss=0.1941, pruned_loss=0.02251, over 4922.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2148, pruned_loss=0.03491, over 955009.36 frames.], batch size: 18, lr: 2.41e-04 2022-05-06 09:13:14,319 INFO [train.py:715] (0/8) Epoch 9, batch 850, loss[loss=0.1024, simple_loss=0.174, pruned_loss=0.01543, over 4907.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2143, pruned_loss=0.0346, over 959045.22 frames.], batch size: 17, lr: 2.41e-04 2022-05-06 09:13:53,318 INFO [train.py:715] (0/8) Epoch 9, batch 900, loss[loss=0.1169, simple_loss=0.1965, pruned_loss=0.01865, over 4742.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2152, pruned_loss=0.03505, over 962389.97 frames.], batch size: 16, lr: 2.41e-04 2022-05-06 09:14:32,594 INFO [train.py:715] (0/8) Epoch 9, batch 950, loss[loss=0.1752, simple_loss=0.2463, pruned_loss=0.05204, over 4904.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2148, pruned_loss=0.03484, over 965401.14 frames.], batch size: 19, lr: 2.41e-04 2022-05-06 09:15:12,210 INFO [train.py:715] (0/8) Epoch 9, batch 1000, loss[loss=0.1641, simple_loss=0.2336, pruned_loss=0.04735, over 4932.00 frames.], tot_loss[loss=0.143, simple_loss=0.2151, pruned_loss=0.03541, over 966722.88 frames.], batch size: 18, lr: 2.41e-04 2022-05-06 09:15:50,367 INFO [train.py:715] (0/8) Epoch 9, batch 1050, loss[loss=0.1306, simple_loss=0.2077, pruned_loss=0.02681, over 4821.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2156, pruned_loss=0.03576, over 967371.58 frames.], batch size: 26, lr: 2.41e-04 2022-05-06 09:16:30,510 INFO [train.py:715] (0/8) Epoch 9, batch 1100, loss[loss=0.128, simple_loss=0.2049, pruned_loss=0.02551, over 4838.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2155, pruned_loss=0.03593, over 967576.44 frames.], batch size: 20, lr: 2.41e-04 2022-05-06 09:17:10,346 INFO [train.py:715] (0/8) Epoch 9, batch 1150, loss[loss=0.1461, simple_loss=0.2219, pruned_loss=0.03518, over 4911.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2156, pruned_loss=0.03574, over 968032.55 frames.], batch size: 18, lr: 2.41e-04 2022-05-06 09:17:49,482 INFO [train.py:715] (0/8) Epoch 9, batch 1200, loss[loss=0.1765, simple_loss=0.2409, pruned_loss=0.05602, over 4890.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2151, pruned_loss=0.03554, over 967944.34 frames.], batch size: 22, lr: 2.41e-04 2022-05-06 09:18:28,818 INFO [train.py:715] (0/8) Epoch 9, batch 1250, loss[loss=0.1225, simple_loss=0.1956, pruned_loss=0.02473, over 4795.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2158, pruned_loss=0.03578, over 968792.39 frames.], batch size: 24, lr: 2.41e-04 2022-05-06 09:19:08,559 INFO [train.py:715] (0/8) Epoch 9, batch 1300, loss[loss=0.09651, simple_loss=0.1696, pruned_loss=0.01174, over 4847.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2148, pruned_loss=0.03545, over 968879.49 frames.], batch size: 12, lr: 2.41e-04 2022-05-06 09:19:48,098 INFO [train.py:715] (0/8) Epoch 9, batch 1350, loss[loss=0.1456, simple_loss=0.2232, pruned_loss=0.03397, over 4865.00 frames.], tot_loss[loss=0.1431, simple_loss=0.215, pruned_loss=0.03565, over 970152.41 frames.], batch size: 16, lr: 2.41e-04 2022-05-06 09:20:26,895 INFO [train.py:715] (0/8) Epoch 9, batch 1400, loss[loss=0.1216, simple_loss=0.1923, pruned_loss=0.02541, over 4931.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2145, pruned_loss=0.03502, over 970486.21 frames.], batch size: 23, lr: 2.41e-04 2022-05-06 09:21:06,499 INFO [train.py:715] (0/8) Epoch 9, batch 1450, loss[loss=0.1405, simple_loss=0.2002, pruned_loss=0.04047, over 4883.00 frames.], tot_loss[loss=0.1419, simple_loss=0.214, pruned_loss=0.03489, over 971069.10 frames.], batch size: 32, lr: 2.41e-04 2022-05-06 09:21:45,309 INFO [train.py:715] (0/8) Epoch 9, batch 1500, loss[loss=0.1281, simple_loss=0.204, pruned_loss=0.02611, over 4893.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2136, pruned_loss=0.03438, over 971561.16 frames.], batch size: 22, lr: 2.41e-04 2022-05-06 09:22:24,143 INFO [train.py:715] (0/8) Epoch 9, batch 1550, loss[loss=0.1974, simple_loss=0.2705, pruned_loss=0.06215, over 4935.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2148, pruned_loss=0.0347, over 971069.72 frames.], batch size: 21, lr: 2.41e-04 2022-05-06 09:23:03,177 INFO [train.py:715] (0/8) Epoch 9, batch 1600, loss[loss=0.1353, simple_loss=0.2061, pruned_loss=0.03231, over 4823.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03474, over 970921.62 frames.], batch size: 15, lr: 2.41e-04 2022-05-06 09:23:42,083 INFO [train.py:715] (0/8) Epoch 9, batch 1650, loss[loss=0.1473, simple_loss=0.2139, pruned_loss=0.0403, over 4819.00 frames.], tot_loss[loss=0.142, simple_loss=0.2148, pruned_loss=0.03463, over 971714.64 frames.], batch size: 25, lr: 2.41e-04 2022-05-06 09:24:21,076 INFO [train.py:715] (0/8) Epoch 9, batch 1700, loss[loss=0.1578, simple_loss=0.2188, pruned_loss=0.04835, over 4868.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03509, over 972331.94 frames.], batch size: 20, lr: 2.41e-04 2022-05-06 09:25:00,146 INFO [train.py:715] (0/8) Epoch 9, batch 1750, loss[loss=0.1468, simple_loss=0.2142, pruned_loss=0.03969, over 4963.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.03518, over 971270.83 frames.], batch size: 35, lr: 2.41e-04 2022-05-06 09:25:39,673 INFO [train.py:715] (0/8) Epoch 9, batch 1800, loss[loss=0.1154, simple_loss=0.1838, pruned_loss=0.02352, over 4856.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2144, pruned_loss=0.03511, over 971164.19 frames.], batch size: 15, lr: 2.41e-04 2022-05-06 09:26:18,854 INFO [train.py:715] (0/8) Epoch 9, batch 1850, loss[loss=0.1293, simple_loss=0.1975, pruned_loss=0.03054, over 4925.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2148, pruned_loss=0.03523, over 971011.12 frames.], batch size: 29, lr: 2.41e-04 2022-05-06 09:26:57,984 INFO [train.py:715] (0/8) Epoch 9, batch 1900, loss[loss=0.1227, simple_loss=0.1902, pruned_loss=0.02761, over 4977.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2144, pruned_loss=0.03545, over 971174.92 frames.], batch size: 28, lr: 2.41e-04 2022-05-06 09:27:37,985 INFO [train.py:715] (0/8) Epoch 9, batch 1950, loss[loss=0.129, simple_loss=0.2143, pruned_loss=0.02181, over 4988.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.0351, over 971256.84 frames.], batch size: 28, lr: 2.41e-04 2022-05-06 09:28:17,667 INFO [train.py:715] (0/8) Epoch 9, batch 2000, loss[loss=0.1453, simple_loss=0.2135, pruned_loss=0.03857, over 4772.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2145, pruned_loss=0.03521, over 971389.33 frames.], batch size: 14, lr: 2.41e-04 2022-05-06 09:28:56,802 INFO [train.py:715] (0/8) Epoch 9, batch 2050, loss[loss=0.1412, simple_loss=0.2148, pruned_loss=0.03377, over 4761.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.0344, over 971426.42 frames.], batch size: 14, lr: 2.41e-04 2022-05-06 09:29:35,323 INFO [train.py:715] (0/8) Epoch 9, batch 2100, loss[loss=0.1442, simple_loss=0.2336, pruned_loss=0.02739, over 4793.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2139, pruned_loss=0.03434, over 971905.93 frames.], batch size: 18, lr: 2.41e-04 2022-05-06 09:30:14,645 INFO [train.py:715] (0/8) Epoch 9, batch 2150, loss[loss=0.1399, simple_loss=0.2212, pruned_loss=0.02934, over 4815.00 frames.], tot_loss[loss=0.1415, simple_loss=0.214, pruned_loss=0.03452, over 972123.95 frames.], batch size: 26, lr: 2.41e-04 2022-05-06 09:30:53,732 INFO [train.py:715] (0/8) Epoch 9, batch 2200, loss[loss=0.1117, simple_loss=0.1832, pruned_loss=0.02013, over 4736.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2138, pruned_loss=0.03455, over 972155.78 frames.], batch size: 12, lr: 2.41e-04 2022-05-06 09:31:32,490 INFO [train.py:715] (0/8) Epoch 9, batch 2250, loss[loss=0.1296, simple_loss=0.2131, pruned_loss=0.02303, over 4805.00 frames.], tot_loss[loss=0.1418, simple_loss=0.214, pruned_loss=0.03481, over 971279.54 frames.], batch size: 25, lr: 2.41e-04 2022-05-06 09:32:11,656 INFO [train.py:715] (0/8) Epoch 9, batch 2300, loss[loss=0.1383, simple_loss=0.2046, pruned_loss=0.03603, over 4950.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2134, pruned_loss=0.03452, over 970956.46 frames.], batch size: 35, lr: 2.41e-04 2022-05-06 09:32:50,735 INFO [train.py:715] (0/8) Epoch 9, batch 2350, loss[loss=0.1502, simple_loss=0.2083, pruned_loss=0.04599, over 4710.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2131, pruned_loss=0.03406, over 970959.28 frames.], batch size: 15, lr: 2.41e-04 2022-05-06 09:33:30,101 INFO [train.py:715] (0/8) Epoch 9, batch 2400, loss[loss=0.1351, simple_loss=0.2134, pruned_loss=0.02842, over 4810.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2123, pruned_loss=0.03365, over 970930.41 frames.], batch size: 21, lr: 2.41e-04 2022-05-06 09:34:08,885 INFO [train.py:715] (0/8) Epoch 9, batch 2450, loss[loss=0.1236, simple_loss=0.1998, pruned_loss=0.02375, over 4928.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2129, pruned_loss=0.03408, over 971344.93 frames.], batch size: 29, lr: 2.41e-04 2022-05-06 09:34:48,498 INFO [train.py:715] (0/8) Epoch 9, batch 2500, loss[loss=0.1271, simple_loss=0.2018, pruned_loss=0.02621, over 4806.00 frames.], tot_loss[loss=0.1413, simple_loss=0.214, pruned_loss=0.03434, over 971715.55 frames.], batch size: 12, lr: 2.41e-04 2022-05-06 09:35:27,020 INFO [train.py:715] (0/8) Epoch 9, batch 2550, loss[loss=0.1232, simple_loss=0.1934, pruned_loss=0.02648, over 4785.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2139, pruned_loss=0.03441, over 972846.28 frames.], batch size: 21, lr: 2.41e-04 2022-05-06 09:36:06,037 INFO [train.py:715] (0/8) Epoch 9, batch 2600, loss[loss=0.1607, simple_loss=0.227, pruned_loss=0.04724, over 4841.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2139, pruned_loss=0.03439, over 973438.27 frames.], batch size: 32, lr: 2.41e-04 2022-05-06 09:36:45,108 INFO [train.py:715] (0/8) Epoch 9, batch 2650, loss[loss=0.1206, simple_loss=0.1936, pruned_loss=0.02382, over 4879.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.03462, over 972419.59 frames.], batch size: 16, lr: 2.41e-04 2022-05-06 09:37:24,473 INFO [train.py:715] (0/8) Epoch 9, batch 2700, loss[loss=0.1595, simple_loss=0.2375, pruned_loss=0.04077, over 4887.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.03486, over 971031.20 frames.], batch size: 22, lr: 2.40e-04 2022-05-06 09:38:03,292 INFO [train.py:715] (0/8) Epoch 9, batch 2750, loss[loss=0.1279, simple_loss=0.1963, pruned_loss=0.02977, over 4689.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03485, over 970645.43 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 09:38:42,264 INFO [train.py:715] (0/8) Epoch 9, batch 2800, loss[loss=0.133, simple_loss=0.202, pruned_loss=0.03199, over 4978.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2145, pruned_loss=0.03464, over 971048.08 frames.], batch size: 28, lr: 2.40e-04 2022-05-06 09:39:21,841 INFO [train.py:715] (0/8) Epoch 9, batch 2850, loss[loss=0.1342, simple_loss=0.2024, pruned_loss=0.03303, over 4951.00 frames.], tot_loss[loss=0.141, simple_loss=0.2137, pruned_loss=0.03413, over 971801.93 frames.], batch size: 29, lr: 2.40e-04 2022-05-06 09:40:00,909 INFO [train.py:715] (0/8) Epoch 9, batch 2900, loss[loss=0.1837, simple_loss=0.2472, pruned_loss=0.06011, over 4975.00 frames.], tot_loss[loss=0.1415, simple_loss=0.214, pruned_loss=0.03454, over 971496.53 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 09:40:39,677 INFO [train.py:715] (0/8) Epoch 9, batch 2950, loss[loss=0.1213, simple_loss=0.1889, pruned_loss=0.02688, over 4969.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03417, over 971789.73 frames.], batch size: 35, lr: 2.40e-04 2022-05-06 09:41:18,903 INFO [train.py:715] (0/8) Epoch 9, batch 3000, loss[loss=0.1218, simple_loss=0.2022, pruned_loss=0.02074, over 4843.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.03433, over 971675.44 frames.], batch size: 20, lr: 2.40e-04 2022-05-06 09:41:18,903 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 09:41:28,535 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1069, simple_loss=0.1915, pruned_loss=0.01118, over 914524.00 frames. 2022-05-06 09:42:08,248 INFO [train.py:715] (0/8) Epoch 9, batch 3050, loss[loss=0.1289, simple_loss=0.1971, pruned_loss=0.03037, over 4814.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2131, pruned_loss=0.03413, over 972325.20 frames.], batch size: 13, lr: 2.40e-04 2022-05-06 09:42:47,738 INFO [train.py:715] (0/8) Epoch 9, batch 3100, loss[loss=0.1381, simple_loss=0.2049, pruned_loss=0.03565, over 4831.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2127, pruned_loss=0.0343, over 971806.18 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 09:43:27,212 INFO [train.py:715] (0/8) Epoch 9, batch 3150, loss[loss=0.1266, simple_loss=0.2027, pruned_loss=0.02523, over 4817.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2135, pruned_loss=0.0346, over 972516.17 frames.], batch size: 26, lr: 2.40e-04 2022-05-06 09:44:06,425 INFO [train.py:715] (0/8) Epoch 9, batch 3200, loss[loss=0.1581, simple_loss=0.2244, pruned_loss=0.04587, over 4798.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2132, pruned_loss=0.03448, over 971666.76 frames.], batch size: 13, lr: 2.40e-04 2022-05-06 09:44:45,579 INFO [train.py:715] (0/8) Epoch 9, batch 3250, loss[loss=0.1286, simple_loss=0.2026, pruned_loss=0.02731, over 4976.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2138, pruned_loss=0.0348, over 972010.93 frames.], batch size: 14, lr: 2.40e-04 2022-05-06 09:45:24,838 INFO [train.py:715] (0/8) Epoch 9, batch 3300, loss[loss=0.1353, simple_loss=0.2236, pruned_loss=0.02346, over 4962.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2135, pruned_loss=0.03407, over 971821.87 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 09:46:03,659 INFO [train.py:715] (0/8) Epoch 9, batch 3350, loss[loss=0.1595, simple_loss=0.2351, pruned_loss=0.04199, over 4918.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2138, pruned_loss=0.03445, over 972731.89 frames.], batch size: 17, lr: 2.40e-04 2022-05-06 09:46:42,960 INFO [train.py:715] (0/8) Epoch 9, batch 3400, loss[loss=0.1533, simple_loss=0.226, pruned_loss=0.04029, over 4983.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2137, pruned_loss=0.03477, over 973008.48 frames.], batch size: 27, lr: 2.40e-04 2022-05-06 09:47:22,069 INFO [train.py:715] (0/8) Epoch 9, batch 3450, loss[loss=0.1196, simple_loss=0.1844, pruned_loss=0.02744, over 4820.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2141, pruned_loss=0.03484, over 973104.24 frames.], batch size: 12, lr: 2.40e-04 2022-05-06 09:48:00,720 INFO [train.py:715] (0/8) Epoch 9, batch 3500, loss[loss=0.1342, simple_loss=0.2131, pruned_loss=0.02764, over 4802.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2137, pruned_loss=0.03453, over 972743.51 frames.], batch size: 21, lr: 2.40e-04 2022-05-06 09:48:40,284 INFO [train.py:715] (0/8) Epoch 9, batch 3550, loss[loss=0.1208, simple_loss=0.1868, pruned_loss=0.02739, over 4820.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.03434, over 973368.98 frames.], batch size: 12, lr: 2.40e-04 2022-05-06 09:49:19,725 INFO [train.py:715] (0/8) Epoch 9, batch 3600, loss[loss=0.1446, simple_loss=0.2172, pruned_loss=0.03605, over 4761.00 frames.], tot_loss[loss=0.141, simple_loss=0.2136, pruned_loss=0.03419, over 972727.75 frames.], batch size: 16, lr: 2.40e-04 2022-05-06 09:49:59,014 INFO [train.py:715] (0/8) Epoch 9, batch 3650, loss[loss=0.1605, simple_loss=0.2508, pruned_loss=0.03509, over 4924.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2137, pruned_loss=0.034, over 972377.68 frames.], batch size: 23, lr: 2.40e-04 2022-05-06 09:50:37,658 INFO [train.py:715] (0/8) Epoch 9, batch 3700, loss[loss=0.1342, simple_loss=0.2105, pruned_loss=0.02893, over 4904.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2147, pruned_loss=0.03474, over 972263.99 frames.], batch size: 19, lr: 2.40e-04 2022-05-06 09:51:17,145 INFO [train.py:715] (0/8) Epoch 9, batch 3750, loss[loss=0.1153, simple_loss=0.1864, pruned_loss=0.02209, over 4981.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2142, pruned_loss=0.03468, over 972497.05 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 09:51:56,917 INFO [train.py:715] (0/8) Epoch 9, batch 3800, loss[loss=0.1172, simple_loss=0.1916, pruned_loss=0.02143, over 4983.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2139, pruned_loss=0.03457, over 971819.11 frames.], batch size: 28, lr: 2.40e-04 2022-05-06 09:52:35,336 INFO [train.py:715] (0/8) Epoch 9, batch 3850, loss[loss=0.1829, simple_loss=0.2629, pruned_loss=0.05143, over 4941.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2142, pruned_loss=0.03504, over 972643.54 frames.], batch size: 21, lr: 2.40e-04 2022-05-06 09:53:14,340 INFO [train.py:715] (0/8) Epoch 9, batch 3900, loss[loss=0.1852, simple_loss=0.2579, pruned_loss=0.05623, over 4853.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2149, pruned_loss=0.03508, over 973141.87 frames.], batch size: 20, lr: 2.40e-04 2022-05-06 09:53:53,824 INFO [train.py:715] (0/8) Epoch 9, batch 3950, loss[loss=0.1067, simple_loss=0.1773, pruned_loss=0.01804, over 4988.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.03483, over 973604.88 frames.], batch size: 14, lr: 2.40e-04 2022-05-06 09:54:33,399 INFO [train.py:715] (0/8) Epoch 9, batch 4000, loss[loss=0.1431, simple_loss=0.2095, pruned_loss=0.03834, over 4836.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03443, over 973646.51 frames.], batch size: 32, lr: 2.40e-04 2022-05-06 09:55:12,126 INFO [train.py:715] (0/8) Epoch 9, batch 4050, loss[loss=0.1246, simple_loss=0.1988, pruned_loss=0.0252, over 4849.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2143, pruned_loss=0.03437, over 973681.65 frames.], batch size: 13, lr: 2.40e-04 2022-05-06 09:55:52,104 INFO [train.py:715] (0/8) Epoch 9, batch 4100, loss[loss=0.1318, simple_loss=0.2125, pruned_loss=0.02558, over 4820.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2141, pruned_loss=0.03437, over 972064.32 frames.], batch size: 26, lr: 2.40e-04 2022-05-06 09:56:30,805 INFO [train.py:715] (0/8) Epoch 9, batch 4150, loss[loss=0.1559, simple_loss=0.2177, pruned_loss=0.04708, over 4976.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2144, pruned_loss=0.03453, over 972747.29 frames.], batch size: 35, lr: 2.40e-04 2022-05-06 09:57:10,157 INFO [train.py:715] (0/8) Epoch 9, batch 4200, loss[loss=0.1713, simple_loss=0.2493, pruned_loss=0.0466, over 4807.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.03462, over 972326.66 frames.], batch size: 26, lr: 2.40e-04 2022-05-06 09:57:49,720 INFO [train.py:715] (0/8) Epoch 9, batch 4250, loss[loss=0.128, simple_loss=0.2074, pruned_loss=0.02426, over 4778.00 frames.], tot_loss[loss=0.1415, simple_loss=0.214, pruned_loss=0.03454, over 972713.55 frames.], batch size: 18, lr: 2.40e-04 2022-05-06 09:58:29,616 INFO [train.py:715] (0/8) Epoch 9, batch 4300, loss[loss=0.1347, simple_loss=0.2067, pruned_loss=0.03128, over 4861.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2143, pruned_loss=0.03434, over 972315.92 frames.], batch size: 34, lr: 2.40e-04 2022-05-06 09:59:09,596 INFO [train.py:715] (0/8) Epoch 9, batch 4350, loss[loss=0.1361, simple_loss=0.2122, pruned_loss=0.03003, over 4821.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2148, pruned_loss=0.03445, over 971584.65 frames.], batch size: 26, lr: 2.40e-04 2022-05-06 09:59:48,191 INFO [train.py:715] (0/8) Epoch 9, batch 4400, loss[loss=0.1806, simple_loss=0.2503, pruned_loss=0.05543, over 4723.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2153, pruned_loss=0.03502, over 971716.67 frames.], batch size: 16, lr: 2.40e-04 2022-05-06 10:00:27,688 INFO [train.py:715] (0/8) Epoch 9, batch 4450, loss[loss=0.1562, simple_loss=0.2324, pruned_loss=0.03993, over 4644.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2158, pruned_loss=0.03538, over 972178.47 frames.], batch size: 13, lr: 2.40e-04 2022-05-06 10:01:06,480 INFO [train.py:715] (0/8) Epoch 9, batch 4500, loss[loss=0.1348, simple_loss=0.2126, pruned_loss=0.02846, over 4752.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2148, pruned_loss=0.03472, over 972809.89 frames.], batch size: 19, lr: 2.40e-04 2022-05-06 10:01:45,447 INFO [train.py:715] (0/8) Epoch 9, batch 4550, loss[loss=0.1351, simple_loss=0.2021, pruned_loss=0.03405, over 4858.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2153, pruned_loss=0.03494, over 973265.22 frames.], batch size: 30, lr: 2.40e-04 2022-05-06 10:02:24,723 INFO [train.py:715] (0/8) Epoch 9, batch 4600, loss[loss=0.1304, simple_loss=0.1986, pruned_loss=0.03115, over 4825.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03449, over 973430.35 frames.], batch size: 12, lr: 2.40e-04 2022-05-06 10:03:04,291 INFO [train.py:715] (0/8) Epoch 9, batch 4650, loss[loss=0.1508, simple_loss=0.2296, pruned_loss=0.036, over 4756.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2141, pruned_loss=0.03473, over 972999.06 frames.], batch size: 19, lr: 2.40e-04 2022-05-06 10:03:43,902 INFO [train.py:715] (0/8) Epoch 9, batch 4700, loss[loss=0.1428, simple_loss=0.208, pruned_loss=0.03875, over 4633.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2148, pruned_loss=0.03522, over 972386.67 frames.], batch size: 13, lr: 2.40e-04 2022-05-06 10:04:22,846 INFO [train.py:715] (0/8) Epoch 9, batch 4750, loss[loss=0.1377, simple_loss=0.2186, pruned_loss=0.02837, over 4829.00 frames.], tot_loss[loss=0.1426, simple_loss=0.215, pruned_loss=0.03511, over 972820.83 frames.], batch size: 27, lr: 2.40e-04 2022-05-06 10:05:02,421 INFO [train.py:715] (0/8) Epoch 9, batch 4800, loss[loss=0.1388, simple_loss=0.1966, pruned_loss=0.04043, over 4968.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2146, pruned_loss=0.03519, over 973075.30 frames.], batch size: 14, lr: 2.40e-04 2022-05-06 10:05:41,418 INFO [train.py:715] (0/8) Epoch 9, batch 4850, loss[loss=0.1165, simple_loss=0.1946, pruned_loss=0.0192, over 4940.00 frames.], tot_loss[loss=0.142, simple_loss=0.2142, pruned_loss=0.03488, over 972978.88 frames.], batch size: 21, lr: 2.40e-04 2022-05-06 10:06:20,852 INFO [train.py:715] (0/8) Epoch 9, batch 4900, loss[loss=0.1604, simple_loss=0.2286, pruned_loss=0.04613, over 4981.00 frames.], tot_loss[loss=0.1419, simple_loss=0.214, pruned_loss=0.0349, over 973278.60 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 10:06:59,736 INFO [train.py:715] (0/8) Epoch 9, batch 4950, loss[loss=0.1641, simple_loss=0.2386, pruned_loss=0.04477, over 4847.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2144, pruned_loss=0.03471, over 972979.06 frames.], batch size: 30, lr: 2.40e-04 2022-05-06 10:07:39,114 INFO [train.py:715] (0/8) Epoch 9, batch 5000, loss[loss=0.1528, simple_loss=0.2168, pruned_loss=0.04441, over 4915.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.03518, over 972797.23 frames.], batch size: 17, lr: 2.40e-04 2022-05-06 10:08:18,415 INFO [train.py:715] (0/8) Epoch 9, batch 5050, loss[loss=0.1106, simple_loss=0.1751, pruned_loss=0.02302, over 4937.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2146, pruned_loss=0.03479, over 972379.14 frames.], batch size: 29, lr: 2.40e-04 2022-05-06 10:08:57,171 INFO [train.py:715] (0/8) Epoch 9, batch 5100, loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.02896, over 4972.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2152, pruned_loss=0.0351, over 973593.77 frames.], batch size: 15, lr: 2.40e-04 2022-05-06 10:09:36,558 INFO [train.py:715] (0/8) Epoch 9, batch 5150, loss[loss=0.1428, simple_loss=0.2201, pruned_loss=0.03269, over 4990.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2143, pruned_loss=0.03457, over 973766.63 frames.], batch size: 28, lr: 2.40e-04 2022-05-06 10:10:15,464 INFO [train.py:715] (0/8) Epoch 9, batch 5200, loss[loss=0.1707, simple_loss=0.2372, pruned_loss=0.05211, over 4825.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2146, pruned_loss=0.03442, over 973545.26 frames.], batch size: 26, lr: 2.40e-04 2022-05-06 10:10:54,748 INFO [train.py:715] (0/8) Epoch 9, batch 5250, loss[loss=0.141, simple_loss=0.2145, pruned_loss=0.03376, over 4748.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2147, pruned_loss=0.03444, over 972949.67 frames.], batch size: 19, lr: 2.40e-04 2022-05-06 10:11:33,953 INFO [train.py:715] (0/8) Epoch 9, batch 5300, loss[loss=0.1426, simple_loss=0.2076, pruned_loss=0.03886, over 4870.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2154, pruned_loss=0.03513, over 972503.59 frames.], batch size: 32, lr: 2.39e-04 2022-05-06 10:12:13,442 INFO [train.py:715] (0/8) Epoch 9, batch 5350, loss[loss=0.1442, simple_loss=0.211, pruned_loss=0.03865, over 4844.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2154, pruned_loss=0.035, over 971794.94 frames.], batch size: 30, lr: 2.39e-04 2022-05-06 10:12:52,103 INFO [train.py:715] (0/8) Epoch 9, batch 5400, loss[loss=0.1476, simple_loss=0.2289, pruned_loss=0.03315, over 4974.00 frames.], tot_loss[loss=0.1422, simple_loss=0.215, pruned_loss=0.03468, over 972799.08 frames.], batch size: 15, lr: 2.39e-04 2022-05-06 10:13:30,896 INFO [train.py:715] (0/8) Epoch 9, batch 5450, loss[loss=0.1347, simple_loss=0.2032, pruned_loss=0.03312, over 4835.00 frames.], tot_loss[loss=0.1424, simple_loss=0.215, pruned_loss=0.03491, over 972601.07 frames.], batch size: 13, lr: 2.39e-04 2022-05-06 10:14:10,208 INFO [train.py:715] (0/8) Epoch 9, batch 5500, loss[loss=0.131, simple_loss=0.2101, pruned_loss=0.02595, over 4804.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2157, pruned_loss=0.03522, over 973201.49 frames.], batch size: 21, lr: 2.39e-04 2022-05-06 10:14:49,297 INFO [train.py:715] (0/8) Epoch 9, batch 5550, loss[loss=0.1232, simple_loss=0.2, pruned_loss=0.02316, over 4817.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.03517, over 973108.67 frames.], batch size: 26, lr: 2.39e-04 2022-05-06 10:15:28,465 INFO [train.py:715] (0/8) Epoch 9, batch 5600, loss[loss=0.1796, simple_loss=0.2354, pruned_loss=0.06192, over 4886.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2153, pruned_loss=0.03566, over 973230.84 frames.], batch size: 39, lr: 2.39e-04 2022-05-06 10:16:07,455 INFO [train.py:715] (0/8) Epoch 9, batch 5650, loss[loss=0.1547, simple_loss=0.2223, pruned_loss=0.04359, over 4693.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2154, pruned_loss=0.0356, over 972647.69 frames.], batch size: 15, lr: 2.39e-04 2022-05-06 10:16:47,096 INFO [train.py:715] (0/8) Epoch 9, batch 5700, loss[loss=0.1799, simple_loss=0.2457, pruned_loss=0.05701, over 4844.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2155, pruned_loss=0.0355, over 972853.34 frames.], batch size: 15, lr: 2.39e-04 2022-05-06 10:17:26,140 INFO [train.py:715] (0/8) Epoch 9, batch 5750, loss[loss=0.1404, simple_loss=0.2196, pruned_loss=0.03056, over 4934.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03441, over 973276.26 frames.], batch size: 23, lr: 2.39e-04 2022-05-06 10:18:04,788 INFO [train.py:715] (0/8) Epoch 9, batch 5800, loss[loss=0.1269, simple_loss=0.2068, pruned_loss=0.02346, over 4821.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2149, pruned_loss=0.03505, over 973953.46 frames.], batch size: 25, lr: 2.39e-04 2022-05-06 10:18:44,312 INFO [train.py:715] (0/8) Epoch 9, batch 5850, loss[loss=0.1394, simple_loss=0.2162, pruned_loss=0.03125, over 4821.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2142, pruned_loss=0.03462, over 973627.16 frames.], batch size: 27, lr: 2.39e-04 2022-05-06 10:19:23,128 INFO [train.py:715] (0/8) Epoch 9, batch 5900, loss[loss=0.1375, simple_loss=0.2084, pruned_loss=0.0333, over 4894.00 frames.], tot_loss[loss=0.141, simple_loss=0.2135, pruned_loss=0.03422, over 973093.08 frames.], batch size: 19, lr: 2.39e-04 2022-05-06 10:20:02,776 INFO [train.py:715] (0/8) Epoch 9, batch 5950, loss[loss=0.1462, simple_loss=0.2203, pruned_loss=0.03611, over 4702.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2141, pruned_loss=0.03431, over 972644.49 frames.], batch size: 15, lr: 2.39e-04 2022-05-06 10:20:41,533 INFO [train.py:715] (0/8) Epoch 9, batch 6000, loss[loss=0.1242, simple_loss=0.2072, pruned_loss=0.02064, over 4989.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2143, pruned_loss=0.03428, over 972427.14 frames.], batch size: 25, lr: 2.39e-04 2022-05-06 10:20:41,534 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 10:20:51,194 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.107, simple_loss=0.1914, pruned_loss=0.0113, over 914524.00 frames. 2022-05-06 10:21:30,883 INFO [train.py:715] (0/8) Epoch 9, batch 6050, loss[loss=0.1419, simple_loss=0.2163, pruned_loss=0.03373, over 4732.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2133, pruned_loss=0.03372, over 972251.95 frames.], batch size: 16, lr: 2.39e-04 2022-05-06 10:22:10,754 INFO [train.py:715] (0/8) Epoch 9, batch 6100, loss[loss=0.1379, simple_loss=0.2129, pruned_loss=0.03141, over 4784.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2136, pruned_loss=0.03404, over 971767.33 frames.], batch size: 18, lr: 2.39e-04 2022-05-06 10:22:49,972 INFO [train.py:715] (0/8) Epoch 9, batch 6150, loss[loss=0.1399, simple_loss=0.209, pruned_loss=0.03535, over 4876.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2141, pruned_loss=0.0343, over 972222.88 frames.], batch size: 16, lr: 2.39e-04 2022-05-06 10:23:28,784 INFO [train.py:715] (0/8) Epoch 9, batch 6200, loss[loss=0.1447, simple_loss=0.2184, pruned_loss=0.03549, over 4783.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03412, over 973365.44 frames.], batch size: 17, lr: 2.39e-04 2022-05-06 10:24:08,419 INFO [train.py:715] (0/8) Epoch 9, batch 6250, loss[loss=0.1509, simple_loss=0.2283, pruned_loss=0.03675, over 4948.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2128, pruned_loss=0.03402, over 972883.76 frames.], batch size: 21, lr: 2.39e-04 2022-05-06 10:24:47,200 INFO [train.py:715] (0/8) Epoch 9, batch 6300, loss[loss=0.1353, simple_loss=0.2129, pruned_loss=0.02882, over 4787.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2127, pruned_loss=0.03386, over 971742.72 frames.], batch size: 24, lr: 2.39e-04 2022-05-06 10:25:26,318 INFO [train.py:715] (0/8) Epoch 9, batch 6350, loss[loss=0.1471, simple_loss=0.2212, pruned_loss=0.03646, over 4968.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03382, over 971912.38 frames.], batch size: 31, lr: 2.39e-04 2022-05-06 10:26:05,951 INFO [train.py:715] (0/8) Epoch 9, batch 6400, loss[loss=0.1155, simple_loss=0.198, pruned_loss=0.01654, over 4978.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2131, pruned_loss=0.03383, over 972353.52 frames.], batch size: 25, lr: 2.39e-04 2022-05-06 10:26:46,099 INFO [train.py:715] (0/8) Epoch 9, batch 6450, loss[loss=0.1302, simple_loss=0.209, pruned_loss=0.02573, over 4790.00 frames.], tot_loss[loss=0.14, simple_loss=0.2123, pruned_loss=0.03387, over 972351.26 frames.], batch size: 18, lr: 2.39e-04 2022-05-06 10:27:25,423 INFO [train.py:715] (0/8) Epoch 9, batch 6500, loss[loss=0.1225, simple_loss=0.1997, pruned_loss=0.02271, over 4804.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2125, pruned_loss=0.03437, over 972365.13 frames.], batch size: 21, lr: 2.39e-04 2022-05-06 10:28:04,257 INFO [train.py:715] (0/8) Epoch 9, batch 6550, loss[loss=0.1288, simple_loss=0.206, pruned_loss=0.02579, over 4945.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2124, pruned_loss=0.03458, over 972794.12 frames.], batch size: 23, lr: 2.39e-04 2022-05-06 10:28:44,037 INFO [train.py:715] (0/8) Epoch 9, batch 6600, loss[loss=0.151, simple_loss=0.2194, pruned_loss=0.04131, over 4878.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.0351, over 972674.20 frames.], batch size: 32, lr: 2.39e-04 2022-05-06 10:29:23,593 INFO [train.py:715] (0/8) Epoch 9, batch 6650, loss[loss=0.1307, simple_loss=0.2035, pruned_loss=0.0289, over 4901.00 frames.], tot_loss[loss=0.1423, simple_loss=0.214, pruned_loss=0.03524, over 973355.39 frames.], batch size: 19, lr: 2.39e-04 2022-05-06 10:30:02,746 INFO [train.py:715] (0/8) Epoch 9, batch 6700, loss[loss=0.1689, simple_loss=0.2266, pruned_loss=0.05565, over 4879.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2151, pruned_loss=0.03596, over 973366.05 frames.], batch size: 16, lr: 2.39e-04 2022-05-06 10:30:14,231 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-320000.pt 2022-05-06 10:30:44,171 INFO [train.py:715] (0/8) Epoch 9, batch 6750, loss[loss=0.1652, simple_loss=0.2432, pruned_loss=0.04358, over 4910.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2149, pruned_loss=0.03566, over 972053.32 frames.], batch size: 19, lr: 2.39e-04 2022-05-06 10:31:23,599 INFO [train.py:715] (0/8) Epoch 9, batch 6800, loss[loss=0.1501, simple_loss=0.2239, pruned_loss=0.03819, over 4969.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2149, pruned_loss=0.03538, over 973010.25 frames.], batch size: 24, lr: 2.39e-04 2022-05-06 10:32:02,555 INFO [train.py:715] (0/8) Epoch 9, batch 6850, loss[loss=0.1327, simple_loss=0.2085, pruned_loss=0.02848, over 4871.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2153, pruned_loss=0.03511, over 972767.60 frames.], batch size: 20, lr: 2.39e-04 2022-05-06 10:32:40,750 INFO [train.py:715] (0/8) Epoch 9, batch 6900, loss[loss=0.1432, simple_loss=0.2128, pruned_loss=0.03678, over 4687.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2157, pruned_loss=0.03564, over 972467.93 frames.], batch size: 15, lr: 2.39e-04 2022-05-06 10:33:20,057 INFO [train.py:715] (0/8) Epoch 9, batch 6950, loss[loss=0.1653, simple_loss=0.226, pruned_loss=0.05228, over 4782.00 frames.], tot_loss[loss=0.1426, simple_loss=0.215, pruned_loss=0.03514, over 972507.40 frames.], batch size: 14, lr: 2.39e-04 2022-05-06 10:33:59,863 INFO [train.py:715] (0/8) Epoch 9, batch 7000, loss[loss=0.1439, simple_loss=0.2145, pruned_loss=0.03666, over 4888.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2149, pruned_loss=0.03504, over 972692.27 frames.], batch size: 32, lr: 2.39e-04 2022-05-06 10:34:38,723 INFO [train.py:715] (0/8) Epoch 9, batch 7050, loss[loss=0.1438, simple_loss=0.2208, pruned_loss=0.03339, over 4943.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2153, pruned_loss=0.03526, over 971573.81 frames.], batch size: 23, lr: 2.39e-04 2022-05-06 10:35:17,347 INFO [train.py:715] (0/8) Epoch 9, batch 7100, loss[loss=0.1424, simple_loss=0.2072, pruned_loss=0.03881, over 4788.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2155, pruned_loss=0.03562, over 971566.10 frames.], batch size: 18, lr: 2.39e-04 2022-05-06 10:35:56,810 INFO [train.py:715] (0/8) Epoch 9, batch 7150, loss[loss=0.1431, simple_loss=0.2118, pruned_loss=0.03719, over 4910.00 frames.], tot_loss[loss=0.1424, simple_loss=0.215, pruned_loss=0.03492, over 971278.58 frames.], batch size: 18, lr: 2.39e-04 2022-05-06 10:36:35,506 INFO [train.py:715] (0/8) Epoch 9, batch 7200, loss[loss=0.1521, simple_loss=0.2203, pruned_loss=0.0419, over 4787.00 frames.], tot_loss[loss=0.143, simple_loss=0.2157, pruned_loss=0.03519, over 971889.09 frames.], batch size: 14, lr: 2.39e-04 2022-05-06 10:37:14,249 INFO [train.py:715] (0/8) Epoch 9, batch 7250, loss[loss=0.1887, simple_loss=0.2597, pruned_loss=0.05879, over 4969.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2155, pruned_loss=0.03511, over 972746.82 frames.], batch size: 39, lr: 2.39e-04 2022-05-06 10:37:53,495 INFO [train.py:715] (0/8) Epoch 9, batch 7300, loss[loss=0.178, simple_loss=0.2497, pruned_loss=0.0532, over 4767.00 frames.], tot_loss[loss=0.144, simple_loss=0.2164, pruned_loss=0.03576, over 972296.69 frames.], batch size: 17, lr: 2.39e-04 2022-05-06 10:38:32,800 INFO [train.py:715] (0/8) Epoch 9, batch 7350, loss[loss=0.1435, simple_loss=0.2218, pruned_loss=0.03264, over 4821.00 frames.], tot_loss[loss=0.1443, simple_loss=0.2169, pruned_loss=0.03589, over 972847.02 frames.], batch size: 27, lr: 2.39e-04 2022-05-06 10:39:11,302 INFO [train.py:715] (0/8) Epoch 9, batch 7400, loss[loss=0.1579, simple_loss=0.2231, pruned_loss=0.0464, over 4977.00 frames.], tot_loss[loss=0.1436, simple_loss=0.216, pruned_loss=0.03563, over 973284.52 frames.], batch size: 28, lr: 2.39e-04 2022-05-06 10:39:50,257 INFO [train.py:715] (0/8) Epoch 9, batch 7450, loss[loss=0.146, simple_loss=0.2128, pruned_loss=0.03964, over 4840.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2153, pruned_loss=0.03524, over 972987.18 frames.], batch size: 13, lr: 2.39e-04 2022-05-06 10:40:30,202 INFO [train.py:715] (0/8) Epoch 9, batch 7500, loss[loss=0.1438, simple_loss=0.2084, pruned_loss=0.03954, over 4945.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2152, pruned_loss=0.03485, over 973274.29 frames.], batch size: 35, lr: 2.39e-04 2022-05-06 10:41:09,245 INFO [train.py:715] (0/8) Epoch 9, batch 7550, loss[loss=0.1654, simple_loss=0.2398, pruned_loss=0.0455, over 4964.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2146, pruned_loss=0.03452, over 972737.67 frames.], batch size: 35, lr: 2.39e-04 2022-05-06 10:41:48,087 INFO [train.py:715] (0/8) Epoch 9, batch 7600, loss[loss=0.1349, simple_loss=0.2129, pruned_loss=0.02842, over 4800.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03436, over 972264.17 frames.], batch size: 25, lr: 2.39e-04 2022-05-06 10:42:27,542 INFO [train.py:715] (0/8) Epoch 9, batch 7650, loss[loss=0.1402, simple_loss=0.2105, pruned_loss=0.03497, over 4945.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.03444, over 973159.75 frames.], batch size: 21, lr: 2.39e-04 2022-05-06 10:43:06,740 INFO [train.py:715] (0/8) Epoch 9, batch 7700, loss[loss=0.1189, simple_loss=0.1944, pruned_loss=0.02176, over 4946.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03459, over 971142.33 frames.], batch size: 29, lr: 2.39e-04 2022-05-06 10:43:45,564 INFO [train.py:715] (0/8) Epoch 9, batch 7750, loss[loss=0.1406, simple_loss=0.2379, pruned_loss=0.02168, over 4975.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2137, pruned_loss=0.03442, over 971618.54 frames.], batch size: 25, lr: 2.39e-04 2022-05-06 10:44:24,374 INFO [train.py:715] (0/8) Epoch 9, batch 7800, loss[loss=0.1185, simple_loss=0.1983, pruned_loss=0.0193, over 4771.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2151, pruned_loss=0.03492, over 972128.92 frames.], batch size: 14, lr: 2.39e-04 2022-05-06 10:45:04,414 INFO [train.py:715] (0/8) Epoch 9, batch 7850, loss[loss=0.1322, simple_loss=0.2053, pruned_loss=0.02952, over 4696.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2147, pruned_loss=0.03453, over 972155.10 frames.], batch size: 15, lr: 2.39e-04 2022-05-06 10:45:43,397 INFO [train.py:715] (0/8) Epoch 9, batch 7900, loss[loss=0.1561, simple_loss=0.2228, pruned_loss=0.04471, over 4904.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2148, pruned_loss=0.03494, over 972514.70 frames.], batch size: 39, lr: 2.39e-04 2022-05-06 10:46:21,524 INFO [train.py:715] (0/8) Epoch 9, batch 7950, loss[loss=0.1266, simple_loss=0.1983, pruned_loss=0.02752, over 4908.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2146, pruned_loss=0.03451, over 972686.44 frames.], batch size: 17, lr: 2.39e-04 2022-05-06 10:47:00,913 INFO [train.py:715] (0/8) Epoch 9, batch 8000, loss[loss=0.1406, simple_loss=0.2142, pruned_loss=0.03349, over 4807.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2152, pruned_loss=0.03501, over 972377.01 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 10:47:39,932 INFO [train.py:715] (0/8) Epoch 9, batch 8050, loss[loss=0.1335, simple_loss=0.2048, pruned_loss=0.03111, over 4776.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2155, pruned_loss=0.03499, over 972498.38 frames.], batch size: 18, lr: 2.38e-04 2022-05-06 10:48:18,558 INFO [train.py:715] (0/8) Epoch 9, batch 8100, loss[loss=0.1386, simple_loss=0.2202, pruned_loss=0.02854, over 4740.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2163, pruned_loss=0.03545, over 971885.71 frames.], batch size: 16, lr: 2.38e-04 2022-05-06 10:48:57,106 INFO [train.py:715] (0/8) Epoch 9, batch 8150, loss[loss=0.123, simple_loss=0.1987, pruned_loss=0.02361, over 4813.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2157, pruned_loss=0.03523, over 971475.32 frames.], batch size: 26, lr: 2.38e-04 2022-05-06 10:49:36,459 INFO [train.py:715] (0/8) Epoch 9, batch 8200, loss[loss=0.1574, simple_loss=0.2277, pruned_loss=0.04356, over 4813.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2152, pruned_loss=0.03493, over 971497.12 frames.], batch size: 15, lr: 2.38e-04 2022-05-06 10:50:15,124 INFO [train.py:715] (0/8) Epoch 9, batch 8250, loss[loss=0.1131, simple_loss=0.188, pruned_loss=0.01908, over 4785.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2151, pruned_loss=0.0349, over 972595.39 frames.], batch size: 18, lr: 2.38e-04 2022-05-06 10:50:53,694 INFO [train.py:715] (0/8) Epoch 9, batch 8300, loss[loss=0.1479, simple_loss=0.233, pruned_loss=0.03136, over 4799.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2153, pruned_loss=0.03495, over 972425.55 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 10:51:32,738 INFO [train.py:715] (0/8) Epoch 9, batch 8350, loss[loss=0.1315, simple_loss=0.2024, pruned_loss=0.03028, over 4853.00 frames.], tot_loss[loss=0.1425, simple_loss=0.215, pruned_loss=0.03502, over 971974.18 frames.], batch size: 30, lr: 2.38e-04 2022-05-06 10:52:12,413 INFO [train.py:715] (0/8) Epoch 9, batch 8400, loss[loss=0.1638, simple_loss=0.233, pruned_loss=0.04729, over 4747.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2149, pruned_loss=0.03512, over 973303.59 frames.], batch size: 16, lr: 2.38e-04 2022-05-06 10:52:50,770 INFO [train.py:715] (0/8) Epoch 9, batch 8450, loss[loss=0.1173, simple_loss=0.1989, pruned_loss=0.01788, over 4800.00 frames.], tot_loss[loss=0.1425, simple_loss=0.215, pruned_loss=0.03507, over 973674.77 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 10:53:29,411 INFO [train.py:715] (0/8) Epoch 9, batch 8500, loss[loss=0.1333, simple_loss=0.2181, pruned_loss=0.02423, over 4926.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2144, pruned_loss=0.03488, over 974073.15 frames.], batch size: 21, lr: 2.38e-04 2022-05-06 10:54:08,958 INFO [train.py:715] (0/8) Epoch 9, batch 8550, loss[loss=0.1413, simple_loss=0.2256, pruned_loss=0.02848, over 4855.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2149, pruned_loss=0.03477, over 974082.68 frames.], batch size: 20, lr: 2.38e-04 2022-05-06 10:54:48,127 INFO [train.py:715] (0/8) Epoch 9, batch 8600, loss[loss=0.1461, simple_loss=0.207, pruned_loss=0.04264, over 4835.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03418, over 973776.58 frames.], batch size: 30, lr: 2.38e-04 2022-05-06 10:55:26,982 INFO [train.py:715] (0/8) Epoch 9, batch 8650, loss[loss=0.1353, simple_loss=0.2171, pruned_loss=0.02674, over 4792.00 frames.], tot_loss[loss=0.141, simple_loss=0.2139, pruned_loss=0.034, over 972843.92 frames.], batch size: 24, lr: 2.38e-04 2022-05-06 10:56:06,796 INFO [train.py:715] (0/8) Epoch 9, batch 8700, loss[loss=0.1721, simple_loss=0.2577, pruned_loss=0.04329, over 4876.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2139, pruned_loss=0.03353, over 972778.20 frames.], batch size: 22, lr: 2.38e-04 2022-05-06 10:56:46,700 INFO [train.py:715] (0/8) Epoch 9, batch 8750, loss[loss=0.1461, simple_loss=0.2191, pruned_loss=0.0365, over 4789.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2135, pruned_loss=0.03393, over 972787.86 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 10:57:25,011 INFO [train.py:715] (0/8) Epoch 9, batch 8800, loss[loss=0.1535, simple_loss=0.2329, pruned_loss=0.03705, over 4761.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2143, pruned_loss=0.03439, over 972724.63 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 10:58:04,390 INFO [train.py:715] (0/8) Epoch 9, batch 8850, loss[loss=0.1937, simple_loss=0.2685, pruned_loss=0.05944, over 4818.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03419, over 972158.49 frames.], batch size: 15, lr: 2.38e-04 2022-05-06 10:58:43,836 INFO [train.py:715] (0/8) Epoch 9, batch 8900, loss[loss=0.151, simple_loss=0.2224, pruned_loss=0.03978, over 4953.00 frames.], tot_loss[loss=0.141, simple_loss=0.2138, pruned_loss=0.03415, over 972145.02 frames.], batch size: 39, lr: 2.38e-04 2022-05-06 10:59:22,964 INFO [train.py:715] (0/8) Epoch 9, batch 8950, loss[loss=0.1442, simple_loss=0.2216, pruned_loss=0.03336, over 4913.00 frames.], tot_loss[loss=0.1413, simple_loss=0.214, pruned_loss=0.03427, over 972473.05 frames.], batch size: 39, lr: 2.38e-04 2022-05-06 11:00:01,616 INFO [train.py:715] (0/8) Epoch 9, batch 9000, loss[loss=0.1639, simple_loss=0.2349, pruned_loss=0.04643, over 4874.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03419, over 972376.58 frames.], batch size: 32, lr: 2.38e-04 2022-05-06 11:00:01,617 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 11:00:11,232 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.107, simple_loss=0.1914, pruned_loss=0.0113, over 914524.00 frames. 2022-05-06 11:00:49,916 INFO [train.py:715] (0/8) Epoch 9, batch 9050, loss[loss=0.1705, simple_loss=0.2312, pruned_loss=0.05488, over 4863.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2127, pruned_loss=0.03429, over 971925.41 frames.], batch size: 32, lr: 2.38e-04 2022-05-06 11:01:30,082 INFO [train.py:715] (0/8) Epoch 9, batch 9100, loss[loss=0.1239, simple_loss=0.1998, pruned_loss=0.02401, over 4817.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2133, pruned_loss=0.03449, over 972948.30 frames.], batch size: 27, lr: 2.38e-04 2022-05-06 11:02:09,669 INFO [train.py:715] (0/8) Epoch 9, batch 9150, loss[loss=0.1491, simple_loss=0.2174, pruned_loss=0.04038, over 4899.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2136, pruned_loss=0.03458, over 972928.31 frames.], batch size: 19, lr: 2.38e-04 2022-05-06 11:02:48,632 INFO [train.py:715] (0/8) Epoch 9, batch 9200, loss[loss=0.1642, simple_loss=0.2386, pruned_loss=0.0449, over 4929.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.0344, over 973158.95 frames.], batch size: 23, lr: 2.38e-04 2022-05-06 11:03:28,185 INFO [train.py:715] (0/8) Epoch 9, batch 9250, loss[loss=0.1305, simple_loss=0.2056, pruned_loss=0.02772, over 4894.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2141, pruned_loss=0.03437, over 972289.15 frames.], batch size: 19, lr: 2.38e-04 2022-05-06 11:04:07,599 INFO [train.py:715] (0/8) Epoch 9, batch 9300, loss[loss=0.1414, simple_loss=0.2106, pruned_loss=0.03606, over 4817.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2138, pruned_loss=0.03455, over 971841.95 frames.], batch size: 21, lr: 2.38e-04 2022-05-06 11:04:46,769 INFO [train.py:715] (0/8) Epoch 9, batch 9350, loss[loss=0.1242, simple_loss=0.2034, pruned_loss=0.02252, over 4989.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2132, pruned_loss=0.03457, over 972319.55 frames.], batch size: 28, lr: 2.38e-04 2022-05-06 11:05:25,231 INFO [train.py:715] (0/8) Epoch 9, batch 9400, loss[loss=0.1744, simple_loss=0.2388, pruned_loss=0.05498, over 4854.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2136, pruned_loss=0.0347, over 971635.53 frames.], batch size: 34, lr: 2.38e-04 2022-05-06 11:06:05,135 INFO [train.py:715] (0/8) Epoch 9, batch 9450, loss[loss=0.1444, simple_loss=0.2182, pruned_loss=0.03536, over 4991.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2148, pruned_loss=0.03513, over 972363.84 frames.], batch size: 26, lr: 2.38e-04 2022-05-06 11:06:44,278 INFO [train.py:715] (0/8) Epoch 9, batch 9500, loss[loss=0.1296, simple_loss=0.2011, pruned_loss=0.02903, over 4772.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2144, pruned_loss=0.0347, over 972427.51 frames.], batch size: 14, lr: 2.38e-04 2022-05-06 11:07:22,931 INFO [train.py:715] (0/8) Epoch 9, batch 9550, loss[loss=0.1433, simple_loss=0.2189, pruned_loss=0.03383, over 4885.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2151, pruned_loss=0.03519, over 972424.38 frames.], batch size: 22, lr: 2.38e-04 2022-05-06 11:08:02,129 INFO [train.py:715] (0/8) Epoch 9, batch 9600, loss[loss=0.1372, simple_loss=0.2018, pruned_loss=0.03636, over 4711.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2143, pruned_loss=0.03478, over 971845.91 frames.], batch size: 15, lr: 2.38e-04 2022-05-06 11:08:41,397 INFO [train.py:715] (0/8) Epoch 9, batch 9650, loss[loss=0.1535, simple_loss=0.2172, pruned_loss=0.04496, over 4778.00 frames.], tot_loss[loss=0.141, simple_loss=0.2134, pruned_loss=0.03432, over 972165.07 frames.], batch size: 14, lr: 2.38e-04 2022-05-06 11:09:20,425 INFO [train.py:715] (0/8) Epoch 9, batch 9700, loss[loss=0.1482, simple_loss=0.2197, pruned_loss=0.03834, over 4959.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2136, pruned_loss=0.03429, over 972083.49 frames.], batch size: 24, lr: 2.38e-04 2022-05-06 11:09:58,453 INFO [train.py:715] (0/8) Epoch 9, batch 9750, loss[loss=0.1301, simple_loss=0.2064, pruned_loss=0.02693, over 4944.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03451, over 972405.79 frames.], batch size: 29, lr: 2.38e-04 2022-05-06 11:10:38,590 INFO [train.py:715] (0/8) Epoch 9, batch 9800, loss[loss=0.1277, simple_loss=0.1928, pruned_loss=0.03134, over 4848.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2144, pruned_loss=0.03463, over 972367.47 frames.], batch size: 32, lr: 2.38e-04 2022-05-06 11:11:18,277 INFO [train.py:715] (0/8) Epoch 9, batch 9850, loss[loss=0.146, simple_loss=0.2194, pruned_loss=0.03634, over 4924.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2143, pruned_loss=0.03446, over 971278.18 frames.], batch size: 18, lr: 2.38e-04 2022-05-06 11:11:56,606 INFO [train.py:715] (0/8) Epoch 9, batch 9900, loss[loss=0.1462, simple_loss=0.2139, pruned_loss=0.03922, over 4922.00 frames.], tot_loss[loss=0.142, simple_loss=0.2145, pruned_loss=0.03479, over 971058.97 frames.], batch size: 29, lr: 2.38e-04 2022-05-06 11:12:35,815 INFO [train.py:715] (0/8) Epoch 9, batch 9950, loss[loss=0.1437, simple_loss=0.221, pruned_loss=0.03319, over 4935.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2143, pruned_loss=0.03454, over 971922.93 frames.], batch size: 21, lr: 2.38e-04 2022-05-06 11:13:15,756 INFO [train.py:715] (0/8) Epoch 9, batch 10000, loss[loss=0.1498, simple_loss=0.2229, pruned_loss=0.03835, over 4924.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2148, pruned_loss=0.035, over 971958.72 frames.], batch size: 39, lr: 2.38e-04 2022-05-06 11:13:55,090 INFO [train.py:715] (0/8) Epoch 9, batch 10050, loss[loss=0.14, simple_loss=0.2127, pruned_loss=0.03365, over 4774.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2145, pruned_loss=0.03462, over 972378.84 frames.], batch size: 14, lr: 2.38e-04 2022-05-06 11:14:33,374 INFO [train.py:715] (0/8) Epoch 9, batch 10100, loss[loss=0.1519, simple_loss=0.2174, pruned_loss=0.04321, over 4863.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.03484, over 973005.99 frames.], batch size: 32, lr: 2.38e-04 2022-05-06 11:15:12,910 INFO [train.py:715] (0/8) Epoch 9, batch 10150, loss[loss=0.1377, simple_loss=0.2158, pruned_loss=0.02984, over 4811.00 frames.], tot_loss[loss=0.142, simple_loss=0.2145, pruned_loss=0.03476, over 972464.27 frames.], batch size: 13, lr: 2.38e-04 2022-05-06 11:15:52,569 INFO [train.py:715] (0/8) Epoch 9, batch 10200, loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03323, over 4848.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2143, pruned_loss=0.0347, over 972174.43 frames.], batch size: 20, lr: 2.38e-04 2022-05-06 11:16:31,359 INFO [train.py:715] (0/8) Epoch 9, batch 10250, loss[loss=0.1539, simple_loss=0.2316, pruned_loss=0.03814, over 4745.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2148, pruned_loss=0.03468, over 972038.29 frames.], batch size: 16, lr: 2.38e-04 2022-05-06 11:17:10,102 INFO [train.py:715] (0/8) Epoch 9, batch 10300, loss[loss=0.1427, simple_loss=0.2112, pruned_loss=0.03707, over 4959.00 frames.], tot_loss[loss=0.142, simple_loss=0.2148, pruned_loss=0.03458, over 972227.34 frames.], batch size: 35, lr: 2.38e-04 2022-05-06 11:17:49,723 INFO [train.py:715] (0/8) Epoch 9, batch 10350, loss[loss=0.1538, simple_loss=0.2261, pruned_loss=0.04079, over 4932.00 frames.], tot_loss[loss=0.142, simple_loss=0.2148, pruned_loss=0.03456, over 971816.98 frames.], batch size: 18, lr: 2.38e-04 2022-05-06 11:18:28,419 INFO [train.py:715] (0/8) Epoch 9, batch 10400, loss[loss=0.1394, simple_loss=0.2166, pruned_loss=0.03113, over 4689.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03418, over 972162.93 frames.], batch size: 15, lr: 2.38e-04 2022-05-06 11:19:06,741 INFO [train.py:715] (0/8) Epoch 9, batch 10450, loss[loss=0.1454, simple_loss=0.2156, pruned_loss=0.03756, over 4880.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2147, pruned_loss=0.03441, over 972446.29 frames.], batch size: 16, lr: 2.38e-04 2022-05-06 11:19:45,852 INFO [train.py:715] (0/8) Epoch 9, batch 10500, loss[loss=0.1249, simple_loss=0.1937, pruned_loss=0.02803, over 4925.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2145, pruned_loss=0.03406, over 971878.84 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 11:20:25,283 INFO [train.py:715] (0/8) Epoch 9, batch 10550, loss[loss=0.159, simple_loss=0.2296, pruned_loss=0.04415, over 4664.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2145, pruned_loss=0.03393, over 970788.84 frames.], batch size: 13, lr: 2.38e-04 2022-05-06 11:21:04,100 INFO [train.py:715] (0/8) Epoch 9, batch 10600, loss[loss=0.1199, simple_loss=0.194, pruned_loss=0.02293, over 4913.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2148, pruned_loss=0.03429, over 971597.69 frames.], batch size: 17, lr: 2.38e-04 2022-05-06 11:21:42,609 INFO [train.py:715] (0/8) Epoch 9, batch 10650, loss[loss=0.178, simple_loss=0.2552, pruned_loss=0.05045, over 4803.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2149, pruned_loss=0.03431, over 971830.93 frames.], batch size: 24, lr: 2.38e-04 2022-05-06 11:22:21,911 INFO [train.py:715] (0/8) Epoch 9, batch 10700, loss[loss=0.1303, simple_loss=0.1992, pruned_loss=0.03072, over 4932.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2144, pruned_loss=0.03386, over 971746.00 frames.], batch size: 29, lr: 2.37e-04 2022-05-06 11:23:01,947 INFO [train.py:715] (0/8) Epoch 9, batch 10750, loss[loss=0.1603, simple_loss=0.2401, pruned_loss=0.04024, over 4735.00 frames.], tot_loss[loss=0.1407, simple_loss=0.214, pruned_loss=0.03372, over 971523.13 frames.], batch size: 16, lr: 2.37e-04 2022-05-06 11:23:40,538 INFO [train.py:715] (0/8) Epoch 9, batch 10800, loss[loss=0.1267, simple_loss=0.2015, pruned_loss=0.02591, over 4830.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2135, pruned_loss=0.03387, over 971300.00 frames.], batch size: 26, lr: 2.37e-04 2022-05-06 11:24:20,015 INFO [train.py:715] (0/8) Epoch 9, batch 10850, loss[loss=0.1374, simple_loss=0.2084, pruned_loss=0.03322, over 4870.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2136, pruned_loss=0.03356, over 972194.33 frames.], batch size: 16, lr: 2.37e-04 2022-05-06 11:24:59,848 INFO [train.py:715] (0/8) Epoch 9, batch 10900, loss[loss=0.1342, simple_loss=0.2164, pruned_loss=0.02601, over 4760.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2138, pruned_loss=0.03387, over 972361.55 frames.], batch size: 19, lr: 2.37e-04 2022-05-06 11:25:40,137 INFO [train.py:715] (0/8) Epoch 9, batch 10950, loss[loss=0.156, simple_loss=0.2386, pruned_loss=0.03669, over 4845.00 frames.], tot_loss[loss=0.1408, simple_loss=0.214, pruned_loss=0.03381, over 973378.02 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:26:20,015 INFO [train.py:715] (0/8) Epoch 9, batch 11000, loss[loss=0.1495, simple_loss=0.2349, pruned_loss=0.0321, over 4752.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2136, pruned_loss=0.03339, over 973428.60 frames.], batch size: 16, lr: 2.37e-04 2022-05-06 11:27:00,851 INFO [train.py:715] (0/8) Epoch 9, batch 11050, loss[loss=0.1694, simple_loss=0.2446, pruned_loss=0.04713, over 4942.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2136, pruned_loss=0.03359, over 973718.71 frames.], batch size: 39, lr: 2.37e-04 2022-05-06 11:27:42,121 INFO [train.py:715] (0/8) Epoch 9, batch 11100, loss[loss=0.1232, simple_loss=0.1968, pruned_loss=0.02478, over 4937.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2144, pruned_loss=0.0339, over 973074.68 frames.], batch size: 21, lr: 2.37e-04 2022-05-06 11:28:22,784 INFO [train.py:715] (0/8) Epoch 9, batch 11150, loss[loss=0.1347, simple_loss=0.2039, pruned_loss=0.0327, over 4642.00 frames.], tot_loss[loss=0.1417, simple_loss=0.215, pruned_loss=0.03424, over 972519.56 frames.], batch size: 13, lr: 2.37e-04 2022-05-06 11:29:03,601 INFO [train.py:715] (0/8) Epoch 9, batch 11200, loss[loss=0.1427, simple_loss=0.2187, pruned_loss=0.03334, over 4762.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03412, over 972408.74 frames.], batch size: 14, lr: 2.37e-04 2022-05-06 11:29:45,086 INFO [train.py:715] (0/8) Epoch 9, batch 11250, loss[loss=0.1351, simple_loss=0.2121, pruned_loss=0.02904, over 4838.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2146, pruned_loss=0.03441, over 972842.14 frames.], batch size: 13, lr: 2.37e-04 2022-05-06 11:30:26,204 INFO [train.py:715] (0/8) Epoch 9, batch 11300, loss[loss=0.1277, simple_loss=0.2054, pruned_loss=0.02498, over 4844.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2145, pruned_loss=0.03453, over 973623.55 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:31:06,648 INFO [train.py:715] (0/8) Epoch 9, batch 11350, loss[loss=0.1371, simple_loss=0.2128, pruned_loss=0.03074, over 4937.00 frames.], tot_loss[loss=0.142, simple_loss=0.2151, pruned_loss=0.03445, over 974029.92 frames.], batch size: 23, lr: 2.37e-04 2022-05-06 11:31:47,930 INFO [train.py:715] (0/8) Epoch 9, batch 11400, loss[loss=0.1171, simple_loss=0.1949, pruned_loss=0.01971, over 4828.00 frames.], tot_loss[loss=0.142, simple_loss=0.215, pruned_loss=0.03449, over 973291.15 frames.], batch size: 26, lr: 2.37e-04 2022-05-06 11:32:29,498 INFO [train.py:715] (0/8) Epoch 9, batch 11450, loss[loss=0.1215, simple_loss=0.2015, pruned_loss=0.02077, over 4941.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2161, pruned_loss=0.03499, over 972201.86 frames.], batch size: 29, lr: 2.37e-04 2022-05-06 11:33:10,076 INFO [train.py:715] (0/8) Epoch 9, batch 11500, loss[loss=0.1498, simple_loss=0.2058, pruned_loss=0.04692, over 4855.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2146, pruned_loss=0.03425, over 972643.16 frames.], batch size: 32, lr: 2.37e-04 2022-05-06 11:33:50,769 INFO [train.py:715] (0/8) Epoch 9, batch 11550, loss[loss=0.1387, simple_loss=0.2099, pruned_loss=0.03374, over 4787.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2127, pruned_loss=0.03344, over 972707.77 frames.], batch size: 14, lr: 2.37e-04 2022-05-06 11:34:32,084 INFO [train.py:715] (0/8) Epoch 9, batch 11600, loss[loss=0.162, simple_loss=0.2296, pruned_loss=0.04723, over 4975.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2136, pruned_loss=0.03357, over 972295.18 frames.], batch size: 15, lr: 2.37e-04 2022-05-06 11:35:13,599 INFO [train.py:715] (0/8) Epoch 9, batch 11650, loss[loss=0.1314, simple_loss=0.2127, pruned_loss=0.02509, over 4692.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2139, pruned_loss=0.03341, over 972395.50 frames.], batch size: 15, lr: 2.37e-04 2022-05-06 11:35:53,522 INFO [train.py:715] (0/8) Epoch 9, batch 11700, loss[loss=0.1631, simple_loss=0.2307, pruned_loss=0.04775, over 4694.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2144, pruned_loss=0.03357, over 971943.95 frames.], batch size: 15, lr: 2.37e-04 2022-05-06 11:36:34,967 INFO [train.py:715] (0/8) Epoch 9, batch 11750, loss[loss=0.1264, simple_loss=0.2042, pruned_loss=0.02434, over 4943.00 frames.], tot_loss[loss=0.141, simple_loss=0.2143, pruned_loss=0.03382, over 971638.41 frames.], batch size: 29, lr: 2.37e-04 2022-05-06 11:37:16,469 INFO [train.py:715] (0/8) Epoch 9, batch 11800, loss[loss=0.144, simple_loss=0.2163, pruned_loss=0.03588, over 4837.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2145, pruned_loss=0.03449, over 971513.96 frames.], batch size: 26, lr: 2.37e-04 2022-05-06 11:37:56,811 INFO [train.py:715] (0/8) Epoch 9, batch 11850, loss[loss=0.1386, simple_loss=0.2063, pruned_loss=0.03543, over 4839.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2148, pruned_loss=0.03484, over 971969.51 frames.], batch size: 30, lr: 2.37e-04 2022-05-06 11:38:37,232 INFO [train.py:715] (0/8) Epoch 9, batch 11900, loss[loss=0.1597, simple_loss=0.2237, pruned_loss=0.04787, over 4778.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.03465, over 972472.72 frames.], batch size: 14, lr: 2.37e-04 2022-05-06 11:39:18,263 INFO [train.py:715] (0/8) Epoch 9, batch 11950, loss[loss=0.1629, simple_loss=0.2427, pruned_loss=0.04156, over 4744.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03447, over 972166.94 frames.], batch size: 16, lr: 2.37e-04 2022-05-06 11:39:59,369 INFO [train.py:715] (0/8) Epoch 9, batch 12000, loss[loss=0.1474, simple_loss=0.219, pruned_loss=0.0379, over 4886.00 frames.], tot_loss[loss=0.142, simple_loss=0.2147, pruned_loss=0.0347, over 971917.75 frames.], batch size: 19, lr: 2.37e-04 2022-05-06 11:39:59,370 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 11:40:09,082 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.107, simple_loss=0.1913, pruned_loss=0.01136, over 914524.00 frames. 2022-05-06 11:40:50,128 INFO [train.py:715] (0/8) Epoch 9, batch 12050, loss[loss=0.1606, simple_loss=0.2319, pruned_loss=0.04468, over 4802.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2147, pruned_loss=0.03455, over 972185.32 frames.], batch size: 21, lr: 2.37e-04 2022-05-06 11:41:29,622 INFO [train.py:715] (0/8) Epoch 9, batch 12100, loss[loss=0.1118, simple_loss=0.1898, pruned_loss=0.01692, over 4777.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2148, pruned_loss=0.03471, over 971551.84 frames.], batch size: 14, lr: 2.37e-04 2022-05-06 11:42:10,006 INFO [train.py:715] (0/8) Epoch 9, batch 12150, loss[loss=0.1507, simple_loss=0.2252, pruned_loss=0.03813, over 4790.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2146, pruned_loss=0.03475, over 970116.82 frames.], batch size: 12, lr: 2.37e-04 2022-05-06 11:42:50,007 INFO [train.py:715] (0/8) Epoch 9, batch 12200, loss[loss=0.1187, simple_loss=0.1997, pruned_loss=0.0188, over 4809.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2152, pruned_loss=0.03492, over 970705.37 frames.], batch size: 26, lr: 2.37e-04 2022-05-06 11:43:29,257 INFO [train.py:715] (0/8) Epoch 9, batch 12250, loss[loss=0.1237, simple_loss=0.1979, pruned_loss=0.02472, over 4966.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2153, pruned_loss=0.03468, over 970338.24 frames.], batch size: 24, lr: 2.37e-04 2022-05-06 11:44:08,221 INFO [train.py:715] (0/8) Epoch 9, batch 12300, loss[loss=0.1431, simple_loss=0.214, pruned_loss=0.03613, over 4858.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2161, pruned_loss=0.03509, over 970901.03 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:44:47,993 INFO [train.py:715] (0/8) Epoch 9, batch 12350, loss[loss=0.1484, simple_loss=0.2209, pruned_loss=0.03799, over 4910.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2156, pruned_loss=0.0349, over 970405.35 frames.], batch size: 23, lr: 2.37e-04 2022-05-06 11:45:28,031 INFO [train.py:715] (0/8) Epoch 9, batch 12400, loss[loss=0.1669, simple_loss=0.2459, pruned_loss=0.04397, over 4861.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2153, pruned_loss=0.03467, over 970758.85 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:46:07,544 INFO [train.py:715] (0/8) Epoch 9, batch 12450, loss[loss=0.1262, simple_loss=0.198, pruned_loss=0.02724, over 4895.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2152, pruned_loss=0.03449, over 970751.11 frames.], batch size: 32, lr: 2.37e-04 2022-05-06 11:46:47,597 INFO [train.py:715] (0/8) Epoch 9, batch 12500, loss[loss=0.1424, simple_loss=0.2203, pruned_loss=0.03228, over 4942.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2144, pruned_loss=0.034, over 971039.39 frames.], batch size: 21, lr: 2.37e-04 2022-05-06 11:47:27,726 INFO [train.py:715] (0/8) Epoch 9, batch 12550, loss[loss=0.1541, simple_loss=0.2296, pruned_loss=0.03933, over 4812.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2151, pruned_loss=0.03418, over 970607.04 frames.], batch size: 26, lr: 2.37e-04 2022-05-06 11:48:07,690 INFO [train.py:715] (0/8) Epoch 9, batch 12600, loss[loss=0.1173, simple_loss=0.1981, pruned_loss=0.01821, over 4892.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2144, pruned_loss=0.03408, over 971605.70 frames.], batch size: 16, lr: 2.37e-04 2022-05-06 11:48:46,460 INFO [train.py:715] (0/8) Epoch 9, batch 12650, loss[loss=0.146, simple_loss=0.2183, pruned_loss=0.0369, over 4792.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2141, pruned_loss=0.03411, over 971383.14 frames.], batch size: 24, lr: 2.37e-04 2022-05-06 11:49:26,596 INFO [train.py:715] (0/8) Epoch 9, batch 12700, loss[loss=0.139, simple_loss=0.2088, pruned_loss=0.03458, over 4855.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.03436, over 971236.79 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:50:06,589 INFO [train.py:715] (0/8) Epoch 9, batch 12750, loss[loss=0.1125, simple_loss=0.184, pruned_loss=0.02046, over 4704.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2146, pruned_loss=0.03441, over 972081.39 frames.], batch size: 12, lr: 2.37e-04 2022-05-06 11:50:45,758 INFO [train.py:715] (0/8) Epoch 9, batch 12800, loss[loss=0.1423, simple_loss=0.2094, pruned_loss=0.03761, over 4723.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2144, pruned_loss=0.03429, over 972231.18 frames.], batch size: 12, lr: 2.37e-04 2022-05-06 11:51:25,603 INFO [train.py:715] (0/8) Epoch 9, batch 12850, loss[loss=0.1488, simple_loss=0.208, pruned_loss=0.0448, over 4781.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2155, pruned_loss=0.03503, over 972007.93 frames.], batch size: 17, lr: 2.37e-04 2022-05-06 11:52:05,495 INFO [train.py:715] (0/8) Epoch 9, batch 12900, loss[loss=0.1245, simple_loss=0.1971, pruned_loss=0.02598, over 4864.00 frames.], tot_loss[loss=0.1436, simple_loss=0.2161, pruned_loss=0.03554, over 971772.59 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:52:45,473 INFO [train.py:715] (0/8) Epoch 9, batch 12950, loss[loss=0.1477, simple_loss=0.2183, pruned_loss=0.03855, over 4988.00 frames.], tot_loss[loss=0.1433, simple_loss=0.216, pruned_loss=0.03529, over 971230.25 frames.], batch size: 35, lr: 2.37e-04 2022-05-06 11:53:24,503 INFO [train.py:715] (0/8) Epoch 9, batch 13000, loss[loss=0.158, simple_loss=0.2334, pruned_loss=0.04124, over 4859.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2164, pruned_loss=0.03527, over 971955.81 frames.], batch size: 20, lr: 2.37e-04 2022-05-06 11:54:04,855 INFO [train.py:715] (0/8) Epoch 9, batch 13050, loss[loss=0.1202, simple_loss=0.1904, pruned_loss=0.02501, over 4934.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2154, pruned_loss=0.03466, over 972137.69 frames.], batch size: 21, lr: 2.37e-04 2022-05-06 11:54:44,622 INFO [train.py:715] (0/8) Epoch 9, batch 13100, loss[loss=0.1326, simple_loss=0.2025, pruned_loss=0.0314, over 4840.00 frames.], tot_loss[loss=0.1423, simple_loss=0.215, pruned_loss=0.03476, over 972185.15 frames.], batch size: 32, lr: 2.37e-04 2022-05-06 11:55:23,867 INFO [train.py:715] (0/8) Epoch 9, batch 13150, loss[loss=0.1205, simple_loss=0.1931, pruned_loss=0.02393, over 4991.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2158, pruned_loss=0.035, over 971832.61 frames.], batch size: 14, lr: 2.37e-04 2022-05-06 11:56:03,852 INFO [train.py:715] (0/8) Epoch 9, batch 13200, loss[loss=0.1192, simple_loss=0.1947, pruned_loss=0.02184, over 4768.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2153, pruned_loss=0.03471, over 971714.70 frames.], batch size: 18, lr: 2.37e-04 2022-05-06 11:56:44,166 INFO [train.py:715] (0/8) Epoch 9, batch 13250, loss[loss=0.1442, simple_loss=0.2217, pruned_loss=0.03338, over 4730.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2146, pruned_loss=0.03452, over 971338.51 frames.], batch size: 16, lr: 2.37e-04 2022-05-06 11:57:23,739 INFO [train.py:715] (0/8) Epoch 9, batch 13300, loss[loss=0.132, simple_loss=0.2105, pruned_loss=0.02677, over 4757.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2147, pruned_loss=0.03432, over 971660.27 frames.], batch size: 19, lr: 2.37e-04 2022-05-06 11:58:03,448 INFO [train.py:715] (0/8) Epoch 9, batch 13350, loss[loss=0.1523, simple_loss=0.239, pruned_loss=0.03278, over 4948.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2156, pruned_loss=0.03491, over 971887.21 frames.], batch size: 24, lr: 2.37e-04 2022-05-06 11:58:43,523 INFO [train.py:715] (0/8) Epoch 9, batch 13400, loss[loss=0.09152, simple_loss=0.1552, pruned_loss=0.01394, over 4788.00 frames.], tot_loss[loss=0.142, simple_loss=0.2149, pruned_loss=0.03457, over 971425.25 frames.], batch size: 12, lr: 2.37e-04 2022-05-06 11:59:23,792 INFO [train.py:715] (0/8) Epoch 9, batch 13450, loss[loss=0.1344, simple_loss=0.2071, pruned_loss=0.03081, over 4932.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2143, pruned_loss=0.03431, over 971645.02 frames.], batch size: 18, lr: 2.36e-04 2022-05-06 12:00:02,967 INFO [train.py:715] (0/8) Epoch 9, batch 13500, loss[loss=0.1682, simple_loss=0.2422, pruned_loss=0.04707, over 4896.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.0347, over 971614.57 frames.], batch size: 39, lr: 2.36e-04 2022-05-06 12:00:42,983 INFO [train.py:715] (0/8) Epoch 9, batch 13550, loss[loss=0.163, simple_loss=0.2388, pruned_loss=0.04365, over 4972.00 frames.], tot_loss[loss=0.1423, simple_loss=0.215, pruned_loss=0.03478, over 971583.81 frames.], batch size: 39, lr: 2.36e-04 2022-05-06 12:01:22,499 INFO [train.py:715] (0/8) Epoch 9, batch 13600, loss[loss=0.1474, simple_loss=0.2143, pruned_loss=0.0402, over 4822.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2141, pruned_loss=0.03429, over 971978.59 frames.], batch size: 15, lr: 2.36e-04 2022-05-06 12:02:01,620 INFO [train.py:715] (0/8) Epoch 9, batch 13650, loss[loss=0.1664, simple_loss=0.2307, pruned_loss=0.05102, over 4785.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03476, over 971774.36 frames.], batch size: 14, lr: 2.36e-04 2022-05-06 12:02:40,851 INFO [train.py:715] (0/8) Epoch 9, batch 13700, loss[loss=0.1291, simple_loss=0.209, pruned_loss=0.02461, over 4926.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2136, pruned_loss=0.0347, over 971380.53 frames.], batch size: 29, lr: 2.36e-04 2022-05-06 12:03:20,731 INFO [train.py:715] (0/8) Epoch 9, batch 13750, loss[loss=0.1444, simple_loss=0.2033, pruned_loss=0.04279, over 4802.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2136, pruned_loss=0.03484, over 970926.54 frames.], batch size: 14, lr: 2.36e-04 2022-05-06 12:03:59,884 INFO [train.py:715] (0/8) Epoch 9, batch 13800, loss[loss=0.1379, simple_loss=0.2095, pruned_loss=0.03312, over 4981.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2134, pruned_loss=0.0347, over 971729.81 frames.], batch size: 33, lr: 2.36e-04 2022-05-06 12:04:38,385 INFO [train.py:715] (0/8) Epoch 9, batch 13850, loss[loss=0.1318, simple_loss=0.2081, pruned_loss=0.02778, over 4975.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2134, pruned_loss=0.03454, over 971801.55 frames.], batch size: 28, lr: 2.36e-04 2022-05-06 12:05:17,812 INFO [train.py:715] (0/8) Epoch 9, batch 13900, loss[loss=0.1324, simple_loss=0.2082, pruned_loss=0.02833, over 4743.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2136, pruned_loss=0.03452, over 971456.12 frames.], batch size: 16, lr: 2.36e-04 2022-05-06 12:05:57,958 INFO [train.py:715] (0/8) Epoch 9, batch 13950, loss[loss=0.111, simple_loss=0.181, pruned_loss=0.02054, over 4814.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03397, over 972422.81 frames.], batch size: 21, lr: 2.36e-04 2022-05-06 12:06:36,917 INFO [train.py:715] (0/8) Epoch 9, batch 14000, loss[loss=0.1324, simple_loss=0.2052, pruned_loss=0.02982, over 4882.00 frames.], tot_loss[loss=0.141, simple_loss=0.2135, pruned_loss=0.03428, over 972099.15 frames.], batch size: 22, lr: 2.36e-04 2022-05-06 12:07:16,027 INFO [train.py:715] (0/8) Epoch 9, batch 14050, loss[loss=0.1773, simple_loss=0.2515, pruned_loss=0.05154, over 4784.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2128, pruned_loss=0.034, over 972462.32 frames.], batch size: 18, lr: 2.36e-04 2022-05-06 12:07:55,562 INFO [train.py:715] (0/8) Epoch 9, batch 14100, loss[loss=0.1434, simple_loss=0.2156, pruned_loss=0.03559, over 4932.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2139, pruned_loss=0.03452, over 972614.50 frames.], batch size: 29, lr: 2.36e-04 2022-05-06 12:08:35,130 INFO [train.py:715] (0/8) Epoch 9, batch 14150, loss[loss=0.1419, simple_loss=0.2213, pruned_loss=0.03122, over 4921.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.03443, over 972092.41 frames.], batch size: 29, lr: 2.36e-04 2022-05-06 12:09:14,477 INFO [train.py:715] (0/8) Epoch 9, batch 14200, loss[loss=0.1267, simple_loss=0.1926, pruned_loss=0.03039, over 4844.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2149, pruned_loss=0.03517, over 971885.98 frames.], batch size: 13, lr: 2.36e-04 2022-05-06 12:09:53,801 INFO [train.py:715] (0/8) Epoch 9, batch 14250, loss[loss=0.1281, simple_loss=0.2045, pruned_loss=0.02588, over 4904.00 frames.], tot_loss[loss=0.1434, simple_loss=0.2157, pruned_loss=0.03554, over 973686.87 frames.], batch size: 19, lr: 2.36e-04 2022-05-06 12:10:33,294 INFO [train.py:715] (0/8) Epoch 9, batch 14300, loss[loss=0.126, simple_loss=0.2036, pruned_loss=0.02421, over 4687.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2149, pruned_loss=0.03471, over 973152.75 frames.], batch size: 15, lr: 2.36e-04 2022-05-06 12:11:11,972 INFO [train.py:715] (0/8) Epoch 9, batch 14350, loss[loss=0.1303, simple_loss=0.215, pruned_loss=0.02283, over 4761.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2148, pruned_loss=0.03467, over 973230.09 frames.], batch size: 16, lr: 2.36e-04 2022-05-06 12:11:50,596 INFO [train.py:715] (0/8) Epoch 9, batch 14400, loss[loss=0.1432, simple_loss=0.2139, pruned_loss=0.03628, over 4955.00 frames.], tot_loss[loss=0.1429, simple_loss=0.2156, pruned_loss=0.03508, over 972954.00 frames.], batch size: 21, lr: 2.36e-04 2022-05-06 12:12:30,355 INFO [train.py:715] (0/8) Epoch 9, batch 14450, loss[loss=0.1392, simple_loss=0.2229, pruned_loss=0.02772, over 4811.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2148, pruned_loss=0.03479, over 972372.50 frames.], batch size: 25, lr: 2.36e-04 2022-05-06 12:13:09,686 INFO [train.py:715] (0/8) Epoch 9, batch 14500, loss[loss=0.1631, simple_loss=0.2322, pruned_loss=0.04697, over 4932.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2148, pruned_loss=0.03492, over 972167.82 frames.], batch size: 39, lr: 2.36e-04 2022-05-06 12:13:48,631 INFO [train.py:715] (0/8) Epoch 9, batch 14550, loss[loss=0.1407, simple_loss=0.2098, pruned_loss=0.03585, over 4948.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.03431, over 972572.03 frames.], batch size: 23, lr: 2.36e-04 2022-05-06 12:14:27,682 INFO [train.py:715] (0/8) Epoch 9, batch 14600, loss[loss=0.1115, simple_loss=0.1855, pruned_loss=0.01881, over 4834.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2137, pruned_loss=0.03431, over 972870.61 frames.], batch size: 12, lr: 2.36e-04 2022-05-06 12:15:07,384 INFO [train.py:715] (0/8) Epoch 9, batch 14650, loss[loss=0.1268, simple_loss=0.2041, pruned_loss=0.02471, over 4867.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2133, pruned_loss=0.03406, over 972188.14 frames.], batch size: 16, lr: 2.36e-04 2022-05-06 12:15:45,918 INFO [train.py:715] (0/8) Epoch 9, batch 14700, loss[loss=0.1828, simple_loss=0.2469, pruned_loss=0.05934, over 4753.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2129, pruned_loss=0.0341, over 971828.26 frames.], batch size: 16, lr: 2.36e-04 2022-05-06 12:15:57,297 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-328000.pt 2022-05-06 12:16:27,517 INFO [train.py:715] (0/8) Epoch 9, batch 14750, loss[loss=0.1192, simple_loss=0.2088, pruned_loss=0.01485, over 4811.00 frames.], tot_loss[loss=0.141, simple_loss=0.2136, pruned_loss=0.03414, over 971251.40 frames.], batch size: 25, lr: 2.36e-04 2022-05-06 12:17:06,570 INFO [train.py:715] (0/8) Epoch 9, batch 14800, loss[loss=0.1368, simple_loss=0.2039, pruned_loss=0.03487, over 4879.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2141, pruned_loss=0.03409, over 971453.26 frames.], batch size: 16, lr: 2.36e-04 2022-05-06 12:17:45,493 INFO [train.py:715] (0/8) Epoch 9, batch 14850, loss[loss=0.1402, simple_loss=0.2109, pruned_loss=0.03477, over 4832.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03413, over 971681.52 frames.], batch size: 13, lr: 2.36e-04 2022-05-06 12:18:24,545 INFO [train.py:715] (0/8) Epoch 9, batch 14900, loss[loss=0.1437, simple_loss=0.2231, pruned_loss=0.03215, over 4780.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2148, pruned_loss=0.03484, over 972152.21 frames.], batch size: 18, lr: 2.36e-04 2022-05-06 12:19:03,080 INFO [train.py:715] (0/8) Epoch 9, batch 14950, loss[loss=0.1399, simple_loss=0.2162, pruned_loss=0.03179, over 4988.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2161, pruned_loss=0.03577, over 972251.53 frames.], batch size: 25, lr: 2.36e-04 2022-05-06 12:19:42,678 INFO [train.py:715] (0/8) Epoch 9, batch 15000, loss[loss=0.1845, simple_loss=0.2529, pruned_loss=0.05805, over 4968.00 frames.], tot_loss[loss=0.1438, simple_loss=0.2161, pruned_loss=0.03576, over 972802.62 frames.], batch size: 14, lr: 2.36e-04 2022-05-06 12:19:42,679 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 12:19:52,343 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1071, simple_loss=0.1915, pruned_loss=0.01139, over 914524.00 frames. 2022-05-06 12:20:32,094 INFO [train.py:715] (0/8) Epoch 9, batch 15050, loss[loss=0.1272, simple_loss=0.2098, pruned_loss=0.02234, over 4891.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2147, pruned_loss=0.03516, over 972853.42 frames.], batch size: 22, lr: 2.36e-04 2022-05-06 12:21:11,100 INFO [train.py:715] (0/8) Epoch 9, batch 15100, loss[loss=0.1074, simple_loss=0.1792, pruned_loss=0.01781, over 4862.00 frames.], tot_loss[loss=0.142, simple_loss=0.2143, pruned_loss=0.03482, over 972477.80 frames.], batch size: 13, lr: 2.36e-04 2022-05-06 12:21:50,196 INFO [train.py:715] (0/8) Epoch 9, batch 15150, loss[loss=0.1395, simple_loss=0.2174, pruned_loss=0.0308, over 4806.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.03469, over 972004.27 frames.], batch size: 21, lr: 2.36e-04 2022-05-06 12:22:30,008 INFO [train.py:715] (0/8) Epoch 9, batch 15200, loss[loss=0.1147, simple_loss=0.1795, pruned_loss=0.02488, over 4728.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2131, pruned_loss=0.0343, over 973217.97 frames.], batch size: 12, lr: 2.36e-04 2022-05-06 12:23:09,319 INFO [train.py:715] (0/8) Epoch 9, batch 15250, loss[loss=0.1339, simple_loss=0.2016, pruned_loss=0.03315, over 4795.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.03434, over 972467.17 frames.], batch size: 21, lr: 2.36e-04 2022-05-06 12:23:48,031 INFO [train.py:715] (0/8) Epoch 9, batch 15300, loss[loss=0.1063, simple_loss=0.1835, pruned_loss=0.01456, over 4784.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2132, pruned_loss=0.03391, over 972679.19 frames.], batch size: 12, lr: 2.36e-04 2022-05-06 12:24:27,147 INFO [train.py:715] (0/8) Epoch 9, batch 15350, loss[loss=0.1144, simple_loss=0.191, pruned_loss=0.01892, over 4829.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03379, over 972547.27 frames.], batch size: 12, lr: 2.36e-04 2022-05-06 12:25:06,184 INFO [train.py:715] (0/8) Epoch 9, batch 15400, loss[loss=0.182, simple_loss=0.2438, pruned_loss=0.06009, over 4829.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2131, pruned_loss=0.03413, over 972812.60 frames.], batch size: 15, lr: 2.36e-04 2022-05-06 12:25:44,957 INFO [train.py:715] (0/8) Epoch 9, batch 15450, loss[loss=0.1867, simple_loss=0.2461, pruned_loss=0.06361, over 4966.00 frames.], tot_loss[loss=0.1416, simple_loss=0.214, pruned_loss=0.03461, over 973103.28 frames.], batch size: 29, lr: 2.36e-04 2022-05-06 12:26:23,385 INFO [train.py:715] (0/8) Epoch 9, batch 15500, loss[loss=0.1356, simple_loss=0.1939, pruned_loss=0.03868, over 4886.00 frames.], tot_loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03521, over 972478.37 frames.], batch size: 16, lr: 2.36e-04 2022-05-06 12:27:03,111 INFO [train.py:715] (0/8) Epoch 9, batch 15550, loss[loss=0.1203, simple_loss=0.191, pruned_loss=0.02485, over 4777.00 frames.], tot_loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03518, over 972479.84 frames.], batch size: 17, lr: 2.36e-04 2022-05-06 12:27:41,871 INFO [train.py:715] (0/8) Epoch 9, batch 15600, loss[loss=0.1358, simple_loss=0.2186, pruned_loss=0.02646, over 4752.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2147, pruned_loss=0.03533, over 972522.78 frames.], batch size: 19, lr: 2.36e-04 2022-05-06 12:28:20,221 INFO [train.py:715] (0/8) Epoch 9, batch 15650, loss[loss=0.1405, simple_loss=0.2087, pruned_loss=0.03614, over 4984.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2141, pruned_loss=0.03529, over 972668.39 frames.], batch size: 33, lr: 2.36e-04 2022-05-06 12:28:59,316 INFO [train.py:715] (0/8) Epoch 9, batch 15700, loss[loss=0.1317, simple_loss=0.2143, pruned_loss=0.02453, over 4826.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2144, pruned_loss=0.03494, over 973122.75 frames.], batch size: 25, lr: 2.36e-04 2022-05-06 12:29:39,071 INFO [train.py:715] (0/8) Epoch 9, batch 15750, loss[loss=0.1335, simple_loss=0.207, pruned_loss=0.02998, over 4836.00 frames.], tot_loss[loss=0.142, simple_loss=0.2144, pruned_loss=0.03476, over 972513.67 frames.], batch size: 26, lr: 2.36e-04 2022-05-06 12:30:17,872 INFO [train.py:715] (0/8) Epoch 9, batch 15800, loss[loss=0.1422, simple_loss=0.2095, pruned_loss=0.03748, over 4818.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03412, over 972507.16 frames.], batch size: 15, lr: 2.36e-04 2022-05-06 12:30:56,794 INFO [train.py:715] (0/8) Epoch 9, batch 15850, loss[loss=0.1736, simple_loss=0.2426, pruned_loss=0.05225, over 4941.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2143, pruned_loss=0.03441, over 974275.52 frames.], batch size: 39, lr: 2.36e-04 2022-05-06 12:31:36,398 INFO [train.py:715] (0/8) Epoch 9, batch 15900, loss[loss=0.1821, simple_loss=0.2745, pruned_loss=0.04481, over 4941.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2144, pruned_loss=0.03422, over 974748.27 frames.], batch size: 23, lr: 2.36e-04 2022-05-06 12:32:15,973 INFO [train.py:715] (0/8) Epoch 9, batch 15950, loss[loss=0.1373, simple_loss=0.2156, pruned_loss=0.02948, over 4901.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03418, over 973736.51 frames.], batch size: 17, lr: 2.36e-04 2022-05-06 12:32:54,616 INFO [train.py:715] (0/8) Epoch 9, batch 16000, loss[loss=0.1149, simple_loss=0.1892, pruned_loss=0.02035, over 4917.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2141, pruned_loss=0.03427, over 973373.37 frames.], batch size: 23, lr: 2.36e-04 2022-05-06 12:33:33,295 INFO [train.py:715] (0/8) Epoch 9, batch 16050, loss[loss=0.2032, simple_loss=0.2817, pruned_loss=0.0624, over 4965.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2143, pruned_loss=0.03439, over 972687.19 frames.], batch size: 15, lr: 2.36e-04 2022-05-06 12:34:12,505 INFO [train.py:715] (0/8) Epoch 9, batch 16100, loss[loss=0.1476, simple_loss=0.2234, pruned_loss=0.03588, over 4898.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2139, pruned_loss=0.03423, over 973584.78 frames.], batch size: 17, lr: 2.36e-04 2022-05-06 12:34:51,591 INFO [train.py:715] (0/8) Epoch 9, batch 16150, loss[loss=0.1683, simple_loss=0.2371, pruned_loss=0.04979, over 4823.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2147, pruned_loss=0.0344, over 973093.90 frames.], batch size: 15, lr: 2.36e-04 2022-05-06 12:35:30,767 INFO [train.py:715] (0/8) Epoch 9, batch 16200, loss[loss=0.1217, simple_loss=0.1908, pruned_loss=0.02635, over 4887.00 frames.], tot_loss[loss=0.1413, simple_loss=0.214, pruned_loss=0.0343, over 973810.78 frames.], batch size: 22, lr: 2.36e-04 2022-05-06 12:36:10,108 INFO [train.py:715] (0/8) Epoch 9, batch 16250, loss[loss=0.1549, simple_loss=0.2304, pruned_loss=0.03969, over 4976.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2146, pruned_loss=0.03459, over 973645.17 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 12:36:49,785 INFO [train.py:715] (0/8) Epoch 9, batch 16300, loss[loss=0.1175, simple_loss=0.1924, pruned_loss=0.02126, over 4777.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2146, pruned_loss=0.03455, over 973857.79 frames.], batch size: 12, lr: 2.35e-04 2022-05-06 12:37:27,727 INFO [train.py:715] (0/8) Epoch 9, batch 16350, loss[loss=0.1474, simple_loss=0.2266, pruned_loss=0.03405, over 4832.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2152, pruned_loss=0.03516, over 973344.61 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 12:38:07,158 INFO [train.py:715] (0/8) Epoch 9, batch 16400, loss[loss=0.1571, simple_loss=0.2337, pruned_loss=0.04024, over 4983.00 frames.], tot_loss[loss=0.1425, simple_loss=0.215, pruned_loss=0.03499, over 973020.01 frames.], batch size: 20, lr: 2.35e-04 2022-05-06 12:38:47,052 INFO [train.py:715] (0/8) Epoch 9, batch 16450, loss[loss=0.1422, simple_loss=0.2086, pruned_loss=0.0379, over 4986.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2149, pruned_loss=0.03478, over 972165.90 frames.], batch size: 25, lr: 2.35e-04 2022-05-06 12:39:25,802 INFO [train.py:715] (0/8) Epoch 9, batch 16500, loss[loss=0.1534, simple_loss=0.2232, pruned_loss=0.04182, over 4831.00 frames.], tot_loss[loss=0.143, simple_loss=0.2154, pruned_loss=0.03528, over 971949.00 frames.], batch size: 25, lr: 2.35e-04 2022-05-06 12:40:04,381 INFO [train.py:715] (0/8) Epoch 9, batch 16550, loss[loss=0.1518, simple_loss=0.2298, pruned_loss=0.03688, over 4955.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2153, pruned_loss=0.03546, over 971928.32 frames.], batch size: 39, lr: 2.35e-04 2022-05-06 12:40:43,844 INFO [train.py:715] (0/8) Epoch 9, batch 16600, loss[loss=0.1345, simple_loss=0.2199, pruned_loss=0.0246, over 4795.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2145, pruned_loss=0.03519, over 971908.07 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 12:41:23,434 INFO [train.py:715] (0/8) Epoch 9, batch 16650, loss[loss=0.1484, simple_loss=0.2194, pruned_loss=0.03875, over 4807.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2144, pruned_loss=0.03495, over 971692.26 frames.], batch size: 25, lr: 2.35e-04 2022-05-06 12:42:02,347 INFO [train.py:715] (0/8) Epoch 9, batch 16700, loss[loss=0.1985, simple_loss=0.243, pruned_loss=0.07703, over 4689.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2153, pruned_loss=0.0355, over 971417.89 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 12:42:41,612 INFO [train.py:715] (0/8) Epoch 9, batch 16750, loss[loss=0.134, simple_loss=0.2121, pruned_loss=0.02794, over 4763.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03577, over 971126.18 frames.], batch size: 19, lr: 2.35e-04 2022-05-06 12:43:21,409 INFO [train.py:715] (0/8) Epoch 9, batch 16800, loss[loss=0.1162, simple_loss=0.1966, pruned_loss=0.01791, over 4933.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03574, over 971085.31 frames.], batch size: 23, lr: 2.35e-04 2022-05-06 12:44:01,037 INFO [train.py:715] (0/8) Epoch 9, batch 16850, loss[loss=0.1941, simple_loss=0.2543, pruned_loss=0.06695, over 4953.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2152, pruned_loss=0.03562, over 971507.20 frames.], batch size: 39, lr: 2.35e-04 2022-05-06 12:44:40,456 INFO [train.py:715] (0/8) Epoch 9, batch 16900, loss[loss=0.1262, simple_loss=0.1962, pruned_loss=0.02809, over 4759.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2152, pruned_loss=0.03574, over 971803.28 frames.], batch size: 14, lr: 2.35e-04 2022-05-06 12:45:20,531 INFO [train.py:715] (0/8) Epoch 9, batch 16950, loss[loss=0.1196, simple_loss=0.1938, pruned_loss=0.02266, over 4802.00 frames.], tot_loss[loss=0.143, simple_loss=0.215, pruned_loss=0.0355, over 972643.64 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 12:46:00,234 INFO [train.py:715] (0/8) Epoch 9, batch 17000, loss[loss=0.1361, simple_loss=0.2075, pruned_loss=0.03234, over 4844.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2149, pruned_loss=0.03539, over 972478.60 frames.], batch size: 20, lr: 2.35e-04 2022-05-06 12:46:38,811 INFO [train.py:715] (0/8) Epoch 9, batch 17050, loss[loss=0.1441, simple_loss=0.2252, pruned_loss=0.03148, over 4916.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2157, pruned_loss=0.03539, over 972972.20 frames.], batch size: 29, lr: 2.35e-04 2022-05-06 12:47:18,384 INFO [train.py:715] (0/8) Epoch 9, batch 17100, loss[loss=0.141, simple_loss=0.2147, pruned_loss=0.03366, over 4819.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2158, pruned_loss=0.03559, over 972743.44 frames.], batch size: 27, lr: 2.35e-04 2022-05-06 12:47:58,062 INFO [train.py:715] (0/8) Epoch 9, batch 17150, loss[loss=0.1337, simple_loss=0.2138, pruned_loss=0.02682, over 4909.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2152, pruned_loss=0.03518, over 973433.23 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 12:48:37,317 INFO [train.py:715] (0/8) Epoch 9, batch 17200, loss[loss=0.1718, simple_loss=0.2449, pruned_loss=0.04933, over 4852.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2144, pruned_loss=0.03489, over 974025.15 frames.], batch size: 32, lr: 2.35e-04 2022-05-06 12:49:15,989 INFO [train.py:715] (0/8) Epoch 9, batch 17250, loss[loss=0.151, simple_loss=0.2289, pruned_loss=0.03656, over 4919.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2145, pruned_loss=0.0346, over 973133.36 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 12:49:54,884 INFO [train.py:715] (0/8) Epoch 9, batch 17300, loss[loss=0.131, simple_loss=0.2129, pruned_loss=0.02456, over 4987.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2146, pruned_loss=0.03462, over 972921.35 frames.], batch size: 25, lr: 2.35e-04 2022-05-06 12:50:33,969 INFO [train.py:715] (0/8) Epoch 9, batch 17350, loss[loss=0.1294, simple_loss=0.2118, pruned_loss=0.02349, over 4828.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.03484, over 973318.09 frames.], batch size: 13, lr: 2.35e-04 2022-05-06 12:51:13,076 INFO [train.py:715] (0/8) Epoch 9, batch 17400, loss[loss=0.13, simple_loss=0.2092, pruned_loss=0.02535, over 4787.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03448, over 973858.19 frames.], batch size: 14, lr: 2.35e-04 2022-05-06 12:51:52,390 INFO [train.py:715] (0/8) Epoch 9, batch 17450, loss[loss=0.1157, simple_loss=0.1871, pruned_loss=0.02219, over 4918.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2139, pruned_loss=0.03475, over 973434.24 frames.], batch size: 29, lr: 2.35e-04 2022-05-06 12:52:31,600 INFO [train.py:715] (0/8) Epoch 9, batch 17500, loss[loss=0.1543, simple_loss=0.2183, pruned_loss=0.04509, over 4878.00 frames.], tot_loss[loss=0.141, simple_loss=0.2135, pruned_loss=0.03424, over 973079.57 frames.], batch size: 16, lr: 2.35e-04 2022-05-06 12:53:10,810 INFO [train.py:715] (0/8) Epoch 9, batch 17550, loss[loss=0.1425, simple_loss=0.2117, pruned_loss=0.03664, over 4768.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2128, pruned_loss=0.03378, over 973660.65 frames.], batch size: 14, lr: 2.35e-04 2022-05-06 12:53:49,890 INFO [train.py:715] (0/8) Epoch 9, batch 17600, loss[loss=0.1419, simple_loss=0.2177, pruned_loss=0.0331, over 4783.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03385, over 974458.79 frames.], batch size: 18, lr: 2.35e-04 2022-05-06 12:54:29,584 INFO [train.py:715] (0/8) Epoch 9, batch 17650, loss[loss=0.1266, simple_loss=0.197, pruned_loss=0.0281, over 4923.00 frames.], tot_loss[loss=0.141, simple_loss=0.2134, pruned_loss=0.03429, over 974059.01 frames.], batch size: 21, lr: 2.35e-04 2022-05-06 12:55:08,477 INFO [train.py:715] (0/8) Epoch 9, batch 17700, loss[loss=0.1302, simple_loss=0.2103, pruned_loss=0.02507, over 4905.00 frames.], tot_loss[loss=0.1408, simple_loss=0.213, pruned_loss=0.03425, over 973854.12 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 12:55:47,741 INFO [train.py:715] (0/8) Epoch 9, batch 17750, loss[loss=0.1319, simple_loss=0.2107, pruned_loss=0.0265, over 4697.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2131, pruned_loss=0.03464, over 973164.95 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 12:56:27,545 INFO [train.py:715] (0/8) Epoch 9, batch 17800, loss[loss=0.1521, simple_loss=0.2176, pruned_loss=0.0433, over 4741.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2132, pruned_loss=0.03478, over 973634.75 frames.], batch size: 16, lr: 2.35e-04 2022-05-06 12:57:06,520 INFO [train.py:715] (0/8) Epoch 9, batch 17850, loss[loss=0.1812, simple_loss=0.2592, pruned_loss=0.05162, over 4817.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2136, pruned_loss=0.03464, over 972858.14 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 12:57:45,747 INFO [train.py:715] (0/8) Epoch 9, batch 17900, loss[loss=0.141, simple_loss=0.2039, pruned_loss=0.03901, over 4834.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2129, pruned_loss=0.0343, over 973138.08 frames.], batch size: 13, lr: 2.35e-04 2022-05-06 12:58:25,611 INFO [train.py:715] (0/8) Epoch 9, batch 17950, loss[loss=0.1264, simple_loss=0.1975, pruned_loss=0.02765, over 4795.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2129, pruned_loss=0.03423, over 972380.77 frames.], batch size: 25, lr: 2.35e-04 2022-05-06 12:59:04,967 INFO [train.py:715] (0/8) Epoch 9, batch 18000, loss[loss=0.1333, simple_loss=0.1947, pruned_loss=0.03593, over 4842.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2134, pruned_loss=0.03472, over 971189.40 frames.], batch size: 30, lr: 2.35e-04 2022-05-06 12:59:04,968 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 12:59:14,501 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1068, simple_loss=0.1912, pruned_loss=0.01121, over 914524.00 frames. 2022-05-06 12:59:53,952 INFO [train.py:715] (0/8) Epoch 9, batch 18050, loss[loss=0.1529, simple_loss=0.2131, pruned_loss=0.04631, over 4750.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2133, pruned_loss=0.03466, over 971316.15 frames.], batch size: 16, lr: 2.35e-04 2022-05-06 13:00:33,770 INFO [train.py:715] (0/8) Epoch 9, batch 18100, loss[loss=0.1669, simple_loss=0.2456, pruned_loss=0.04412, over 4929.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2142, pruned_loss=0.03468, over 971908.79 frames.], batch size: 18, lr: 2.35e-04 2022-05-06 13:01:13,061 INFO [train.py:715] (0/8) Epoch 9, batch 18150, loss[loss=0.161, simple_loss=0.2338, pruned_loss=0.04412, over 4869.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2146, pruned_loss=0.03485, over 972204.39 frames.], batch size: 20, lr: 2.35e-04 2022-05-06 13:01:52,670 INFO [train.py:715] (0/8) Epoch 9, batch 18200, loss[loss=0.151, simple_loss=0.2297, pruned_loss=0.0362, over 4909.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2144, pruned_loss=0.03458, over 972251.87 frames.], batch size: 18, lr: 2.35e-04 2022-05-06 13:02:31,898 INFO [train.py:715] (0/8) Epoch 9, batch 18250, loss[loss=0.2078, simple_loss=0.2669, pruned_loss=0.07429, over 4773.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2151, pruned_loss=0.03471, over 971685.53 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 13:03:11,073 INFO [train.py:715] (0/8) Epoch 9, batch 18300, loss[loss=0.1797, simple_loss=0.2511, pruned_loss=0.05416, over 4953.00 frames.], tot_loss[loss=0.1433, simple_loss=0.2162, pruned_loss=0.03522, over 971441.03 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 13:03:50,428 INFO [train.py:715] (0/8) Epoch 9, batch 18350, loss[loss=0.13, simple_loss=0.2047, pruned_loss=0.02765, over 4847.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2152, pruned_loss=0.03484, over 970886.72 frames.], batch size: 20, lr: 2.35e-04 2022-05-06 13:04:29,592 INFO [train.py:715] (0/8) Epoch 9, batch 18400, loss[loss=0.1635, simple_loss=0.2369, pruned_loss=0.04506, over 4808.00 frames.], tot_loss[loss=0.143, simple_loss=0.2157, pruned_loss=0.0351, over 972002.45 frames.], batch size: 25, lr: 2.35e-04 2022-05-06 13:05:08,635 INFO [train.py:715] (0/8) Epoch 9, batch 18450, loss[loss=0.1672, simple_loss=0.2346, pruned_loss=0.0499, over 4975.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2153, pruned_loss=0.03512, over 972565.39 frames.], batch size: 35, lr: 2.35e-04 2022-05-06 13:05:47,601 INFO [train.py:715] (0/8) Epoch 9, batch 18500, loss[loss=0.1529, simple_loss=0.2256, pruned_loss=0.04005, over 4967.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2151, pruned_loss=0.0349, over 972971.57 frames.], batch size: 24, lr: 2.35e-04 2022-05-06 13:06:26,993 INFO [train.py:715] (0/8) Epoch 9, batch 18550, loss[loss=0.1451, simple_loss=0.2067, pruned_loss=0.04179, over 4777.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2142, pruned_loss=0.03457, over 973517.87 frames.], batch size: 18, lr: 2.35e-04 2022-05-06 13:07:06,065 INFO [train.py:715] (0/8) Epoch 9, batch 18600, loss[loss=0.1747, simple_loss=0.238, pruned_loss=0.05571, over 4705.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2146, pruned_loss=0.03486, over 972951.06 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 13:07:44,915 INFO [train.py:715] (0/8) Epoch 9, batch 18650, loss[loss=0.1525, simple_loss=0.2146, pruned_loss=0.04522, over 4840.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.03436, over 972215.31 frames.], batch size: 30, lr: 2.35e-04 2022-05-06 13:08:24,470 INFO [train.py:715] (0/8) Epoch 9, batch 18700, loss[loss=0.1377, simple_loss=0.2142, pruned_loss=0.03057, over 4909.00 frames.], tot_loss[loss=0.1411, simple_loss=0.214, pruned_loss=0.0341, over 972079.11 frames.], batch size: 17, lr: 2.35e-04 2022-05-06 13:09:03,185 INFO [train.py:715] (0/8) Epoch 9, batch 18750, loss[loss=0.137, simple_loss=0.207, pruned_loss=0.03348, over 4970.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.03403, over 971785.30 frames.], batch size: 14, lr: 2.35e-04 2022-05-06 13:09:42,757 INFO [train.py:715] (0/8) Epoch 9, batch 18800, loss[loss=0.1517, simple_loss=0.2243, pruned_loss=0.03952, over 4882.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2127, pruned_loss=0.03377, over 972095.40 frames.], batch size: 22, lr: 2.35e-04 2022-05-06 13:10:21,583 INFO [train.py:715] (0/8) Epoch 9, batch 18850, loss[loss=0.1116, simple_loss=0.1858, pruned_loss=0.01866, over 4944.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2138, pruned_loss=0.03418, over 971575.82 frames.], batch size: 29, lr: 2.35e-04 2022-05-06 13:11:00,814 INFO [train.py:715] (0/8) Epoch 9, batch 18900, loss[loss=0.1734, simple_loss=0.2231, pruned_loss=0.06186, over 4715.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2136, pruned_loss=0.03427, over 971028.00 frames.], batch size: 15, lr: 2.35e-04 2022-05-06 13:11:40,162 INFO [train.py:715] (0/8) Epoch 9, batch 18950, loss[loss=0.128, simple_loss=0.2056, pruned_loss=0.02521, over 4947.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2139, pruned_loss=0.03389, over 971787.59 frames.], batch size: 23, lr: 2.35e-04 2022-05-06 13:12:18,868 INFO [train.py:715] (0/8) Epoch 9, batch 19000, loss[loss=0.143, simple_loss=0.2184, pruned_loss=0.03382, over 4874.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2132, pruned_loss=0.03354, over 973192.30 frames.], batch size: 16, lr: 2.35e-04 2022-05-06 13:12:58,958 INFO [train.py:715] (0/8) Epoch 9, batch 19050, loss[loss=0.1287, simple_loss=0.2015, pruned_loss=0.028, over 4794.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2139, pruned_loss=0.03397, over 972501.33 frames.], batch size: 21, lr: 2.34e-04 2022-05-06 13:13:38,425 INFO [train.py:715] (0/8) Epoch 9, batch 19100, loss[loss=0.1444, simple_loss=0.2156, pruned_loss=0.03665, over 4903.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2136, pruned_loss=0.03404, over 972919.63 frames.], batch size: 17, lr: 2.34e-04 2022-05-06 13:14:17,258 INFO [train.py:715] (0/8) Epoch 9, batch 19150, loss[loss=0.1781, simple_loss=0.2405, pruned_loss=0.05787, over 4804.00 frames.], tot_loss[loss=0.141, simple_loss=0.2136, pruned_loss=0.03416, over 973049.72 frames.], batch size: 24, lr: 2.34e-04 2022-05-06 13:14:57,085 INFO [train.py:715] (0/8) Epoch 9, batch 19200, loss[loss=0.1215, simple_loss=0.194, pruned_loss=0.02452, over 4960.00 frames.], tot_loss[loss=0.141, simple_loss=0.2138, pruned_loss=0.03413, over 973549.14 frames.], batch size: 24, lr: 2.34e-04 2022-05-06 13:15:36,589 INFO [train.py:715] (0/8) Epoch 9, batch 19250, loss[loss=0.1241, simple_loss=0.1947, pruned_loss=0.02676, over 4822.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2129, pruned_loss=0.03374, over 972946.39 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:16:15,484 INFO [train.py:715] (0/8) Epoch 9, batch 19300, loss[loss=0.1587, simple_loss=0.221, pruned_loss=0.04824, over 4814.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2136, pruned_loss=0.03404, over 971114.41 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:16:54,060 INFO [train.py:715] (0/8) Epoch 9, batch 19350, loss[loss=0.1483, simple_loss=0.2199, pruned_loss=0.03838, over 4766.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2125, pruned_loss=0.03384, over 970387.40 frames.], batch size: 18, lr: 2.34e-04 2022-05-06 13:17:34,090 INFO [train.py:715] (0/8) Epoch 9, batch 19400, loss[loss=0.1619, simple_loss=0.2298, pruned_loss=0.04706, over 4860.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2135, pruned_loss=0.034, over 970697.47 frames.], batch size: 30, lr: 2.34e-04 2022-05-06 13:18:13,121 INFO [train.py:715] (0/8) Epoch 9, batch 19450, loss[loss=0.1423, simple_loss=0.2061, pruned_loss=0.03921, over 4834.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2135, pruned_loss=0.03417, over 971030.11 frames.], batch size: 30, lr: 2.34e-04 2022-05-06 13:18:51,810 INFO [train.py:715] (0/8) Epoch 9, batch 19500, loss[loss=0.1614, simple_loss=0.2188, pruned_loss=0.05194, over 4635.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2133, pruned_loss=0.03395, over 970812.07 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:19:30,942 INFO [train.py:715] (0/8) Epoch 9, batch 19550, loss[loss=0.1357, simple_loss=0.2065, pruned_loss=0.03241, over 4937.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.034, over 970282.99 frames.], batch size: 35, lr: 2.34e-04 2022-05-06 13:20:10,204 INFO [train.py:715] (0/8) Epoch 9, batch 19600, loss[loss=0.1347, simple_loss=0.2155, pruned_loss=0.02696, over 4962.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2139, pruned_loss=0.03396, over 970550.63 frames.], batch size: 24, lr: 2.34e-04 2022-05-06 13:20:48,780 INFO [train.py:715] (0/8) Epoch 9, batch 19650, loss[loss=0.1263, simple_loss=0.1985, pruned_loss=0.02708, over 4862.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.03346, over 970477.21 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:21:27,267 INFO [train.py:715] (0/8) Epoch 9, batch 19700, loss[loss=0.1427, simple_loss=0.2117, pruned_loss=0.03685, over 4767.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03369, over 970444.75 frames.], batch size: 14, lr: 2.34e-04 2022-05-06 13:22:07,182 INFO [train.py:715] (0/8) Epoch 9, batch 19750, loss[loss=0.1475, simple_loss=0.2182, pruned_loss=0.03841, over 4832.00 frames.], tot_loss[loss=0.1409, simple_loss=0.214, pruned_loss=0.03385, over 971168.88 frames.], batch size: 25, lr: 2.34e-04 2022-05-06 13:22:46,853 INFO [train.py:715] (0/8) Epoch 9, batch 19800, loss[loss=0.1691, simple_loss=0.2318, pruned_loss=0.05324, over 4833.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2136, pruned_loss=0.03391, over 970634.42 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:23:26,648 INFO [train.py:715] (0/8) Epoch 9, batch 19850, loss[loss=0.1371, simple_loss=0.2056, pruned_loss=0.03425, over 4965.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2141, pruned_loss=0.03422, over 971019.53 frames.], batch size: 35, lr: 2.34e-04 2022-05-06 13:24:06,293 INFO [train.py:715] (0/8) Epoch 9, batch 19900, loss[loss=0.1168, simple_loss=0.1894, pruned_loss=0.02212, over 4824.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2148, pruned_loss=0.03457, over 970966.01 frames.], batch size: 26, lr: 2.34e-04 2022-05-06 13:24:45,453 INFO [train.py:715] (0/8) Epoch 9, batch 19950, loss[loss=0.1214, simple_loss=0.2051, pruned_loss=0.01886, over 4759.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2143, pruned_loss=0.03441, over 971675.36 frames.], batch size: 19, lr: 2.34e-04 2022-05-06 13:25:24,505 INFO [train.py:715] (0/8) Epoch 9, batch 20000, loss[loss=0.1393, simple_loss=0.2109, pruned_loss=0.03388, over 4774.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2144, pruned_loss=0.03446, over 971437.71 frames.], batch size: 18, lr: 2.34e-04 2022-05-06 13:26:02,949 INFO [train.py:715] (0/8) Epoch 9, batch 20050, loss[loss=0.1229, simple_loss=0.1964, pruned_loss=0.02471, over 4972.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2149, pruned_loss=0.03495, over 972018.58 frames.], batch size: 25, lr: 2.34e-04 2022-05-06 13:26:42,421 INFO [train.py:715] (0/8) Epoch 9, batch 20100, loss[loss=0.1172, simple_loss=0.1931, pruned_loss=0.02065, over 4926.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03507, over 972131.01 frames.], batch size: 29, lr: 2.34e-04 2022-05-06 13:27:21,486 INFO [train.py:715] (0/8) Epoch 9, batch 20150, loss[loss=0.1194, simple_loss=0.1959, pruned_loss=0.02149, over 4813.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2142, pruned_loss=0.03457, over 971164.88 frames.], batch size: 21, lr: 2.34e-04 2022-05-06 13:27:59,970 INFO [train.py:715] (0/8) Epoch 9, batch 20200, loss[loss=0.1348, simple_loss=0.2114, pruned_loss=0.02905, over 4829.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.0347, over 970953.73 frames.], batch size: 27, lr: 2.34e-04 2022-05-06 13:28:39,473 INFO [train.py:715] (0/8) Epoch 9, batch 20250, loss[loss=0.1125, simple_loss=0.1824, pruned_loss=0.02133, over 4813.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2147, pruned_loss=0.03495, over 972055.50 frames.], batch size: 12, lr: 2.34e-04 2022-05-06 13:29:18,327 INFO [train.py:715] (0/8) Epoch 9, batch 20300, loss[loss=0.1311, simple_loss=0.2161, pruned_loss=0.02307, over 4883.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2152, pruned_loss=0.03512, over 971567.46 frames.], batch size: 16, lr: 2.34e-04 2022-05-06 13:29:57,720 INFO [train.py:715] (0/8) Epoch 9, batch 20350, loss[loss=0.1187, simple_loss=0.1905, pruned_loss=0.02346, over 4828.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2147, pruned_loss=0.03476, over 971463.83 frames.], batch size: 15, lr: 2.34e-04 2022-05-06 13:30:37,200 INFO [train.py:715] (0/8) Epoch 9, batch 20400, loss[loss=0.1161, simple_loss=0.1946, pruned_loss=0.01878, over 4646.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2149, pruned_loss=0.03482, over 970465.86 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:31:17,089 INFO [train.py:715] (0/8) Epoch 9, batch 20450, loss[loss=0.1362, simple_loss=0.217, pruned_loss=0.02767, over 4906.00 frames.], tot_loss[loss=0.1432, simple_loss=0.216, pruned_loss=0.03521, over 970909.39 frames.], batch size: 19, lr: 2.34e-04 2022-05-06 13:31:56,596 INFO [train.py:715] (0/8) Epoch 9, batch 20500, loss[loss=0.1277, simple_loss=0.2013, pruned_loss=0.02705, over 4992.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03484, over 971641.88 frames.], batch size: 14, lr: 2.34e-04 2022-05-06 13:32:35,667 INFO [train.py:715] (0/8) Epoch 9, batch 20550, loss[loss=0.1416, simple_loss=0.2137, pruned_loss=0.03475, over 4866.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2149, pruned_loss=0.03463, over 971880.85 frames.], batch size: 16, lr: 2.34e-04 2022-05-06 13:33:14,859 INFO [train.py:715] (0/8) Epoch 9, batch 20600, loss[loss=0.1491, simple_loss=0.2201, pruned_loss=0.03898, over 4766.00 frames.], tot_loss[loss=0.143, simple_loss=0.2156, pruned_loss=0.03515, over 971508.00 frames.], batch size: 18, lr: 2.34e-04 2022-05-06 13:33:53,309 INFO [train.py:715] (0/8) Epoch 9, batch 20650, loss[loss=0.1324, simple_loss=0.1973, pruned_loss=0.0337, over 4872.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03472, over 971365.52 frames.], batch size: 39, lr: 2.34e-04 2022-05-06 13:34:32,411 INFO [train.py:715] (0/8) Epoch 9, batch 20700, loss[loss=0.1746, simple_loss=0.2387, pruned_loss=0.05528, over 4844.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2149, pruned_loss=0.03492, over 971472.13 frames.], batch size: 15, lr: 2.34e-04 2022-05-06 13:35:11,245 INFO [train.py:715] (0/8) Epoch 9, batch 20750, loss[loss=0.143, simple_loss=0.2089, pruned_loss=0.03855, over 4745.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2151, pruned_loss=0.03503, over 972067.49 frames.], batch size: 16, lr: 2.34e-04 2022-05-06 13:35:50,838 INFO [train.py:715] (0/8) Epoch 9, batch 20800, loss[loss=0.1176, simple_loss=0.197, pruned_loss=0.01914, over 4803.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2147, pruned_loss=0.03452, over 972282.74 frames.], batch size: 21, lr: 2.34e-04 2022-05-06 13:36:30,208 INFO [train.py:715] (0/8) Epoch 9, batch 20850, loss[loss=0.1208, simple_loss=0.1866, pruned_loss=0.02752, over 4846.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.0346, over 972793.06 frames.], batch size: 13, lr: 2.34e-04 2022-05-06 13:37:09,647 INFO [train.py:715] (0/8) Epoch 9, batch 20900, loss[loss=0.1258, simple_loss=0.2029, pruned_loss=0.02437, over 4768.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2142, pruned_loss=0.0346, over 972253.56 frames.], batch size: 18, lr: 2.34e-04 2022-05-06 13:37:49,142 INFO [train.py:715] (0/8) Epoch 9, batch 20950, loss[loss=0.1904, simple_loss=0.2432, pruned_loss=0.06883, over 4853.00 frames.], tot_loss[loss=0.142, simple_loss=0.2145, pruned_loss=0.03472, over 971273.52 frames.], batch size: 30, lr: 2.34e-04 2022-05-06 13:38:28,445 INFO [train.py:715] (0/8) Epoch 9, batch 21000, loss[loss=0.1609, simple_loss=0.2381, pruned_loss=0.0419, over 4974.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2139, pruned_loss=0.03478, over 971321.37 frames.], batch size: 15, lr: 2.34e-04 2022-05-06 13:38:28,446 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 13:38:38,084 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1069, simple_loss=0.1912, pruned_loss=0.01129, over 914524.00 frames. 2022-05-06 13:39:17,238 INFO [train.py:715] (0/8) Epoch 9, batch 21050, loss[loss=0.1212, simple_loss=0.2026, pruned_loss=0.01989, over 4824.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03486, over 971501.50 frames.], batch size: 15, lr: 2.34e-04 2022-05-06 13:39:56,156 INFO [train.py:715] (0/8) Epoch 9, batch 21100, loss[loss=0.1434, simple_loss=0.2204, pruned_loss=0.03326, over 4976.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2139, pruned_loss=0.03451, over 972514.41 frames.], batch size: 24, lr: 2.34e-04 2022-05-06 13:40:35,519 INFO [train.py:715] (0/8) Epoch 9, batch 21150, loss[loss=0.1207, simple_loss=0.1845, pruned_loss=0.02847, over 4811.00 frames.], tot_loss[loss=0.141, simple_loss=0.2134, pruned_loss=0.03425, over 973253.53 frames.], batch size: 12, lr: 2.34e-04 2022-05-06 13:41:14,528 INFO [train.py:715] (0/8) Epoch 9, batch 21200, loss[loss=0.1399, simple_loss=0.2091, pruned_loss=0.03536, over 4775.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2138, pruned_loss=0.03469, over 973343.07 frames.], batch size: 12, lr: 2.34e-04 2022-05-06 13:41:54,095 INFO [train.py:715] (0/8) Epoch 9, batch 21250, loss[loss=0.1401, simple_loss=0.2113, pruned_loss=0.03447, over 4900.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2132, pruned_loss=0.03462, over 973958.43 frames.], batch size: 17, lr: 2.34e-04 2022-05-06 13:42:32,486 INFO [train.py:715] (0/8) Epoch 9, batch 21300, loss[loss=0.1571, simple_loss=0.2255, pruned_loss=0.04435, over 4960.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2132, pruned_loss=0.03458, over 973776.58 frames.], batch size: 24, lr: 2.34e-04 2022-05-06 13:43:11,100 INFO [train.py:715] (0/8) Epoch 9, batch 21350, loss[loss=0.1313, simple_loss=0.2097, pruned_loss=0.02643, over 4745.00 frames.], tot_loss[loss=0.141, simple_loss=0.2133, pruned_loss=0.03439, over 973374.33 frames.], batch size: 19, lr: 2.34e-04 2022-05-06 13:43:50,027 INFO [train.py:715] (0/8) Epoch 9, batch 21400, loss[loss=0.1319, simple_loss=0.204, pruned_loss=0.02993, over 4902.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2135, pruned_loss=0.03467, over 974107.55 frames.], batch size: 18, lr: 2.34e-04 2022-05-06 13:44:28,772 INFO [train.py:715] (0/8) Epoch 9, batch 21450, loss[loss=0.1281, simple_loss=0.1957, pruned_loss=0.03026, over 4781.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2139, pruned_loss=0.03471, over 973552.95 frames.], batch size: 18, lr: 2.34e-04 2022-05-06 13:45:07,166 INFO [train.py:715] (0/8) Epoch 9, batch 21500, loss[loss=0.1693, simple_loss=0.2349, pruned_loss=0.05189, over 4914.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2146, pruned_loss=0.0349, over 973439.56 frames.], batch size: 19, lr: 2.34e-04 2022-05-06 13:45:46,283 INFO [train.py:715] (0/8) Epoch 9, batch 21550, loss[loss=0.1439, simple_loss=0.2135, pruned_loss=0.03716, over 4986.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2144, pruned_loss=0.03521, over 973750.21 frames.], batch size: 26, lr: 2.34e-04 2022-05-06 13:46:25,000 INFO [train.py:715] (0/8) Epoch 9, batch 21600, loss[loss=0.1452, simple_loss=0.232, pruned_loss=0.02917, over 4773.00 frames.], tot_loss[loss=0.1425, simple_loss=0.215, pruned_loss=0.03498, over 973044.16 frames.], batch size: 19, lr: 2.34e-04 2022-05-06 13:47:04,088 INFO [train.py:715] (0/8) Epoch 9, batch 21650, loss[loss=0.2069, simple_loss=0.2842, pruned_loss=0.06484, over 4941.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.03485, over 973245.48 frames.], batch size: 29, lr: 2.34e-04 2022-05-06 13:47:43,363 INFO [train.py:715] (0/8) Epoch 9, batch 21700, loss[loss=0.1642, simple_loss=0.238, pruned_loss=0.04518, over 4814.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2146, pruned_loss=0.03467, over 973074.90 frames.], batch size: 27, lr: 2.34e-04 2022-05-06 13:48:22,454 INFO [train.py:715] (0/8) Epoch 9, batch 21750, loss[loss=0.1295, simple_loss=0.2025, pruned_loss=0.02827, over 4697.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2133, pruned_loss=0.03388, over 973238.72 frames.], batch size: 15, lr: 2.34e-04 2022-05-06 13:49:01,561 INFO [train.py:715] (0/8) Epoch 9, batch 21800, loss[loss=0.1467, simple_loss=0.1988, pruned_loss=0.04727, over 4783.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2135, pruned_loss=0.0341, over 973898.12 frames.], batch size: 12, lr: 2.34e-04 2022-05-06 13:49:41,088 INFO [train.py:715] (0/8) Epoch 9, batch 21850, loss[loss=0.1155, simple_loss=0.188, pruned_loss=0.02151, over 4929.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2132, pruned_loss=0.03404, over 973555.31 frames.], batch size: 29, lr: 2.34e-04 2022-05-06 13:50:20,438 INFO [train.py:715] (0/8) Epoch 9, batch 21900, loss[loss=0.1609, simple_loss=0.2235, pruned_loss=0.04915, over 4687.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2132, pruned_loss=0.03392, over 972007.96 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 13:50:59,008 INFO [train.py:715] (0/8) Epoch 9, batch 21950, loss[loss=0.1743, simple_loss=0.2402, pruned_loss=0.05423, over 4874.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2136, pruned_loss=0.03398, over 971797.31 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 13:51:37,910 INFO [train.py:715] (0/8) Epoch 9, batch 22000, loss[loss=0.1866, simple_loss=0.251, pruned_loss=0.06108, over 4695.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03418, over 971836.90 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 13:52:16,811 INFO [train.py:715] (0/8) Epoch 9, batch 22050, loss[loss=0.113, simple_loss=0.195, pruned_loss=0.01547, over 4955.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03446, over 972229.17 frames.], batch size: 21, lr: 2.33e-04 2022-05-06 13:52:56,514 INFO [train.py:715] (0/8) Epoch 9, batch 22100, loss[loss=0.1384, simple_loss=0.2018, pruned_loss=0.03752, over 4981.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2145, pruned_loss=0.03458, over 971638.55 frames.], batch size: 25, lr: 2.33e-04 2022-05-06 13:53:35,798 INFO [train.py:715] (0/8) Epoch 9, batch 22150, loss[loss=0.1527, simple_loss=0.226, pruned_loss=0.03972, over 4880.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03457, over 971434.91 frames.], batch size: 22, lr: 2.33e-04 2022-05-06 13:54:14,967 INFO [train.py:715] (0/8) Epoch 9, batch 22200, loss[loss=0.1256, simple_loss=0.1973, pruned_loss=0.02692, over 4753.00 frames.], tot_loss[loss=0.141, simple_loss=0.2134, pruned_loss=0.03428, over 971860.45 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 13:54:54,446 INFO [train.py:715] (0/8) Epoch 9, batch 22250, loss[loss=0.1465, simple_loss=0.2247, pruned_loss=0.03416, over 4949.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2141, pruned_loss=0.03459, over 973492.41 frames.], batch size: 21, lr: 2.33e-04 2022-05-06 13:55:33,231 INFO [train.py:715] (0/8) Epoch 9, batch 22300, loss[loss=0.132, simple_loss=0.2052, pruned_loss=0.02938, over 4840.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2135, pruned_loss=0.03403, over 973054.66 frames.], batch size: 32, lr: 2.33e-04 2022-05-06 13:56:11,828 INFO [train.py:715] (0/8) Epoch 9, batch 22350, loss[loss=0.1641, simple_loss=0.2385, pruned_loss=0.04479, over 4860.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2147, pruned_loss=0.03455, over 973071.62 frames.], batch size: 20, lr: 2.33e-04 2022-05-06 13:56:50,719 INFO [train.py:715] (0/8) Epoch 9, batch 22400, loss[loss=0.1589, simple_loss=0.2337, pruned_loss=0.04202, over 4959.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03485, over 973621.94 frames.], batch size: 39, lr: 2.33e-04 2022-05-06 13:57:29,428 INFO [train.py:715] (0/8) Epoch 9, batch 22450, loss[loss=0.147, simple_loss=0.2207, pruned_loss=0.03662, over 4692.00 frames.], tot_loss[loss=0.142, simple_loss=0.2148, pruned_loss=0.03461, over 972694.52 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 13:58:08,125 INFO [train.py:715] (0/8) Epoch 9, batch 22500, loss[loss=0.1433, simple_loss=0.2185, pruned_loss=0.03403, over 4928.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2142, pruned_loss=0.03428, over 971748.23 frames.], batch size: 29, lr: 2.33e-04 2022-05-06 13:58:47,014 INFO [train.py:715] (0/8) Epoch 9, batch 22550, loss[loss=0.1582, simple_loss=0.2216, pruned_loss=0.04735, over 4896.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.03444, over 971599.25 frames.], batch size: 19, lr: 2.33e-04 2022-05-06 13:59:26,034 INFO [train.py:715] (0/8) Epoch 9, batch 22600, loss[loss=0.141, simple_loss=0.218, pruned_loss=0.03201, over 4695.00 frames.], tot_loss[loss=0.142, simple_loss=0.215, pruned_loss=0.03456, over 971704.79 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 14:00:05,202 INFO [train.py:715] (0/8) Epoch 9, batch 22650, loss[loss=0.1535, simple_loss=0.2221, pruned_loss=0.04243, over 4698.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2154, pruned_loss=0.03464, over 972662.07 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 14:00:44,241 INFO [train.py:715] (0/8) Epoch 9, batch 22700, loss[loss=0.1482, simple_loss=0.2137, pruned_loss=0.04136, over 4866.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2157, pruned_loss=0.03484, over 970987.05 frames.], batch size: 32, lr: 2.33e-04 2022-05-06 14:00:55,097 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-336000.pt 2022-05-06 14:01:26,076 INFO [train.py:715] (0/8) Epoch 9, batch 22750, loss[loss=0.1369, simple_loss=0.1973, pruned_loss=0.0382, over 4896.00 frames.], tot_loss[loss=0.1418, simple_loss=0.215, pruned_loss=0.03427, over 971688.80 frames.], batch size: 19, lr: 2.33e-04 2022-05-06 14:02:04,850 INFO [train.py:715] (0/8) Epoch 9, batch 22800, loss[loss=0.1353, simple_loss=0.2135, pruned_loss=0.02861, over 4916.00 frames.], tot_loss[loss=0.141, simple_loss=0.2144, pruned_loss=0.03376, over 971819.97 frames.], batch size: 17, lr: 2.33e-04 2022-05-06 14:02:44,151 INFO [train.py:715] (0/8) Epoch 9, batch 22850, loss[loss=0.1352, simple_loss=0.2085, pruned_loss=0.03101, over 4958.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2152, pruned_loss=0.03405, over 971523.99 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 14:03:22,717 INFO [train.py:715] (0/8) Epoch 9, batch 22900, loss[loss=0.1474, simple_loss=0.2108, pruned_loss=0.04206, over 4785.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2146, pruned_loss=0.03411, over 971387.78 frames.], batch size: 12, lr: 2.33e-04 2022-05-06 14:04:01,801 INFO [train.py:715] (0/8) Epoch 9, batch 22950, loss[loss=0.1573, simple_loss=0.2292, pruned_loss=0.0427, over 4879.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2145, pruned_loss=0.03443, over 971503.80 frames.], batch size: 38, lr: 2.33e-04 2022-05-06 14:04:40,855 INFO [train.py:715] (0/8) Epoch 9, batch 23000, loss[loss=0.1213, simple_loss=0.205, pruned_loss=0.01884, over 4956.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2138, pruned_loss=0.03381, over 970936.08 frames.], batch size: 24, lr: 2.33e-04 2022-05-06 14:05:20,246 INFO [train.py:715] (0/8) Epoch 9, batch 23050, loss[loss=0.1613, simple_loss=0.2319, pruned_loss=0.04535, over 4774.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2146, pruned_loss=0.03404, over 970794.18 frames.], batch size: 18, lr: 2.33e-04 2022-05-06 14:05:59,522 INFO [train.py:715] (0/8) Epoch 9, batch 23100, loss[loss=0.1152, simple_loss=0.1958, pruned_loss=0.0173, over 4900.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2135, pruned_loss=0.03343, over 971765.42 frames.], batch size: 17, lr: 2.33e-04 2022-05-06 14:06:38,544 INFO [train.py:715] (0/8) Epoch 9, batch 23150, loss[loss=0.1514, simple_loss=0.2262, pruned_loss=0.03828, over 4803.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2134, pruned_loss=0.03411, over 971617.86 frames.], batch size: 14, lr: 2.33e-04 2022-05-06 14:07:18,158 INFO [train.py:715] (0/8) Epoch 9, batch 23200, loss[loss=0.1172, simple_loss=0.1921, pruned_loss=0.02115, over 4813.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03423, over 971565.99 frames.], batch size: 25, lr: 2.33e-04 2022-05-06 14:07:57,913 INFO [train.py:715] (0/8) Epoch 9, batch 23250, loss[loss=0.1327, simple_loss=0.2113, pruned_loss=0.02704, over 4914.00 frames.], tot_loss[loss=0.1407, simple_loss=0.213, pruned_loss=0.03418, over 971276.85 frames.], batch size: 18, lr: 2.33e-04 2022-05-06 14:08:37,684 INFO [train.py:715] (0/8) Epoch 9, batch 23300, loss[loss=0.17, simple_loss=0.2486, pruned_loss=0.0457, over 4943.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03444, over 971371.96 frames.], batch size: 39, lr: 2.33e-04 2022-05-06 14:09:17,438 INFO [train.py:715] (0/8) Epoch 9, batch 23350, loss[loss=0.1376, simple_loss=0.2033, pruned_loss=0.03596, over 4958.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03456, over 971957.04 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 14:09:56,736 INFO [train.py:715] (0/8) Epoch 9, batch 23400, loss[loss=0.1296, simple_loss=0.2047, pruned_loss=0.02724, over 4864.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2142, pruned_loss=0.03442, over 972595.06 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 14:10:35,591 INFO [train.py:715] (0/8) Epoch 9, batch 23450, loss[loss=0.1629, simple_loss=0.2293, pruned_loss=0.04819, over 4888.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2152, pruned_loss=0.03504, over 973064.34 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 14:11:14,354 INFO [train.py:715] (0/8) Epoch 9, batch 23500, loss[loss=0.1544, simple_loss=0.2285, pruned_loss=0.04015, over 4859.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2144, pruned_loss=0.03455, over 973148.57 frames.], batch size: 20, lr: 2.33e-04 2022-05-06 14:11:52,879 INFO [train.py:715] (0/8) Epoch 9, batch 23550, loss[loss=0.1387, simple_loss=0.2121, pruned_loss=0.03268, over 4977.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03466, over 973123.26 frames.], batch size: 25, lr: 2.33e-04 2022-05-06 14:12:32,346 INFO [train.py:715] (0/8) Epoch 9, batch 23600, loss[loss=0.1375, simple_loss=0.206, pruned_loss=0.03452, over 4970.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2137, pruned_loss=0.0343, over 973447.52 frames.], batch size: 24, lr: 2.33e-04 2022-05-06 14:13:11,495 INFO [train.py:715] (0/8) Epoch 9, batch 23650, loss[loss=0.1542, simple_loss=0.2019, pruned_loss=0.05327, over 4779.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2139, pruned_loss=0.0346, over 972690.89 frames.], batch size: 13, lr: 2.33e-04 2022-05-06 14:13:50,877 INFO [train.py:715] (0/8) Epoch 9, batch 23700, loss[loss=0.1237, simple_loss=0.2045, pruned_loss=0.02139, over 4885.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2129, pruned_loss=0.03374, over 972296.18 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 14:14:30,048 INFO [train.py:715] (0/8) Epoch 9, batch 23750, loss[loss=0.1591, simple_loss=0.231, pruned_loss=0.04365, over 4860.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2139, pruned_loss=0.03429, over 971313.74 frames.], batch size: 32, lr: 2.33e-04 2022-05-06 14:15:09,284 INFO [train.py:715] (0/8) Epoch 9, batch 23800, loss[loss=0.1295, simple_loss=0.209, pruned_loss=0.02504, over 4918.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2135, pruned_loss=0.0341, over 971698.94 frames.], batch size: 17, lr: 2.33e-04 2022-05-06 14:15:48,389 INFO [train.py:715] (0/8) Epoch 9, batch 23850, loss[loss=0.1223, simple_loss=0.194, pruned_loss=0.02528, over 4804.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2138, pruned_loss=0.03419, over 971732.76 frames.], batch size: 12, lr: 2.33e-04 2022-05-06 14:16:27,643 INFO [train.py:715] (0/8) Epoch 9, batch 23900, loss[loss=0.1459, simple_loss=0.2217, pruned_loss=0.03503, over 4957.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2135, pruned_loss=0.03401, over 971050.49 frames.], batch size: 24, lr: 2.33e-04 2022-05-06 14:17:06,534 INFO [train.py:715] (0/8) Epoch 9, batch 23950, loss[loss=0.1576, simple_loss=0.2242, pruned_loss=0.04554, over 4908.00 frames.], tot_loss[loss=0.141, simple_loss=0.2134, pruned_loss=0.0343, over 971794.69 frames.], batch size: 19, lr: 2.33e-04 2022-05-06 14:17:45,501 INFO [train.py:715] (0/8) Epoch 9, batch 24000, loss[loss=0.1551, simple_loss=0.2299, pruned_loss=0.04018, over 4868.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2131, pruned_loss=0.03439, over 971262.32 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 14:17:45,502 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 14:17:55,356 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1069, simple_loss=0.1913, pruned_loss=0.01128, over 914524.00 frames. 2022-05-06 14:18:34,689 INFO [train.py:715] (0/8) Epoch 9, batch 24050, loss[loss=0.1503, simple_loss=0.2222, pruned_loss=0.03919, over 4868.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2129, pruned_loss=0.03464, over 971346.41 frames.], batch size: 22, lr: 2.33e-04 2022-05-06 14:19:14,961 INFO [train.py:715] (0/8) Epoch 9, batch 24100, loss[loss=0.1525, simple_loss=0.2218, pruned_loss=0.04163, over 4810.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2123, pruned_loss=0.03429, over 971256.52 frames.], batch size: 25, lr: 2.33e-04 2022-05-06 14:19:54,471 INFO [train.py:715] (0/8) Epoch 9, batch 24150, loss[loss=0.1471, simple_loss=0.2168, pruned_loss=0.03871, over 4922.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2122, pruned_loss=0.03427, over 971812.01 frames.], batch size: 29, lr: 2.33e-04 2022-05-06 14:20:33,557 INFO [train.py:715] (0/8) Epoch 9, batch 24200, loss[loss=0.1559, simple_loss=0.2178, pruned_loss=0.04695, over 4691.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2124, pruned_loss=0.03421, over 971674.63 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 14:21:12,482 INFO [train.py:715] (0/8) Epoch 9, batch 24250, loss[loss=0.09863, simple_loss=0.177, pruned_loss=0.01014, over 4802.00 frames.], tot_loss[loss=0.1402, simple_loss=0.212, pruned_loss=0.03418, over 971476.20 frames.], batch size: 24, lr: 2.33e-04 2022-05-06 14:21:52,132 INFO [train.py:715] (0/8) Epoch 9, batch 24300, loss[loss=0.1666, simple_loss=0.2495, pruned_loss=0.04188, over 4987.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2119, pruned_loss=0.03369, over 971227.64 frames.], batch size: 39, lr: 2.33e-04 2022-05-06 14:22:31,313 INFO [train.py:715] (0/8) Epoch 9, batch 24350, loss[loss=0.1354, simple_loss=0.2111, pruned_loss=0.02985, over 4923.00 frames.], tot_loss[loss=0.14, simple_loss=0.2122, pruned_loss=0.03384, over 971318.56 frames.], batch size: 29, lr: 2.33e-04 2022-05-06 14:23:10,724 INFO [train.py:715] (0/8) Epoch 9, batch 24400, loss[loss=0.1399, simple_loss=0.2146, pruned_loss=0.03257, over 4746.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2122, pruned_loss=0.03372, over 971991.75 frames.], batch size: 19, lr: 2.33e-04 2022-05-06 14:23:50,612 INFO [train.py:715] (0/8) Epoch 9, batch 24450, loss[loss=0.1776, simple_loss=0.2415, pruned_loss=0.0568, over 4811.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03353, over 971772.69 frames.], batch size: 15, lr: 2.33e-04 2022-05-06 14:24:30,639 INFO [train.py:715] (0/8) Epoch 9, batch 24500, loss[loss=0.1523, simple_loss=0.2152, pruned_loss=0.04473, over 4883.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2121, pruned_loss=0.03345, over 971107.35 frames.], batch size: 16, lr: 2.33e-04 2022-05-06 14:25:10,998 INFO [train.py:715] (0/8) Epoch 9, batch 24550, loss[loss=0.134, simple_loss=0.2107, pruned_loss=0.0286, over 4764.00 frames.], tot_loss[loss=0.14, simple_loss=0.2125, pruned_loss=0.03372, over 972208.82 frames.], batch size: 19, lr: 2.33e-04 2022-05-06 14:25:50,743 INFO [train.py:715] (0/8) Epoch 9, batch 24600, loss[loss=0.1345, simple_loss=0.2049, pruned_loss=0.032, over 4803.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.0336, over 971989.47 frames.], batch size: 17, lr: 2.33e-04 2022-05-06 14:26:30,712 INFO [train.py:715] (0/8) Epoch 9, batch 24650, loss[loss=0.1432, simple_loss=0.2031, pruned_loss=0.04169, over 4939.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2133, pruned_loss=0.03468, over 972095.61 frames.], batch size: 29, lr: 2.33e-04 2022-05-06 14:27:09,791 INFO [train.py:715] (0/8) Epoch 9, batch 24700, loss[loss=0.1541, simple_loss=0.2259, pruned_loss=0.04119, over 4915.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2138, pruned_loss=0.03487, over 972135.99 frames.], batch size: 17, lr: 2.33e-04 2022-05-06 14:27:48,505 INFO [train.py:715] (0/8) Epoch 9, batch 24750, loss[loss=0.1275, simple_loss=0.1978, pruned_loss=0.02866, over 4912.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2143, pruned_loss=0.03496, over 972654.02 frames.], batch size: 17, lr: 2.33e-04 2022-05-06 14:28:28,023 INFO [train.py:715] (0/8) Epoch 9, batch 24800, loss[loss=0.1611, simple_loss=0.2257, pruned_loss=0.04824, over 4832.00 frames.], tot_loss[loss=0.142, simple_loss=0.2142, pruned_loss=0.03492, over 972231.10 frames.], batch size: 15, lr: 2.32e-04 2022-05-06 14:29:07,569 INFO [train.py:715] (0/8) Epoch 9, batch 24850, loss[loss=0.1238, simple_loss=0.1978, pruned_loss=0.02495, over 4915.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2132, pruned_loss=0.03428, over 972967.53 frames.], batch size: 23, lr: 2.32e-04 2022-05-06 14:29:46,970 INFO [train.py:715] (0/8) Epoch 9, batch 24900, loss[loss=0.1244, simple_loss=0.2068, pruned_loss=0.02103, over 4840.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2136, pruned_loss=0.0341, over 972501.37 frames.], batch size: 30, lr: 2.32e-04 2022-05-06 14:30:26,386 INFO [train.py:715] (0/8) Epoch 9, batch 24950, loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02836, over 4810.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2139, pruned_loss=0.03434, over 972843.65 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 14:31:06,082 INFO [train.py:715] (0/8) Epoch 9, batch 25000, loss[loss=0.1239, simple_loss=0.1951, pruned_loss=0.02634, over 4843.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.03402, over 973210.02 frames.], batch size: 30, lr: 2.32e-04 2022-05-06 14:31:44,917 INFO [train.py:715] (0/8) Epoch 9, batch 25050, loss[loss=0.1347, simple_loss=0.2022, pruned_loss=0.03363, over 4776.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2143, pruned_loss=0.0346, over 971656.52 frames.], batch size: 18, lr: 2.32e-04 2022-05-06 14:32:24,415 INFO [train.py:715] (0/8) Epoch 9, batch 25100, loss[loss=0.1195, simple_loss=0.1891, pruned_loss=0.02494, over 4932.00 frames.], tot_loss[loss=0.1415, simple_loss=0.214, pruned_loss=0.03448, over 971169.10 frames.], batch size: 18, lr: 2.32e-04 2022-05-06 14:33:03,518 INFO [train.py:715] (0/8) Epoch 9, batch 25150, loss[loss=0.133, simple_loss=0.2043, pruned_loss=0.03087, over 4975.00 frames.], tot_loss[loss=0.1414, simple_loss=0.214, pruned_loss=0.03434, over 972111.96 frames.], batch size: 28, lr: 2.32e-04 2022-05-06 14:33:42,580 INFO [train.py:715] (0/8) Epoch 9, batch 25200, loss[loss=0.147, simple_loss=0.2128, pruned_loss=0.04061, over 4762.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2132, pruned_loss=0.03431, over 971232.10 frames.], batch size: 12, lr: 2.32e-04 2022-05-06 14:34:21,838 INFO [train.py:715] (0/8) Epoch 9, batch 25250, loss[loss=0.1663, simple_loss=0.2319, pruned_loss=0.05037, over 4895.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2144, pruned_loss=0.03526, over 971629.25 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 14:35:00,582 INFO [train.py:715] (0/8) Epoch 9, batch 25300, loss[loss=0.1467, simple_loss=0.2192, pruned_loss=0.03712, over 4762.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2159, pruned_loss=0.03576, over 971599.06 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 14:35:40,267 INFO [train.py:715] (0/8) Epoch 9, batch 25350, loss[loss=0.1408, simple_loss=0.2143, pruned_loss=0.0337, over 4966.00 frames.], tot_loss[loss=0.1435, simple_loss=0.2156, pruned_loss=0.03573, over 971819.27 frames.], batch size: 15, lr: 2.32e-04 2022-05-06 14:36:20,106 INFO [train.py:715] (0/8) Epoch 9, batch 25400, loss[loss=0.119, simple_loss=0.187, pruned_loss=0.02549, over 4837.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2144, pruned_loss=0.03496, over 971903.92 frames.], batch size: 12, lr: 2.32e-04 2022-05-06 14:37:00,343 INFO [train.py:715] (0/8) Epoch 9, batch 25450, loss[loss=0.1708, simple_loss=0.2324, pruned_loss=0.05462, over 4988.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.03505, over 972851.54 frames.], batch size: 35, lr: 2.32e-04 2022-05-06 14:37:38,913 INFO [train.py:715] (0/8) Epoch 9, batch 25500, loss[loss=0.1349, simple_loss=0.2154, pruned_loss=0.02725, over 4789.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2143, pruned_loss=0.03474, over 972119.42 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 14:38:18,072 INFO [train.py:715] (0/8) Epoch 9, batch 25550, loss[loss=0.1497, simple_loss=0.2252, pruned_loss=0.03712, over 4899.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2131, pruned_loss=0.03416, over 971439.42 frames.], batch size: 16, lr: 2.32e-04 2022-05-06 14:38:57,224 INFO [train.py:715] (0/8) Epoch 9, batch 25600, loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.03502, over 4754.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2126, pruned_loss=0.0338, over 971158.19 frames.], batch size: 16, lr: 2.32e-04 2022-05-06 14:39:36,162 INFO [train.py:715] (0/8) Epoch 9, batch 25650, loss[loss=0.1219, simple_loss=0.2034, pruned_loss=0.02018, over 4838.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2128, pruned_loss=0.03403, over 971543.50 frames.], batch size: 13, lr: 2.32e-04 2022-05-06 14:40:15,295 INFO [train.py:715] (0/8) Epoch 9, batch 25700, loss[loss=0.1971, simple_loss=0.2454, pruned_loss=0.07442, over 4828.00 frames.], tot_loss[loss=0.1408, simple_loss=0.213, pruned_loss=0.03432, over 971661.28 frames.], batch size: 13, lr: 2.32e-04 2022-05-06 14:40:54,414 INFO [train.py:715] (0/8) Epoch 9, batch 25750, loss[loss=0.1447, simple_loss=0.2227, pruned_loss=0.03335, over 4819.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2134, pruned_loss=0.03454, over 971374.35 frames.], batch size: 25, lr: 2.32e-04 2022-05-06 14:41:33,408 INFO [train.py:715] (0/8) Epoch 9, batch 25800, loss[loss=0.1445, simple_loss=0.2167, pruned_loss=0.03612, over 4759.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03417, over 971993.79 frames.], batch size: 16, lr: 2.32e-04 2022-05-06 14:42:13,625 INFO [train.py:715] (0/8) Epoch 9, batch 25850, loss[loss=0.1458, simple_loss=0.2273, pruned_loss=0.03217, over 4946.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2142, pruned_loss=0.03441, over 971896.34 frames.], batch size: 23, lr: 2.32e-04 2022-05-06 14:42:53,070 INFO [train.py:715] (0/8) Epoch 9, batch 25900, loss[loss=0.1485, simple_loss=0.216, pruned_loss=0.04052, over 4925.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2134, pruned_loss=0.03408, over 971563.00 frames.], batch size: 23, lr: 2.32e-04 2022-05-06 14:43:32,744 INFO [train.py:715] (0/8) Epoch 9, batch 25950, loss[loss=0.1613, simple_loss=0.2355, pruned_loss=0.04353, over 4920.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2134, pruned_loss=0.03361, over 972284.05 frames.], batch size: 39, lr: 2.32e-04 2022-05-06 14:44:11,976 INFO [train.py:715] (0/8) Epoch 9, batch 26000, loss[loss=0.1539, simple_loss=0.2232, pruned_loss=0.0423, over 4756.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2137, pruned_loss=0.03396, over 971463.97 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 14:44:51,305 INFO [train.py:715] (0/8) Epoch 9, batch 26050, loss[loss=0.1544, simple_loss=0.2277, pruned_loss=0.0405, over 4907.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2136, pruned_loss=0.03406, over 972444.65 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 14:45:30,099 INFO [train.py:715] (0/8) Epoch 9, batch 26100, loss[loss=0.1293, simple_loss=0.2038, pruned_loss=0.02741, over 4793.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.03402, over 971872.85 frames.], batch size: 17, lr: 2.32e-04 2022-05-06 14:46:09,804 INFO [train.py:715] (0/8) Epoch 9, batch 26150, loss[loss=0.1278, simple_loss=0.206, pruned_loss=0.02484, over 4972.00 frames.], tot_loss[loss=0.1404, simple_loss=0.213, pruned_loss=0.03393, over 972478.33 frames.], batch size: 15, lr: 2.32e-04 2022-05-06 14:46:50,051 INFO [train.py:715] (0/8) Epoch 9, batch 26200, loss[loss=0.1253, simple_loss=0.2079, pruned_loss=0.02141, over 4762.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2132, pruned_loss=0.03429, over 972481.02 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 14:47:29,915 INFO [train.py:715] (0/8) Epoch 9, batch 26250, loss[loss=0.1469, simple_loss=0.2242, pruned_loss=0.03484, over 4982.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03423, over 972074.04 frames.], batch size: 40, lr: 2.32e-04 2022-05-06 14:48:09,835 INFO [train.py:715] (0/8) Epoch 9, batch 26300, loss[loss=0.1082, simple_loss=0.181, pruned_loss=0.01768, over 4928.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03411, over 973048.52 frames.], batch size: 23, lr: 2.32e-04 2022-05-06 14:48:49,365 INFO [train.py:715] (0/8) Epoch 9, batch 26350, loss[loss=0.1544, simple_loss=0.2269, pruned_loss=0.04095, over 4935.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2133, pruned_loss=0.03399, over 972833.77 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 14:49:28,722 INFO [train.py:715] (0/8) Epoch 9, batch 26400, loss[loss=0.1693, simple_loss=0.2395, pruned_loss=0.04957, over 4793.00 frames.], tot_loss[loss=0.141, simple_loss=0.2139, pruned_loss=0.03403, over 972424.75 frames.], batch size: 24, lr: 2.32e-04 2022-05-06 14:50:07,638 INFO [train.py:715] (0/8) Epoch 9, batch 26450, loss[loss=0.1337, simple_loss=0.2142, pruned_loss=0.02661, over 4976.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2142, pruned_loss=0.03404, over 972576.03 frames.], batch size: 15, lr: 2.32e-04 2022-05-06 14:50:46,954 INFO [train.py:715] (0/8) Epoch 9, batch 26500, loss[loss=0.1336, simple_loss=0.2076, pruned_loss=0.02982, over 4914.00 frames.], tot_loss[loss=0.1412, simple_loss=0.214, pruned_loss=0.03417, over 972634.70 frames.], batch size: 18, lr: 2.32e-04 2022-05-06 14:51:26,796 INFO [train.py:715] (0/8) Epoch 9, batch 26550, loss[loss=0.146, simple_loss=0.2245, pruned_loss=0.03376, over 4695.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2135, pruned_loss=0.03396, over 971424.53 frames.], batch size: 15, lr: 2.32e-04 2022-05-06 14:52:06,133 INFO [train.py:715] (0/8) Epoch 9, batch 26600, loss[loss=0.1202, simple_loss=0.1917, pruned_loss=0.02433, over 4849.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2135, pruned_loss=0.03386, over 971211.76 frames.], batch size: 32, lr: 2.32e-04 2022-05-06 14:52:46,088 INFO [train.py:715] (0/8) Epoch 9, batch 26650, loss[loss=0.1694, simple_loss=0.243, pruned_loss=0.04787, over 4771.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2136, pruned_loss=0.03449, over 971360.97 frames.], batch size: 18, lr: 2.32e-04 2022-05-06 14:53:25,378 INFO [train.py:715] (0/8) Epoch 9, batch 26700, loss[loss=0.1269, simple_loss=0.2067, pruned_loss=0.02357, over 4851.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2145, pruned_loss=0.0345, over 970605.13 frames.], batch size: 20, lr: 2.32e-04 2022-05-06 14:54:04,742 INFO [train.py:715] (0/8) Epoch 9, batch 26750, loss[loss=0.1437, simple_loss=0.2141, pruned_loss=0.03668, over 4768.00 frames.], tot_loss[loss=0.1413, simple_loss=0.214, pruned_loss=0.03425, over 970837.66 frames.], batch size: 14, lr: 2.32e-04 2022-05-06 14:54:43,922 INFO [train.py:715] (0/8) Epoch 9, batch 26800, loss[loss=0.1642, simple_loss=0.2218, pruned_loss=0.05331, over 4793.00 frames.], tot_loss[loss=0.1411, simple_loss=0.214, pruned_loss=0.03413, over 970753.40 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 14:55:22,919 INFO [train.py:715] (0/8) Epoch 9, batch 26850, loss[loss=0.1522, simple_loss=0.2314, pruned_loss=0.03648, over 4768.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2136, pruned_loss=0.03425, over 971223.79 frames.], batch size: 18, lr: 2.32e-04 2022-05-06 14:56:02,400 INFO [train.py:715] (0/8) Epoch 9, batch 26900, loss[loss=0.1698, simple_loss=0.2462, pruned_loss=0.04664, over 4810.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.03435, over 971652.27 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 14:56:42,226 INFO [train.py:715] (0/8) Epoch 9, batch 26950, loss[loss=0.1337, simple_loss=0.2118, pruned_loss=0.02781, over 4835.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2143, pruned_loss=0.03449, over 972407.38 frames.], batch size: 26, lr: 2.32e-04 2022-05-06 14:57:21,396 INFO [train.py:715] (0/8) Epoch 9, batch 27000, loss[loss=0.1169, simple_loss=0.1968, pruned_loss=0.01853, over 4966.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2137, pruned_loss=0.03422, over 972075.81 frames.], batch size: 24, lr: 2.32e-04 2022-05-06 14:57:21,397 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 14:57:30,964 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1068, simple_loss=0.1912, pruned_loss=0.01121, over 914524.00 frames. 2022-05-06 14:58:10,506 INFO [train.py:715] (0/8) Epoch 9, batch 27050, loss[loss=0.1713, simple_loss=0.2475, pruned_loss=0.04752, over 4933.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2144, pruned_loss=0.03448, over 971883.22 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 14:58:50,064 INFO [train.py:715] (0/8) Epoch 9, batch 27100, loss[loss=0.13, simple_loss=0.2062, pruned_loss=0.02687, over 4800.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2143, pruned_loss=0.03433, over 971471.43 frames.], batch size: 24, lr: 2.32e-04 2022-05-06 14:59:30,120 INFO [train.py:715] (0/8) Epoch 9, batch 27150, loss[loss=0.1232, simple_loss=0.1986, pruned_loss=0.02384, over 4820.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2134, pruned_loss=0.03368, over 971975.06 frames.], batch size: 27, lr: 2.32e-04 2022-05-06 15:00:09,247 INFO [train.py:715] (0/8) Epoch 9, batch 27200, loss[loss=0.1241, simple_loss=0.2007, pruned_loss=0.02376, over 4894.00 frames.], tot_loss[loss=0.141, simple_loss=0.2139, pruned_loss=0.03405, over 972263.76 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 15:00:48,167 INFO [train.py:715] (0/8) Epoch 9, batch 27250, loss[loss=0.144, simple_loss=0.2168, pruned_loss=0.03562, over 4886.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2139, pruned_loss=0.03425, over 971960.77 frames.], batch size: 19, lr: 2.32e-04 2022-05-06 15:01:27,381 INFO [train.py:715] (0/8) Epoch 9, batch 27300, loss[loss=0.1248, simple_loss=0.2053, pruned_loss=0.02213, over 4951.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2137, pruned_loss=0.03386, over 971321.78 frames.], batch size: 23, lr: 2.32e-04 2022-05-06 15:02:06,271 INFO [train.py:715] (0/8) Epoch 9, batch 27350, loss[loss=0.1274, simple_loss=0.196, pruned_loss=0.02936, over 4944.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2144, pruned_loss=0.03397, over 971623.87 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 15:02:45,302 INFO [train.py:715] (0/8) Epoch 9, batch 27400, loss[loss=0.1433, simple_loss=0.2213, pruned_loss=0.03266, over 4878.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2147, pruned_loss=0.03416, over 972208.42 frames.], batch size: 22, lr: 2.32e-04 2022-05-06 15:03:24,465 INFO [train.py:715] (0/8) Epoch 9, batch 27450, loss[loss=0.1194, simple_loss=0.1915, pruned_loss=0.02369, over 4925.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2141, pruned_loss=0.03407, over 972058.13 frames.], batch size: 17, lr: 2.32e-04 2022-05-06 15:04:03,430 INFO [train.py:715] (0/8) Epoch 9, batch 27500, loss[loss=0.1299, simple_loss=0.2057, pruned_loss=0.02705, over 4923.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2139, pruned_loss=0.03401, over 971929.39 frames.], batch size: 29, lr: 2.32e-04 2022-05-06 15:04:42,461 INFO [train.py:715] (0/8) Epoch 9, batch 27550, loss[loss=0.1377, simple_loss=0.2038, pruned_loss=0.03581, over 4866.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.03431, over 972449.01 frames.], batch size: 13, lr: 2.32e-04 2022-05-06 15:05:21,386 INFO [train.py:715] (0/8) Epoch 9, batch 27600, loss[loss=0.1442, simple_loss=0.2075, pruned_loss=0.04051, over 4937.00 frames.], tot_loss[loss=0.1416, simple_loss=0.214, pruned_loss=0.03464, over 971909.38 frames.], batch size: 21, lr: 2.32e-04 2022-05-06 15:06:00,162 INFO [train.py:715] (0/8) Epoch 9, batch 27650, loss[loss=0.1393, simple_loss=0.2135, pruned_loss=0.03256, over 4781.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2142, pruned_loss=0.03465, over 972457.93 frames.], batch size: 18, lr: 2.32e-04 2022-05-06 15:06:39,015 INFO [train.py:715] (0/8) Epoch 9, batch 27700, loss[loss=0.1756, simple_loss=0.2402, pruned_loss=0.05549, over 4692.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2139, pruned_loss=0.03456, over 971508.57 frames.], batch size: 15, lr: 2.32e-04 2022-05-06 15:07:18,262 INFO [train.py:715] (0/8) Epoch 9, batch 27750, loss[loss=0.1312, simple_loss=0.1956, pruned_loss=0.03339, over 4898.00 frames.], tot_loss[loss=0.141, simple_loss=0.2133, pruned_loss=0.03434, over 971988.61 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:07:57,611 INFO [train.py:715] (0/8) Epoch 9, batch 27800, loss[loss=0.1404, simple_loss=0.2178, pruned_loss=0.03145, over 4971.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.034, over 971426.83 frames.], batch size: 28, lr: 2.31e-04 2022-05-06 15:08:36,546 INFO [train.py:715] (0/8) Epoch 9, batch 27850, loss[loss=0.1032, simple_loss=0.1886, pruned_loss=0.008856, over 4885.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03385, over 971073.84 frames.], batch size: 22, lr: 2.31e-04 2022-05-06 15:09:16,411 INFO [train.py:715] (0/8) Epoch 9, batch 27900, loss[loss=0.1494, simple_loss=0.2211, pruned_loss=0.0389, over 4794.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03395, over 971618.78 frames.], batch size: 24, lr: 2.31e-04 2022-05-06 15:09:54,909 INFO [train.py:715] (0/8) Epoch 9, batch 27950, loss[loss=0.1473, simple_loss=0.2229, pruned_loss=0.03589, over 4806.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03417, over 971946.14 frames.], batch size: 21, lr: 2.31e-04 2022-05-06 15:10:34,264 INFO [train.py:715] (0/8) Epoch 9, batch 28000, loss[loss=0.1449, simple_loss=0.2025, pruned_loss=0.04361, over 4785.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2133, pruned_loss=0.03399, over 971296.68 frames.], batch size: 12, lr: 2.31e-04 2022-05-06 15:11:13,571 INFO [train.py:715] (0/8) Epoch 9, batch 28050, loss[loss=0.1344, simple_loss=0.2119, pruned_loss=0.02842, over 4914.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03411, over 970950.63 frames.], batch size: 17, lr: 2.31e-04 2022-05-06 15:11:52,639 INFO [train.py:715] (0/8) Epoch 9, batch 28100, loss[loss=0.1282, simple_loss=0.197, pruned_loss=0.02968, over 4698.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2126, pruned_loss=0.03405, over 969908.76 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:12:31,906 INFO [train.py:715] (0/8) Epoch 9, batch 28150, loss[loss=0.1299, simple_loss=0.2088, pruned_loss=0.02552, over 4805.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03421, over 970227.08 frames.], batch size: 25, lr: 2.31e-04 2022-05-06 15:13:10,815 INFO [train.py:715] (0/8) Epoch 9, batch 28200, loss[loss=0.1471, simple_loss=0.2166, pruned_loss=0.03881, over 4921.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03439, over 971145.78 frames.], batch size: 17, lr: 2.31e-04 2022-05-06 15:13:50,247 INFO [train.py:715] (0/8) Epoch 9, batch 28250, loss[loss=0.1336, simple_loss=0.2066, pruned_loss=0.0303, over 4947.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2145, pruned_loss=0.03436, over 970874.31 frames.], batch size: 23, lr: 2.31e-04 2022-05-06 15:14:28,526 INFO [train.py:715] (0/8) Epoch 9, batch 28300, loss[loss=0.1252, simple_loss=0.1988, pruned_loss=0.02586, over 4914.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2138, pruned_loss=0.03401, over 971066.57 frames.], batch size: 17, lr: 2.31e-04 2022-05-06 15:15:07,474 INFO [train.py:715] (0/8) Epoch 9, batch 28350, loss[loss=0.1305, simple_loss=0.1932, pruned_loss=0.03389, over 4813.00 frames.], tot_loss[loss=0.141, simple_loss=0.2142, pruned_loss=0.03395, over 971237.45 frames.], batch size: 13, lr: 2.31e-04 2022-05-06 15:15:46,873 INFO [train.py:715] (0/8) Epoch 9, batch 28400, loss[loss=0.1429, simple_loss=0.225, pruned_loss=0.03039, over 4969.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2147, pruned_loss=0.03439, over 971577.21 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:16:25,950 INFO [train.py:715] (0/8) Epoch 9, batch 28450, loss[loss=0.1457, simple_loss=0.2135, pruned_loss=0.03897, over 4811.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2141, pruned_loss=0.03423, over 972138.41 frames.], batch size: 13, lr: 2.31e-04 2022-05-06 15:17:04,385 INFO [train.py:715] (0/8) Epoch 9, batch 28500, loss[loss=0.1365, simple_loss=0.2224, pruned_loss=0.02534, over 4877.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2142, pruned_loss=0.03413, over 972783.99 frames.], batch size: 16, lr: 2.31e-04 2022-05-06 15:17:43,522 INFO [train.py:715] (0/8) Epoch 9, batch 28550, loss[loss=0.1185, simple_loss=0.1801, pruned_loss=0.0284, over 4847.00 frames.], tot_loss[loss=0.1432, simple_loss=0.2158, pruned_loss=0.03531, over 971987.55 frames.], batch size: 12, lr: 2.31e-04 2022-05-06 15:18:22,912 INFO [train.py:715] (0/8) Epoch 9, batch 28600, loss[loss=0.1075, simple_loss=0.1859, pruned_loss=0.01454, over 4763.00 frames.], tot_loss[loss=0.143, simple_loss=0.2157, pruned_loss=0.03509, over 972461.53 frames.], batch size: 18, lr: 2.31e-04 2022-05-06 15:19:01,329 INFO [train.py:715] (0/8) Epoch 9, batch 28650, loss[loss=0.1227, simple_loss=0.2025, pruned_loss=0.02144, over 4949.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2154, pruned_loss=0.03495, over 972128.78 frames.], batch size: 23, lr: 2.31e-04 2022-05-06 15:19:40,170 INFO [train.py:715] (0/8) Epoch 9, batch 28700, loss[loss=0.1517, simple_loss=0.2186, pruned_loss=0.04238, over 4899.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2154, pruned_loss=0.03481, over 971680.46 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:20:19,621 INFO [train.py:715] (0/8) Epoch 9, batch 28750, loss[loss=0.1713, simple_loss=0.2302, pruned_loss=0.05623, over 4988.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2147, pruned_loss=0.0349, over 972185.19 frames.], batch size: 20, lr: 2.31e-04 2022-05-06 15:20:58,322 INFO [train.py:715] (0/8) Epoch 9, batch 28800, loss[loss=0.1185, simple_loss=0.1943, pruned_loss=0.02137, over 4804.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2137, pruned_loss=0.03445, over 971845.37 frames.], batch size: 25, lr: 2.31e-04 2022-05-06 15:21:36,721 INFO [train.py:715] (0/8) Epoch 9, batch 28850, loss[loss=0.1583, simple_loss=0.2334, pruned_loss=0.0416, over 4887.00 frames.], tot_loss[loss=0.1414, simple_loss=0.214, pruned_loss=0.0344, over 971916.96 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:22:16,102 INFO [train.py:715] (0/8) Epoch 9, batch 28900, loss[loss=0.1495, simple_loss=0.2303, pruned_loss=0.03434, over 4986.00 frames.], tot_loss[loss=0.143, simple_loss=0.2155, pruned_loss=0.03525, over 972358.87 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:22:55,365 INFO [train.py:715] (0/8) Epoch 9, batch 28950, loss[loss=0.1233, simple_loss=0.1964, pruned_loss=0.02508, over 4909.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2149, pruned_loss=0.0348, over 971915.06 frames.], batch size: 18, lr: 2.31e-04 2022-05-06 15:23:33,681 INFO [train.py:715] (0/8) Epoch 9, batch 29000, loss[loss=0.1253, simple_loss=0.2047, pruned_loss=0.02296, over 4842.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.03446, over 972658.56 frames.], batch size: 20, lr: 2.31e-04 2022-05-06 15:24:12,156 INFO [train.py:715] (0/8) Epoch 9, batch 29050, loss[loss=0.1337, simple_loss=0.2168, pruned_loss=0.02528, over 4800.00 frames.], tot_loss[loss=0.142, simple_loss=0.2149, pruned_loss=0.03459, over 972698.00 frames.], batch size: 24, lr: 2.31e-04 2022-05-06 15:24:51,097 INFO [train.py:715] (0/8) Epoch 9, batch 29100, loss[loss=0.135, simple_loss=0.2037, pruned_loss=0.03315, over 4787.00 frames.], tot_loss[loss=0.1424, simple_loss=0.2154, pruned_loss=0.03474, over 972530.20 frames.], batch size: 18, lr: 2.31e-04 2022-05-06 15:25:30,246 INFO [train.py:715] (0/8) Epoch 9, batch 29150, loss[loss=0.1315, simple_loss=0.1972, pruned_loss=0.03293, over 4748.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2147, pruned_loss=0.03427, over 972689.83 frames.], batch size: 12, lr: 2.31e-04 2022-05-06 15:26:09,095 INFO [train.py:715] (0/8) Epoch 9, batch 29200, loss[loss=0.1665, simple_loss=0.2364, pruned_loss=0.04826, over 4947.00 frames.], tot_loss[loss=0.1428, simple_loss=0.2158, pruned_loss=0.03492, over 973096.87 frames.], batch size: 21, lr: 2.31e-04 2022-05-06 15:26:48,461 INFO [train.py:715] (0/8) Epoch 9, batch 29250, loss[loss=0.1532, simple_loss=0.2224, pruned_loss=0.04204, over 4969.00 frames.], tot_loss[loss=0.1437, simple_loss=0.2163, pruned_loss=0.03557, over 973645.72 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:27:27,198 INFO [train.py:715] (0/8) Epoch 9, batch 29300, loss[loss=0.1497, simple_loss=0.2299, pruned_loss=0.03477, over 4874.00 frames.], tot_loss[loss=0.1432, simple_loss=0.216, pruned_loss=0.03522, over 973135.26 frames.], batch size: 16, lr: 2.31e-04 2022-05-06 15:28:06,265 INFO [train.py:715] (0/8) Epoch 9, batch 29350, loss[loss=0.1422, simple_loss=0.2119, pruned_loss=0.03625, over 4978.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2155, pruned_loss=0.03497, over 973115.22 frames.], batch size: 31, lr: 2.31e-04 2022-05-06 15:28:45,210 INFO [train.py:715] (0/8) Epoch 9, batch 29400, loss[loss=0.1254, simple_loss=0.2046, pruned_loss=0.02312, over 4985.00 frames.], tot_loss[loss=0.1431, simple_loss=0.2154, pruned_loss=0.03538, over 972685.01 frames.], batch size: 16, lr: 2.31e-04 2022-05-06 15:29:23,946 INFO [train.py:715] (0/8) Epoch 9, batch 29450, loss[loss=0.1854, simple_loss=0.2482, pruned_loss=0.06134, over 4818.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2146, pruned_loss=0.03494, over 973152.27 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:30:02,404 INFO [train.py:715] (0/8) Epoch 9, batch 29500, loss[loss=0.1493, simple_loss=0.2224, pruned_loss=0.03811, over 4896.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2143, pruned_loss=0.03499, over 973132.84 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:30:41,335 INFO [train.py:715] (0/8) Epoch 9, batch 29550, loss[loss=0.1501, simple_loss=0.2355, pruned_loss=0.0323, over 4894.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2132, pruned_loss=0.03428, over 973410.19 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:31:20,275 INFO [train.py:715] (0/8) Epoch 9, batch 29600, loss[loss=0.1424, simple_loss=0.2217, pruned_loss=0.03155, over 4988.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2132, pruned_loss=0.03452, over 973337.08 frames.], batch size: 28, lr: 2.31e-04 2022-05-06 15:31:59,539 INFO [train.py:715] (0/8) Epoch 9, batch 29650, loss[loss=0.1277, simple_loss=0.1984, pruned_loss=0.02854, over 4939.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2135, pruned_loss=0.03442, over 973016.35 frames.], batch size: 21, lr: 2.31e-04 2022-05-06 15:32:39,144 INFO [train.py:715] (0/8) Epoch 9, batch 29700, loss[loss=0.1424, simple_loss=0.2147, pruned_loss=0.03506, over 4871.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2139, pruned_loss=0.03456, over 973182.47 frames.], batch size: 22, lr: 2.31e-04 2022-05-06 15:33:17,089 INFO [train.py:715] (0/8) Epoch 9, batch 29750, loss[loss=0.1314, simple_loss=0.2095, pruned_loss=0.02661, over 4942.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2143, pruned_loss=0.03465, over 973796.42 frames.], batch size: 29, lr: 2.31e-04 2022-05-06 15:33:55,771 INFO [train.py:715] (0/8) Epoch 9, batch 29800, loss[loss=0.181, simple_loss=0.2575, pruned_loss=0.05226, over 4753.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2145, pruned_loss=0.03457, over 974361.37 frames.], batch size: 16, lr: 2.31e-04 2022-05-06 15:34:34,893 INFO [train.py:715] (0/8) Epoch 9, batch 29850, loss[loss=0.1403, simple_loss=0.1996, pruned_loss=0.04056, over 4748.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2147, pruned_loss=0.03433, over 973476.54 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:35:13,059 INFO [train.py:715] (0/8) Epoch 9, batch 29900, loss[loss=0.1066, simple_loss=0.1827, pruned_loss=0.01528, over 4825.00 frames.], tot_loss[loss=0.1411, simple_loss=0.214, pruned_loss=0.03413, over 971931.21 frames.], batch size: 12, lr: 2.31e-04 2022-05-06 15:35:52,534 INFO [train.py:715] (0/8) Epoch 9, batch 29950, loss[loss=0.1599, simple_loss=0.2263, pruned_loss=0.04678, over 4973.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2149, pruned_loss=0.03437, over 971531.52 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:36:31,403 INFO [train.py:715] (0/8) Epoch 9, batch 30000, loss[loss=0.1367, simple_loss=0.2008, pruned_loss=0.03625, over 4980.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03471, over 972309.62 frames.], batch size: 31, lr: 2.31e-04 2022-05-06 15:36:31,404 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 15:36:40,919 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1068, simple_loss=0.1911, pruned_loss=0.01124, over 914524.00 frames. 2022-05-06 15:37:20,162 INFO [train.py:715] (0/8) Epoch 9, batch 30050, loss[loss=0.1182, simple_loss=0.187, pruned_loss=0.02472, over 4973.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2152, pruned_loss=0.03491, over 972426.57 frames.], batch size: 14, lr: 2.31e-04 2022-05-06 15:37:58,805 INFO [train.py:715] (0/8) Epoch 9, batch 30100, loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03021, over 4833.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2149, pruned_loss=0.03485, over 971802.05 frames.], batch size: 13, lr: 2.31e-04 2022-05-06 15:38:38,124 INFO [train.py:715] (0/8) Epoch 9, batch 30150, loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03038, over 4796.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03471, over 971338.83 frames.], batch size: 24, lr: 2.31e-04 2022-05-06 15:39:17,506 INFO [train.py:715] (0/8) Epoch 9, batch 30200, loss[loss=0.1258, simple_loss=0.1929, pruned_loss=0.02935, over 4791.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2143, pruned_loss=0.03437, over 971435.29 frames.], batch size: 17, lr: 2.31e-04 2022-05-06 15:39:56,687 INFO [train.py:715] (0/8) Epoch 9, batch 30250, loss[loss=0.1292, simple_loss=0.2082, pruned_loss=0.0251, over 4763.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2145, pruned_loss=0.03431, over 971947.18 frames.], batch size: 16, lr: 2.31e-04 2022-05-06 15:40:35,244 INFO [train.py:715] (0/8) Epoch 9, batch 30300, loss[loss=0.1421, simple_loss=0.2147, pruned_loss=0.03471, over 4897.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2143, pruned_loss=0.03429, over 972076.84 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:41:14,056 INFO [train.py:715] (0/8) Epoch 9, batch 30350, loss[loss=0.1291, simple_loss=0.2032, pruned_loss=0.0275, over 4859.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2145, pruned_loss=0.0344, over 971205.11 frames.], batch size: 32, lr: 2.31e-04 2022-05-06 15:41:53,483 INFO [train.py:715] (0/8) Epoch 9, batch 30400, loss[loss=0.125, simple_loss=0.1954, pruned_loss=0.02731, over 4836.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03411, over 971929.77 frames.], batch size: 30, lr: 2.31e-04 2022-05-06 15:42:32,290 INFO [train.py:715] (0/8) Epoch 9, batch 30450, loss[loss=0.1342, simple_loss=0.2076, pruned_loss=0.03038, over 4823.00 frames.], tot_loss[loss=0.141, simple_loss=0.2139, pruned_loss=0.03404, over 971647.57 frames.], batch size: 13, lr: 2.31e-04 2022-05-06 15:43:10,914 INFO [train.py:715] (0/8) Epoch 9, batch 30500, loss[loss=0.1177, simple_loss=0.1852, pruned_loss=0.02506, over 4950.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03454, over 971872.44 frames.], batch size: 21, lr: 2.31e-04 2022-05-06 15:43:49,984 INFO [train.py:715] (0/8) Epoch 9, batch 30550, loss[loss=0.1673, simple_loss=0.2366, pruned_loss=0.04901, over 4692.00 frames.], tot_loss[loss=0.142, simple_loss=0.2147, pruned_loss=0.03466, over 971750.68 frames.], batch size: 15, lr: 2.31e-04 2022-05-06 15:44:28,846 INFO [train.py:715] (0/8) Epoch 9, batch 30600, loss[loss=0.1263, simple_loss=0.1971, pruned_loss=0.02771, over 4893.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2146, pruned_loss=0.03441, over 972634.99 frames.], batch size: 19, lr: 2.31e-04 2022-05-06 15:45:06,878 INFO [train.py:715] (0/8) Epoch 9, batch 30650, loss[loss=0.09447, simple_loss=0.1695, pruned_loss=0.009715, over 4817.00 frames.], tot_loss[loss=0.142, simple_loss=0.2146, pruned_loss=0.03466, over 972777.91 frames.], batch size: 13, lr: 2.31e-04 2022-05-06 15:45:45,879 INFO [train.py:715] (0/8) Epoch 9, batch 30700, loss[loss=0.1187, simple_loss=0.1917, pruned_loss=0.02288, over 4871.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2134, pruned_loss=0.03415, over 973106.34 frames.], batch size: 20, lr: 2.30e-04 2022-05-06 15:45:56,746 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-344000.pt 2022-05-06 15:46:27,569 INFO [train.py:715] (0/8) Epoch 9, batch 30750, loss[loss=0.1584, simple_loss=0.2324, pruned_loss=0.04223, over 4689.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2137, pruned_loss=0.03427, over 973148.73 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 15:47:06,253 INFO [train.py:715] (0/8) Epoch 9, batch 30800, loss[loss=0.1273, simple_loss=0.1933, pruned_loss=0.03068, over 4897.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.03435, over 972681.79 frames.], batch size: 19, lr: 2.30e-04 2022-05-06 15:47:44,602 INFO [train.py:715] (0/8) Epoch 9, batch 30850, loss[loss=0.1408, simple_loss=0.2189, pruned_loss=0.03132, over 4805.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.03402, over 972159.36 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 15:48:23,856 INFO [train.py:715] (0/8) Epoch 9, batch 30900, loss[loss=0.1275, simple_loss=0.1974, pruned_loss=0.02879, over 4869.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03422, over 972587.84 frames.], batch size: 32, lr: 2.30e-04 2022-05-06 15:49:03,043 INFO [train.py:715] (0/8) Epoch 9, batch 30950, loss[loss=0.1486, simple_loss=0.2213, pruned_loss=0.03797, over 4702.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2129, pruned_loss=0.0342, over 973046.89 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 15:49:41,532 INFO [train.py:715] (0/8) Epoch 9, batch 31000, loss[loss=0.1089, simple_loss=0.1756, pruned_loss=0.02107, over 4825.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2134, pruned_loss=0.03444, over 972845.69 frames.], batch size: 13, lr: 2.30e-04 2022-05-06 15:50:20,506 INFO [train.py:715] (0/8) Epoch 9, batch 31050, loss[loss=0.1453, simple_loss=0.2316, pruned_loss=0.02952, over 4984.00 frames.], tot_loss[loss=0.1411, simple_loss=0.214, pruned_loss=0.03412, over 972267.28 frames.], batch size: 25, lr: 2.30e-04 2022-05-06 15:50:59,764 INFO [train.py:715] (0/8) Epoch 9, batch 31100, loss[loss=0.1305, simple_loss=0.2076, pruned_loss=0.02675, over 4933.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2141, pruned_loss=0.03412, over 972864.55 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 15:51:38,433 INFO [train.py:715] (0/8) Epoch 9, batch 31150, loss[loss=0.1351, simple_loss=0.2076, pruned_loss=0.03126, over 4852.00 frames.], tot_loss[loss=0.1418, simple_loss=0.215, pruned_loss=0.03436, over 972084.98 frames.], batch size: 20, lr: 2.30e-04 2022-05-06 15:52:17,016 INFO [train.py:715] (0/8) Epoch 9, batch 31200, loss[loss=0.1143, simple_loss=0.194, pruned_loss=0.01737, over 4767.00 frames.], tot_loss[loss=0.1406, simple_loss=0.214, pruned_loss=0.03361, over 972325.54 frames.], batch size: 12, lr: 2.30e-04 2022-05-06 15:52:56,549 INFO [train.py:715] (0/8) Epoch 9, batch 31250, loss[loss=0.1275, simple_loss=0.2097, pruned_loss=0.02269, over 4921.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2144, pruned_loss=0.0337, over 971620.39 frames.], batch size: 23, lr: 2.30e-04 2022-05-06 15:53:35,996 INFO [train.py:715] (0/8) Epoch 9, batch 31300, loss[loss=0.2111, simple_loss=0.2635, pruned_loss=0.07935, over 4858.00 frames.], tot_loss[loss=0.1418, simple_loss=0.215, pruned_loss=0.0343, over 971434.80 frames.], batch size: 30, lr: 2.30e-04 2022-05-06 15:54:14,969 INFO [train.py:715] (0/8) Epoch 9, batch 31350, loss[loss=0.1301, simple_loss=0.205, pruned_loss=0.02755, over 4957.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03368, over 971953.62 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 15:54:53,756 INFO [train.py:715] (0/8) Epoch 9, batch 31400, loss[loss=0.1399, simple_loss=0.2051, pruned_loss=0.0374, over 4783.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2133, pruned_loss=0.03362, over 972006.15 frames.], batch size: 12, lr: 2.30e-04 2022-05-06 15:55:32,698 INFO [train.py:715] (0/8) Epoch 9, batch 31450, loss[loss=0.1296, simple_loss=0.1995, pruned_loss=0.02985, over 4906.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03315, over 971851.35 frames.], batch size: 17, lr: 2.30e-04 2022-05-06 15:56:11,769 INFO [train.py:715] (0/8) Epoch 9, batch 31500, loss[loss=0.1496, simple_loss=0.227, pruned_loss=0.03615, over 4778.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2134, pruned_loss=0.03383, over 971692.26 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 15:56:50,181 INFO [train.py:715] (0/8) Epoch 9, batch 31550, loss[loss=0.1714, simple_loss=0.2419, pruned_loss=0.05046, over 4964.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2139, pruned_loss=0.03395, over 972706.75 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 15:57:29,717 INFO [train.py:715] (0/8) Epoch 9, batch 31600, loss[loss=0.126, simple_loss=0.194, pruned_loss=0.02905, over 4816.00 frames.], tot_loss[loss=0.141, simple_loss=0.2139, pruned_loss=0.03405, over 972781.88 frames.], batch size: 13, lr: 2.30e-04 2022-05-06 15:58:09,720 INFO [train.py:715] (0/8) Epoch 9, batch 31650, loss[loss=0.1434, simple_loss=0.2157, pruned_loss=0.03555, over 4772.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03414, over 974161.76 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 15:58:48,440 INFO [train.py:715] (0/8) Epoch 9, batch 31700, loss[loss=0.1372, simple_loss=0.2062, pruned_loss=0.03414, over 4913.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2142, pruned_loss=0.03441, over 973635.66 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 15:59:27,450 INFO [train.py:715] (0/8) Epoch 9, batch 31750, loss[loss=0.1401, simple_loss=0.2149, pruned_loss=0.03263, over 4846.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.0344, over 974204.52 frames.], batch size: 32, lr: 2.30e-04 2022-05-06 16:00:06,078 INFO [train.py:715] (0/8) Epoch 9, batch 31800, loss[loss=0.1197, simple_loss=0.1943, pruned_loss=0.02258, over 4785.00 frames.], tot_loss[loss=0.1414, simple_loss=0.214, pruned_loss=0.03441, over 974197.54 frames.], batch size: 14, lr: 2.30e-04 2022-05-06 16:00:45,141 INFO [train.py:715] (0/8) Epoch 9, batch 31850, loss[loss=0.1449, simple_loss=0.2258, pruned_loss=0.03195, over 4816.00 frames.], tot_loss[loss=0.141, simple_loss=0.214, pruned_loss=0.03403, over 974039.38 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 16:01:23,636 INFO [train.py:715] (0/8) Epoch 9, batch 31900, loss[loss=0.1414, simple_loss=0.2067, pruned_loss=0.03802, over 4779.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2143, pruned_loss=0.034, over 973134.56 frames.], batch size: 17, lr: 2.30e-04 2022-05-06 16:02:02,944 INFO [train.py:715] (0/8) Epoch 9, batch 31950, loss[loss=0.1132, simple_loss=0.1952, pruned_loss=0.01561, over 4762.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2136, pruned_loss=0.0338, over 973642.18 frames.], batch size: 19, lr: 2.30e-04 2022-05-06 16:02:42,218 INFO [train.py:715] (0/8) Epoch 9, batch 32000, loss[loss=0.1543, simple_loss=0.2316, pruned_loss=0.03848, over 4827.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2139, pruned_loss=0.03425, over 973469.78 frames.], batch size: 25, lr: 2.30e-04 2022-05-06 16:03:20,777 INFO [train.py:715] (0/8) Epoch 9, batch 32050, loss[loss=0.1438, simple_loss=0.2218, pruned_loss=0.03289, over 4898.00 frames.], tot_loss[loss=0.142, simple_loss=0.2147, pruned_loss=0.03467, over 973665.44 frames.], batch size: 17, lr: 2.30e-04 2022-05-06 16:03:59,266 INFO [train.py:715] (0/8) Epoch 9, batch 32100, loss[loss=0.1335, simple_loss=0.2028, pruned_loss=0.03206, over 4839.00 frames.], tot_loss[loss=0.1423, simple_loss=0.2149, pruned_loss=0.0349, over 972904.22 frames.], batch size: 30, lr: 2.30e-04 2022-05-06 16:04:38,259 INFO [train.py:715] (0/8) Epoch 9, batch 32150, loss[loss=0.1015, simple_loss=0.1848, pruned_loss=0.009083, over 4757.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03485, over 972699.63 frames.], batch size: 12, lr: 2.30e-04 2022-05-06 16:05:17,699 INFO [train.py:715] (0/8) Epoch 9, batch 32200, loss[loss=0.1398, simple_loss=0.2073, pruned_loss=0.03619, over 4971.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2137, pruned_loss=0.03443, over 973019.21 frames.], batch size: 25, lr: 2.30e-04 2022-05-06 16:05:55,459 INFO [train.py:715] (0/8) Epoch 9, batch 32250, loss[loss=0.1369, simple_loss=0.2072, pruned_loss=0.03325, over 4970.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.03426, over 972415.45 frames.], batch size: 14, lr: 2.30e-04 2022-05-06 16:06:34,663 INFO [train.py:715] (0/8) Epoch 9, batch 32300, loss[loss=0.1269, simple_loss=0.2004, pruned_loss=0.02669, over 4883.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03448, over 972224.62 frames.], batch size: 19, lr: 2.30e-04 2022-05-06 16:07:13,840 INFO [train.py:715] (0/8) Epoch 9, batch 32350, loss[loss=0.138, simple_loss=0.2117, pruned_loss=0.03219, over 4950.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2134, pruned_loss=0.03402, over 971966.96 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 16:07:52,329 INFO [train.py:715] (0/8) Epoch 9, batch 32400, loss[loss=0.1583, simple_loss=0.2372, pruned_loss=0.03971, over 4861.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2137, pruned_loss=0.03444, over 972058.89 frames.], batch size: 20, lr: 2.30e-04 2022-05-06 16:08:31,412 INFO [train.py:715] (0/8) Epoch 9, batch 32450, loss[loss=0.1242, simple_loss=0.1923, pruned_loss=0.02804, over 4957.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2139, pruned_loss=0.03482, over 972595.28 frames.], batch size: 25, lr: 2.30e-04 2022-05-06 16:09:10,514 INFO [train.py:715] (0/8) Epoch 9, batch 32500, loss[loss=0.1232, simple_loss=0.199, pruned_loss=0.02372, over 4854.00 frames.], tot_loss[loss=0.141, simple_loss=0.2135, pruned_loss=0.03425, over 972488.55 frames.], batch size: 13, lr: 2.30e-04 2022-05-06 16:09:49,356 INFO [train.py:715] (0/8) Epoch 9, batch 32550, loss[loss=0.1306, simple_loss=0.2088, pruned_loss=0.02624, over 4818.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2136, pruned_loss=0.03445, over 971672.80 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 16:10:27,860 INFO [train.py:715] (0/8) Epoch 9, batch 32600, loss[loss=0.1442, simple_loss=0.2289, pruned_loss=0.02978, over 4743.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03454, over 972160.49 frames.], batch size: 16, lr: 2.30e-04 2022-05-06 16:11:06,892 INFO [train.py:715] (0/8) Epoch 9, batch 32650, loss[loss=0.1758, simple_loss=0.2432, pruned_loss=0.05419, over 4956.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2151, pruned_loss=0.03494, over 972053.49 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 16:11:45,870 INFO [train.py:715] (0/8) Epoch 9, batch 32700, loss[loss=0.1332, simple_loss=0.2077, pruned_loss=0.0294, over 4933.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2143, pruned_loss=0.03471, over 971212.54 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 16:12:24,793 INFO [train.py:715] (0/8) Epoch 9, batch 32750, loss[loss=0.1521, simple_loss=0.2236, pruned_loss=0.04028, over 4882.00 frames.], tot_loss[loss=0.141, simple_loss=0.2133, pruned_loss=0.03431, over 972182.21 frames.], batch size: 22, lr: 2.30e-04 2022-05-06 16:13:03,519 INFO [train.py:715] (0/8) Epoch 9, batch 32800, loss[loss=0.1456, simple_loss=0.2082, pruned_loss=0.04148, over 4837.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03438, over 972452.89 frames.], batch size: 30, lr: 2.30e-04 2022-05-06 16:13:42,560 INFO [train.py:715] (0/8) Epoch 9, batch 32850, loss[loss=0.1481, simple_loss=0.2306, pruned_loss=0.03285, over 4827.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2148, pruned_loss=0.03483, over 971879.40 frames.], batch size: 25, lr: 2.30e-04 2022-05-06 16:14:21,302 INFO [train.py:715] (0/8) Epoch 9, batch 32900, loss[loss=0.1169, simple_loss=0.1871, pruned_loss=0.02337, over 4650.00 frames.], tot_loss[loss=0.142, simple_loss=0.2147, pruned_loss=0.03468, over 972398.54 frames.], batch size: 13, lr: 2.30e-04 2022-05-06 16:14:59,681 INFO [train.py:715] (0/8) Epoch 9, batch 32950, loss[loss=0.1342, simple_loss=0.2104, pruned_loss=0.02894, over 4791.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2144, pruned_loss=0.03461, over 973045.34 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 16:15:38,638 INFO [train.py:715] (0/8) Epoch 9, batch 33000, loss[loss=0.1256, simple_loss=0.1911, pruned_loss=0.03008, over 4764.00 frames.], tot_loss[loss=0.1419, simple_loss=0.2142, pruned_loss=0.0348, over 973613.60 frames.], batch size: 12, lr: 2.30e-04 2022-05-06 16:15:38,639 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 16:15:48,001 INFO [train.py:742] (0/8) Epoch 9, validation: loss=0.1068, simple_loss=0.1913, pruned_loss=0.01119, over 914524.00 frames. 2022-05-06 16:16:27,262 INFO [train.py:715] (0/8) Epoch 9, batch 33050, loss[loss=0.1622, simple_loss=0.2294, pruned_loss=0.04749, over 4970.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2145, pruned_loss=0.03482, over 973026.30 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 16:17:06,452 INFO [train.py:715] (0/8) Epoch 9, batch 33100, loss[loss=0.1052, simple_loss=0.176, pruned_loss=0.01721, over 4810.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2141, pruned_loss=0.03441, over 972617.13 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 16:17:45,624 INFO [train.py:715] (0/8) Epoch 9, batch 33150, loss[loss=0.1066, simple_loss=0.1817, pruned_loss=0.01572, over 4805.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2142, pruned_loss=0.03461, over 972600.32 frames.], batch size: 21, lr: 2.30e-04 2022-05-06 16:18:25,448 INFO [train.py:715] (0/8) Epoch 9, batch 33200, loss[loss=0.1684, simple_loss=0.2438, pruned_loss=0.04648, over 4970.00 frames.], tot_loss[loss=0.142, simple_loss=0.2147, pruned_loss=0.03472, over 972501.48 frames.], batch size: 24, lr: 2.30e-04 2022-05-06 16:19:04,996 INFO [train.py:715] (0/8) Epoch 9, batch 33250, loss[loss=0.1378, simple_loss=0.2118, pruned_loss=0.03191, over 4873.00 frames.], tot_loss[loss=0.142, simple_loss=0.215, pruned_loss=0.03453, over 973090.10 frames.], batch size: 16, lr: 2.30e-04 2022-05-06 16:19:44,054 INFO [train.py:715] (0/8) Epoch 9, batch 33300, loss[loss=0.1233, simple_loss=0.1957, pruned_loss=0.02542, over 4799.00 frames.], tot_loss[loss=0.1426, simple_loss=0.2153, pruned_loss=0.03491, over 973836.43 frames.], batch size: 13, lr: 2.30e-04 2022-05-06 16:20:23,552 INFO [train.py:715] (0/8) Epoch 9, batch 33350, loss[loss=0.1295, simple_loss=0.2112, pruned_loss=0.02387, over 4780.00 frames.], tot_loss[loss=0.1425, simple_loss=0.2155, pruned_loss=0.03481, over 973208.07 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 16:21:03,299 INFO [train.py:715] (0/8) Epoch 9, batch 33400, loss[loss=0.1531, simple_loss=0.2232, pruned_loss=0.0415, over 4903.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2149, pruned_loss=0.0348, over 973441.14 frames.], batch size: 19, lr: 2.30e-04 2022-05-06 16:21:43,054 INFO [train.py:715] (0/8) Epoch 9, batch 33450, loss[loss=0.1435, simple_loss=0.224, pruned_loss=0.03149, over 4944.00 frames.], tot_loss[loss=0.142, simple_loss=0.2144, pruned_loss=0.03479, over 972363.36 frames.], batch size: 39, lr: 2.30e-04 2022-05-06 16:22:22,074 INFO [train.py:715] (0/8) Epoch 9, batch 33500, loss[loss=0.1534, simple_loss=0.2235, pruned_loss=0.04168, over 4922.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.03426, over 971979.41 frames.], batch size: 23, lr: 2.30e-04 2022-05-06 16:23:00,822 INFO [train.py:715] (0/8) Epoch 9, batch 33550, loss[loss=0.1498, simple_loss=0.2327, pruned_loss=0.03342, over 4709.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2134, pruned_loss=0.0339, over 972081.54 frames.], batch size: 15, lr: 2.30e-04 2022-05-06 16:23:40,546 INFO [train.py:715] (0/8) Epoch 9, batch 33600, loss[loss=0.1446, simple_loss=0.2112, pruned_loss=0.03899, over 4906.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2141, pruned_loss=0.03421, over 973018.32 frames.], batch size: 19, lr: 2.30e-04 2022-05-06 16:24:19,318 INFO [train.py:715] (0/8) Epoch 9, batch 33650, loss[loss=0.1371, simple_loss=0.2038, pruned_loss=0.03519, over 4764.00 frames.], tot_loss[loss=0.1414, simple_loss=0.214, pruned_loss=0.03433, over 972937.14 frames.], batch size: 18, lr: 2.30e-04 2022-05-06 16:24:58,234 INFO [train.py:715] (0/8) Epoch 9, batch 33700, loss[loss=0.1223, simple_loss=0.2078, pruned_loss=0.01835, over 4780.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2136, pruned_loss=0.03426, over 972101.38 frames.], batch size: 18, lr: 2.29e-04 2022-05-06 16:25:37,405 INFO [train.py:715] (0/8) Epoch 9, batch 33750, loss[loss=0.1441, simple_loss=0.2094, pruned_loss=0.0394, over 4859.00 frames.], tot_loss[loss=0.1407, simple_loss=0.213, pruned_loss=0.03415, over 972522.20 frames.], batch size: 13, lr: 2.29e-04 2022-05-06 16:26:16,196 INFO [train.py:715] (0/8) Epoch 9, batch 33800, loss[loss=0.1483, simple_loss=0.22, pruned_loss=0.03832, over 4848.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2124, pruned_loss=0.0339, over 972057.10 frames.], batch size: 32, lr: 2.29e-04 2022-05-06 16:26:54,910 INFO [train.py:715] (0/8) Epoch 9, batch 33850, loss[loss=0.1358, simple_loss=0.2178, pruned_loss=0.02694, over 4965.00 frames.], tot_loss[loss=0.14, simple_loss=0.2124, pruned_loss=0.03382, over 971841.63 frames.], batch size: 28, lr: 2.29e-04 2022-05-06 16:27:33,753 INFO [train.py:715] (0/8) Epoch 9, batch 33900, loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.02887, over 4979.00 frames.], tot_loss[loss=0.1409, simple_loss=0.213, pruned_loss=0.03439, over 971538.82 frames.], batch size: 25, lr: 2.29e-04 2022-05-06 16:28:13,483 INFO [train.py:715] (0/8) Epoch 9, batch 33950, loss[loss=0.1373, simple_loss=0.215, pruned_loss=0.02987, over 4803.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2128, pruned_loss=0.03415, over 971494.38 frames.], batch size: 21, lr: 2.29e-04 2022-05-06 16:28:52,280 INFO [train.py:715] (0/8) Epoch 9, batch 34000, loss[loss=0.1442, simple_loss=0.2243, pruned_loss=0.0321, over 4648.00 frames.], tot_loss[loss=0.1405, simple_loss=0.213, pruned_loss=0.03398, over 972230.35 frames.], batch size: 13, lr: 2.29e-04 2022-05-06 16:29:31,510 INFO [train.py:715] (0/8) Epoch 9, batch 34050, loss[loss=0.1287, simple_loss=0.1944, pruned_loss=0.03151, over 4836.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2138, pruned_loss=0.03441, over 972613.92 frames.], batch size: 15, lr: 2.29e-04 2022-05-06 16:30:09,973 INFO [train.py:715] (0/8) Epoch 9, batch 34100, loss[loss=0.1376, simple_loss=0.2136, pruned_loss=0.0308, over 4972.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2132, pruned_loss=0.03374, over 972582.51 frames.], batch size: 24, lr: 2.29e-04 2022-05-06 16:30:49,073 INFO [train.py:715] (0/8) Epoch 9, batch 34150, loss[loss=0.1388, simple_loss=0.2198, pruned_loss=0.02887, over 4845.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03385, over 972643.28 frames.], batch size: 30, lr: 2.29e-04 2022-05-06 16:31:27,538 INFO [train.py:715] (0/8) Epoch 9, batch 34200, loss[loss=0.1757, simple_loss=0.2474, pruned_loss=0.05195, over 4795.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2127, pruned_loss=0.03374, over 972773.82 frames.], batch size: 24, lr: 2.29e-04 2022-05-06 16:32:05,777 INFO [train.py:715] (0/8) Epoch 9, batch 34250, loss[loss=0.1272, simple_loss=0.1955, pruned_loss=0.02949, over 4926.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2124, pruned_loss=0.03353, over 972498.57 frames.], batch size: 23, lr: 2.29e-04 2022-05-06 16:32:45,090 INFO [train.py:715] (0/8) Epoch 9, batch 34300, loss[loss=0.1371, simple_loss=0.2146, pruned_loss=0.02977, over 4827.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2134, pruned_loss=0.03418, over 972791.42 frames.], batch size: 27, lr: 2.29e-04 2022-05-06 16:33:23,848 INFO [train.py:715] (0/8) Epoch 9, batch 34350, loss[loss=0.1323, simple_loss=0.2144, pruned_loss=0.02508, over 4752.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2133, pruned_loss=0.03419, over 972655.35 frames.], batch size: 16, lr: 2.29e-04 2022-05-06 16:34:02,522 INFO [train.py:715] (0/8) Epoch 9, batch 34400, loss[loss=0.1192, simple_loss=0.1949, pruned_loss=0.02172, over 4818.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2141, pruned_loss=0.03429, over 971708.31 frames.], batch size: 25, lr: 2.29e-04 2022-05-06 16:34:41,407 INFO [train.py:715] (0/8) Epoch 9, batch 34450, loss[loss=0.1363, simple_loss=0.2151, pruned_loss=0.02872, over 4768.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2139, pruned_loss=0.03448, over 970954.55 frames.], batch size: 14, lr: 2.29e-04 2022-05-06 16:35:20,341 INFO [train.py:715] (0/8) Epoch 9, batch 34500, loss[loss=0.1265, simple_loss=0.1999, pruned_loss=0.02657, over 4839.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2144, pruned_loss=0.03438, over 971683.64 frames.], batch size: 26, lr: 2.29e-04 2022-05-06 16:35:59,374 INFO [train.py:715] (0/8) Epoch 9, batch 34550, loss[loss=0.1294, simple_loss=0.2095, pruned_loss=0.02464, over 4889.00 frames.], tot_loss[loss=0.1427, simple_loss=0.2158, pruned_loss=0.03479, over 971357.74 frames.], batch size: 22, lr: 2.29e-04 2022-05-06 16:36:38,003 INFO [train.py:715] (0/8) Epoch 9, batch 34600, loss[loss=0.1465, simple_loss=0.21, pruned_loss=0.04145, over 4871.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2153, pruned_loss=0.0346, over 971542.14 frames.], batch size: 32, lr: 2.29e-04 2022-05-06 16:37:17,105 INFO [train.py:715] (0/8) Epoch 9, batch 34650, loss[loss=0.1285, simple_loss=0.208, pruned_loss=0.02445, over 4961.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2144, pruned_loss=0.03423, over 971446.47 frames.], batch size: 24, lr: 2.29e-04 2022-05-06 16:37:56,492 INFO [train.py:715] (0/8) Epoch 9, batch 34700, loss[loss=0.1434, simple_loss=0.2125, pruned_loss=0.03714, over 4861.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2137, pruned_loss=0.03379, over 971836.98 frames.], batch size: 32, lr: 2.29e-04 2022-05-06 16:38:34,783 INFO [train.py:715] (0/8) Epoch 9, batch 34750, loss[loss=0.12, simple_loss=0.1885, pruned_loss=0.02571, over 4794.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2138, pruned_loss=0.03382, over 971307.68 frames.], batch size: 18, lr: 2.29e-04 2022-05-06 16:39:12,243 INFO [train.py:715] (0/8) Epoch 9, batch 34800, loss[loss=0.117, simple_loss=0.1721, pruned_loss=0.03092, over 4769.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2134, pruned_loss=0.0339, over 971053.35 frames.], batch size: 12, lr: 2.29e-04 2022-05-06 16:39:20,977 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-9.pt 2022-05-06 16:40:01,152 INFO [train.py:715] (0/8) Epoch 10, batch 0, loss[loss=0.156, simple_loss=0.231, pruned_loss=0.04054, over 4934.00 frames.], tot_loss[loss=0.156, simple_loss=0.231, pruned_loss=0.04054, over 4934.00 frames.], batch size: 18, lr: 2.19e-04 2022-05-06 16:40:41,028 INFO [train.py:715] (0/8) Epoch 10, batch 50, loss[loss=0.1615, simple_loss=0.2377, pruned_loss=0.04263, over 4927.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2105, pruned_loss=0.03364, over 218886.00 frames.], batch size: 18, lr: 2.19e-04 2022-05-06 16:41:20,753 INFO [train.py:715] (0/8) Epoch 10, batch 100, loss[loss=0.125, simple_loss=0.1937, pruned_loss=0.0282, over 4952.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2135, pruned_loss=0.03483, over 385719.38 frames.], batch size: 29, lr: 2.19e-04 2022-05-06 16:42:00,750 INFO [train.py:715] (0/8) Epoch 10, batch 150, loss[loss=0.1383, simple_loss=0.2023, pruned_loss=0.03717, over 4815.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2127, pruned_loss=0.0344, over 515851.19 frames.], batch size: 12, lr: 2.19e-04 2022-05-06 16:42:41,365 INFO [train.py:715] (0/8) Epoch 10, batch 200, loss[loss=0.1285, simple_loss=0.2105, pruned_loss=0.02324, over 4691.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2113, pruned_loss=0.03365, over 616551.50 frames.], batch size: 15, lr: 2.19e-04 2022-05-06 16:43:22,395 INFO [train.py:715] (0/8) Epoch 10, batch 250, loss[loss=0.19, simple_loss=0.2507, pruned_loss=0.06469, over 4872.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2119, pruned_loss=0.0336, over 695398.50 frames.], batch size: 16, lr: 2.19e-04 2022-05-06 16:44:03,217 INFO [train.py:715] (0/8) Epoch 10, batch 300, loss[loss=0.1257, simple_loss=0.1953, pruned_loss=0.02809, over 4935.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03318, over 757569.39 frames.], batch size: 21, lr: 2.19e-04 2022-05-06 16:44:43,667 INFO [train.py:715] (0/8) Epoch 10, batch 350, loss[loss=0.1288, simple_loss=0.2078, pruned_loss=0.02487, over 4955.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03313, over 805811.61 frames.], batch size: 15, lr: 2.19e-04 2022-05-06 16:45:25,021 INFO [train.py:715] (0/8) Epoch 10, batch 400, loss[loss=0.1352, simple_loss=0.2207, pruned_loss=0.02482, over 4750.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03312, over 842844.30 frames.], batch size: 19, lr: 2.19e-04 2022-05-06 16:46:06,716 INFO [train.py:715] (0/8) Epoch 10, batch 450, loss[loss=0.1571, simple_loss=0.2224, pruned_loss=0.04588, over 4947.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2126, pruned_loss=0.03298, over 871577.01 frames.], batch size: 39, lr: 2.19e-04 2022-05-06 16:46:47,445 INFO [train.py:715] (0/8) Epoch 10, batch 500, loss[loss=0.1343, simple_loss=0.194, pruned_loss=0.03733, over 4823.00 frames.], tot_loss[loss=0.14, simple_loss=0.2132, pruned_loss=0.03345, over 894814.75 frames.], batch size: 13, lr: 2.19e-04 2022-05-06 16:47:28,879 INFO [train.py:715] (0/8) Epoch 10, batch 550, loss[loss=0.1275, simple_loss=0.2022, pruned_loss=0.02639, over 4832.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03312, over 912125.78 frames.], batch size: 26, lr: 2.19e-04 2022-05-06 16:48:10,019 INFO [train.py:715] (0/8) Epoch 10, batch 600, loss[loss=0.1172, simple_loss=0.197, pruned_loss=0.01868, over 4897.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2124, pruned_loss=0.03306, over 926489.09 frames.], batch size: 19, lr: 2.19e-04 2022-05-06 16:48:50,533 INFO [train.py:715] (0/8) Epoch 10, batch 650, loss[loss=0.1727, simple_loss=0.2428, pruned_loss=0.05128, over 4915.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2127, pruned_loss=0.03339, over 935920.10 frames.], batch size: 18, lr: 2.19e-04 2022-05-06 16:49:31,186 INFO [train.py:715] (0/8) Epoch 10, batch 700, loss[loss=0.1864, simple_loss=0.2518, pruned_loss=0.06048, over 4869.00 frames.], tot_loss[loss=0.1413, simple_loss=0.214, pruned_loss=0.03431, over 944321.35 frames.], batch size: 20, lr: 2.19e-04 2022-05-06 16:50:12,723 INFO [train.py:715] (0/8) Epoch 10, batch 750, loss[loss=0.1453, simple_loss=0.2153, pruned_loss=0.03764, over 4828.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2131, pruned_loss=0.0339, over 949446.54 frames.], batch size: 13, lr: 2.19e-04 2022-05-06 16:50:54,000 INFO [train.py:715] (0/8) Epoch 10, batch 800, loss[loss=0.1396, simple_loss=0.2169, pruned_loss=0.03116, over 4822.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03335, over 954955.23 frames.], batch size: 26, lr: 2.19e-04 2022-05-06 16:51:34,423 INFO [train.py:715] (0/8) Epoch 10, batch 850, loss[loss=0.1826, simple_loss=0.258, pruned_loss=0.05355, over 4934.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03339, over 959407.63 frames.], batch size: 35, lr: 2.19e-04 2022-05-06 16:52:15,217 INFO [train.py:715] (0/8) Epoch 10, batch 900, loss[loss=0.1462, simple_loss=0.2277, pruned_loss=0.03235, over 4929.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2129, pruned_loss=0.03344, over 963043.55 frames.], batch size: 18, lr: 2.19e-04 2022-05-06 16:52:55,736 INFO [train.py:715] (0/8) Epoch 10, batch 950, loss[loss=0.1405, simple_loss=0.2084, pruned_loss=0.03631, over 4899.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03351, over 965760.79 frames.], batch size: 32, lr: 2.19e-04 2022-05-06 16:53:35,731 INFO [train.py:715] (0/8) Epoch 10, batch 1000, loss[loss=0.151, simple_loss=0.2209, pruned_loss=0.04052, over 4937.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2135, pruned_loss=0.03388, over 967144.07 frames.], batch size: 23, lr: 2.19e-04 2022-05-06 16:54:14,957 INFO [train.py:715] (0/8) Epoch 10, batch 1050, loss[loss=0.12, simple_loss=0.2032, pruned_loss=0.01845, over 4829.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2123, pruned_loss=0.0332, over 967977.48 frames.], batch size: 13, lr: 2.19e-04 2022-05-06 16:54:55,330 INFO [train.py:715] (0/8) Epoch 10, batch 1100, loss[loss=0.1505, simple_loss=0.2152, pruned_loss=0.04291, over 4949.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.0331, over 969126.67 frames.], batch size: 21, lr: 2.19e-04 2022-05-06 16:55:34,628 INFO [train.py:715] (0/8) Epoch 10, batch 1150, loss[loss=0.1279, simple_loss=0.199, pruned_loss=0.02835, over 4912.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2121, pruned_loss=0.03299, over 969889.13 frames.], batch size: 19, lr: 2.19e-04 2022-05-06 16:56:13,827 INFO [train.py:715] (0/8) Epoch 10, batch 1200, loss[loss=0.1453, simple_loss=0.2272, pruned_loss=0.03173, over 4770.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03312, over 969307.98 frames.], batch size: 19, lr: 2.19e-04 2022-05-06 16:56:53,599 INFO [train.py:715] (0/8) Epoch 10, batch 1250, loss[loss=0.1429, simple_loss=0.2231, pruned_loss=0.03136, over 4984.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.0331, over 970162.48 frames.], batch size: 24, lr: 2.19e-04 2022-05-06 16:57:32,222 INFO [train.py:715] (0/8) Epoch 10, batch 1300, loss[loss=0.1358, simple_loss=0.2129, pruned_loss=0.02937, over 4822.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.0333, over 969528.58 frames.], batch size: 25, lr: 2.19e-04 2022-05-06 16:58:11,018 INFO [train.py:715] (0/8) Epoch 10, batch 1350, loss[loss=0.1261, simple_loss=0.1887, pruned_loss=0.03176, over 4966.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.03304, over 969676.46 frames.], batch size: 35, lr: 2.19e-04 2022-05-06 16:58:49,195 INFO [train.py:715] (0/8) Epoch 10, batch 1400, loss[loss=0.1282, simple_loss=0.1973, pruned_loss=0.02951, over 4953.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03329, over 969578.89 frames.], batch size: 21, lr: 2.19e-04 2022-05-06 16:59:28,744 INFO [train.py:715] (0/8) Epoch 10, batch 1450, loss[loss=0.1564, simple_loss=0.2135, pruned_loss=0.04962, over 4859.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2129, pruned_loss=0.03401, over 970754.44 frames.], batch size: 30, lr: 2.19e-04 2022-05-06 17:00:07,715 INFO [train.py:715] (0/8) Epoch 10, batch 1500, loss[loss=0.1363, simple_loss=0.2179, pruned_loss=0.02729, over 4799.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2127, pruned_loss=0.03395, over 970486.87 frames.], batch size: 21, lr: 2.19e-04 2022-05-06 17:00:46,474 INFO [train.py:715] (0/8) Epoch 10, batch 1550, loss[loss=0.1146, simple_loss=0.1909, pruned_loss=0.01919, over 4903.00 frames.], tot_loss[loss=0.1405, simple_loss=0.213, pruned_loss=0.03394, over 970781.49 frames.], batch size: 17, lr: 2.19e-04 2022-05-06 17:01:25,568 INFO [train.py:715] (0/8) Epoch 10, batch 1600, loss[loss=0.1448, simple_loss=0.214, pruned_loss=0.03782, over 4850.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2137, pruned_loss=0.03393, over 970816.06 frames.], batch size: 20, lr: 2.19e-04 2022-05-06 17:02:04,988 INFO [train.py:715] (0/8) Epoch 10, batch 1650, loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03152, over 4835.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2132, pruned_loss=0.03346, over 971293.09 frames.], batch size: 15, lr: 2.19e-04 2022-05-06 17:02:43,706 INFO [train.py:715] (0/8) Epoch 10, batch 1700, loss[loss=0.1465, simple_loss=0.2255, pruned_loss=0.03374, over 4840.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03332, over 971139.91 frames.], batch size: 30, lr: 2.19e-04 2022-05-06 17:03:22,052 INFO [train.py:715] (0/8) Epoch 10, batch 1750, loss[loss=0.132, simple_loss=0.2109, pruned_loss=0.02651, over 4770.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2133, pruned_loss=0.03397, over 970960.96 frames.], batch size: 12, lr: 2.19e-04 2022-05-06 17:04:02,176 INFO [train.py:715] (0/8) Epoch 10, batch 1800, loss[loss=0.1393, simple_loss=0.2092, pruned_loss=0.03467, over 4916.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03399, over 971644.94 frames.], batch size: 19, lr: 2.19e-04 2022-05-06 17:04:41,814 INFO [train.py:715] (0/8) Epoch 10, batch 1850, loss[loss=0.1494, simple_loss=0.2177, pruned_loss=0.04057, over 4805.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2131, pruned_loss=0.03408, over 970996.04 frames.], batch size: 21, lr: 2.19e-04 2022-05-06 17:05:20,552 INFO [train.py:715] (0/8) Epoch 10, batch 1900, loss[loss=0.1584, simple_loss=0.2455, pruned_loss=0.03561, over 4879.00 frames.], tot_loss[loss=0.1394, simple_loss=0.212, pruned_loss=0.03342, over 971606.89 frames.], batch size: 16, lr: 2.19e-04 2022-05-06 17:05:59,510 INFO [train.py:715] (0/8) Epoch 10, batch 1950, loss[loss=0.156, simple_loss=0.2198, pruned_loss=0.04606, over 4788.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03351, over 971795.96 frames.], batch size: 21, lr: 2.18e-04 2022-05-06 17:06:39,836 INFO [train.py:715] (0/8) Epoch 10, batch 2000, loss[loss=0.1426, simple_loss=0.2125, pruned_loss=0.0363, over 4874.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.03362, over 972402.45 frames.], batch size: 32, lr: 2.18e-04 2022-05-06 17:07:19,136 INFO [train.py:715] (0/8) Epoch 10, batch 2050, loss[loss=0.1301, simple_loss=0.2132, pruned_loss=0.02355, over 4832.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2122, pruned_loss=0.03363, over 972988.84 frames.], batch size: 26, lr: 2.18e-04 2022-05-06 17:07:57,715 INFO [train.py:715] (0/8) Epoch 10, batch 2100, loss[loss=0.1073, simple_loss=0.1809, pruned_loss=0.0169, over 4756.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2119, pruned_loss=0.03333, over 972455.06 frames.], batch size: 19, lr: 2.18e-04 2022-05-06 17:08:37,348 INFO [train.py:715] (0/8) Epoch 10, batch 2150, loss[loss=0.142, simple_loss=0.2222, pruned_loss=0.03097, over 4875.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03332, over 973166.94 frames.], batch size: 22, lr: 2.18e-04 2022-05-06 17:09:16,486 INFO [train.py:715] (0/8) Epoch 10, batch 2200, loss[loss=0.1551, simple_loss=0.2184, pruned_loss=0.04591, over 4858.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2124, pruned_loss=0.03307, over 972671.31 frames.], batch size: 32, lr: 2.18e-04 2022-05-06 17:09:55,193 INFO [train.py:715] (0/8) Epoch 10, batch 2250, loss[loss=0.1507, simple_loss=0.2275, pruned_loss=0.03699, over 4703.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03349, over 972727.40 frames.], batch size: 15, lr: 2.18e-04 2022-05-06 17:10:33,968 INFO [train.py:715] (0/8) Epoch 10, batch 2300, loss[loss=0.1278, simple_loss=0.1984, pruned_loss=0.02859, over 4772.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.03329, over 972120.11 frames.], batch size: 17, lr: 2.18e-04 2022-05-06 17:11:13,693 INFO [train.py:715] (0/8) Epoch 10, batch 2350, loss[loss=0.1306, simple_loss=0.2053, pruned_loss=0.02797, over 4932.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2129, pruned_loss=0.03365, over 972529.73 frames.], batch size: 29, lr: 2.18e-04 2022-05-06 17:11:52,501 INFO [train.py:715] (0/8) Epoch 10, batch 2400, loss[loss=0.1485, simple_loss=0.2187, pruned_loss=0.03913, over 4736.00 frames.], tot_loss[loss=0.1402, simple_loss=0.213, pruned_loss=0.03371, over 972169.55 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:12:31,236 INFO [train.py:715] (0/8) Epoch 10, batch 2450, loss[loss=0.11, simple_loss=0.18, pruned_loss=0.01994, over 4772.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03297, over 971947.38 frames.], batch size: 14, lr: 2.18e-04 2022-05-06 17:13:10,536 INFO [train.py:715] (0/8) Epoch 10, batch 2500, loss[loss=0.1496, simple_loss=0.2294, pruned_loss=0.03487, over 4787.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2117, pruned_loss=0.03305, over 972422.24 frames.], batch size: 17, lr: 2.18e-04 2022-05-06 17:13:49,920 INFO [train.py:715] (0/8) Epoch 10, batch 2550, loss[loss=0.1424, simple_loss=0.2173, pruned_loss=0.0338, over 4883.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2126, pruned_loss=0.03378, over 972422.53 frames.], batch size: 19, lr: 2.18e-04 2022-05-06 17:14:29,342 INFO [train.py:715] (0/8) Epoch 10, batch 2600, loss[loss=0.1308, simple_loss=0.2012, pruned_loss=0.03018, over 4770.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2126, pruned_loss=0.03361, over 972914.63 frames.], batch size: 14, lr: 2.18e-04 2022-05-06 17:15:08,458 INFO [train.py:715] (0/8) Epoch 10, batch 2650, loss[loss=0.1218, simple_loss=0.2024, pruned_loss=0.02057, over 4700.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2125, pruned_loss=0.03333, over 972305.43 frames.], batch size: 15, lr: 2.18e-04 2022-05-06 17:15:47,658 INFO [train.py:715] (0/8) Epoch 10, batch 2700, loss[loss=0.1749, simple_loss=0.2361, pruned_loss=0.05684, over 4816.00 frames.], tot_loss[loss=0.1412, simple_loss=0.214, pruned_loss=0.03414, over 972063.90 frames.], batch size: 15, lr: 2.18e-04 2022-05-06 17:16:26,376 INFO [train.py:715] (0/8) Epoch 10, batch 2750, loss[loss=0.1228, simple_loss=0.2019, pruned_loss=0.02185, over 4824.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03411, over 972815.83 frames.], batch size: 26, lr: 2.18e-04 2022-05-06 17:17:05,079 INFO [train.py:715] (0/8) Epoch 10, batch 2800, loss[loss=0.1335, simple_loss=0.2133, pruned_loss=0.0269, over 4994.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2145, pruned_loss=0.03424, over 973627.65 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:17:43,819 INFO [train.py:715] (0/8) Epoch 10, batch 2850, loss[loss=0.1706, simple_loss=0.2293, pruned_loss=0.056, over 4982.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2136, pruned_loss=0.0338, over 973398.41 frames.], batch size: 28, lr: 2.18e-04 2022-05-06 17:18:23,062 INFO [train.py:715] (0/8) Epoch 10, batch 2900, loss[loss=0.1307, simple_loss=0.203, pruned_loss=0.02915, over 4930.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2133, pruned_loss=0.03342, over 973183.57 frames.], batch size: 29, lr: 2.18e-04 2022-05-06 17:19:02,254 INFO [train.py:715] (0/8) Epoch 10, batch 2950, loss[loss=0.1206, simple_loss=0.1958, pruned_loss=0.02272, over 4982.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2129, pruned_loss=0.03293, over 972323.98 frames.], batch size: 14, lr: 2.18e-04 2022-05-06 17:19:40,631 INFO [train.py:715] (0/8) Epoch 10, batch 3000, loss[loss=0.1597, simple_loss=0.2232, pruned_loss=0.04808, over 4842.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2134, pruned_loss=0.03369, over 972962.11 frames.], batch size: 20, lr: 2.18e-04 2022-05-06 17:19:40,632 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 17:19:50,102 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1065, simple_loss=0.1908, pruned_loss=0.01113, over 914524.00 frames. 2022-05-06 17:20:28,625 INFO [train.py:715] (0/8) Epoch 10, batch 3050, loss[loss=0.1628, simple_loss=0.2214, pruned_loss=0.05213, over 4849.00 frames.], tot_loss[loss=0.141, simple_loss=0.214, pruned_loss=0.03403, over 973475.44 frames.], batch size: 32, lr: 2.18e-04 2022-05-06 17:21:07,569 INFO [train.py:715] (0/8) Epoch 10, batch 3100, loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02914, over 4903.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2133, pruned_loss=0.0335, over 973943.33 frames.], batch size: 17, lr: 2.18e-04 2022-05-06 17:21:46,717 INFO [train.py:715] (0/8) Epoch 10, batch 3150, loss[loss=0.1269, simple_loss=0.2036, pruned_loss=0.02504, over 4794.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2134, pruned_loss=0.03365, over 973592.46 frames.], batch size: 24, lr: 2.18e-04 2022-05-06 17:22:25,530 INFO [train.py:715] (0/8) Epoch 10, batch 3200, loss[loss=0.1212, simple_loss=0.195, pruned_loss=0.02372, over 4919.00 frames.], tot_loss[loss=0.1408, simple_loss=0.214, pruned_loss=0.03383, over 973377.27 frames.], batch size: 29, lr: 2.18e-04 2022-05-06 17:23:03,963 INFO [train.py:715] (0/8) Epoch 10, batch 3250, loss[loss=0.1804, simple_loss=0.2558, pruned_loss=0.05249, over 4923.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2131, pruned_loss=0.03372, over 972269.13 frames.], batch size: 29, lr: 2.18e-04 2022-05-06 17:23:44,490 INFO [train.py:715] (0/8) Epoch 10, batch 3300, loss[loss=0.1376, simple_loss=0.2158, pruned_loss=0.02971, over 4967.00 frames.], tot_loss[loss=0.1417, simple_loss=0.2144, pruned_loss=0.03446, over 971768.43 frames.], batch size: 21, lr: 2.18e-04 2022-05-06 17:24:24,203 INFO [train.py:715] (0/8) Epoch 10, batch 3350, loss[loss=0.1441, simple_loss=0.2279, pruned_loss=0.03016, over 4882.00 frames.], tot_loss[loss=0.1418, simple_loss=0.2147, pruned_loss=0.03446, over 972033.65 frames.], batch size: 22, lr: 2.18e-04 2022-05-06 17:25:04,068 INFO [train.py:715] (0/8) Epoch 10, batch 3400, loss[loss=0.1479, simple_loss=0.2189, pruned_loss=0.0384, over 4967.00 frames.], tot_loss[loss=0.1421, simple_loss=0.2149, pruned_loss=0.03464, over 971508.21 frames.], batch size: 24, lr: 2.18e-04 2022-05-06 17:25:44,885 INFO [train.py:715] (0/8) Epoch 10, batch 3450, loss[loss=0.1395, simple_loss=0.2241, pruned_loss=0.0274, over 4965.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2146, pruned_loss=0.03412, over 972623.41 frames.], batch size: 24, lr: 2.18e-04 2022-05-06 17:26:26,601 INFO [train.py:715] (0/8) Epoch 10, batch 3500, loss[loss=0.1279, simple_loss=0.1969, pruned_loss=0.02945, over 4974.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03366, over 972773.49 frames.], batch size: 15, lr: 2.18e-04 2022-05-06 17:27:07,262 INFO [train.py:715] (0/8) Epoch 10, batch 3550, loss[loss=0.1241, simple_loss=0.2071, pruned_loss=0.02049, over 4886.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2132, pruned_loss=0.03319, over 973099.41 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:27:48,564 INFO [train.py:715] (0/8) Epoch 10, batch 3600, loss[loss=0.1363, simple_loss=0.2122, pruned_loss=0.03021, over 4847.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03327, over 973594.02 frames.], batch size: 20, lr: 2.18e-04 2022-05-06 17:28:29,167 INFO [train.py:715] (0/8) Epoch 10, batch 3650, loss[loss=0.1153, simple_loss=0.1931, pruned_loss=0.01876, over 4790.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2123, pruned_loss=0.03323, over 974212.20 frames.], batch size: 17, lr: 2.18e-04 2022-05-06 17:29:10,564 INFO [train.py:715] (0/8) Epoch 10, batch 3700, loss[loss=0.1476, simple_loss=0.2146, pruned_loss=0.04029, over 4864.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2123, pruned_loss=0.03346, over 974178.87 frames.], batch size: 20, lr: 2.18e-04 2022-05-06 17:29:51,152 INFO [train.py:715] (0/8) Epoch 10, batch 3750, loss[loss=0.1371, simple_loss=0.2051, pruned_loss=0.0345, over 4938.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2121, pruned_loss=0.03362, over 974365.28 frames.], batch size: 29, lr: 2.18e-04 2022-05-06 17:30:32,377 INFO [train.py:715] (0/8) Epoch 10, batch 3800, loss[loss=0.1512, simple_loss=0.2252, pruned_loss=0.03863, over 4876.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2123, pruned_loss=0.03318, over 973909.47 frames.], batch size: 20, lr: 2.18e-04 2022-05-06 17:31:13,745 INFO [train.py:715] (0/8) Epoch 10, batch 3850, loss[loss=0.1354, simple_loss=0.2033, pruned_loss=0.0338, over 4762.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03303, over 973635.24 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:31:54,692 INFO [train.py:715] (0/8) Epoch 10, batch 3900, loss[loss=0.1243, simple_loss=0.1914, pruned_loss=0.02857, over 4914.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2125, pruned_loss=0.03342, over 973222.58 frames.], batch size: 17, lr: 2.18e-04 2022-05-06 17:31:58,982 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-352000.pt 2022-05-06 17:32:36,891 INFO [train.py:715] (0/8) Epoch 10, batch 3950, loss[loss=0.1057, simple_loss=0.1691, pruned_loss=0.0211, over 4798.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03353, over 973297.87 frames.], batch size: 12, lr: 2.18e-04 2022-05-06 17:33:16,171 INFO [train.py:715] (0/8) Epoch 10, batch 4000, loss[loss=0.1337, simple_loss=0.2043, pruned_loss=0.03151, over 4724.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03362, over 973571.92 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:33:55,834 INFO [train.py:715] (0/8) Epoch 10, batch 4050, loss[loss=0.1291, simple_loss=0.2105, pruned_loss=0.02381, over 4921.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03323, over 973431.77 frames.], batch size: 22, lr: 2.18e-04 2022-05-06 17:34:34,554 INFO [train.py:715] (0/8) Epoch 10, batch 4100, loss[loss=0.1278, simple_loss=0.1946, pruned_loss=0.03052, over 4809.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03323, over 972224.55 frames.], batch size: 12, lr: 2.18e-04 2022-05-06 17:35:13,433 INFO [train.py:715] (0/8) Epoch 10, batch 4150, loss[loss=0.1639, simple_loss=0.2383, pruned_loss=0.04472, over 4806.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03298, over 972791.33 frames.], batch size: 25, lr: 2.18e-04 2022-05-06 17:35:52,992 INFO [train.py:715] (0/8) Epoch 10, batch 4200, loss[loss=0.1059, simple_loss=0.1737, pruned_loss=0.01908, over 4819.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03322, over 972843.36 frames.], batch size: 13, lr: 2.18e-04 2022-05-06 17:36:31,675 INFO [train.py:715] (0/8) Epoch 10, batch 4250, loss[loss=0.14, simple_loss=0.2139, pruned_loss=0.03301, over 4926.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2135, pruned_loss=0.03375, over 972478.17 frames.], batch size: 23, lr: 2.18e-04 2022-05-06 17:37:10,488 INFO [train.py:715] (0/8) Epoch 10, batch 4300, loss[loss=0.1656, simple_loss=0.2388, pruned_loss=0.04616, over 4971.00 frames.], tot_loss[loss=0.1399, simple_loss=0.213, pruned_loss=0.03342, over 972070.77 frames.], batch size: 39, lr: 2.18e-04 2022-05-06 17:37:49,693 INFO [train.py:715] (0/8) Epoch 10, batch 4350, loss[loss=0.141, simple_loss=0.2191, pruned_loss=0.03142, over 4974.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2131, pruned_loss=0.03351, over 971519.89 frames.], batch size: 24, lr: 2.18e-04 2022-05-06 17:38:28,649 INFO [train.py:715] (0/8) Epoch 10, batch 4400, loss[loss=0.1153, simple_loss=0.1891, pruned_loss=0.02071, over 4934.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03331, over 972209.97 frames.], batch size: 29, lr: 2.18e-04 2022-05-06 17:39:07,613 INFO [train.py:715] (0/8) Epoch 10, batch 4450, loss[loss=0.1249, simple_loss=0.1958, pruned_loss=0.027, over 4883.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2127, pruned_loss=0.03331, over 972830.47 frames.], batch size: 22, lr: 2.18e-04 2022-05-06 17:39:46,321 INFO [train.py:715] (0/8) Epoch 10, batch 4500, loss[loss=0.1293, simple_loss=0.2034, pruned_loss=0.02761, over 4855.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03322, over 972535.16 frames.], batch size: 32, lr: 2.18e-04 2022-05-06 17:40:25,797 INFO [train.py:715] (0/8) Epoch 10, batch 4550, loss[loss=0.1853, simple_loss=0.2395, pruned_loss=0.06556, over 4889.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2128, pruned_loss=0.03339, over 973423.96 frames.], batch size: 22, lr: 2.18e-04 2022-05-06 17:41:04,676 INFO [train.py:715] (0/8) Epoch 10, batch 4600, loss[loss=0.1789, simple_loss=0.2343, pruned_loss=0.0618, over 4979.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2132, pruned_loss=0.03377, over 973347.42 frames.], batch size: 24, lr: 2.18e-04 2022-05-06 17:41:43,575 INFO [train.py:715] (0/8) Epoch 10, batch 4650, loss[loss=0.1479, simple_loss=0.2189, pruned_loss=0.03842, over 4774.00 frames.], tot_loss[loss=0.1422, simple_loss=0.2149, pruned_loss=0.03475, over 972592.85 frames.], batch size: 17, lr: 2.18e-04 2022-05-06 17:42:23,830 INFO [train.py:715] (0/8) Epoch 10, batch 4700, loss[loss=0.1345, simple_loss=0.2021, pruned_loss=0.03343, over 4868.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2137, pruned_loss=0.03438, over 973042.87 frames.], batch size: 20, lr: 2.18e-04 2022-05-06 17:43:03,973 INFO [train.py:715] (0/8) Epoch 10, batch 4750, loss[loss=0.1479, simple_loss=0.2184, pruned_loss=0.03864, over 4802.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2133, pruned_loss=0.03384, over 973030.38 frames.], batch size: 21, lr: 2.18e-04 2022-05-06 17:43:43,162 INFO [train.py:715] (0/8) Epoch 10, batch 4800, loss[loss=0.1199, simple_loss=0.187, pruned_loss=0.02643, over 4751.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2119, pruned_loss=0.0332, over 973312.20 frames.], batch size: 19, lr: 2.18e-04 2022-05-06 17:44:22,995 INFO [train.py:715] (0/8) Epoch 10, batch 4850, loss[loss=0.1559, simple_loss=0.2243, pruned_loss=0.04374, over 4768.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2117, pruned_loss=0.03352, over 972506.36 frames.], batch size: 19, lr: 2.18e-04 2022-05-06 17:45:02,947 INFO [train.py:715] (0/8) Epoch 10, batch 4900, loss[loss=0.1419, simple_loss=0.2082, pruned_loss=0.03784, over 4823.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2121, pruned_loss=0.03365, over 972948.44 frames.], batch size: 13, lr: 2.18e-04 2022-05-06 17:45:42,396 INFO [train.py:715] (0/8) Epoch 10, batch 4950, loss[loss=0.1266, simple_loss=0.2007, pruned_loss=0.02622, over 4773.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03352, over 973070.40 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:46:21,435 INFO [train.py:715] (0/8) Epoch 10, batch 5000, loss[loss=0.1394, simple_loss=0.2102, pruned_loss=0.03429, over 4859.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2128, pruned_loss=0.03372, over 972672.43 frames.], batch size: 16, lr: 2.18e-04 2022-05-06 17:47:00,596 INFO [train.py:715] (0/8) Epoch 10, batch 5050, loss[loss=0.11, simple_loss=0.1808, pruned_loss=0.01958, over 4838.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2125, pruned_loss=0.03366, over 972961.26 frames.], batch size: 13, lr: 2.18e-04 2022-05-06 17:47:39,526 INFO [train.py:715] (0/8) Epoch 10, batch 5100, loss[loss=0.1479, simple_loss=0.2182, pruned_loss=0.03882, over 4833.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2128, pruned_loss=0.03376, over 973267.94 frames.], batch size: 15, lr: 2.18e-04 2022-05-06 17:48:18,800 INFO [train.py:715] (0/8) Epoch 10, batch 5150, loss[loss=0.1352, simple_loss=0.2139, pruned_loss=0.02827, over 4768.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2123, pruned_loss=0.0333, over 973386.45 frames.], batch size: 18, lr: 2.18e-04 2022-05-06 17:48:58,635 INFO [train.py:715] (0/8) Epoch 10, batch 5200, loss[loss=0.1532, simple_loss=0.2243, pruned_loss=0.0411, over 4829.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2125, pruned_loss=0.03354, over 973333.86 frames.], batch size: 26, lr: 2.17e-04 2022-05-06 17:49:38,471 INFO [train.py:715] (0/8) Epoch 10, batch 5250, loss[loss=0.1816, simple_loss=0.2466, pruned_loss=0.05831, over 4835.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03357, over 973747.83 frames.], batch size: 15, lr: 2.17e-04 2022-05-06 17:50:17,850 INFO [train.py:715] (0/8) Epoch 10, batch 5300, loss[loss=0.1203, simple_loss=0.1949, pruned_loss=0.0228, over 4913.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2123, pruned_loss=0.03323, over 973988.81 frames.], batch size: 23, lr: 2.17e-04 2022-05-06 17:50:57,192 INFO [train.py:715] (0/8) Epoch 10, batch 5350, loss[loss=0.1429, simple_loss=0.2198, pruned_loss=0.03301, over 4821.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2131, pruned_loss=0.03315, over 974230.03 frames.], batch size: 26, lr: 2.17e-04 2022-05-06 17:51:37,021 INFO [train.py:715] (0/8) Epoch 10, batch 5400, loss[loss=0.129, simple_loss=0.2067, pruned_loss=0.0257, over 4838.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2133, pruned_loss=0.03354, over 974414.42 frames.], batch size: 30, lr: 2.17e-04 2022-05-06 17:52:16,936 INFO [train.py:715] (0/8) Epoch 10, batch 5450, loss[loss=0.1653, simple_loss=0.2332, pruned_loss=0.04869, over 4699.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2137, pruned_loss=0.03377, over 974134.12 frames.], batch size: 15, lr: 2.17e-04 2022-05-06 17:52:56,343 INFO [train.py:715] (0/8) Epoch 10, batch 5500, loss[loss=0.1299, simple_loss=0.2064, pruned_loss=0.02668, over 4952.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.0341, over 973785.48 frames.], batch size: 24, lr: 2.17e-04 2022-05-06 17:53:36,103 INFO [train.py:715] (0/8) Epoch 10, batch 5550, loss[loss=0.1905, simple_loss=0.2547, pruned_loss=0.06316, over 4913.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2139, pruned_loss=0.03391, over 972244.89 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 17:54:16,051 INFO [train.py:715] (0/8) Epoch 10, batch 5600, loss[loss=0.1248, simple_loss=0.1945, pruned_loss=0.02754, over 4634.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2133, pruned_loss=0.03381, over 971592.98 frames.], batch size: 13, lr: 2.17e-04 2022-05-06 17:54:55,813 INFO [train.py:715] (0/8) Epoch 10, batch 5650, loss[loss=0.1276, simple_loss=0.2068, pruned_loss=0.02424, over 4882.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03354, over 971791.28 frames.], batch size: 19, lr: 2.17e-04 2022-05-06 17:55:34,979 INFO [train.py:715] (0/8) Epoch 10, batch 5700, loss[loss=0.1532, simple_loss=0.2245, pruned_loss=0.04088, over 4988.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03297, over 971189.73 frames.], batch size: 28, lr: 2.17e-04 2022-05-06 17:56:15,019 INFO [train.py:715] (0/8) Epoch 10, batch 5750, loss[loss=0.1127, simple_loss=0.1769, pruned_loss=0.02422, over 4920.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2121, pruned_loss=0.03324, over 971455.81 frames.], batch size: 18, lr: 2.17e-04 2022-05-06 17:56:54,686 INFO [train.py:715] (0/8) Epoch 10, batch 5800, loss[loss=0.1594, simple_loss=0.233, pruned_loss=0.0429, over 4781.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.03297, over 971490.25 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 17:57:34,207 INFO [train.py:715] (0/8) Epoch 10, batch 5850, loss[loss=0.1424, simple_loss=0.21, pruned_loss=0.03742, over 4910.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2131, pruned_loss=0.03353, over 972064.11 frames.], batch size: 23, lr: 2.17e-04 2022-05-06 17:58:14,026 INFO [train.py:715] (0/8) Epoch 10, batch 5900, loss[loss=0.1201, simple_loss=0.1958, pruned_loss=0.02217, over 4898.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03418, over 971579.68 frames.], batch size: 22, lr: 2.17e-04 2022-05-06 17:58:53,762 INFO [train.py:715] (0/8) Epoch 10, batch 5950, loss[loss=0.1504, simple_loss=0.2323, pruned_loss=0.03426, over 4934.00 frames.], tot_loss[loss=0.1413, simple_loss=0.214, pruned_loss=0.0343, over 972468.37 frames.], batch size: 23, lr: 2.17e-04 2022-05-06 17:59:33,428 INFO [train.py:715] (0/8) Epoch 10, batch 6000, loss[loss=0.1448, simple_loss=0.2105, pruned_loss=0.03956, over 4975.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2137, pruned_loss=0.03407, over 971338.24 frames.], batch size: 35, lr: 2.17e-04 2022-05-06 17:59:33,429 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 17:59:42,753 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1067, simple_loss=0.1909, pruned_loss=0.01126, over 914524.00 frames. 2022-05-06 18:00:22,324 INFO [train.py:715] (0/8) Epoch 10, batch 6050, loss[loss=0.1457, simple_loss=0.2122, pruned_loss=0.03964, over 4830.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2143, pruned_loss=0.03441, over 971105.81 frames.], batch size: 30, lr: 2.17e-04 2022-05-06 18:01:00,746 INFO [train.py:715] (0/8) Epoch 10, batch 6100, loss[loss=0.169, simple_loss=0.2548, pruned_loss=0.04163, over 4974.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2138, pruned_loss=0.03428, over 970654.42 frames.], batch size: 25, lr: 2.17e-04 2022-05-06 18:01:40,209 INFO [train.py:715] (0/8) Epoch 10, batch 6150, loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03159, over 4738.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03369, over 971245.95 frames.], batch size: 16, lr: 2.17e-04 2022-05-06 18:02:20,068 INFO [train.py:715] (0/8) Epoch 10, batch 6200, loss[loss=0.1392, simple_loss=0.2051, pruned_loss=0.03666, over 4877.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2132, pruned_loss=0.03325, over 971630.14 frames.], batch size: 16, lr: 2.17e-04 2022-05-06 18:02:59,939 INFO [train.py:715] (0/8) Epoch 10, batch 6250, loss[loss=0.1402, simple_loss=0.2102, pruned_loss=0.03506, over 4779.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03319, over 972126.56 frames.], batch size: 14, lr: 2.17e-04 2022-05-06 18:03:39,469 INFO [train.py:715] (0/8) Epoch 10, batch 6300, loss[loss=0.1479, simple_loss=0.21, pruned_loss=0.04295, over 4793.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2128, pruned_loss=0.03315, over 972384.10 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 18:04:19,278 INFO [train.py:715] (0/8) Epoch 10, batch 6350, loss[loss=0.1555, simple_loss=0.2425, pruned_loss=0.03424, over 4795.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2133, pruned_loss=0.03318, over 971626.54 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 18:04:58,326 INFO [train.py:715] (0/8) Epoch 10, batch 6400, loss[loss=0.1657, simple_loss=0.2457, pruned_loss=0.04284, over 4763.00 frames.], tot_loss[loss=0.1392, simple_loss=0.213, pruned_loss=0.03275, over 971783.72 frames.], batch size: 16, lr: 2.17e-04 2022-05-06 18:05:36,736 INFO [train.py:715] (0/8) Epoch 10, batch 6450, loss[loss=0.1182, simple_loss=0.2006, pruned_loss=0.01788, over 4967.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2132, pruned_loss=0.03299, over 971421.34 frames.], batch size: 29, lr: 2.17e-04 2022-05-06 18:06:15,660 INFO [train.py:715] (0/8) Epoch 10, batch 6500, loss[loss=0.1192, simple_loss=0.1911, pruned_loss=0.02361, over 4828.00 frames.], tot_loss[loss=0.1396, simple_loss=0.213, pruned_loss=0.03311, over 972641.99 frames.], batch size: 26, lr: 2.17e-04 2022-05-06 18:06:54,775 INFO [train.py:715] (0/8) Epoch 10, batch 6550, loss[loss=0.1168, simple_loss=0.1948, pruned_loss=0.01934, over 4694.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03279, over 971500.78 frames.], batch size: 15, lr: 2.17e-04 2022-05-06 18:07:33,913 INFO [train.py:715] (0/8) Epoch 10, batch 6600, loss[loss=0.1365, simple_loss=0.2067, pruned_loss=0.03311, over 4965.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03316, over 972096.24 frames.], batch size: 21, lr: 2.17e-04 2022-05-06 18:08:12,465 INFO [train.py:715] (0/8) Epoch 10, batch 6650, loss[loss=0.147, simple_loss=0.2105, pruned_loss=0.0417, over 4923.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03331, over 971830.20 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 18:08:52,598 INFO [train.py:715] (0/8) Epoch 10, batch 6700, loss[loss=0.1318, simple_loss=0.2096, pruned_loss=0.02704, over 4916.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.033, over 971812.87 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 18:09:31,853 INFO [train.py:715] (0/8) Epoch 10, batch 6750, loss[loss=0.1288, simple_loss=0.1992, pruned_loss=0.02927, over 4916.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2132, pruned_loss=0.03329, over 972244.91 frames.], batch size: 18, lr: 2.17e-04 2022-05-06 18:10:10,543 INFO [train.py:715] (0/8) Epoch 10, batch 6800, loss[loss=0.136, simple_loss=0.1971, pruned_loss=0.03744, over 4827.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2136, pruned_loss=0.03347, over 972474.97 frames.], batch size: 27, lr: 2.17e-04 2022-05-06 18:10:50,411 INFO [train.py:715] (0/8) Epoch 10, batch 6850, loss[loss=0.1605, simple_loss=0.2468, pruned_loss=0.03713, over 4983.00 frames.], tot_loss[loss=0.1396, simple_loss=0.213, pruned_loss=0.03307, over 973331.45 frames.], batch size: 28, lr: 2.17e-04 2022-05-06 18:11:29,655 INFO [train.py:715] (0/8) Epoch 10, batch 6900, loss[loss=0.1553, simple_loss=0.2281, pruned_loss=0.04123, over 4874.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2141, pruned_loss=0.03355, over 973774.64 frames.], batch size: 22, lr: 2.17e-04 2022-05-06 18:12:08,733 INFO [train.py:715] (0/8) Epoch 10, batch 6950, loss[loss=0.1674, simple_loss=0.2496, pruned_loss=0.04257, over 4927.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2139, pruned_loss=0.03373, over 973078.97 frames.], batch size: 18, lr: 2.17e-04 2022-05-06 18:12:48,639 INFO [train.py:715] (0/8) Epoch 10, batch 7000, loss[loss=0.1558, simple_loss=0.2212, pruned_loss=0.0452, over 4882.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2143, pruned_loss=0.03408, over 972362.41 frames.], batch size: 16, lr: 2.17e-04 2022-05-06 18:13:28,548 INFO [train.py:715] (0/8) Epoch 10, batch 7050, loss[loss=0.1203, simple_loss=0.1981, pruned_loss=0.02123, over 4955.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2136, pruned_loss=0.03349, over 972160.82 frames.], batch size: 24, lr: 2.17e-04 2022-05-06 18:14:07,742 INFO [train.py:715] (0/8) Epoch 10, batch 7100, loss[loss=0.1672, simple_loss=0.2449, pruned_loss=0.04474, over 4913.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2137, pruned_loss=0.03372, over 972114.42 frames.], batch size: 18, lr: 2.17e-04 2022-05-06 18:14:46,902 INFO [train.py:715] (0/8) Epoch 10, batch 7150, loss[loss=0.1295, simple_loss=0.2092, pruned_loss=0.02496, over 4813.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2133, pruned_loss=0.03352, over 971623.51 frames.], batch size: 25, lr: 2.17e-04 2022-05-06 18:15:26,296 INFO [train.py:715] (0/8) Epoch 10, batch 7200, loss[loss=0.1564, simple_loss=0.2296, pruned_loss=0.04162, over 4879.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2135, pruned_loss=0.03367, over 971606.44 frames.], batch size: 32, lr: 2.17e-04 2022-05-06 18:16:05,420 INFO [train.py:715] (0/8) Epoch 10, batch 7250, loss[loss=0.1277, simple_loss=0.2073, pruned_loss=0.02402, over 4811.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2137, pruned_loss=0.03386, over 972074.65 frames.], batch size: 25, lr: 2.17e-04 2022-05-06 18:16:44,408 INFO [train.py:715] (0/8) Epoch 10, batch 7300, loss[loss=0.1243, simple_loss=0.1946, pruned_loss=0.027, over 4764.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2138, pruned_loss=0.03387, over 972765.18 frames.], batch size: 18, lr: 2.17e-04 2022-05-06 18:17:23,327 INFO [train.py:715] (0/8) Epoch 10, batch 7350, loss[loss=0.1231, simple_loss=0.193, pruned_loss=0.0266, over 4983.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2143, pruned_loss=0.03402, over 972877.65 frames.], batch size: 28, lr: 2.17e-04 2022-05-06 18:18:02,742 INFO [train.py:715] (0/8) Epoch 10, batch 7400, loss[loss=0.1654, simple_loss=0.2316, pruned_loss=0.0496, over 4960.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2137, pruned_loss=0.03375, over 972593.45 frames.], batch size: 35, lr: 2.17e-04 2022-05-06 18:18:41,885 INFO [train.py:715] (0/8) Epoch 10, batch 7450, loss[loss=0.127, simple_loss=0.2049, pruned_loss=0.02452, over 4931.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03332, over 972683.46 frames.], batch size: 23, lr: 2.17e-04 2022-05-06 18:19:20,014 INFO [train.py:715] (0/8) Epoch 10, batch 7500, loss[loss=0.1313, simple_loss=0.1979, pruned_loss=0.03229, over 4649.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.03286, over 972383.41 frames.], batch size: 13, lr: 2.17e-04 2022-05-06 18:19:59,643 INFO [train.py:715] (0/8) Epoch 10, batch 7550, loss[loss=0.1224, simple_loss=0.2027, pruned_loss=0.02104, over 4938.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2112, pruned_loss=0.03246, over 972910.07 frames.], batch size: 21, lr: 2.17e-04 2022-05-06 18:20:38,456 INFO [train.py:715] (0/8) Epoch 10, batch 7600, loss[loss=0.1307, simple_loss=0.2042, pruned_loss=0.02862, over 4785.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03312, over 972352.76 frames.], batch size: 17, lr: 2.17e-04 2022-05-06 18:21:17,037 INFO [train.py:715] (0/8) Epoch 10, batch 7650, loss[loss=0.151, simple_loss=0.2361, pruned_loss=0.03295, over 4936.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.03309, over 972012.86 frames.], batch size: 29, lr: 2.17e-04 2022-05-06 18:21:56,436 INFO [train.py:715] (0/8) Epoch 10, batch 7700, loss[loss=0.1392, simple_loss=0.2126, pruned_loss=0.03291, over 4889.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2116, pruned_loss=0.03269, over 972806.09 frames.], batch size: 22, lr: 2.17e-04 2022-05-06 18:22:35,793 INFO [train.py:715] (0/8) Epoch 10, batch 7750, loss[loss=0.1479, simple_loss=0.2152, pruned_loss=0.04035, over 4983.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.03306, over 972838.77 frames.], batch size: 28, lr: 2.17e-04 2022-05-06 18:23:15,170 INFO [train.py:715] (0/8) Epoch 10, batch 7800, loss[loss=0.1448, simple_loss=0.2233, pruned_loss=0.03314, over 4889.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2121, pruned_loss=0.03322, over 972805.87 frames.], batch size: 19, lr: 2.17e-04 2022-05-06 18:23:53,546 INFO [train.py:715] (0/8) Epoch 10, batch 7850, loss[loss=0.1331, simple_loss=0.2016, pruned_loss=0.03234, over 4763.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03351, over 971628.92 frames.], batch size: 19, lr: 2.17e-04 2022-05-06 18:24:33,021 INFO [train.py:715] (0/8) Epoch 10, batch 7900, loss[loss=0.1256, simple_loss=0.1922, pruned_loss=0.02951, over 4961.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03311, over 971504.47 frames.], batch size: 14, lr: 2.17e-04 2022-05-06 18:25:12,544 INFO [train.py:715] (0/8) Epoch 10, batch 7950, loss[loss=0.1417, simple_loss=0.2234, pruned_loss=0.03001, over 4810.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2126, pruned_loss=0.03324, over 972016.27 frames.], batch size: 21, lr: 2.17e-04 2022-05-06 18:25:51,356 INFO [train.py:715] (0/8) Epoch 10, batch 8000, loss[loss=0.1312, simple_loss=0.2021, pruned_loss=0.03013, over 4777.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03316, over 971836.84 frames.], batch size: 18, lr: 2.17e-04 2022-05-06 18:26:30,788 INFO [train.py:715] (0/8) Epoch 10, batch 8050, loss[loss=0.1247, simple_loss=0.2034, pruned_loss=0.02302, over 4815.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2124, pruned_loss=0.03302, over 972458.81 frames.], batch size: 25, lr: 2.17e-04 2022-05-06 18:27:10,409 INFO [train.py:715] (0/8) Epoch 10, batch 8100, loss[loss=0.1796, simple_loss=0.2475, pruned_loss=0.05584, over 4945.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2129, pruned_loss=0.03344, over 972445.61 frames.], batch size: 21, lr: 2.17e-04 2022-05-06 18:27:49,301 INFO [train.py:715] (0/8) Epoch 10, batch 8150, loss[loss=0.1081, simple_loss=0.1721, pruned_loss=0.02208, over 4822.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03335, over 971762.50 frames.], batch size: 13, lr: 2.17e-04 2022-05-06 18:28:27,915 INFO [train.py:715] (0/8) Epoch 10, batch 8200, loss[loss=0.1401, simple_loss=0.2143, pruned_loss=0.0329, over 4754.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2125, pruned_loss=0.03326, over 971310.68 frames.], batch size: 19, lr: 2.17e-04 2022-05-06 18:29:07,590 INFO [train.py:715] (0/8) Epoch 10, batch 8250, loss[loss=0.1329, simple_loss=0.2096, pruned_loss=0.02815, over 4909.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2126, pruned_loss=0.03324, over 971861.48 frames.], batch size: 23, lr: 2.17e-04 2022-05-06 18:29:46,987 INFO [train.py:715] (0/8) Epoch 10, batch 8300, loss[loss=0.1328, simple_loss=0.1913, pruned_loss=0.03711, over 4763.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03332, over 971861.19 frames.], batch size: 14, lr: 2.17e-04 2022-05-06 18:30:25,734 INFO [train.py:715] (0/8) Epoch 10, batch 8350, loss[loss=0.1278, simple_loss=0.2048, pruned_loss=0.02545, over 4984.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.03318, over 971786.70 frames.], batch size: 28, lr: 2.17e-04 2022-05-06 18:31:05,468 INFO [train.py:715] (0/8) Epoch 10, batch 8400, loss[loss=0.1464, simple_loss=0.2381, pruned_loss=0.0274, over 4933.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03353, over 971506.01 frames.], batch size: 23, lr: 2.17e-04 2022-05-06 18:31:44,985 INFO [train.py:715] (0/8) Epoch 10, batch 8450, loss[loss=0.141, simple_loss=0.1946, pruned_loss=0.04368, over 4914.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03327, over 971393.58 frames.], batch size: 18, lr: 2.16e-04 2022-05-06 18:32:23,259 INFO [train.py:715] (0/8) Epoch 10, batch 8500, loss[loss=0.1276, simple_loss=0.2083, pruned_loss=0.02342, over 4804.00 frames.], tot_loss[loss=0.1393, simple_loss=0.212, pruned_loss=0.03329, over 971562.84 frames.], batch size: 24, lr: 2.16e-04 2022-05-06 18:33:02,050 INFO [train.py:715] (0/8) Epoch 10, batch 8550, loss[loss=0.1583, simple_loss=0.2288, pruned_loss=0.04392, over 4813.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2115, pruned_loss=0.03311, over 971802.12 frames.], batch size: 25, lr: 2.16e-04 2022-05-06 18:33:41,300 INFO [train.py:715] (0/8) Epoch 10, batch 8600, loss[loss=0.1411, simple_loss=0.2135, pruned_loss=0.03434, over 4950.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03314, over 971569.77 frames.], batch size: 40, lr: 2.16e-04 2022-05-06 18:34:19,982 INFO [train.py:715] (0/8) Epoch 10, batch 8650, loss[loss=0.1234, simple_loss=0.1887, pruned_loss=0.02908, over 4826.00 frames.], tot_loss[loss=0.1378, simple_loss=0.211, pruned_loss=0.0323, over 971576.32 frames.], batch size: 12, lr: 2.16e-04 2022-05-06 18:34:58,631 INFO [train.py:715] (0/8) Epoch 10, batch 8700, loss[loss=0.1478, simple_loss=0.2306, pruned_loss=0.03253, over 4905.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2119, pruned_loss=0.03275, over 972082.69 frames.], batch size: 18, lr: 2.16e-04 2022-05-06 18:35:37,457 INFO [train.py:715] (0/8) Epoch 10, batch 8750, loss[loss=0.1304, simple_loss=0.1982, pruned_loss=0.03135, over 4819.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03317, over 972387.93 frames.], batch size: 15, lr: 2.16e-04 2022-05-06 18:36:15,828 INFO [train.py:715] (0/8) Epoch 10, batch 8800, loss[loss=0.1428, simple_loss=0.212, pruned_loss=0.03677, over 4944.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03296, over 972529.69 frames.], batch size: 35, lr: 2.16e-04 2022-05-06 18:36:54,706 INFO [train.py:715] (0/8) Epoch 10, batch 8850, loss[loss=0.1666, simple_loss=0.2454, pruned_loss=0.04392, over 4750.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03351, over 972868.90 frames.], batch size: 19, lr: 2.16e-04 2022-05-06 18:37:34,277 INFO [train.py:715] (0/8) Epoch 10, batch 8900, loss[loss=0.1259, simple_loss=0.1926, pruned_loss=0.02961, over 4866.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03336, over 972378.02 frames.], batch size: 12, lr: 2.16e-04 2022-05-06 18:38:13,788 INFO [train.py:715] (0/8) Epoch 10, batch 8950, loss[loss=0.1255, simple_loss=0.1998, pruned_loss=0.02559, over 4759.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2134, pruned_loss=0.03355, over 971751.71 frames.], batch size: 19, lr: 2.16e-04 2022-05-06 18:38:53,303 INFO [train.py:715] (0/8) Epoch 10, batch 9000, loss[loss=0.155, simple_loss=0.22, pruned_loss=0.04502, over 4968.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2135, pruned_loss=0.03361, over 972182.04 frames.], batch size: 14, lr: 2.16e-04 2022-05-06 18:38:53,304 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 18:39:02,858 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1064, simple_loss=0.1907, pruned_loss=0.01106, over 914524.00 frames. 2022-05-06 18:39:42,085 INFO [train.py:715] (0/8) Epoch 10, batch 9050, loss[loss=0.1291, simple_loss=0.204, pruned_loss=0.02709, over 4881.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2128, pruned_loss=0.03343, over 971608.10 frames.], batch size: 16, lr: 2.16e-04 2022-05-06 18:40:21,149 INFO [train.py:715] (0/8) Epoch 10, batch 9100, loss[loss=0.1264, simple_loss=0.2011, pruned_loss=0.02585, over 4934.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03354, over 971556.38 frames.], batch size: 21, lr: 2.16e-04 2022-05-06 18:41:01,479 INFO [train.py:715] (0/8) Epoch 10, batch 9150, loss[loss=0.1686, simple_loss=0.2358, pruned_loss=0.05074, over 4867.00 frames.], tot_loss[loss=0.1395, simple_loss=0.212, pruned_loss=0.03347, over 972114.33 frames.], batch size: 22, lr: 2.16e-04 2022-05-06 18:41:40,995 INFO [train.py:715] (0/8) Epoch 10, batch 9200, loss[loss=0.1182, simple_loss=0.196, pruned_loss=0.02018, over 4758.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2124, pruned_loss=0.03373, over 972218.36 frames.], batch size: 19, lr: 2.16e-04 2022-05-06 18:42:20,445 INFO [train.py:715] (0/8) Epoch 10, batch 9250, loss[loss=0.1897, simple_loss=0.2538, pruned_loss=0.06286, over 4876.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2128, pruned_loss=0.03395, over 972618.84 frames.], batch size: 16, lr: 2.16e-04 2022-05-06 18:43:00,263 INFO [train.py:715] (0/8) Epoch 10, batch 9300, loss[loss=0.1439, simple_loss=0.208, pruned_loss=0.03989, over 4948.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2121, pruned_loss=0.03361, over 972405.04 frames.], batch size: 21, lr: 2.16e-04 2022-05-06 18:43:39,884 INFO [train.py:715] (0/8) Epoch 10, batch 9350, loss[loss=0.1331, simple_loss=0.218, pruned_loss=0.02407, over 4939.00 frames.], tot_loss[loss=0.139, simple_loss=0.2115, pruned_loss=0.03322, over 973033.34 frames.], batch size: 21, lr: 2.16e-04 2022-05-06 18:44:19,392 INFO [train.py:715] (0/8) Epoch 10, batch 9400, loss[loss=0.1539, simple_loss=0.2329, pruned_loss=0.03743, over 4985.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2113, pruned_loss=0.03314, over 973628.02 frames.], batch size: 26, lr: 2.16e-04 2022-05-06 18:44:58,980 INFO [train.py:715] (0/8) Epoch 10, batch 9450, loss[loss=0.1494, simple_loss=0.2274, pruned_loss=0.03567, over 4927.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2116, pruned_loss=0.03306, over 972823.24 frames.], batch size: 29, lr: 2.16e-04 2022-05-06 18:45:38,374 INFO [train.py:715] (0/8) Epoch 10, batch 9500, loss[loss=0.1239, simple_loss=0.204, pruned_loss=0.02188, over 4807.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2118, pruned_loss=0.03292, over 971904.17 frames.], batch size: 25, lr: 2.16e-04 2022-05-06 18:46:17,355 INFO [train.py:715] (0/8) Epoch 10, batch 9550, loss[loss=0.1268, simple_loss=0.2058, pruned_loss=0.02392, over 4909.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2119, pruned_loss=0.03338, over 971763.15 frames.], batch size: 23, lr: 2.16e-04 2022-05-06 18:46:55,766 INFO [train.py:715] (0/8) Epoch 10, batch 9600, loss[loss=0.1632, simple_loss=0.2282, pruned_loss=0.04909, over 4855.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2123, pruned_loss=0.03369, over 971129.77 frames.], batch size: 32, lr: 2.16e-04 2022-05-06 18:47:34,908 INFO [train.py:715] (0/8) Epoch 10, batch 9650, loss[loss=0.1138, simple_loss=0.1897, pruned_loss=0.01898, over 4925.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2122, pruned_loss=0.03373, over 971257.82 frames.], batch size: 29, lr: 2.16e-04 2022-05-06 18:48:14,560 INFO [train.py:715] (0/8) Epoch 10, batch 9700, loss[loss=0.1503, simple_loss=0.2314, pruned_loss=0.03457, over 4868.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2124, pruned_loss=0.03343, over 971985.19 frames.], batch size: 20, lr: 2.16e-04 2022-05-06 18:48:52,977 INFO [train.py:715] (0/8) Epoch 10, batch 9750, loss[loss=0.1962, simple_loss=0.2595, pruned_loss=0.06643, over 4826.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2116, pruned_loss=0.03309, over 972110.61 frames.], batch size: 15, lr: 2.16e-04 2022-05-06 18:49:32,213 INFO [train.py:715] (0/8) Epoch 10, batch 9800, loss[loss=0.09855, simple_loss=0.1722, pruned_loss=0.01244, over 4817.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03359, over 972300.12 frames.], batch size: 26, lr: 2.16e-04 2022-05-06 18:50:11,747 INFO [train.py:715] (0/8) Epoch 10, batch 9850, loss[loss=0.1306, simple_loss=0.2059, pruned_loss=0.0276, over 4921.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2127, pruned_loss=0.03345, over 972559.85 frames.], batch size: 18, lr: 2.16e-04 2022-05-06 18:50:51,054 INFO [train.py:715] (0/8) Epoch 10, batch 9900, loss[loss=0.134, simple_loss=0.2023, pruned_loss=0.03287, over 4866.00 frames.], tot_loss[loss=0.14, simple_loss=0.213, pruned_loss=0.03352, over 973096.78 frames.], batch size: 32, lr: 2.16e-04 2022-05-06 18:51:30,046 INFO [train.py:715] (0/8) Epoch 10, batch 9950, loss[loss=0.1362, simple_loss=0.2084, pruned_loss=0.03197, over 4850.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.03345, over 973502.88 frames.], batch size: 20, lr: 2.16e-04 2022-05-06 18:52:10,243 INFO [train.py:715] (0/8) Epoch 10, batch 10000, loss[loss=0.1532, simple_loss=0.2233, pruned_loss=0.04157, over 4795.00 frames.], tot_loss[loss=0.1395, simple_loss=0.213, pruned_loss=0.03305, over 972857.90 frames.], batch size: 21, lr: 2.16e-04 2022-05-06 18:52:49,844 INFO [train.py:715] (0/8) Epoch 10, batch 10050, loss[loss=0.1427, simple_loss=0.2126, pruned_loss=0.0364, over 4785.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2125, pruned_loss=0.03338, over 972242.23 frames.], batch size: 17, lr: 2.16e-04 2022-05-06 18:53:27,866 INFO [train.py:715] (0/8) Epoch 10, batch 10100, loss[loss=0.1352, simple_loss=0.2142, pruned_loss=0.02815, over 4734.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.0332, over 972471.30 frames.], batch size: 16, lr: 2.16e-04 2022-05-06 18:54:06,607 INFO [train.py:715] (0/8) Epoch 10, batch 10150, loss[loss=0.1283, simple_loss=0.2041, pruned_loss=0.02625, over 4972.00 frames.], tot_loss[loss=0.139, simple_loss=0.2118, pruned_loss=0.03306, over 972467.36 frames.], batch size: 28, lr: 2.16e-04 2022-05-06 18:54:46,538 INFO [train.py:715] (0/8) Epoch 10, batch 10200, loss[loss=0.1569, simple_loss=0.2223, pruned_loss=0.04575, over 4933.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2129, pruned_loss=0.03377, over 971681.33 frames.], batch size: 21, lr: 2.16e-04 2022-05-06 18:55:25,657 INFO [train.py:715] (0/8) Epoch 10, batch 10250, loss[loss=0.1306, simple_loss=0.1957, pruned_loss=0.03275, over 4987.00 frames.], tot_loss[loss=0.1395, simple_loss=0.212, pruned_loss=0.03345, over 971655.32 frames.], batch size: 14, lr: 2.16e-04 2022-05-06 18:56:04,511 INFO [train.py:715] (0/8) Epoch 10, batch 10300, loss[loss=0.1369, simple_loss=0.2123, pruned_loss=0.03074, over 4924.00 frames.], tot_loss[loss=0.14, simple_loss=0.2124, pruned_loss=0.03379, over 972447.15 frames.], batch size: 23, lr: 2.16e-04 2022-05-06 18:56:44,439 INFO [train.py:715] (0/8) Epoch 10, batch 10350, loss[loss=0.1312, simple_loss=0.1993, pruned_loss=0.03153, over 4786.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2133, pruned_loss=0.03442, over 972004.75 frames.], batch size: 12, lr: 2.16e-04 2022-05-06 18:57:24,435 INFO [train.py:715] (0/8) Epoch 10, batch 10400, loss[loss=0.1361, simple_loss=0.2159, pruned_loss=0.02819, over 4862.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2128, pruned_loss=0.0338, over 971416.17 frames.], batch size: 20, lr: 2.16e-04 2022-05-06 18:58:02,843 INFO [train.py:715] (0/8) Epoch 10, batch 10450, loss[loss=0.1341, simple_loss=0.2063, pruned_loss=0.03093, over 4789.00 frames.], tot_loss[loss=0.141, simple_loss=0.2135, pruned_loss=0.03431, over 972195.49 frames.], batch size: 17, lr: 2.16e-04 2022-05-06 18:58:41,110 INFO [train.py:715] (0/8) Epoch 10, batch 10500, loss[loss=0.1389, simple_loss=0.216, pruned_loss=0.03093, over 4830.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2138, pruned_loss=0.03402, over 971838.76 frames.], batch size: 26, lr: 2.16e-04 2022-05-06 18:59:20,244 INFO [train.py:715] (0/8) Epoch 10, batch 10550, loss[loss=0.1235, simple_loss=0.2029, pruned_loss=0.02203, over 4826.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2139, pruned_loss=0.03379, over 971712.16 frames.], batch size: 26, lr: 2.16e-04 2022-05-06 18:59:59,204 INFO [train.py:715] (0/8) Epoch 10, batch 10600, loss[loss=0.1495, simple_loss=0.2157, pruned_loss=0.04164, over 4850.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2135, pruned_loss=0.03349, over 972520.81 frames.], batch size: 30, lr: 2.16e-04 2022-05-06 19:00:37,418 INFO [train.py:715] (0/8) Epoch 10, batch 10650, loss[loss=0.1638, simple_loss=0.2377, pruned_loss=0.04493, over 4837.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2142, pruned_loss=0.03382, over 973445.40 frames.], batch size: 30, lr: 2.16e-04 2022-05-06 19:01:16,840 INFO [train.py:715] (0/8) Epoch 10, batch 10700, loss[loss=0.1617, simple_loss=0.2246, pruned_loss=0.04934, over 4922.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2144, pruned_loss=0.03393, over 972907.01 frames.], batch size: 23, lr: 2.16e-04 2022-05-06 19:01:56,164 INFO [train.py:715] (0/8) Epoch 10, batch 10750, loss[loss=0.1221, simple_loss=0.1975, pruned_loss=0.02332, over 4701.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2138, pruned_loss=0.03387, over 973213.85 frames.], batch size: 15, lr: 2.16e-04 2022-05-06 19:02:34,990 INFO [train.py:715] (0/8) Epoch 10, batch 10800, loss[loss=0.1352, simple_loss=0.2118, pruned_loss=0.02925, over 4958.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03397, over 972380.18 frames.], batch size: 24, lr: 2.16e-04 2022-05-06 19:03:13,435 INFO [train.py:715] (0/8) Epoch 10, batch 10850, loss[loss=0.1372, simple_loss=0.2062, pruned_loss=0.03408, over 4906.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03357, over 972348.51 frames.], batch size: 17, lr: 2.16e-04 2022-05-06 19:03:52,878 INFO [train.py:715] (0/8) Epoch 10, batch 10900, loss[loss=0.1277, simple_loss=0.1914, pruned_loss=0.03197, over 4736.00 frames.], tot_loss[loss=0.1403, simple_loss=0.213, pruned_loss=0.0338, over 972335.06 frames.], batch size: 12, lr: 2.16e-04 2022-05-06 19:04:31,790 INFO [train.py:715] (0/8) Epoch 10, batch 10950, loss[loss=0.1266, simple_loss=0.2043, pruned_loss=0.02442, over 4975.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03341, over 973063.34 frames.], batch size: 24, lr: 2.16e-04 2022-05-06 19:05:10,346 INFO [train.py:715] (0/8) Epoch 10, batch 11000, loss[loss=0.1398, simple_loss=0.2112, pruned_loss=0.0342, over 4913.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03325, over 972956.96 frames.], batch size: 18, lr: 2.16e-04 2022-05-06 19:05:49,497 INFO [train.py:715] (0/8) Epoch 10, batch 11050, loss[loss=0.1231, simple_loss=0.1979, pruned_loss=0.02419, over 4981.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03292, over 972936.00 frames.], batch size: 15, lr: 2.16e-04 2022-05-06 19:06:29,281 INFO [train.py:715] (0/8) Epoch 10, batch 11100, loss[loss=0.1287, simple_loss=0.2048, pruned_loss=0.02633, over 4916.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03306, over 973156.55 frames.], batch size: 18, lr: 2.16e-04 2022-05-06 19:07:07,078 INFO [train.py:715] (0/8) Epoch 10, batch 11150, loss[loss=0.1406, simple_loss=0.2046, pruned_loss=0.0383, over 4874.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03323, over 973306.59 frames.], batch size: 22, lr: 2.16e-04 2022-05-06 19:07:46,334 INFO [train.py:715] (0/8) Epoch 10, batch 11200, loss[loss=0.1436, simple_loss=0.2279, pruned_loss=0.02962, over 4933.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03329, over 974318.24 frames.], batch size: 23, lr: 2.16e-04 2022-05-06 19:08:25,401 INFO [train.py:715] (0/8) Epoch 10, batch 11250, loss[loss=0.1285, simple_loss=0.2099, pruned_loss=0.02353, over 4799.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.03343, over 973720.39 frames.], batch size: 21, lr: 2.16e-04 2022-05-06 19:09:03,756 INFO [train.py:715] (0/8) Epoch 10, batch 11300, loss[loss=0.1431, simple_loss=0.2147, pruned_loss=0.03571, over 4903.00 frames.], tot_loss[loss=0.14, simple_loss=0.213, pruned_loss=0.03344, over 973514.32 frames.], batch size: 19, lr: 2.16e-04 2022-05-06 19:09:42,491 INFO [train.py:715] (0/8) Epoch 10, batch 11350, loss[loss=0.1439, simple_loss=0.2122, pruned_loss=0.03776, over 4983.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.03399, over 973484.21 frames.], batch size: 14, lr: 2.16e-04 2022-05-06 19:10:21,473 INFO [train.py:715] (0/8) Epoch 10, batch 11400, loss[loss=0.1839, simple_loss=0.2387, pruned_loss=0.06459, over 4897.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03397, over 972539.55 frames.], batch size: 17, lr: 2.16e-04 2022-05-06 19:11:00,937 INFO [train.py:715] (0/8) Epoch 10, batch 11450, loss[loss=0.1481, simple_loss=0.2131, pruned_loss=0.04156, over 4886.00 frames.], tot_loss[loss=0.141, simple_loss=0.2137, pruned_loss=0.03417, over 972776.43 frames.], batch size: 19, lr: 2.16e-04 2022-05-06 19:11:38,820 INFO [train.py:715] (0/8) Epoch 10, batch 11500, loss[loss=0.1544, simple_loss=0.2272, pruned_loss=0.04077, over 4697.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2131, pruned_loss=0.0338, over 972044.77 frames.], batch size: 15, lr: 2.16e-04 2022-05-06 19:12:17,871 INFO [train.py:715] (0/8) Epoch 10, batch 11550, loss[loss=0.1687, simple_loss=0.2403, pruned_loss=0.04852, over 4737.00 frames.], tot_loss[loss=0.14, simple_loss=0.213, pruned_loss=0.03351, over 972154.74 frames.], batch size: 16, lr: 2.16e-04 2022-05-06 19:12:57,424 INFO [train.py:715] (0/8) Epoch 10, batch 11600, loss[loss=0.1185, simple_loss=0.1933, pruned_loss=0.02189, over 4931.00 frames.], tot_loss[loss=0.14, simple_loss=0.2128, pruned_loss=0.03361, over 972778.03 frames.], batch size: 29, lr: 2.16e-04 2022-05-06 19:13:35,825 INFO [train.py:715] (0/8) Epoch 10, batch 11650, loss[loss=0.1337, simple_loss=0.2041, pruned_loss=0.03164, over 4887.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03327, over 973261.58 frames.], batch size: 22, lr: 2.16e-04 2022-05-06 19:14:14,882 INFO [train.py:715] (0/8) Epoch 10, batch 11700, loss[loss=0.1431, simple_loss=0.2247, pruned_loss=0.0307, over 4981.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2115, pruned_loss=0.0329, over 972966.58 frames.], batch size: 27, lr: 2.16e-04 2022-05-06 19:14:53,451 INFO [train.py:715] (0/8) Epoch 10, batch 11750, loss[loss=0.123, simple_loss=0.1932, pruned_loss=0.02639, over 4781.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2116, pruned_loss=0.03267, over 972437.48 frames.], batch size: 18, lr: 2.15e-04 2022-05-06 19:15:32,370 INFO [train.py:715] (0/8) Epoch 10, batch 11800, loss[loss=0.1152, simple_loss=0.185, pruned_loss=0.02273, over 4783.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03242, over 972322.77 frames.], batch size: 17, lr: 2.15e-04 2022-05-06 19:16:10,395 INFO [train.py:715] (0/8) Epoch 10, batch 11850, loss[loss=0.1245, simple_loss=0.205, pruned_loss=0.02198, over 4884.00 frames.], tot_loss[loss=0.1383, simple_loss=0.211, pruned_loss=0.03283, over 972158.37 frames.], batch size: 16, lr: 2.15e-04 2022-05-06 19:16:49,162 INFO [train.py:715] (0/8) Epoch 10, batch 11900, loss[loss=0.1289, simple_loss=0.2043, pruned_loss=0.02677, over 4772.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2113, pruned_loss=0.03273, over 971845.39 frames.], batch size: 17, lr: 2.15e-04 2022-05-06 19:16:52,853 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-360000.pt 2022-05-06 19:17:30,480 INFO [train.py:715] (0/8) Epoch 10, batch 11950, loss[loss=0.119, simple_loss=0.1946, pruned_loss=0.02168, over 4796.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2112, pruned_loss=0.03286, over 971802.59 frames.], batch size: 24, lr: 2.15e-04 2022-05-06 19:18:09,370 INFO [train.py:715] (0/8) Epoch 10, batch 12000, loss[loss=0.1218, simple_loss=0.1941, pruned_loss=0.02475, over 4780.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2116, pruned_loss=0.03278, over 971896.48 frames.], batch size: 18, lr: 2.15e-04 2022-05-06 19:18:09,371 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 19:18:19,016 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1065, simple_loss=0.1908, pruned_loss=0.01105, over 914524.00 frames. 2022-05-06 19:18:57,898 INFO [train.py:715] (0/8) Epoch 10, batch 12050, loss[loss=0.1533, simple_loss=0.233, pruned_loss=0.03679, over 4955.00 frames.], tot_loss[loss=0.1388, simple_loss=0.212, pruned_loss=0.03283, over 972525.16 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:19:37,114 INFO [train.py:715] (0/8) Epoch 10, batch 12100, loss[loss=0.1376, simple_loss=0.2184, pruned_loss=0.0284, over 4849.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2133, pruned_loss=0.0336, over 972327.88 frames.], batch size: 32, lr: 2.15e-04 2022-05-06 19:20:16,372 INFO [train.py:715] (0/8) Epoch 10, batch 12150, loss[loss=0.1382, simple_loss=0.2129, pruned_loss=0.03181, over 4904.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2142, pruned_loss=0.03382, over 972263.64 frames.], batch size: 22, lr: 2.15e-04 2022-05-06 19:20:55,540 INFO [train.py:715] (0/8) Epoch 10, batch 12200, loss[loss=0.1423, simple_loss=0.2177, pruned_loss=0.03345, over 4787.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2134, pruned_loss=0.03323, over 972933.00 frames.], batch size: 18, lr: 2.15e-04 2022-05-06 19:21:34,086 INFO [train.py:715] (0/8) Epoch 10, batch 12250, loss[loss=0.11, simple_loss=0.1836, pruned_loss=0.01815, over 4828.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.03326, over 972187.38 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:22:13,029 INFO [train.py:715] (0/8) Epoch 10, batch 12300, loss[loss=0.1579, simple_loss=0.2315, pruned_loss=0.04209, over 4968.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.0331, over 972097.73 frames.], batch size: 28, lr: 2.15e-04 2022-05-06 19:22:51,957 INFO [train.py:715] (0/8) Epoch 10, batch 12350, loss[loss=0.1444, simple_loss=0.2356, pruned_loss=0.02658, over 4854.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2118, pruned_loss=0.03322, over 972416.85 frames.], batch size: 20, lr: 2.15e-04 2022-05-06 19:23:30,787 INFO [train.py:715] (0/8) Epoch 10, batch 12400, loss[loss=0.1251, simple_loss=0.2047, pruned_loss=0.02271, over 4785.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2115, pruned_loss=0.03293, over 972600.04 frames.], batch size: 18, lr: 2.15e-04 2022-05-06 19:24:09,218 INFO [train.py:715] (0/8) Epoch 10, batch 12450, loss[loss=0.1573, simple_loss=0.2309, pruned_loss=0.04189, over 4847.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03319, over 972810.33 frames.], batch size: 20, lr: 2.15e-04 2022-05-06 19:24:48,248 INFO [train.py:715] (0/8) Epoch 10, batch 12500, loss[loss=0.1389, simple_loss=0.2152, pruned_loss=0.03127, over 4848.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2121, pruned_loss=0.03325, over 973169.27 frames.], batch size: 34, lr: 2.15e-04 2022-05-06 19:25:27,027 INFO [train.py:715] (0/8) Epoch 10, batch 12550, loss[loss=0.1349, simple_loss=0.2109, pruned_loss=0.02943, over 4789.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2122, pruned_loss=0.03344, over 971734.93 frames.], batch size: 17, lr: 2.15e-04 2022-05-06 19:26:05,181 INFO [train.py:715] (0/8) Epoch 10, batch 12600, loss[loss=0.1426, simple_loss=0.2315, pruned_loss=0.02679, over 4909.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03323, over 972431.72 frames.], batch size: 39, lr: 2.15e-04 2022-05-06 19:26:43,471 INFO [train.py:715] (0/8) Epoch 10, batch 12650, loss[loss=0.1218, simple_loss=0.1998, pruned_loss=0.02192, over 4773.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2121, pruned_loss=0.03335, over 971301.91 frames.], batch size: 18, lr: 2.15e-04 2022-05-06 19:27:22,409 INFO [train.py:715] (0/8) Epoch 10, batch 12700, loss[loss=0.1704, simple_loss=0.2508, pruned_loss=0.04494, over 4980.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03385, over 971032.62 frames.], batch size: 28, lr: 2.15e-04 2022-05-06 19:28:00,751 INFO [train.py:715] (0/8) Epoch 10, batch 12750, loss[loss=0.116, simple_loss=0.1899, pruned_loss=0.0211, over 4926.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2131, pruned_loss=0.03374, over 971716.21 frames.], batch size: 23, lr: 2.15e-04 2022-05-06 19:28:39,216 INFO [train.py:715] (0/8) Epoch 10, batch 12800, loss[loss=0.1853, simple_loss=0.2575, pruned_loss=0.05658, over 4975.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2129, pruned_loss=0.03369, over 971828.71 frames.], batch size: 35, lr: 2.15e-04 2022-05-06 19:29:18,645 INFO [train.py:715] (0/8) Epoch 10, batch 12850, loss[loss=0.1581, simple_loss=0.2271, pruned_loss=0.0446, over 4975.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2131, pruned_loss=0.03361, over 971200.93 frames.], batch size: 39, lr: 2.15e-04 2022-05-06 19:29:57,811 INFO [train.py:715] (0/8) Epoch 10, batch 12900, loss[loss=0.1595, simple_loss=0.2401, pruned_loss=0.03939, over 4985.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2126, pruned_loss=0.03381, over 971135.92 frames.], batch size: 25, lr: 2.15e-04 2022-05-06 19:30:36,229 INFO [train.py:715] (0/8) Epoch 10, batch 12950, loss[loss=0.1562, simple_loss=0.2223, pruned_loss=0.04505, over 4879.00 frames.], tot_loss[loss=0.1404, simple_loss=0.213, pruned_loss=0.03392, over 971998.72 frames.], batch size: 16, lr: 2.15e-04 2022-05-06 19:31:14,799 INFO [train.py:715] (0/8) Epoch 10, batch 13000, loss[loss=0.1223, simple_loss=0.1929, pruned_loss=0.02586, over 4859.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2133, pruned_loss=0.03376, over 971809.62 frames.], batch size: 20, lr: 2.15e-04 2022-05-06 19:31:54,376 INFO [train.py:715] (0/8) Epoch 10, batch 13050, loss[loss=0.1402, simple_loss=0.2162, pruned_loss=0.03206, over 4791.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2133, pruned_loss=0.03361, over 971638.81 frames.], batch size: 21, lr: 2.15e-04 2022-05-06 19:32:32,857 INFO [train.py:715] (0/8) Epoch 10, batch 13100, loss[loss=0.1338, simple_loss=0.2185, pruned_loss=0.02454, over 4971.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2137, pruned_loss=0.03371, over 972148.27 frames.], batch size: 24, lr: 2.15e-04 2022-05-06 19:33:11,984 INFO [train.py:715] (0/8) Epoch 10, batch 13150, loss[loss=0.1339, simple_loss=0.2098, pruned_loss=0.02897, over 4966.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2138, pruned_loss=0.0334, over 972180.21 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:33:51,010 INFO [train.py:715] (0/8) Epoch 10, batch 13200, loss[loss=0.1417, simple_loss=0.2134, pruned_loss=0.03499, over 4984.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2131, pruned_loss=0.03301, over 972455.73 frames.], batch size: 24, lr: 2.15e-04 2022-05-06 19:34:30,049 INFO [train.py:715] (0/8) Epoch 10, batch 13250, loss[loss=0.1668, simple_loss=0.226, pruned_loss=0.05383, over 4800.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2134, pruned_loss=0.03345, over 970792.89 frames.], batch size: 14, lr: 2.15e-04 2022-05-06 19:35:08,699 INFO [train.py:715] (0/8) Epoch 10, batch 13300, loss[loss=0.1243, simple_loss=0.1946, pruned_loss=0.02701, over 4846.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2133, pruned_loss=0.03356, over 972064.93 frames.], batch size: 12, lr: 2.15e-04 2022-05-06 19:35:47,102 INFO [train.py:715] (0/8) Epoch 10, batch 13350, loss[loss=0.119, simple_loss=0.1885, pruned_loss=0.02476, over 4779.00 frames.], tot_loss[loss=0.1409, simple_loss=0.214, pruned_loss=0.03389, over 972035.50 frames.], batch size: 12, lr: 2.15e-04 2022-05-06 19:36:26,373 INFO [train.py:715] (0/8) Epoch 10, batch 13400, loss[loss=0.1741, simple_loss=0.235, pruned_loss=0.05657, over 4824.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2134, pruned_loss=0.03375, over 971524.04 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:37:04,725 INFO [train.py:715] (0/8) Epoch 10, batch 13450, loss[loss=0.1385, simple_loss=0.2065, pruned_loss=0.03523, over 4875.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03374, over 971526.30 frames.], batch size: 22, lr: 2.15e-04 2022-05-06 19:37:42,966 INFO [train.py:715] (0/8) Epoch 10, batch 13500, loss[loss=0.1517, simple_loss=0.2282, pruned_loss=0.03761, over 4856.00 frames.], tot_loss[loss=0.141, simple_loss=0.2138, pruned_loss=0.0341, over 971926.04 frames.], batch size: 32, lr: 2.15e-04 2022-05-06 19:38:22,033 INFO [train.py:715] (0/8) Epoch 10, batch 13550, loss[loss=0.134, simple_loss=0.2108, pruned_loss=0.02858, over 4911.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2142, pruned_loss=0.03405, over 970866.48 frames.], batch size: 19, lr: 2.15e-04 2022-05-06 19:39:00,606 INFO [train.py:715] (0/8) Epoch 10, batch 13600, loss[loss=0.1507, simple_loss=0.214, pruned_loss=0.04366, over 4984.00 frames.], tot_loss[loss=0.1409, simple_loss=0.214, pruned_loss=0.03393, over 971241.53 frames.], batch size: 25, lr: 2.15e-04 2022-05-06 19:39:39,004 INFO [train.py:715] (0/8) Epoch 10, batch 13650, loss[loss=0.1479, simple_loss=0.2141, pruned_loss=0.0408, over 4965.00 frames.], tot_loss[loss=0.1413, simple_loss=0.2144, pruned_loss=0.0341, over 971825.81 frames.], batch size: 35, lr: 2.15e-04 2022-05-06 19:40:17,578 INFO [train.py:715] (0/8) Epoch 10, batch 13700, loss[loss=0.1635, simple_loss=0.2286, pruned_loss=0.0492, over 4965.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2132, pruned_loss=0.03366, over 972142.57 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:40:57,642 INFO [train.py:715] (0/8) Epoch 10, batch 13750, loss[loss=0.1593, simple_loss=0.2399, pruned_loss=0.03938, over 4793.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03334, over 972407.19 frames.], batch size: 18, lr: 2.15e-04 2022-05-06 19:41:37,005 INFO [train.py:715] (0/8) Epoch 10, batch 13800, loss[loss=0.1508, simple_loss=0.2211, pruned_loss=0.04024, over 4956.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2127, pruned_loss=0.03381, over 972585.12 frames.], batch size: 21, lr: 2.15e-04 2022-05-06 19:42:15,510 INFO [train.py:715] (0/8) Epoch 10, batch 13850, loss[loss=0.1518, simple_loss=0.2083, pruned_loss=0.04768, over 4846.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2132, pruned_loss=0.03381, over 972558.25 frames.], batch size: 32, lr: 2.15e-04 2022-05-06 19:42:55,146 INFO [train.py:715] (0/8) Epoch 10, batch 13900, loss[loss=0.1344, simple_loss=0.2013, pruned_loss=0.03374, over 4889.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2135, pruned_loss=0.03392, over 971616.76 frames.], batch size: 19, lr: 2.15e-04 2022-05-06 19:43:33,823 INFO [train.py:715] (0/8) Epoch 10, batch 13950, loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02927, over 4904.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2141, pruned_loss=0.03387, over 971832.47 frames.], batch size: 19, lr: 2.15e-04 2022-05-06 19:44:12,831 INFO [train.py:715] (0/8) Epoch 10, batch 14000, loss[loss=0.1191, simple_loss=0.196, pruned_loss=0.02109, over 4758.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2131, pruned_loss=0.03304, over 971608.25 frames.], batch size: 19, lr: 2.15e-04 2022-05-06 19:44:51,236 INFO [train.py:715] (0/8) Epoch 10, batch 14050, loss[loss=0.1352, simple_loss=0.211, pruned_loss=0.02972, over 4950.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03299, over 971045.85 frames.], batch size: 23, lr: 2.15e-04 2022-05-06 19:45:30,768 INFO [train.py:715] (0/8) Epoch 10, batch 14100, loss[loss=0.1467, simple_loss=0.2266, pruned_loss=0.03335, over 4863.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2135, pruned_loss=0.03342, over 972328.33 frames.], batch size: 16, lr: 2.15e-04 2022-05-06 19:46:09,123 INFO [train.py:715] (0/8) Epoch 10, batch 14150, loss[loss=0.1205, simple_loss=0.1999, pruned_loss=0.02052, over 4976.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2129, pruned_loss=0.03297, over 972605.00 frames.], batch size: 25, lr: 2.15e-04 2022-05-06 19:46:47,032 INFO [train.py:715] (0/8) Epoch 10, batch 14200, loss[loss=0.1513, simple_loss=0.2289, pruned_loss=0.03685, over 4799.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03325, over 972613.56 frames.], batch size: 24, lr: 2.15e-04 2022-05-06 19:47:26,633 INFO [train.py:715] (0/8) Epoch 10, batch 14250, loss[loss=0.126, simple_loss=0.1947, pruned_loss=0.02864, over 4854.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03317, over 972421.28 frames.], batch size: 20, lr: 2.15e-04 2022-05-06 19:48:05,009 INFO [train.py:715] (0/8) Epoch 10, batch 14300, loss[loss=0.1133, simple_loss=0.1913, pruned_loss=0.01769, over 4988.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2127, pruned_loss=0.03275, over 973341.09 frames.], batch size: 25, lr: 2.15e-04 2022-05-06 19:48:43,126 INFO [train.py:715] (0/8) Epoch 10, batch 14350, loss[loss=0.1309, simple_loss=0.2087, pruned_loss=0.02649, over 4840.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2134, pruned_loss=0.03325, over 972723.84 frames.], batch size: 26, lr: 2.15e-04 2022-05-06 19:49:21,564 INFO [train.py:715] (0/8) Epoch 10, batch 14400, loss[loss=0.1477, simple_loss=0.2175, pruned_loss=0.0389, over 4984.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2126, pruned_loss=0.03286, over 972210.45 frames.], batch size: 28, lr: 2.15e-04 2022-05-06 19:50:01,190 INFO [train.py:715] (0/8) Epoch 10, batch 14450, loss[loss=0.1667, simple_loss=0.2343, pruned_loss=0.04955, over 4985.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2125, pruned_loss=0.03297, over 972100.72 frames.], batch size: 28, lr: 2.15e-04 2022-05-06 19:50:39,558 INFO [train.py:715] (0/8) Epoch 10, batch 14500, loss[loss=0.1209, simple_loss=0.1927, pruned_loss=0.02456, over 4811.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2128, pruned_loss=0.03324, over 972321.21 frames.], batch size: 14, lr: 2.15e-04 2022-05-06 19:51:17,700 INFO [train.py:715] (0/8) Epoch 10, batch 14550, loss[loss=0.1577, simple_loss=0.2317, pruned_loss=0.04184, over 4907.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.03316, over 972995.90 frames.], batch size: 39, lr: 2.15e-04 2022-05-06 19:51:57,345 INFO [train.py:715] (0/8) Epoch 10, batch 14600, loss[loss=0.1507, simple_loss=0.237, pruned_loss=0.0322, over 4916.00 frames.], tot_loss[loss=0.14, simple_loss=0.2131, pruned_loss=0.03344, over 971250.04 frames.], batch size: 29, lr: 2.15e-04 2022-05-06 19:52:35,978 INFO [train.py:715] (0/8) Epoch 10, batch 14650, loss[loss=0.1311, simple_loss=0.2079, pruned_loss=0.02718, over 4885.00 frames.], tot_loss[loss=0.14, simple_loss=0.2132, pruned_loss=0.03341, over 971719.71 frames.], batch size: 22, lr: 2.15e-04 2022-05-06 19:53:14,366 INFO [train.py:715] (0/8) Epoch 10, batch 14700, loss[loss=0.1349, simple_loss=0.2155, pruned_loss=0.02709, over 4862.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03366, over 972315.58 frames.], batch size: 20, lr: 2.15e-04 2022-05-06 19:53:53,325 INFO [train.py:715] (0/8) Epoch 10, batch 14750, loss[loss=0.1173, simple_loss=0.1988, pruned_loss=0.01786, over 4874.00 frames.], tot_loss[loss=0.14, simple_loss=0.2131, pruned_loss=0.0335, over 972839.33 frames.], batch size: 22, lr: 2.15e-04 2022-05-06 19:54:33,131 INFO [train.py:715] (0/8) Epoch 10, batch 14800, loss[loss=0.1634, simple_loss=0.2438, pruned_loss=0.04147, over 4773.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2134, pruned_loss=0.03393, over 972715.43 frames.], batch size: 17, lr: 2.15e-04 2022-05-06 19:55:12,158 INFO [train.py:715] (0/8) Epoch 10, batch 14850, loss[loss=0.1139, simple_loss=0.193, pruned_loss=0.0174, over 4942.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2131, pruned_loss=0.03404, over 972408.88 frames.], batch size: 29, lr: 2.15e-04 2022-05-06 19:55:50,173 INFO [train.py:715] (0/8) Epoch 10, batch 14900, loss[loss=0.1221, simple_loss=0.196, pruned_loss=0.02409, over 4819.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2134, pruned_loss=0.03406, over 972577.66 frames.], batch size: 25, lr: 2.15e-04 2022-05-06 19:56:30,292 INFO [train.py:715] (0/8) Epoch 10, batch 14950, loss[loss=0.1148, simple_loss=0.1901, pruned_loss=0.01982, over 4974.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2138, pruned_loss=0.034, over 973132.48 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:57:09,814 INFO [train.py:715] (0/8) Epoch 10, batch 15000, loss[loss=0.1688, simple_loss=0.2514, pruned_loss=0.0431, over 4812.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2131, pruned_loss=0.03378, over 973428.56 frames.], batch size: 26, lr: 2.15e-04 2022-05-06 19:57:09,815 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 19:57:19,461 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1065, simple_loss=0.1909, pruned_loss=0.01111, over 914524.00 frames. 2022-05-06 19:57:59,084 INFO [train.py:715] (0/8) Epoch 10, batch 15050, loss[loss=0.1444, simple_loss=0.2231, pruned_loss=0.0328, over 4716.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2134, pruned_loss=0.03404, over 972666.18 frames.], batch size: 15, lr: 2.15e-04 2022-05-06 19:58:38,144 INFO [train.py:715] (0/8) Epoch 10, batch 15100, loss[loss=0.1339, simple_loss=0.2053, pruned_loss=0.03124, over 4773.00 frames.], tot_loss[loss=0.1399, simple_loss=0.213, pruned_loss=0.03338, over 972335.26 frames.], batch size: 17, lr: 2.15e-04 2022-05-06 19:59:17,364 INFO [train.py:715] (0/8) Epoch 10, batch 15150, loss[loss=0.1345, simple_loss=0.202, pruned_loss=0.03351, over 4970.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03339, over 971965.42 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 19:59:56,358 INFO [train.py:715] (0/8) Epoch 10, batch 15200, loss[loss=0.1462, simple_loss=0.2142, pruned_loss=0.03912, over 4949.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03348, over 972028.31 frames.], batch size: 14, lr: 2.14e-04 2022-05-06 20:00:35,742 INFO [train.py:715] (0/8) Epoch 10, batch 15250, loss[loss=0.1115, simple_loss=0.1896, pruned_loss=0.01673, over 4937.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03355, over 972108.02 frames.], batch size: 23, lr: 2.14e-04 2022-05-06 20:01:14,787 INFO [train.py:715] (0/8) Epoch 10, batch 15300, loss[loss=0.1418, simple_loss=0.2156, pruned_loss=0.034, over 4791.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2121, pruned_loss=0.03333, over 971637.55 frames.], batch size: 14, lr: 2.14e-04 2022-05-06 20:01:54,057 INFO [train.py:715] (0/8) Epoch 10, batch 15350, loss[loss=0.1369, simple_loss=0.2041, pruned_loss=0.03486, over 4965.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2111, pruned_loss=0.0328, over 972473.65 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:02:34,121 INFO [train.py:715] (0/8) Epoch 10, batch 15400, loss[loss=0.146, simple_loss=0.2194, pruned_loss=0.03633, over 4909.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2117, pruned_loss=0.03321, over 972784.95 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:03:13,391 INFO [train.py:715] (0/8) Epoch 10, batch 15450, loss[loss=0.1292, simple_loss=0.1951, pruned_loss=0.03166, over 4932.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2123, pruned_loss=0.03367, over 972847.06 frames.], batch size: 23, lr: 2.14e-04 2022-05-06 20:03:53,463 INFO [train.py:715] (0/8) Epoch 10, batch 15500, loss[loss=0.1605, simple_loss=0.229, pruned_loss=0.046, over 4954.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2118, pruned_loss=0.03357, over 972644.04 frames.], batch size: 35, lr: 2.14e-04 2022-05-06 20:04:32,469 INFO [train.py:715] (0/8) Epoch 10, batch 15550, loss[loss=0.1625, simple_loss=0.2269, pruned_loss=0.04909, over 4947.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2123, pruned_loss=0.03377, over 973279.04 frames.], batch size: 35, lr: 2.14e-04 2022-05-06 20:05:11,885 INFO [train.py:715] (0/8) Epoch 10, batch 15600, loss[loss=0.1489, simple_loss=0.2156, pruned_loss=0.04106, over 4912.00 frames.], tot_loss[loss=0.1395, simple_loss=0.212, pruned_loss=0.03346, over 973819.25 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:05:50,241 INFO [train.py:715] (0/8) Epoch 10, batch 15650, loss[loss=0.1318, simple_loss=0.2076, pruned_loss=0.02801, over 4877.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2109, pruned_loss=0.03327, over 973558.80 frames.], batch size: 22, lr: 2.14e-04 2022-05-06 20:06:28,930 INFO [train.py:715] (0/8) Epoch 10, batch 15700, loss[loss=0.1483, simple_loss=0.2235, pruned_loss=0.03657, over 4822.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2114, pruned_loss=0.03356, over 973495.38 frames.], batch size: 25, lr: 2.14e-04 2022-05-06 20:07:08,405 INFO [train.py:715] (0/8) Epoch 10, batch 15750, loss[loss=0.154, simple_loss=0.2181, pruned_loss=0.04494, over 4891.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2118, pruned_loss=0.03375, over 973685.06 frames.], batch size: 22, lr: 2.14e-04 2022-05-06 20:07:46,971 INFO [train.py:715] (0/8) Epoch 10, batch 15800, loss[loss=0.1346, simple_loss=0.2048, pruned_loss=0.03219, over 4849.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2121, pruned_loss=0.03382, over 972998.19 frames.], batch size: 30, lr: 2.14e-04 2022-05-06 20:08:26,769 INFO [train.py:715] (0/8) Epoch 10, batch 15850, loss[loss=0.1334, simple_loss=0.2039, pruned_loss=0.03147, over 4980.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2118, pruned_loss=0.03365, over 972579.62 frames.], batch size: 28, lr: 2.14e-04 2022-05-06 20:09:05,638 INFO [train.py:715] (0/8) Epoch 10, batch 15900, loss[loss=0.1736, simple_loss=0.2372, pruned_loss=0.05498, over 4882.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2124, pruned_loss=0.03422, over 973731.35 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:09:44,835 INFO [train.py:715] (0/8) Epoch 10, batch 15950, loss[loss=0.144, simple_loss=0.2177, pruned_loss=0.03511, over 4824.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2123, pruned_loss=0.03401, over 973654.41 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:10:23,754 INFO [train.py:715] (0/8) Epoch 10, batch 16000, loss[loss=0.1146, simple_loss=0.1888, pruned_loss=0.0202, over 4817.00 frames.], tot_loss[loss=0.14, simple_loss=0.2124, pruned_loss=0.03381, over 973585.45 frames.], batch size: 26, lr: 2.14e-04 2022-05-06 20:11:02,643 INFO [train.py:715] (0/8) Epoch 10, batch 16050, loss[loss=0.1298, simple_loss=0.202, pruned_loss=0.02875, over 4861.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2132, pruned_loss=0.03407, over 973978.77 frames.], batch size: 30, lr: 2.14e-04 2022-05-06 20:11:41,914 INFO [train.py:715] (0/8) Epoch 10, batch 16100, loss[loss=0.1544, simple_loss=0.2147, pruned_loss=0.04701, over 4699.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2126, pruned_loss=0.03384, over 972961.60 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:12:21,126 INFO [train.py:715] (0/8) Epoch 10, batch 16150, loss[loss=0.1445, simple_loss=0.2186, pruned_loss=0.03519, over 4921.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2129, pruned_loss=0.0342, over 973267.40 frames.], batch size: 39, lr: 2.14e-04 2022-05-06 20:13:01,096 INFO [train.py:715] (0/8) Epoch 10, batch 16200, loss[loss=0.1684, simple_loss=0.2418, pruned_loss=0.04746, over 4959.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2131, pruned_loss=0.03434, over 973131.78 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:13:40,632 INFO [train.py:715] (0/8) Epoch 10, batch 16250, loss[loss=0.1025, simple_loss=0.1777, pruned_loss=0.0137, over 4936.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03412, over 972667.20 frames.], batch size: 18, lr: 2.14e-04 2022-05-06 20:14:19,845 INFO [train.py:715] (0/8) Epoch 10, batch 16300, loss[loss=0.1435, simple_loss=0.2035, pruned_loss=0.0418, over 4799.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2131, pruned_loss=0.03419, over 972581.09 frames.], batch size: 12, lr: 2.14e-04 2022-05-06 20:14:59,848 INFO [train.py:715] (0/8) Epoch 10, batch 16350, loss[loss=0.1325, simple_loss=0.2125, pruned_loss=0.02628, over 4817.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2131, pruned_loss=0.03374, over 972383.35 frames.], batch size: 25, lr: 2.14e-04 2022-05-06 20:15:39,242 INFO [train.py:715] (0/8) Epoch 10, batch 16400, loss[loss=0.1574, simple_loss=0.2221, pruned_loss=0.04637, over 4931.00 frames.], tot_loss[loss=0.14, simple_loss=0.2126, pruned_loss=0.03367, over 972625.17 frames.], batch size: 23, lr: 2.14e-04 2022-05-06 20:16:18,979 INFO [train.py:715] (0/8) Epoch 10, batch 16450, loss[loss=0.1261, simple_loss=0.2188, pruned_loss=0.01668, over 4815.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2127, pruned_loss=0.03386, over 972036.65 frames.], batch size: 25, lr: 2.14e-04 2022-05-06 20:16:57,467 INFO [train.py:715] (0/8) Epoch 10, batch 16500, loss[loss=0.1488, simple_loss=0.2119, pruned_loss=0.04286, over 4966.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2123, pruned_loss=0.03392, over 971830.03 frames.], batch size: 14, lr: 2.14e-04 2022-05-06 20:17:36,175 INFO [train.py:715] (0/8) Epoch 10, batch 16550, loss[loss=0.1364, simple_loss=0.2131, pruned_loss=0.02991, over 4796.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2126, pruned_loss=0.03385, over 971497.79 frames.], batch size: 21, lr: 2.14e-04 2022-05-06 20:18:15,833 INFO [train.py:715] (0/8) Epoch 10, batch 16600, loss[loss=0.1243, simple_loss=0.2046, pruned_loss=0.022, over 4928.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2127, pruned_loss=0.03394, over 970971.19 frames.], batch size: 21, lr: 2.14e-04 2022-05-06 20:18:54,012 INFO [train.py:715] (0/8) Epoch 10, batch 16650, loss[loss=0.1518, simple_loss=0.2258, pruned_loss=0.03891, over 4936.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2124, pruned_loss=0.03362, over 971748.67 frames.], batch size: 39, lr: 2.14e-04 2022-05-06 20:19:33,368 INFO [train.py:715] (0/8) Epoch 10, batch 16700, loss[loss=0.1461, simple_loss=0.2225, pruned_loss=0.03487, over 4895.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03401, over 971979.17 frames.], batch size: 22, lr: 2.14e-04 2022-05-06 20:20:12,355 INFO [train.py:715] (0/8) Epoch 10, batch 16750, loss[loss=0.1444, simple_loss=0.2166, pruned_loss=0.03603, over 4782.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2135, pruned_loss=0.03405, over 972976.49 frames.], batch size: 17, lr: 2.14e-04 2022-05-06 20:20:52,510 INFO [train.py:715] (0/8) Epoch 10, batch 16800, loss[loss=0.1113, simple_loss=0.1864, pruned_loss=0.01806, over 4901.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.03356, over 972864.36 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:21:31,829 INFO [train.py:715] (0/8) Epoch 10, batch 16850, loss[loss=0.119, simple_loss=0.2014, pruned_loss=0.01834, over 4792.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2127, pruned_loss=0.0338, over 973236.71 frames.], batch size: 17, lr: 2.14e-04 2022-05-06 20:22:11,633 INFO [train.py:715] (0/8) Epoch 10, batch 16900, loss[loss=0.1344, simple_loss=0.2038, pruned_loss=0.03248, over 4992.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03354, over 972297.46 frames.], batch size: 20, lr: 2.14e-04 2022-05-06 20:22:51,673 INFO [train.py:715] (0/8) Epoch 10, batch 16950, loss[loss=0.166, simple_loss=0.249, pruned_loss=0.04152, over 4955.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2132, pruned_loss=0.03384, over 972640.07 frames.], batch size: 21, lr: 2.14e-04 2022-05-06 20:23:29,921 INFO [train.py:715] (0/8) Epoch 10, batch 17000, loss[loss=0.1413, simple_loss=0.224, pruned_loss=0.02929, over 4908.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2127, pruned_loss=0.03379, over 971506.56 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:24:09,509 INFO [train.py:715] (0/8) Epoch 10, batch 17050, loss[loss=0.1486, simple_loss=0.2172, pruned_loss=0.04002, over 4881.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03326, over 972264.73 frames.], batch size: 16, lr: 2.14e-04 2022-05-06 20:24:48,209 INFO [train.py:715] (0/8) Epoch 10, batch 17100, loss[loss=0.1338, simple_loss=0.2134, pruned_loss=0.02715, over 4966.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2136, pruned_loss=0.0334, over 972398.18 frames.], batch size: 24, lr: 2.14e-04 2022-05-06 20:25:27,435 INFO [train.py:715] (0/8) Epoch 10, batch 17150, loss[loss=0.1287, simple_loss=0.2005, pruned_loss=0.0285, over 4943.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2135, pruned_loss=0.03387, over 973023.54 frames.], batch size: 21, lr: 2.14e-04 2022-05-06 20:26:07,395 INFO [train.py:715] (0/8) Epoch 10, batch 17200, loss[loss=0.1405, simple_loss=0.2254, pruned_loss=0.02785, over 4821.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2131, pruned_loss=0.0335, over 972640.95 frames.], batch size: 21, lr: 2.14e-04 2022-05-06 20:26:47,004 INFO [train.py:715] (0/8) Epoch 10, batch 17250, loss[loss=0.1226, simple_loss=0.1966, pruned_loss=0.02426, over 4900.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03353, over 972158.14 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:27:26,660 INFO [train.py:715] (0/8) Epoch 10, batch 17300, loss[loss=0.1332, simple_loss=0.1956, pruned_loss=0.03538, over 4840.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2132, pruned_loss=0.0337, over 972488.42 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:28:05,426 INFO [train.py:715] (0/8) Epoch 10, batch 17350, loss[loss=0.1505, simple_loss=0.2069, pruned_loss=0.04703, over 4818.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2135, pruned_loss=0.03353, over 972852.30 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:28:44,826 INFO [train.py:715] (0/8) Epoch 10, batch 17400, loss[loss=0.1548, simple_loss=0.2251, pruned_loss=0.04227, over 4953.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2133, pruned_loss=0.0336, over 972458.67 frames.], batch size: 24, lr: 2.14e-04 2022-05-06 20:29:24,004 INFO [train.py:715] (0/8) Epoch 10, batch 17450, loss[loss=0.1496, simple_loss=0.2247, pruned_loss=0.03723, over 4959.00 frames.], tot_loss[loss=0.14, simple_loss=0.2131, pruned_loss=0.03347, over 971942.14 frames.], batch size: 35, lr: 2.14e-04 2022-05-06 20:30:02,979 INFO [train.py:715] (0/8) Epoch 10, batch 17500, loss[loss=0.1112, simple_loss=0.1986, pruned_loss=0.01191, over 4849.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2128, pruned_loss=0.03338, over 972530.64 frames.], batch size: 34, lr: 2.14e-04 2022-05-06 20:30:42,974 INFO [train.py:715] (0/8) Epoch 10, batch 17550, loss[loss=0.1324, simple_loss=0.2184, pruned_loss=0.02324, over 4810.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2135, pruned_loss=0.03385, over 971957.67 frames.], batch size: 27, lr: 2.14e-04 2022-05-06 20:31:21,945 INFO [train.py:715] (0/8) Epoch 10, batch 17600, loss[loss=0.1302, simple_loss=0.2066, pruned_loss=0.02693, over 4811.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2137, pruned_loss=0.0341, over 971436.18 frames.], batch size: 27, lr: 2.14e-04 2022-05-06 20:32:01,504 INFO [train.py:715] (0/8) Epoch 10, batch 17650, loss[loss=0.1287, simple_loss=0.2022, pruned_loss=0.02761, over 4888.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2133, pruned_loss=0.03392, over 971616.10 frames.], batch size: 22, lr: 2.14e-04 2022-05-06 20:32:40,267 INFO [train.py:715] (0/8) Epoch 10, batch 17700, loss[loss=0.1394, simple_loss=0.2053, pruned_loss=0.0368, over 4814.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2135, pruned_loss=0.03404, over 971615.46 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:33:20,044 INFO [train.py:715] (0/8) Epoch 10, batch 17750, loss[loss=0.1195, simple_loss=0.1983, pruned_loss=0.02031, over 4793.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2131, pruned_loss=0.03379, over 971612.79 frames.], batch size: 12, lr: 2.14e-04 2022-05-06 20:33:59,769 INFO [train.py:715] (0/8) Epoch 10, batch 17800, loss[loss=0.1225, simple_loss=0.201, pruned_loss=0.02203, over 4825.00 frames.], tot_loss[loss=0.1403, simple_loss=0.213, pruned_loss=0.03381, over 971750.73 frames.], batch size: 25, lr: 2.14e-04 2022-05-06 20:34:38,716 INFO [train.py:715] (0/8) Epoch 10, batch 17850, loss[loss=0.1471, simple_loss=0.2172, pruned_loss=0.03853, over 4917.00 frames.], tot_loss[loss=0.14, simple_loss=0.2127, pruned_loss=0.03359, over 972350.33 frames.], batch size: 17, lr: 2.14e-04 2022-05-06 20:35:18,470 INFO [train.py:715] (0/8) Epoch 10, batch 17900, loss[loss=0.1602, simple_loss=0.2326, pruned_loss=0.04392, over 4817.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2125, pruned_loss=0.03416, over 971948.11 frames.], batch size: 27, lr: 2.14e-04 2022-05-06 20:35:57,404 INFO [train.py:715] (0/8) Epoch 10, batch 17950, loss[loss=0.1508, simple_loss=0.2214, pruned_loss=0.04005, over 4841.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2138, pruned_loss=0.03464, over 970506.41 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:36:36,021 INFO [train.py:715] (0/8) Epoch 10, batch 18000, loss[loss=0.1463, simple_loss=0.2137, pruned_loss=0.03941, over 4970.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2135, pruned_loss=0.03438, over 970588.86 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:36:36,022 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 20:36:45,529 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1064, simple_loss=0.1906, pruned_loss=0.01104, over 914524.00 frames. 2022-05-06 20:37:24,882 INFO [train.py:715] (0/8) Epoch 10, batch 18050, loss[loss=0.1459, simple_loss=0.2215, pruned_loss=0.03513, over 4856.00 frames.], tot_loss[loss=0.1406, simple_loss=0.213, pruned_loss=0.03405, over 971128.77 frames.], batch size: 20, lr: 2.14e-04 2022-05-06 20:38:03,974 INFO [train.py:715] (0/8) Epoch 10, batch 18100, loss[loss=0.1355, simple_loss=0.2114, pruned_loss=0.02978, over 4776.00 frames.], tot_loss[loss=0.1405, simple_loss=0.213, pruned_loss=0.03399, over 971657.35 frames.], batch size: 17, lr: 2.14e-04 2022-05-06 20:38:43,262 INFO [train.py:715] (0/8) Epoch 10, batch 18150, loss[loss=0.1045, simple_loss=0.1796, pruned_loss=0.01474, over 4923.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2127, pruned_loss=0.03386, over 971859.41 frames.], batch size: 29, lr: 2.14e-04 2022-05-06 20:39:21,939 INFO [train.py:715] (0/8) Epoch 10, batch 18200, loss[loss=0.1293, simple_loss=0.2091, pruned_loss=0.0248, over 4961.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03298, over 972959.74 frames.], batch size: 24, lr: 2.14e-04 2022-05-06 20:40:00,617 INFO [train.py:715] (0/8) Epoch 10, batch 18250, loss[loss=0.1525, simple_loss=0.2188, pruned_loss=0.04314, over 4836.00 frames.], tot_loss[loss=0.139, simple_loss=0.2113, pruned_loss=0.03328, over 972885.69 frames.], batch size: 30, lr: 2.14e-04 2022-05-06 20:40:40,109 INFO [train.py:715] (0/8) Epoch 10, batch 18300, loss[loss=0.1647, simple_loss=0.2355, pruned_loss=0.04697, over 4899.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2124, pruned_loss=0.03346, over 972860.36 frames.], batch size: 19, lr: 2.14e-04 2022-05-06 20:41:19,472 INFO [train.py:715] (0/8) Epoch 10, batch 18350, loss[loss=0.1584, simple_loss=0.2324, pruned_loss=0.0422, over 4970.00 frames.], tot_loss[loss=0.14, simple_loss=0.2127, pruned_loss=0.03367, over 972659.93 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:41:57,958 INFO [train.py:715] (0/8) Epoch 10, batch 18400, loss[loss=0.1835, simple_loss=0.2361, pruned_loss=0.06541, over 4832.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2124, pruned_loss=0.03349, over 972271.86 frames.], batch size: 15, lr: 2.14e-04 2022-05-06 20:42:37,148 INFO [train.py:715] (0/8) Epoch 10, batch 18450, loss[loss=0.1739, simple_loss=0.2213, pruned_loss=0.06323, over 4985.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2123, pruned_loss=0.03357, over 972348.81 frames.], batch size: 14, lr: 2.14e-04 2022-05-06 20:43:16,005 INFO [train.py:715] (0/8) Epoch 10, batch 18500, loss[loss=0.1441, simple_loss=0.2141, pruned_loss=0.0371, over 4949.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2123, pruned_loss=0.03347, over 972677.75 frames.], batch size: 21, lr: 2.14e-04 2022-05-06 20:43:55,526 INFO [train.py:715] (0/8) Epoch 10, batch 18550, loss[loss=0.1536, simple_loss=0.2312, pruned_loss=0.03802, over 4818.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03352, over 971980.08 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 20:44:33,841 INFO [train.py:715] (0/8) Epoch 10, batch 18600, loss[loss=0.1534, simple_loss=0.2202, pruned_loss=0.04331, over 4846.00 frames.], tot_loss[loss=0.1402, simple_loss=0.213, pruned_loss=0.03368, over 972658.56 frames.], batch size: 30, lr: 2.13e-04 2022-05-06 20:45:13,252 INFO [train.py:715] (0/8) Epoch 10, batch 18650, loss[loss=0.1306, simple_loss=0.2076, pruned_loss=0.02675, over 4912.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03319, over 973112.98 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 20:45:52,994 INFO [train.py:715] (0/8) Epoch 10, batch 18700, loss[loss=0.1998, simple_loss=0.2733, pruned_loss=0.06313, over 4755.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.03336, over 972425.76 frames.], batch size: 16, lr: 2.13e-04 2022-05-06 20:46:31,254 INFO [train.py:715] (0/8) Epoch 10, batch 18750, loss[loss=0.1217, simple_loss=0.1933, pruned_loss=0.02499, over 4980.00 frames.], tot_loss[loss=0.1397, simple_loss=0.213, pruned_loss=0.03321, over 972478.01 frames.], batch size: 35, lr: 2.13e-04 2022-05-06 20:47:10,633 INFO [train.py:715] (0/8) Epoch 10, batch 18800, loss[loss=0.135, simple_loss=0.2115, pruned_loss=0.02923, over 4955.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2133, pruned_loss=0.03345, over 972238.55 frames.], batch size: 14, lr: 2.13e-04 2022-05-06 20:47:50,110 INFO [train.py:715] (0/8) Epoch 10, batch 18850, loss[loss=0.1669, simple_loss=0.2372, pruned_loss=0.04824, over 4961.00 frames.], tot_loss[loss=0.1401, simple_loss=0.213, pruned_loss=0.0336, over 971814.57 frames.], batch size: 39, lr: 2.13e-04 2022-05-06 20:48:29,010 INFO [train.py:715] (0/8) Epoch 10, batch 18900, loss[loss=0.116, simple_loss=0.1881, pruned_loss=0.0219, over 4748.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2124, pruned_loss=0.03351, over 972300.59 frames.], batch size: 16, lr: 2.13e-04 2022-05-06 20:49:08,063 INFO [train.py:715] (0/8) Epoch 10, batch 18950, loss[loss=0.1642, simple_loss=0.2452, pruned_loss=0.04161, over 4854.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2125, pruned_loss=0.03351, over 972739.81 frames.], batch size: 20, lr: 2.13e-04 2022-05-06 20:49:48,334 INFO [train.py:715] (0/8) Epoch 10, batch 19000, loss[loss=0.1542, simple_loss=0.2225, pruned_loss=0.04295, over 4928.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2127, pruned_loss=0.03383, over 973489.42 frames.], batch size: 29, lr: 2.13e-04 2022-05-06 20:50:27,639 INFO [train.py:715] (0/8) Epoch 10, batch 19050, loss[loss=0.1503, simple_loss=0.2149, pruned_loss=0.04289, over 4784.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.034, over 974132.43 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 20:51:06,451 INFO [train.py:715] (0/8) Epoch 10, batch 19100, loss[loss=0.1606, simple_loss=0.2469, pruned_loss=0.03719, over 4951.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2128, pruned_loss=0.03373, over 973850.03 frames.], batch size: 21, lr: 2.13e-04 2022-05-06 20:51:46,325 INFO [train.py:715] (0/8) Epoch 10, batch 19150, loss[loss=0.1453, simple_loss=0.2095, pruned_loss=0.04059, over 4946.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2123, pruned_loss=0.03338, over 973798.12 frames.], batch size: 21, lr: 2.13e-04 2022-05-06 20:52:26,494 INFO [train.py:715] (0/8) Epoch 10, batch 19200, loss[loss=0.1411, simple_loss=0.2132, pruned_loss=0.03446, over 4816.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03321, over 972845.40 frames.], batch size: 27, lr: 2.13e-04 2022-05-06 20:53:06,170 INFO [train.py:715] (0/8) Epoch 10, batch 19250, loss[loss=0.1137, simple_loss=0.1903, pruned_loss=0.01857, over 4843.00 frames.], tot_loss[loss=0.14, simple_loss=0.2131, pruned_loss=0.03348, over 972424.53 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 20:53:46,066 INFO [train.py:715] (0/8) Epoch 10, batch 19300, loss[loss=0.1575, simple_loss=0.2275, pruned_loss=0.0438, over 4975.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.03301, over 972787.86 frames.], batch size: 35, lr: 2.13e-04 2022-05-06 20:54:26,471 INFO [train.py:715] (0/8) Epoch 10, batch 19350, loss[loss=0.1272, simple_loss=0.2019, pruned_loss=0.02627, over 4896.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2124, pruned_loss=0.03306, over 972049.80 frames.], batch size: 22, lr: 2.13e-04 2022-05-06 20:55:06,649 INFO [train.py:715] (0/8) Epoch 10, batch 19400, loss[loss=0.1527, simple_loss=0.2233, pruned_loss=0.04107, over 4972.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03324, over 971887.18 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 20:55:45,793 INFO [train.py:715] (0/8) Epoch 10, batch 19450, loss[loss=0.1492, simple_loss=0.2154, pruned_loss=0.04153, over 4853.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2127, pruned_loss=0.03325, over 972323.40 frames.], batch size: 20, lr: 2.13e-04 2022-05-06 20:56:25,410 INFO [train.py:715] (0/8) Epoch 10, batch 19500, loss[loss=0.1289, simple_loss=0.2017, pruned_loss=0.02805, over 4827.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2128, pruned_loss=0.03299, over 972849.35 frames.], batch size: 13, lr: 2.13e-04 2022-05-06 20:57:04,607 INFO [train.py:715] (0/8) Epoch 10, batch 19550, loss[loss=0.1343, simple_loss=0.2007, pruned_loss=0.03391, over 4749.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2124, pruned_loss=0.03351, over 973082.00 frames.], batch size: 19, lr: 2.13e-04 2022-05-06 20:57:43,329 INFO [train.py:715] (0/8) Epoch 10, batch 19600, loss[loss=0.1373, simple_loss=0.2039, pruned_loss=0.03537, over 4825.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2123, pruned_loss=0.03343, over 972631.69 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 20:58:22,307 INFO [train.py:715] (0/8) Epoch 10, batch 19650, loss[loss=0.1223, simple_loss=0.1925, pruned_loss=0.02603, over 4841.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2125, pruned_loss=0.0338, over 973184.05 frames.], batch size: 32, lr: 2.13e-04 2022-05-06 20:59:01,942 INFO [train.py:715] (0/8) Epoch 10, batch 19700, loss[loss=0.1663, simple_loss=0.245, pruned_loss=0.04376, over 4706.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2132, pruned_loss=0.03406, over 973257.94 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 20:59:41,298 INFO [train.py:715] (0/8) Epoch 10, batch 19750, loss[loss=0.1393, simple_loss=0.2108, pruned_loss=0.03391, over 4803.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2133, pruned_loss=0.03451, over 971882.78 frames.], batch size: 26, lr: 2.13e-04 2022-05-06 21:00:19,604 INFO [train.py:715] (0/8) Epoch 10, batch 19800, loss[loss=0.1141, simple_loss=0.1875, pruned_loss=0.02039, over 4769.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2127, pruned_loss=0.03446, over 971801.79 frames.], batch size: 18, lr: 2.13e-04 2022-05-06 21:00:59,240 INFO [train.py:715] (0/8) Epoch 10, batch 19850, loss[loss=0.1328, simple_loss=0.2007, pruned_loss=0.03247, over 4855.00 frames.], tot_loss[loss=0.141, simple_loss=0.2131, pruned_loss=0.03448, over 971796.19 frames.], batch size: 13, lr: 2.13e-04 2022-05-06 21:01:38,759 INFO [train.py:715] (0/8) Epoch 10, batch 19900, loss[loss=0.1545, simple_loss=0.2321, pruned_loss=0.03841, over 4761.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2134, pruned_loss=0.03441, over 971729.61 frames.], batch size: 19, lr: 2.13e-04 2022-05-06 21:01:42,489 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-368000.pt 2022-05-06 21:02:19,873 INFO [train.py:715] (0/8) Epoch 10, batch 19950, loss[loss=0.143, simple_loss=0.2093, pruned_loss=0.03833, over 4825.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2134, pruned_loss=0.03467, over 972211.38 frames.], batch size: 26, lr: 2.13e-04 2022-05-06 21:02:58,927 INFO [train.py:715] (0/8) Epoch 10, batch 20000, loss[loss=0.1539, simple_loss=0.2211, pruned_loss=0.04338, over 4768.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2137, pruned_loss=0.03454, over 973312.34 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 21:03:37,945 INFO [train.py:715] (0/8) Epoch 10, batch 20050, loss[loss=0.1405, simple_loss=0.2119, pruned_loss=0.03456, over 4809.00 frames.], tot_loss[loss=0.141, simple_loss=0.213, pruned_loss=0.03446, over 973490.98 frames.], batch size: 21, lr: 2.13e-04 2022-05-06 21:04:17,427 INFO [train.py:715] (0/8) Epoch 10, batch 20100, loss[loss=0.1525, simple_loss=0.2244, pruned_loss=0.04032, over 4893.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2129, pruned_loss=0.03435, over 973063.28 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 21:04:55,528 INFO [train.py:715] (0/8) Epoch 10, batch 20150, loss[loss=0.1203, simple_loss=0.1976, pruned_loss=0.02149, over 4922.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2123, pruned_loss=0.03399, over 973022.49 frames.], batch size: 18, lr: 2.13e-04 2022-05-06 21:05:34,941 INFO [train.py:715] (0/8) Epoch 10, batch 20200, loss[loss=0.1206, simple_loss=0.2051, pruned_loss=0.01807, over 4903.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03357, over 973210.17 frames.], batch size: 23, lr: 2.13e-04 2022-05-06 21:06:13,959 INFO [train.py:715] (0/8) Epoch 10, batch 20250, loss[loss=0.1148, simple_loss=0.1922, pruned_loss=0.0187, over 4832.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2124, pruned_loss=0.03388, over 973232.27 frames.], batch size: 26, lr: 2.13e-04 2022-05-06 21:06:52,618 INFO [train.py:715] (0/8) Epoch 10, batch 20300, loss[loss=0.1418, simple_loss=0.2157, pruned_loss=0.0339, over 4905.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2127, pruned_loss=0.03374, over 973342.98 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 21:07:31,400 INFO [train.py:715] (0/8) Epoch 10, batch 20350, loss[loss=0.129, simple_loss=0.2012, pruned_loss=0.02844, over 4821.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2119, pruned_loss=0.03366, over 972183.62 frames.], batch size: 26, lr: 2.13e-04 2022-05-06 21:08:10,508 INFO [train.py:715] (0/8) Epoch 10, batch 20400, loss[loss=0.1395, simple_loss=0.2121, pruned_loss=0.03346, over 4936.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2122, pruned_loss=0.03359, over 971720.64 frames.], batch size: 29, lr: 2.13e-04 2022-05-06 21:08:49,425 INFO [train.py:715] (0/8) Epoch 10, batch 20450, loss[loss=0.1799, simple_loss=0.2512, pruned_loss=0.05428, over 4780.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2124, pruned_loss=0.03365, over 971511.46 frames.], batch size: 14, lr: 2.13e-04 2022-05-06 21:09:27,878 INFO [train.py:715] (0/8) Epoch 10, batch 20500, loss[loss=0.123, simple_loss=0.1977, pruned_loss=0.02416, over 4934.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2128, pruned_loss=0.03387, over 972523.23 frames.], batch size: 29, lr: 2.13e-04 2022-05-06 21:10:06,950 INFO [train.py:715] (0/8) Epoch 10, batch 20550, loss[loss=0.1384, simple_loss=0.2108, pruned_loss=0.03299, over 4823.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2131, pruned_loss=0.03398, over 972937.71 frames.], batch size: 26, lr: 2.13e-04 2022-05-06 21:10:46,031 INFO [train.py:715] (0/8) Epoch 10, batch 20600, loss[loss=0.1398, simple_loss=0.2127, pruned_loss=0.0335, over 4838.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2137, pruned_loss=0.03386, over 972538.31 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 21:11:25,462 INFO [train.py:715] (0/8) Epoch 10, batch 20650, loss[loss=0.1287, simple_loss=0.2097, pruned_loss=0.02384, over 4987.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2136, pruned_loss=0.0334, over 973439.58 frames.], batch size: 25, lr: 2.13e-04 2022-05-06 21:12:04,256 INFO [train.py:715] (0/8) Epoch 10, batch 20700, loss[loss=0.1249, simple_loss=0.1993, pruned_loss=0.02524, over 4795.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2134, pruned_loss=0.03292, over 973987.21 frames.], batch size: 24, lr: 2.13e-04 2022-05-06 21:12:44,598 INFO [train.py:715] (0/8) Epoch 10, batch 20750, loss[loss=0.1538, simple_loss=0.2213, pruned_loss=0.04318, over 4754.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2131, pruned_loss=0.033, over 974124.19 frames.], batch size: 19, lr: 2.13e-04 2022-05-06 21:13:24,577 INFO [train.py:715] (0/8) Epoch 10, batch 20800, loss[loss=0.1152, simple_loss=0.1951, pruned_loss=0.01762, over 4849.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2129, pruned_loss=0.0331, over 973832.29 frames.], batch size: 20, lr: 2.13e-04 2022-05-06 21:14:03,349 INFO [train.py:715] (0/8) Epoch 10, batch 20850, loss[loss=0.1655, simple_loss=0.2256, pruned_loss=0.05269, over 4841.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.0332, over 973965.07 frames.], batch size: 32, lr: 2.13e-04 2022-05-06 21:14:43,292 INFO [train.py:715] (0/8) Epoch 10, batch 20900, loss[loss=0.141, simple_loss=0.2266, pruned_loss=0.0277, over 4825.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.03327, over 973396.95 frames.], batch size: 13, lr: 2.13e-04 2022-05-06 21:15:23,752 INFO [train.py:715] (0/8) Epoch 10, batch 20950, loss[loss=0.1554, simple_loss=0.2373, pruned_loss=0.03671, over 4779.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2132, pruned_loss=0.03362, over 973677.45 frames.], batch size: 14, lr: 2.13e-04 2022-05-06 21:16:02,699 INFO [train.py:715] (0/8) Epoch 10, batch 21000, loss[loss=0.1489, simple_loss=0.2261, pruned_loss=0.03583, over 4992.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03326, over 974150.76 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 21:16:02,700 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 21:16:12,203 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1065, simple_loss=0.1909, pruned_loss=0.01111, over 914524.00 frames. 2022-05-06 21:16:51,724 INFO [train.py:715] (0/8) Epoch 10, batch 21050, loss[loss=0.1491, simple_loss=0.2248, pruned_loss=0.03664, over 4851.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.03306, over 972848.90 frames.], batch size: 38, lr: 2.13e-04 2022-05-06 21:17:32,533 INFO [train.py:715] (0/8) Epoch 10, batch 21100, loss[loss=0.1441, simple_loss=0.2207, pruned_loss=0.03369, over 4777.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2128, pruned_loss=0.0331, over 973051.44 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 21:18:14,009 INFO [train.py:715] (0/8) Epoch 10, batch 21150, loss[loss=0.1423, simple_loss=0.2232, pruned_loss=0.0307, over 4974.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.03259, over 973086.97 frames.], batch size: 15, lr: 2.13e-04 2022-05-06 21:18:55,100 INFO [train.py:715] (0/8) Epoch 10, batch 21200, loss[loss=0.162, simple_loss=0.2437, pruned_loss=0.04012, over 4933.00 frames.], tot_loss[loss=0.139, simple_loss=0.2126, pruned_loss=0.0327, over 973306.76 frames.], batch size: 23, lr: 2.13e-04 2022-05-06 21:19:35,765 INFO [train.py:715] (0/8) Epoch 10, batch 21250, loss[loss=0.1619, simple_loss=0.2442, pruned_loss=0.0398, over 4921.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03311, over 972310.77 frames.], batch size: 18, lr: 2.13e-04 2022-05-06 21:20:17,431 INFO [train.py:715] (0/8) Epoch 10, batch 21300, loss[loss=0.1566, simple_loss=0.2342, pruned_loss=0.03954, over 4909.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03313, over 972103.07 frames.], batch size: 39, lr: 2.13e-04 2022-05-06 21:20:58,696 INFO [train.py:715] (0/8) Epoch 10, batch 21350, loss[loss=0.1383, simple_loss=0.2172, pruned_loss=0.02972, over 4789.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.03326, over 972430.92 frames.], batch size: 24, lr: 2.13e-04 2022-05-06 21:21:39,101 INFO [train.py:715] (0/8) Epoch 10, batch 21400, loss[loss=0.1331, simple_loss=0.2094, pruned_loss=0.02841, over 4971.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03317, over 972714.86 frames.], batch size: 24, lr: 2.13e-04 2022-05-06 21:22:20,568 INFO [train.py:715] (0/8) Epoch 10, batch 21450, loss[loss=0.1253, simple_loss=0.1804, pruned_loss=0.03509, over 4905.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2134, pruned_loss=0.03416, over 973012.73 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 21:23:02,350 INFO [train.py:715] (0/8) Epoch 10, batch 21500, loss[loss=0.1403, simple_loss=0.2041, pruned_loss=0.03825, over 4786.00 frames.], tot_loss[loss=0.14, simple_loss=0.2127, pruned_loss=0.0337, over 972316.62 frames.], batch size: 21, lr: 2.13e-04 2022-05-06 21:23:43,373 INFO [train.py:715] (0/8) Epoch 10, batch 21550, loss[loss=0.1555, simple_loss=0.2253, pruned_loss=0.04282, over 4838.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2138, pruned_loss=0.0345, over 972050.33 frames.], batch size: 30, lr: 2.13e-04 2022-05-06 21:24:24,263 INFO [train.py:715] (0/8) Epoch 10, batch 21600, loss[loss=0.1269, simple_loss=0.1938, pruned_loss=0.02999, over 4856.00 frames.], tot_loss[loss=0.1416, simple_loss=0.2141, pruned_loss=0.03449, over 972101.12 frames.], batch size: 30, lr: 2.13e-04 2022-05-06 21:25:06,206 INFO [train.py:715] (0/8) Epoch 10, batch 21650, loss[loss=0.1459, simple_loss=0.2181, pruned_loss=0.03683, over 4933.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2138, pruned_loss=0.03419, over 971871.26 frames.], batch size: 23, lr: 2.13e-04 2022-05-06 21:25:47,747 INFO [train.py:715] (0/8) Epoch 10, batch 21700, loss[loss=0.1611, simple_loss=0.2411, pruned_loss=0.04053, over 4915.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2131, pruned_loss=0.03367, over 972488.41 frames.], batch size: 17, lr: 2.13e-04 2022-05-06 21:26:28,009 INFO [train.py:715] (0/8) Epoch 10, batch 21750, loss[loss=0.1364, simple_loss=0.2095, pruned_loss=0.03164, over 4847.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03323, over 971794.35 frames.], batch size: 20, lr: 2.13e-04 2022-05-06 21:27:08,993 INFO [train.py:715] (0/8) Epoch 10, batch 21800, loss[loss=0.1413, simple_loss=0.209, pruned_loss=0.0368, over 4816.00 frames.], tot_loss[loss=0.139, simple_loss=0.2116, pruned_loss=0.03321, over 970611.01 frames.], batch size: 13, lr: 2.13e-04 2022-05-06 21:27:50,694 INFO [train.py:715] (0/8) Epoch 10, batch 21850, loss[loss=0.1519, simple_loss=0.2169, pruned_loss=0.04339, over 4782.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2118, pruned_loss=0.03338, over 970894.12 frames.], batch size: 18, lr: 2.13e-04 2022-05-06 21:28:31,163 INFO [train.py:715] (0/8) Epoch 10, batch 21900, loss[loss=0.132, simple_loss=0.211, pruned_loss=0.02647, over 4841.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2109, pruned_loss=0.03305, over 971406.87 frames.], batch size: 30, lr: 2.13e-04 2022-05-06 21:29:11,913 INFO [train.py:715] (0/8) Epoch 10, batch 21950, loss[loss=0.1291, simple_loss=0.1985, pruned_loss=0.02984, over 4756.00 frames.], tot_loss[loss=0.139, simple_loss=0.2113, pruned_loss=0.0333, over 971210.71 frames.], batch size: 19, lr: 2.13e-04 2022-05-06 21:29:53,129 INFO [train.py:715] (0/8) Epoch 10, batch 22000, loss[loss=0.1655, simple_loss=0.2489, pruned_loss=0.04108, over 4815.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.03364, over 972218.47 frames.], batch size: 25, lr: 2.12e-04 2022-05-06 21:30:33,465 INFO [train.py:715] (0/8) Epoch 10, batch 22050, loss[loss=0.1602, simple_loss=0.2301, pruned_loss=0.04514, over 4829.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2123, pruned_loss=0.03332, over 973036.21 frames.], batch size: 25, lr: 2.12e-04 2022-05-06 21:31:14,088 INFO [train.py:715] (0/8) Epoch 10, batch 22100, loss[loss=0.1369, simple_loss=0.2061, pruned_loss=0.03385, over 4801.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.03292, over 972867.81 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 21:31:54,934 INFO [train.py:715] (0/8) Epoch 10, batch 22150, loss[loss=0.1611, simple_loss=0.2314, pruned_loss=0.0454, over 4759.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2121, pruned_loss=0.03285, over 973061.19 frames.], batch size: 16, lr: 2.12e-04 2022-05-06 21:32:35,990 INFO [train.py:715] (0/8) Epoch 10, batch 22200, loss[loss=0.1625, simple_loss=0.2299, pruned_loss=0.04753, over 4848.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03308, over 973824.78 frames.], batch size: 30, lr: 2.12e-04 2022-05-06 21:33:16,084 INFO [train.py:715] (0/8) Epoch 10, batch 22250, loss[loss=0.1554, simple_loss=0.2355, pruned_loss=0.03765, over 4824.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03296, over 973808.94 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 21:33:56,745 INFO [train.py:715] (0/8) Epoch 10, batch 22300, loss[loss=0.1262, simple_loss=0.1939, pruned_loss=0.02923, over 4910.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03334, over 973278.95 frames.], batch size: 17, lr: 2.12e-04 2022-05-06 21:34:37,774 INFO [train.py:715] (0/8) Epoch 10, batch 22350, loss[loss=0.1294, simple_loss=0.1974, pruned_loss=0.03069, over 4849.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.0328, over 972890.23 frames.], batch size: 34, lr: 2.12e-04 2022-05-06 21:35:17,620 INFO [train.py:715] (0/8) Epoch 10, batch 22400, loss[loss=0.1209, simple_loss=0.2007, pruned_loss=0.02056, over 4983.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2116, pruned_loss=0.03258, over 972977.96 frames.], batch size: 25, lr: 2.12e-04 2022-05-06 21:35:56,787 INFO [train.py:715] (0/8) Epoch 10, batch 22450, loss[loss=0.1437, simple_loss=0.2136, pruned_loss=0.03688, over 4783.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03293, over 973502.81 frames.], batch size: 17, lr: 2.12e-04 2022-05-06 21:36:36,728 INFO [train.py:715] (0/8) Epoch 10, batch 22500, loss[loss=0.1111, simple_loss=0.1925, pruned_loss=0.01484, over 4982.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03274, over 972985.96 frames.], batch size: 28, lr: 2.12e-04 2022-05-06 21:37:17,612 INFO [train.py:715] (0/8) Epoch 10, batch 22550, loss[loss=0.1256, simple_loss=0.2125, pruned_loss=0.01933, over 4810.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2116, pruned_loss=0.03286, over 973059.20 frames.], batch size: 26, lr: 2.12e-04 2022-05-06 21:37:56,430 INFO [train.py:715] (0/8) Epoch 10, batch 22600, loss[loss=0.1566, simple_loss=0.2252, pruned_loss=0.04399, over 4842.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2125, pruned_loss=0.03347, over 972433.49 frames.], batch size: 30, lr: 2.12e-04 2022-05-06 21:38:37,518 INFO [train.py:715] (0/8) Epoch 10, batch 22650, loss[loss=0.1328, simple_loss=0.2141, pruned_loss=0.02571, over 4813.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03319, over 972867.01 frames.], batch size: 26, lr: 2.12e-04 2022-05-06 21:39:19,371 INFO [train.py:715] (0/8) Epoch 10, batch 22700, loss[loss=0.1576, simple_loss=0.2183, pruned_loss=0.04841, over 4973.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2126, pruned_loss=0.03357, over 972627.31 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 21:40:00,102 INFO [train.py:715] (0/8) Epoch 10, batch 22750, loss[loss=0.1252, simple_loss=0.202, pruned_loss=0.02415, over 4699.00 frames.], tot_loss[loss=0.1403, simple_loss=0.213, pruned_loss=0.0338, over 971824.68 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 21:40:41,328 INFO [train.py:715] (0/8) Epoch 10, batch 22800, loss[loss=0.1426, simple_loss=0.2087, pruned_loss=0.03822, over 4833.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2121, pruned_loss=0.03364, over 972329.75 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 21:41:22,882 INFO [train.py:715] (0/8) Epoch 10, batch 22850, loss[loss=0.1296, simple_loss=0.2043, pruned_loss=0.02749, over 4788.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2116, pruned_loss=0.03343, over 971930.93 frames.], batch size: 17, lr: 2.12e-04 2022-05-06 21:42:04,585 INFO [train.py:715] (0/8) Epoch 10, batch 22900, loss[loss=0.124, simple_loss=0.1927, pruned_loss=0.02762, over 4811.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2125, pruned_loss=0.0336, over 972504.30 frames.], batch size: 27, lr: 2.12e-04 2022-05-06 21:42:45,057 INFO [train.py:715] (0/8) Epoch 10, batch 22950, loss[loss=0.1434, simple_loss=0.2332, pruned_loss=0.02678, over 4938.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2132, pruned_loss=0.034, over 971964.49 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 21:43:27,081 INFO [train.py:715] (0/8) Epoch 10, batch 23000, loss[loss=0.1358, simple_loss=0.2024, pruned_loss=0.03454, over 4822.00 frames.], tot_loss[loss=0.141, simple_loss=0.2134, pruned_loss=0.03431, over 972665.86 frames.], batch size: 13, lr: 2.12e-04 2022-05-06 21:44:09,140 INFO [train.py:715] (0/8) Epoch 10, batch 23050, loss[loss=0.1624, simple_loss=0.2428, pruned_loss=0.04099, over 4876.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2133, pruned_loss=0.03414, over 972237.56 frames.], batch size: 22, lr: 2.12e-04 2022-05-06 21:44:49,661 INFO [train.py:715] (0/8) Epoch 10, batch 23100, loss[loss=0.1228, simple_loss=0.1944, pruned_loss=0.02558, over 4752.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03356, over 971582.11 frames.], batch size: 19, lr: 2.12e-04 2022-05-06 21:45:30,871 INFO [train.py:715] (0/8) Epoch 10, batch 23150, loss[loss=0.1442, simple_loss=0.2194, pruned_loss=0.03455, over 4960.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2126, pruned_loss=0.03348, over 972029.38 frames.], batch size: 24, lr: 2.12e-04 2022-05-06 21:46:12,885 INFO [train.py:715] (0/8) Epoch 10, batch 23200, loss[loss=0.1413, simple_loss=0.2165, pruned_loss=0.03304, over 4830.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03347, over 973153.82 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 21:46:54,163 INFO [train.py:715] (0/8) Epoch 10, batch 23250, loss[loss=0.1427, simple_loss=0.2224, pruned_loss=0.03148, over 4948.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2134, pruned_loss=0.03363, over 973098.56 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 21:47:34,834 INFO [train.py:715] (0/8) Epoch 10, batch 23300, loss[loss=0.1183, simple_loss=0.1863, pruned_loss=0.02514, over 4899.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2137, pruned_loss=0.0339, over 973011.59 frames.], batch size: 18, lr: 2.12e-04 2022-05-06 21:48:16,727 INFO [train.py:715] (0/8) Epoch 10, batch 23350, loss[loss=0.1528, simple_loss=0.2104, pruned_loss=0.04754, over 4855.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2134, pruned_loss=0.0337, over 973249.62 frames.], batch size: 20, lr: 2.12e-04 2022-05-06 21:48:58,866 INFO [train.py:715] (0/8) Epoch 10, batch 23400, loss[loss=0.1324, simple_loss=0.2044, pruned_loss=0.03016, over 4932.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03333, over 973335.86 frames.], batch size: 29, lr: 2.12e-04 2022-05-06 21:49:39,775 INFO [train.py:715] (0/8) Epoch 10, batch 23450, loss[loss=0.1273, simple_loss=0.2067, pruned_loss=0.02396, over 4977.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2125, pruned_loss=0.03285, over 973576.85 frames.], batch size: 24, lr: 2.12e-04 2022-05-06 21:50:20,134 INFO [train.py:715] (0/8) Epoch 10, batch 23500, loss[loss=0.1582, simple_loss=0.2336, pruned_loss=0.04141, over 4753.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2133, pruned_loss=0.03348, over 972562.73 frames.], batch size: 19, lr: 2.12e-04 2022-05-06 21:51:02,210 INFO [train.py:715] (0/8) Epoch 10, batch 23550, loss[loss=0.1431, simple_loss=0.22, pruned_loss=0.03311, over 4878.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03309, over 972954.06 frames.], batch size: 22, lr: 2.12e-04 2022-05-06 21:51:43,367 INFO [train.py:715] (0/8) Epoch 10, batch 23600, loss[loss=0.1178, simple_loss=0.1947, pruned_loss=0.02046, over 4970.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.03343, over 972247.97 frames.], batch size: 24, lr: 2.12e-04 2022-05-06 21:52:23,131 INFO [train.py:715] (0/8) Epoch 10, batch 23650, loss[loss=0.1275, simple_loss=0.1994, pruned_loss=0.02778, over 4788.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03348, over 972873.56 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 21:53:03,642 INFO [train.py:715] (0/8) Epoch 10, batch 23700, loss[loss=0.1352, simple_loss=0.2138, pruned_loss=0.0283, over 4780.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2131, pruned_loss=0.03358, over 972433.62 frames.], batch size: 18, lr: 2.12e-04 2022-05-06 21:53:44,220 INFO [train.py:715] (0/8) Epoch 10, batch 23750, loss[loss=0.1333, simple_loss=0.2127, pruned_loss=0.02695, over 4914.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03333, over 971977.18 frames.], batch size: 23, lr: 2.12e-04 2022-05-06 21:54:24,359 INFO [train.py:715] (0/8) Epoch 10, batch 23800, loss[loss=0.1499, simple_loss=0.2238, pruned_loss=0.03798, over 4759.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2121, pruned_loss=0.03292, over 971907.57 frames.], batch size: 16, lr: 2.12e-04 2022-05-06 21:55:04,949 INFO [train.py:715] (0/8) Epoch 10, batch 23850, loss[loss=0.1324, simple_loss=0.2109, pruned_loss=0.0269, over 4751.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.0331, over 971800.49 frames.], batch size: 16, lr: 2.12e-04 2022-05-06 21:55:46,218 INFO [train.py:715] (0/8) Epoch 10, batch 23900, loss[loss=0.1515, simple_loss=0.2177, pruned_loss=0.04264, over 4849.00 frames.], tot_loss[loss=0.1399, simple_loss=0.213, pruned_loss=0.03337, over 970925.80 frames.], batch size: 32, lr: 2.12e-04 2022-05-06 21:56:25,836 INFO [train.py:715] (0/8) Epoch 10, batch 23950, loss[loss=0.1351, simple_loss=0.2139, pruned_loss=0.02812, over 4823.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2119, pruned_loss=0.03268, over 972115.39 frames.], batch size: 13, lr: 2.12e-04 2022-05-06 21:57:06,216 INFO [train.py:715] (0/8) Epoch 10, batch 24000, loss[loss=0.1502, simple_loss=0.2259, pruned_loss=0.03721, over 4816.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2125, pruned_loss=0.03283, over 972471.39 frames.], batch size: 26, lr: 2.12e-04 2022-05-06 21:57:06,217 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 21:57:15,893 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1061, simple_loss=0.1905, pruned_loss=0.01087, over 914524.00 frames. 2022-05-06 21:57:55,800 INFO [train.py:715] (0/8) Epoch 10, batch 24050, loss[loss=0.1307, simple_loss=0.2007, pruned_loss=0.03036, over 4943.00 frames.], tot_loss[loss=0.139, simple_loss=0.2121, pruned_loss=0.03297, over 972377.37 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 21:58:36,845 INFO [train.py:715] (0/8) Epoch 10, batch 24100, loss[loss=0.1232, simple_loss=0.1938, pruned_loss=0.02629, over 4771.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2117, pruned_loss=0.033, over 973235.93 frames.], batch size: 19, lr: 2.12e-04 2022-05-06 21:59:18,107 INFO [train.py:715] (0/8) Epoch 10, batch 24150, loss[loss=0.1117, simple_loss=0.182, pruned_loss=0.0207, over 4971.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2122, pruned_loss=0.03318, over 972565.11 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 21:59:57,434 INFO [train.py:715] (0/8) Epoch 10, batch 24200, loss[loss=0.1133, simple_loss=0.1895, pruned_loss=0.01855, over 4825.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03331, over 972050.80 frames.], batch size: 25, lr: 2.12e-04 2022-05-06 22:00:38,183 INFO [train.py:715] (0/8) Epoch 10, batch 24250, loss[loss=0.1331, simple_loss=0.2035, pruned_loss=0.03132, over 4826.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2119, pruned_loss=0.03279, over 972051.35 frames.], batch size: 25, lr: 2.12e-04 2022-05-06 22:01:19,294 INFO [train.py:715] (0/8) Epoch 10, batch 24300, loss[loss=0.126, simple_loss=0.2, pruned_loss=0.02606, over 4877.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03297, over 972074.46 frames.], batch size: 13, lr: 2.12e-04 2022-05-06 22:01:59,413 INFO [train.py:715] (0/8) Epoch 10, batch 24350, loss[loss=0.1051, simple_loss=0.182, pruned_loss=0.01405, over 4833.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03298, over 971881.92 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 22:02:39,456 INFO [train.py:715] (0/8) Epoch 10, batch 24400, loss[loss=0.1762, simple_loss=0.237, pruned_loss=0.05769, over 4961.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2132, pruned_loss=0.0335, over 972941.88 frames.], batch size: 39, lr: 2.12e-04 2022-05-06 22:03:20,171 INFO [train.py:715] (0/8) Epoch 10, batch 24450, loss[loss=0.1503, simple_loss=0.2112, pruned_loss=0.04471, over 4923.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2132, pruned_loss=0.03323, over 973007.36 frames.], batch size: 23, lr: 2.12e-04 2022-05-06 22:04:01,137 INFO [train.py:715] (0/8) Epoch 10, batch 24500, loss[loss=0.1416, simple_loss=0.2118, pruned_loss=0.03563, over 4824.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03277, over 973551.39 frames.], batch size: 26, lr: 2.12e-04 2022-05-06 22:04:40,219 INFO [train.py:715] (0/8) Epoch 10, batch 24550, loss[loss=0.1668, simple_loss=0.2371, pruned_loss=0.0483, over 4915.00 frames.], tot_loss[loss=0.1388, simple_loss=0.212, pruned_loss=0.03286, over 972985.38 frames.], batch size: 18, lr: 2.12e-04 2022-05-06 22:05:20,204 INFO [train.py:715] (0/8) Epoch 10, batch 24600, loss[loss=0.1091, simple_loss=0.1796, pruned_loss=0.01924, over 4955.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2109, pruned_loss=0.0325, over 972968.87 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 22:06:00,573 INFO [train.py:715] (0/8) Epoch 10, batch 24650, loss[loss=0.1471, simple_loss=0.2234, pruned_loss=0.03539, over 4820.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2113, pruned_loss=0.03295, over 973065.23 frames.], batch size: 26, lr: 2.12e-04 2022-05-06 22:06:39,584 INFO [train.py:715] (0/8) Epoch 10, batch 24700, loss[loss=0.1511, simple_loss=0.2137, pruned_loss=0.04425, over 4841.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2124, pruned_loss=0.03356, over 973393.87 frames.], batch size: 30, lr: 2.12e-04 2022-05-06 22:07:18,185 INFO [train.py:715] (0/8) Epoch 10, batch 24750, loss[loss=0.1321, simple_loss=0.2032, pruned_loss=0.03055, over 4923.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2118, pruned_loss=0.03347, over 972847.98 frames.], batch size: 39, lr: 2.12e-04 2022-05-06 22:07:57,673 INFO [train.py:715] (0/8) Epoch 10, batch 24800, loss[loss=0.1602, simple_loss=0.2279, pruned_loss=0.04628, over 4805.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2114, pruned_loss=0.03298, over 972416.62 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 22:08:36,823 INFO [train.py:715] (0/8) Epoch 10, batch 24850, loss[loss=0.1504, simple_loss=0.216, pruned_loss=0.04241, over 4977.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2114, pruned_loss=0.03302, over 972309.77 frames.], batch size: 35, lr: 2.12e-04 2022-05-06 22:09:14,897 INFO [train.py:715] (0/8) Epoch 10, batch 24900, loss[loss=0.1393, simple_loss=0.2067, pruned_loss=0.0359, over 4972.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2113, pruned_loss=0.03287, over 972310.79 frames.], batch size: 25, lr: 2.12e-04 2022-05-06 22:09:54,522 INFO [train.py:715] (0/8) Epoch 10, batch 24950, loss[loss=0.1209, simple_loss=0.1904, pruned_loss=0.02572, over 4917.00 frames.], tot_loss[loss=0.139, simple_loss=0.2117, pruned_loss=0.03311, over 972411.27 frames.], batch size: 23, lr: 2.12e-04 2022-05-06 22:10:34,375 INFO [train.py:715] (0/8) Epoch 10, batch 25000, loss[loss=0.1485, simple_loss=0.2237, pruned_loss=0.0367, over 4956.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2131, pruned_loss=0.03337, over 971158.44 frames.], batch size: 21, lr: 2.12e-04 2022-05-06 22:11:13,227 INFO [train.py:715] (0/8) Epoch 10, batch 25050, loss[loss=0.1668, simple_loss=0.2298, pruned_loss=0.05189, over 4891.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2135, pruned_loss=0.03359, over 971049.99 frames.], batch size: 19, lr: 2.12e-04 2022-05-06 22:11:52,694 INFO [train.py:715] (0/8) Epoch 10, batch 25100, loss[loss=0.1349, simple_loss=0.2139, pruned_loss=0.02792, over 4928.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2124, pruned_loss=0.03286, over 971343.61 frames.], batch size: 18, lr: 2.12e-04 2022-05-06 22:12:32,718 INFO [train.py:715] (0/8) Epoch 10, batch 25150, loss[loss=0.1464, simple_loss=0.2186, pruned_loss=0.03706, over 4747.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2124, pruned_loss=0.0331, over 971867.13 frames.], batch size: 16, lr: 2.12e-04 2022-05-06 22:13:12,208 INFO [train.py:715] (0/8) Epoch 10, batch 25200, loss[loss=0.1327, simple_loss=0.194, pruned_loss=0.03569, over 4695.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2123, pruned_loss=0.03354, over 972019.39 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 22:13:50,341 INFO [train.py:715] (0/8) Epoch 10, batch 25250, loss[loss=0.1171, simple_loss=0.1839, pruned_loss=0.0251, over 4986.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2117, pruned_loss=0.0333, over 971090.96 frames.], batch size: 14, lr: 2.12e-04 2022-05-06 22:14:29,215 INFO [train.py:715] (0/8) Epoch 10, batch 25300, loss[loss=0.1471, simple_loss=0.2241, pruned_loss=0.03508, over 4756.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2117, pruned_loss=0.03324, over 971451.96 frames.], batch size: 19, lr: 2.12e-04 2022-05-06 22:15:08,861 INFO [train.py:715] (0/8) Epoch 10, batch 25350, loss[loss=0.2042, simple_loss=0.2734, pruned_loss=0.06746, over 4946.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2114, pruned_loss=0.03318, over 972293.89 frames.], batch size: 39, lr: 2.12e-04 2022-05-06 22:15:47,379 INFO [train.py:715] (0/8) Epoch 10, batch 25400, loss[loss=0.1669, simple_loss=0.2374, pruned_loss=0.0482, over 4979.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2116, pruned_loss=0.0334, over 971845.27 frames.], batch size: 15, lr: 2.12e-04 2022-05-06 22:16:26,237 INFO [train.py:715] (0/8) Epoch 10, batch 25450, loss[loss=0.1373, simple_loss=0.2099, pruned_loss=0.03239, over 4947.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2122, pruned_loss=0.03397, over 971744.90 frames.], batch size: 29, lr: 2.12e-04 2022-05-06 22:17:06,157 INFO [train.py:715] (0/8) Epoch 10, batch 25500, loss[loss=0.1125, simple_loss=0.1763, pruned_loss=0.02433, over 4773.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.03355, over 970620.94 frames.], batch size: 12, lr: 2.11e-04 2022-05-06 22:17:45,978 INFO [train.py:715] (0/8) Epoch 10, batch 25550, loss[loss=0.1078, simple_loss=0.1826, pruned_loss=0.01653, over 4937.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2123, pruned_loss=0.03346, over 971285.30 frames.], batch size: 18, lr: 2.11e-04 2022-05-06 22:18:24,956 INFO [train.py:715] (0/8) Epoch 10, batch 25600, loss[loss=0.1738, simple_loss=0.242, pruned_loss=0.05283, over 4745.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03291, over 971073.47 frames.], batch size: 16, lr: 2.11e-04 2022-05-06 22:19:05,113 INFO [train.py:715] (0/8) Epoch 10, batch 25650, loss[loss=0.09874, simple_loss=0.1644, pruned_loss=0.01653, over 4753.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03298, over 970823.35 frames.], batch size: 12, lr: 2.11e-04 2022-05-06 22:19:45,489 INFO [train.py:715] (0/8) Epoch 10, batch 25700, loss[loss=0.1552, simple_loss=0.2246, pruned_loss=0.04289, over 4782.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03297, over 971003.87 frames.], batch size: 17, lr: 2.11e-04 2022-05-06 22:20:25,350 INFO [train.py:715] (0/8) Epoch 10, batch 25750, loss[loss=0.181, simple_loss=0.2523, pruned_loss=0.05488, over 4760.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2129, pruned_loss=0.03282, over 971393.43 frames.], batch size: 16, lr: 2.11e-04 2022-05-06 22:21:04,754 INFO [train.py:715] (0/8) Epoch 10, batch 25800, loss[loss=0.1116, simple_loss=0.1836, pruned_loss=0.01985, over 4792.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2119, pruned_loss=0.03259, over 971373.87 frames.], batch size: 21, lr: 2.11e-04 2022-05-06 22:21:45,291 INFO [train.py:715] (0/8) Epoch 10, batch 25850, loss[loss=0.1536, simple_loss=0.2298, pruned_loss=0.03865, over 4835.00 frames.], tot_loss[loss=0.1399, simple_loss=0.213, pruned_loss=0.0334, over 971900.31 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:22:25,224 INFO [train.py:715] (0/8) Epoch 10, batch 25900, loss[loss=0.1515, simple_loss=0.2095, pruned_loss=0.04671, over 4905.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03334, over 971466.68 frames.], batch size: 19, lr: 2.11e-04 2022-05-06 22:23:03,942 INFO [train.py:715] (0/8) Epoch 10, batch 25950, loss[loss=0.1344, simple_loss=0.214, pruned_loss=0.02739, over 4714.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2129, pruned_loss=0.03367, over 971323.83 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:23:42,712 INFO [train.py:715] (0/8) Epoch 10, batch 26000, loss[loss=0.1318, simple_loss=0.2038, pruned_loss=0.02996, over 4927.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03353, over 970920.33 frames.], batch size: 23, lr: 2.11e-04 2022-05-06 22:24:21,988 INFO [train.py:715] (0/8) Epoch 10, batch 26050, loss[loss=0.1503, simple_loss=0.2217, pruned_loss=0.03945, over 4952.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2131, pruned_loss=0.03335, over 972296.48 frames.], batch size: 24, lr: 2.11e-04 2022-05-06 22:25:00,973 INFO [train.py:715] (0/8) Epoch 10, batch 26100, loss[loss=0.1114, simple_loss=0.1929, pruned_loss=0.01494, over 4979.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2122, pruned_loss=0.03319, over 972023.74 frames.], batch size: 25, lr: 2.11e-04 2022-05-06 22:25:40,336 INFO [train.py:715] (0/8) Epoch 10, batch 26150, loss[loss=0.1145, simple_loss=0.1891, pruned_loss=0.01995, over 4770.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2122, pruned_loss=0.03357, over 971110.44 frames.], batch size: 12, lr: 2.11e-04 2022-05-06 22:26:21,095 INFO [train.py:715] (0/8) Epoch 10, batch 26200, loss[loss=0.1363, simple_loss=0.2158, pruned_loss=0.02835, over 4919.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2122, pruned_loss=0.03357, over 971613.76 frames.], batch size: 23, lr: 2.11e-04 2022-05-06 22:27:00,351 INFO [train.py:715] (0/8) Epoch 10, batch 26250, loss[loss=0.1313, simple_loss=0.2084, pruned_loss=0.02709, over 4980.00 frames.], tot_loss[loss=0.14, simple_loss=0.2126, pruned_loss=0.03371, over 972240.24 frames.], batch size: 27, lr: 2.11e-04 2022-05-06 22:27:39,998 INFO [train.py:715] (0/8) Epoch 10, batch 26300, loss[loss=0.162, simple_loss=0.2321, pruned_loss=0.04595, over 4906.00 frames.], tot_loss[loss=0.14, simple_loss=0.2128, pruned_loss=0.0336, over 972363.24 frames.], batch size: 17, lr: 2.11e-04 2022-05-06 22:28:19,565 INFO [train.py:715] (0/8) Epoch 10, batch 26350, loss[loss=0.1192, simple_loss=0.1979, pruned_loss=0.02021, over 4815.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2133, pruned_loss=0.03371, over 971769.45 frames.], batch size: 25, lr: 2.11e-04 2022-05-06 22:28:59,165 INFO [train.py:715] (0/8) Epoch 10, batch 26400, loss[loss=0.1476, simple_loss=0.2288, pruned_loss=0.03315, over 4752.00 frames.], tot_loss[loss=0.14, simple_loss=0.2131, pruned_loss=0.03347, over 971773.64 frames.], batch size: 19, lr: 2.11e-04 2022-05-06 22:29:38,872 INFO [train.py:715] (0/8) Epoch 10, batch 26450, loss[loss=0.1576, simple_loss=0.2326, pruned_loss=0.04127, over 4813.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03339, over 971859.30 frames.], batch size: 21, lr: 2.11e-04 2022-05-06 22:30:18,682 INFO [train.py:715] (0/8) Epoch 10, batch 26500, loss[loss=0.1567, simple_loss=0.2315, pruned_loss=0.0409, over 4967.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2129, pruned_loss=0.03319, over 971834.63 frames.], batch size: 31, lr: 2.11e-04 2022-05-06 22:30:59,097 INFO [train.py:715] (0/8) Epoch 10, batch 26550, loss[loss=0.1111, simple_loss=0.1856, pruned_loss=0.01825, over 4826.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03337, over 971990.05 frames.], batch size: 27, lr: 2.11e-04 2022-05-06 22:31:37,638 INFO [train.py:715] (0/8) Epoch 10, batch 26600, loss[loss=0.1304, simple_loss=0.2107, pruned_loss=0.02503, over 4968.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2138, pruned_loss=0.03345, over 972919.14 frames.], batch size: 28, lr: 2.11e-04 2022-05-06 22:32:17,164 INFO [train.py:715] (0/8) Epoch 10, batch 26650, loss[loss=0.1623, simple_loss=0.232, pruned_loss=0.04629, over 4963.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03375, over 972262.09 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:32:56,672 INFO [train.py:715] (0/8) Epoch 10, batch 26700, loss[loss=0.1295, simple_loss=0.1955, pruned_loss=0.03171, over 4921.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2143, pruned_loss=0.0341, over 972407.83 frames.], batch size: 23, lr: 2.11e-04 2022-05-06 22:33:36,175 INFO [train.py:715] (0/8) Epoch 10, batch 26750, loss[loss=0.1239, simple_loss=0.2022, pruned_loss=0.02282, over 4933.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2144, pruned_loss=0.03418, over 972936.41 frames.], batch size: 18, lr: 2.11e-04 2022-05-06 22:34:14,863 INFO [train.py:715] (0/8) Epoch 10, batch 26800, loss[loss=0.147, simple_loss=0.2089, pruned_loss=0.04251, over 4709.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2142, pruned_loss=0.03399, over 973249.22 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:34:54,618 INFO [train.py:715] (0/8) Epoch 10, batch 26850, loss[loss=0.1289, simple_loss=0.1984, pruned_loss=0.02966, over 4886.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.03345, over 972776.70 frames.], batch size: 16, lr: 2.11e-04 2022-05-06 22:35:34,128 INFO [train.py:715] (0/8) Epoch 10, batch 26900, loss[loss=0.16, simple_loss=0.2372, pruned_loss=0.04143, over 4980.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03333, over 972511.73 frames.], batch size: 14, lr: 2.11e-04 2022-05-06 22:36:12,940 INFO [train.py:715] (0/8) Epoch 10, batch 26950, loss[loss=0.1236, simple_loss=0.2001, pruned_loss=0.02353, over 4934.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2132, pruned_loss=0.0335, over 973041.69 frames.], batch size: 23, lr: 2.11e-04 2022-05-06 22:36:51,892 INFO [train.py:715] (0/8) Epoch 10, batch 27000, loss[loss=0.1596, simple_loss=0.2388, pruned_loss=0.04015, over 4977.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2132, pruned_loss=0.03394, over 973129.72 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:36:51,893 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 22:37:01,643 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1063, simple_loss=0.1906, pruned_loss=0.01104, over 914524.00 frames. 2022-05-06 22:37:41,041 INFO [train.py:715] (0/8) Epoch 10, batch 27050, loss[loss=0.1502, simple_loss=0.217, pruned_loss=0.0417, over 4969.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2138, pruned_loss=0.03449, over 973665.08 frames.], batch size: 35, lr: 2.11e-04 2022-05-06 22:38:20,999 INFO [train.py:715] (0/8) Epoch 10, batch 27100, loss[loss=0.1066, simple_loss=0.1813, pruned_loss=0.01593, over 4805.00 frames.], tot_loss[loss=0.1405, simple_loss=0.213, pruned_loss=0.03399, over 973372.36 frames.], batch size: 25, lr: 2.11e-04 2022-05-06 22:38:59,614 INFO [train.py:715] (0/8) Epoch 10, batch 27150, loss[loss=0.1525, simple_loss=0.2116, pruned_loss=0.04664, over 4948.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2139, pruned_loss=0.03418, over 973170.30 frames.], batch size: 39, lr: 2.11e-04 2022-05-06 22:39:38,788 INFO [train.py:715] (0/8) Epoch 10, batch 27200, loss[loss=0.1311, simple_loss=0.2175, pruned_loss=0.02235, over 4810.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2132, pruned_loss=0.03351, over 973067.87 frames.], batch size: 21, lr: 2.11e-04 2022-05-06 22:40:18,810 INFO [train.py:715] (0/8) Epoch 10, batch 27250, loss[loss=0.1299, simple_loss=0.2033, pruned_loss=0.02828, over 4978.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03359, over 973240.62 frames.], batch size: 27, lr: 2.11e-04 2022-05-06 22:40:58,231 INFO [train.py:715] (0/8) Epoch 10, batch 27300, loss[loss=0.1512, simple_loss=0.2182, pruned_loss=0.04214, over 4840.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2129, pruned_loss=0.03364, over 973160.58 frames.], batch size: 20, lr: 2.11e-04 2022-05-06 22:41:36,432 INFO [train.py:715] (0/8) Epoch 10, batch 27350, loss[loss=0.1483, simple_loss=0.2215, pruned_loss=0.03754, over 4750.00 frames.], tot_loss[loss=0.14, simple_loss=0.213, pruned_loss=0.03347, over 973372.93 frames.], batch size: 16, lr: 2.11e-04 2022-05-06 22:42:15,730 INFO [train.py:715] (0/8) Epoch 10, batch 27400, loss[loss=0.1586, simple_loss=0.2401, pruned_loss=0.03852, over 4800.00 frames.], tot_loss[loss=0.14, simple_loss=0.213, pruned_loss=0.03352, over 973921.98 frames.], batch size: 21, lr: 2.11e-04 2022-05-06 22:42:55,893 INFO [train.py:715] (0/8) Epoch 10, batch 27450, loss[loss=0.1681, simple_loss=0.2312, pruned_loss=0.05248, over 4855.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2124, pruned_loss=0.03339, over 973878.80 frames.], batch size: 20, lr: 2.11e-04 2022-05-06 22:43:34,160 INFO [train.py:715] (0/8) Epoch 10, batch 27500, loss[loss=0.121, simple_loss=0.1997, pruned_loss=0.02117, over 4986.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03316, over 973604.32 frames.], batch size: 28, lr: 2.11e-04 2022-05-06 22:44:13,413 INFO [train.py:715] (0/8) Epoch 10, batch 27550, loss[loss=0.1515, simple_loss=0.2216, pruned_loss=0.04073, over 4986.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2125, pruned_loss=0.03293, over 973650.20 frames.], batch size: 35, lr: 2.11e-04 2022-05-06 22:44:52,780 INFO [train.py:715] (0/8) Epoch 10, batch 27600, loss[loss=0.1554, simple_loss=0.2259, pruned_loss=0.04239, over 4941.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2122, pruned_loss=0.0327, over 974630.41 frames.], batch size: 39, lr: 2.11e-04 2022-05-06 22:45:32,114 INFO [train.py:715] (0/8) Epoch 10, batch 27650, loss[loss=0.1356, simple_loss=0.2107, pruned_loss=0.03024, over 4918.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2122, pruned_loss=0.03266, over 974430.21 frames.], batch size: 17, lr: 2.11e-04 2022-05-06 22:46:11,030 INFO [train.py:715] (0/8) Epoch 10, batch 27700, loss[loss=0.1303, simple_loss=0.2043, pruned_loss=0.0281, over 4751.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2122, pruned_loss=0.03266, over 973850.41 frames.], batch size: 19, lr: 2.11e-04 2022-05-06 22:46:51,024 INFO [train.py:715] (0/8) Epoch 10, batch 27750, loss[loss=0.1449, simple_loss=0.221, pruned_loss=0.03434, over 4940.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2119, pruned_loss=0.03259, over 973177.54 frames.], batch size: 23, lr: 2.11e-04 2022-05-06 22:47:31,101 INFO [train.py:715] (0/8) Epoch 10, batch 27800, loss[loss=0.1581, simple_loss=0.2294, pruned_loss=0.04339, over 4862.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03308, over 973674.10 frames.], batch size: 20, lr: 2.11e-04 2022-05-06 22:48:10,304 INFO [train.py:715] (0/8) Epoch 10, batch 27850, loss[loss=0.132, simple_loss=0.2137, pruned_loss=0.02513, over 4810.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2128, pruned_loss=0.0331, over 973461.08 frames.], batch size: 26, lr: 2.11e-04 2022-05-06 22:48:50,681 INFO [train.py:715] (0/8) Epoch 10, batch 27900, loss[loss=0.1332, simple_loss=0.2091, pruned_loss=0.02864, over 4843.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2122, pruned_loss=0.03266, over 973007.63 frames.], batch size: 20, lr: 2.11e-04 2022-05-06 22:48:54,583 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-376000.pt 2022-05-06 22:49:34,037 INFO [train.py:715] (0/8) Epoch 10, batch 27950, loss[loss=0.1631, simple_loss=0.2347, pruned_loss=0.0458, over 4963.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2128, pruned_loss=0.03325, over 972974.11 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:50:13,531 INFO [train.py:715] (0/8) Epoch 10, batch 28000, loss[loss=0.1844, simple_loss=0.2491, pruned_loss=0.05983, over 4961.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03345, over 973162.03 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:50:53,594 INFO [train.py:715] (0/8) Epoch 10, batch 28050, loss[loss=0.1478, simple_loss=0.2201, pruned_loss=0.0378, over 4958.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2121, pruned_loss=0.03321, over 972587.00 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:51:34,464 INFO [train.py:715] (0/8) Epoch 10, batch 28100, loss[loss=0.1401, simple_loss=0.2165, pruned_loss=0.03188, over 4773.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03325, over 971871.03 frames.], batch size: 19, lr: 2.11e-04 2022-05-06 22:52:15,138 INFO [train.py:715] (0/8) Epoch 10, batch 28150, loss[loss=0.1159, simple_loss=0.1909, pruned_loss=0.02048, over 4817.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2134, pruned_loss=0.03378, over 971977.44 frames.], batch size: 12, lr: 2.11e-04 2022-05-06 22:52:54,871 INFO [train.py:715] (0/8) Epoch 10, batch 28200, loss[loss=0.1197, simple_loss=0.1891, pruned_loss=0.02514, over 4976.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2137, pruned_loss=0.0339, over 972715.58 frames.], batch size: 15, lr: 2.11e-04 2022-05-06 22:53:35,214 INFO [train.py:715] (0/8) Epoch 10, batch 28250, loss[loss=0.1103, simple_loss=0.1805, pruned_loss=0.02002, over 4884.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2133, pruned_loss=0.0338, over 971991.12 frames.], batch size: 22, lr: 2.11e-04 2022-05-06 22:54:16,801 INFO [train.py:715] (0/8) Epoch 10, batch 28300, loss[loss=0.1269, simple_loss=0.2014, pruned_loss=0.02626, over 4835.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2143, pruned_loss=0.03426, over 971665.43 frames.], batch size: 32, lr: 2.11e-04 2022-05-06 22:54:56,893 INFO [train.py:715] (0/8) Epoch 10, batch 28350, loss[loss=0.1457, simple_loss=0.2208, pruned_loss=0.03533, over 4908.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03366, over 971518.64 frames.], batch size: 19, lr: 2.11e-04 2022-05-06 22:55:37,453 INFO [train.py:715] (0/8) Epoch 10, batch 28400, loss[loss=0.1341, simple_loss=0.2145, pruned_loss=0.02684, over 4829.00 frames.], tot_loss[loss=0.14, simple_loss=0.2133, pruned_loss=0.03338, over 971128.85 frames.], batch size: 30, lr: 2.11e-04 2022-05-06 22:56:19,120 INFO [train.py:715] (0/8) Epoch 10, batch 28450, loss[loss=0.1277, simple_loss=0.1933, pruned_loss=0.0311, over 4789.00 frames.], tot_loss[loss=0.1397, simple_loss=0.213, pruned_loss=0.03323, over 971278.25 frames.], batch size: 14, lr: 2.11e-04 2022-05-06 22:57:00,138 INFO [train.py:715] (0/8) Epoch 10, batch 28500, loss[loss=0.1523, simple_loss=0.2154, pruned_loss=0.04456, over 4883.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2127, pruned_loss=0.03323, over 971673.77 frames.], batch size: 32, lr: 2.11e-04 2022-05-06 22:57:40,542 INFO [train.py:715] (0/8) Epoch 10, batch 28550, loss[loss=0.1536, simple_loss=0.2337, pruned_loss=0.03669, over 4943.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.03306, over 972087.51 frames.], batch size: 21, lr: 2.11e-04 2022-05-06 22:58:21,443 INFO [train.py:715] (0/8) Epoch 10, batch 28600, loss[loss=0.1362, simple_loss=0.1997, pruned_loss=0.03638, over 4971.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03293, over 972709.46 frames.], batch size: 14, lr: 2.11e-04 2022-05-06 22:59:03,592 INFO [train.py:715] (0/8) Epoch 10, batch 28650, loss[loss=0.1187, simple_loss=0.1905, pruned_loss=0.02344, over 4866.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2123, pruned_loss=0.03334, over 972872.52 frames.], batch size: 22, lr: 2.11e-04 2022-05-06 22:59:43,743 INFO [train.py:715] (0/8) Epoch 10, batch 28700, loss[loss=0.1327, simple_loss=0.2108, pruned_loss=0.02733, over 4922.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03334, over 973400.47 frames.], batch size: 18, lr: 2.11e-04 2022-05-06 23:00:24,815 INFO [train.py:715] (0/8) Epoch 10, batch 28750, loss[loss=0.1116, simple_loss=0.1738, pruned_loss=0.02466, over 4847.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2115, pruned_loss=0.03299, over 973606.74 frames.], batch size: 12, lr: 2.11e-04 2022-05-06 23:01:05,928 INFO [train.py:715] (0/8) Epoch 10, batch 28800, loss[loss=0.1521, simple_loss=0.2195, pruned_loss=0.0423, over 4774.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2133, pruned_loss=0.03388, over 973612.36 frames.], batch size: 17, lr: 2.11e-04 2022-05-06 23:01:46,829 INFO [train.py:715] (0/8) Epoch 10, batch 28850, loss[loss=0.1371, simple_loss=0.2216, pruned_loss=0.02628, over 4961.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2131, pruned_loss=0.03338, over 973444.82 frames.], batch size: 21, lr: 2.11e-04 2022-05-06 23:02:27,343 INFO [train.py:715] (0/8) Epoch 10, batch 28900, loss[loss=0.1331, simple_loss=0.2082, pruned_loss=0.02894, over 4840.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2128, pruned_loss=0.0332, over 973270.27 frames.], batch size: 26, lr: 2.11e-04 2022-05-06 23:03:08,206 INFO [train.py:715] (0/8) Epoch 10, batch 28950, loss[loss=0.1224, simple_loss=0.1988, pruned_loss=0.02299, over 4938.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03309, over 973286.23 frames.], batch size: 29, lr: 2.11e-04 2022-05-06 23:03:49,280 INFO [train.py:715] (0/8) Epoch 10, batch 29000, loss[loss=0.1517, simple_loss=0.2269, pruned_loss=0.03824, over 4873.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.03307, over 972543.18 frames.], batch size: 22, lr: 2.11e-04 2022-05-06 23:04:28,433 INFO [train.py:715] (0/8) Epoch 10, batch 29050, loss[loss=0.1364, simple_loss=0.1961, pruned_loss=0.03835, over 4915.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2127, pruned_loss=0.03321, over 972575.52 frames.], batch size: 17, lr: 2.10e-04 2022-05-06 23:05:07,302 INFO [train.py:715] (0/8) Epoch 10, batch 29100, loss[loss=0.1277, simple_loss=0.1916, pruned_loss=0.03193, over 4968.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.0331, over 973111.60 frames.], batch size: 35, lr: 2.10e-04 2022-05-06 23:05:47,478 INFO [train.py:715] (0/8) Epoch 10, batch 29150, loss[loss=0.1119, simple_loss=0.1735, pruned_loss=0.02518, over 4738.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2117, pruned_loss=0.03275, over 972822.00 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:06:27,776 INFO [train.py:715] (0/8) Epoch 10, batch 29200, loss[loss=0.1238, simple_loss=0.2006, pruned_loss=0.02345, over 4780.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03317, over 973224.51 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:07:06,676 INFO [train.py:715] (0/8) Epoch 10, batch 29250, loss[loss=0.1179, simple_loss=0.1802, pruned_loss=0.02785, over 4750.00 frames.], tot_loss[loss=0.1402, simple_loss=0.213, pruned_loss=0.03367, over 973629.34 frames.], batch size: 19, lr: 2.10e-04 2022-05-06 23:07:46,920 INFO [train.py:715] (0/8) Epoch 10, batch 29300, loss[loss=0.1436, simple_loss=0.2273, pruned_loss=0.02993, over 4986.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.033, over 973866.79 frames.], batch size: 27, lr: 2.10e-04 2022-05-06 23:08:27,014 INFO [train.py:715] (0/8) Epoch 10, batch 29350, loss[loss=0.1171, simple_loss=0.1914, pruned_loss=0.02139, over 4959.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.03293, over 973804.78 frames.], batch size: 35, lr: 2.10e-04 2022-05-06 23:09:06,023 INFO [train.py:715] (0/8) Epoch 10, batch 29400, loss[loss=0.1729, simple_loss=0.2328, pruned_loss=0.05657, over 4887.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2116, pruned_loss=0.03302, over 974428.82 frames.], batch size: 19, lr: 2.10e-04 2022-05-06 23:09:45,804 INFO [train.py:715] (0/8) Epoch 10, batch 29450, loss[loss=0.1599, simple_loss=0.2361, pruned_loss=0.04184, over 4840.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.03291, over 974411.67 frames.], batch size: 20, lr: 2.10e-04 2022-05-06 23:10:26,001 INFO [train.py:715] (0/8) Epoch 10, batch 29500, loss[loss=0.1611, simple_loss=0.2339, pruned_loss=0.04419, over 4965.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03306, over 973577.91 frames.], batch size: 35, lr: 2.10e-04 2022-05-06 23:11:05,706 INFO [train.py:715] (0/8) Epoch 10, batch 29550, loss[loss=0.1899, simple_loss=0.2403, pruned_loss=0.0697, over 4870.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.03332, over 973406.15 frames.], batch size: 16, lr: 2.10e-04 2022-05-06 23:11:44,343 INFO [train.py:715] (0/8) Epoch 10, batch 29600, loss[loss=0.1387, simple_loss=0.2182, pruned_loss=0.02962, over 4873.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2129, pruned_loss=0.03345, over 973259.64 frames.], batch size: 30, lr: 2.10e-04 2022-05-06 23:12:23,999 INFO [train.py:715] (0/8) Epoch 10, batch 29650, loss[loss=0.1735, simple_loss=0.2444, pruned_loss=0.05127, over 4873.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2129, pruned_loss=0.0337, over 973162.59 frames.], batch size: 16, lr: 2.10e-04 2022-05-06 23:13:03,434 INFO [train.py:715] (0/8) Epoch 10, batch 29700, loss[loss=0.1544, simple_loss=0.2242, pruned_loss=0.04226, over 4947.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2131, pruned_loss=0.03365, over 971998.24 frames.], batch size: 35, lr: 2.10e-04 2022-05-06 23:13:42,103 INFO [train.py:715] (0/8) Epoch 10, batch 29750, loss[loss=0.1345, simple_loss=0.2096, pruned_loss=0.02971, over 4880.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2128, pruned_loss=0.03343, over 972059.50 frames.], batch size: 22, lr: 2.10e-04 2022-05-06 23:14:21,080 INFO [train.py:715] (0/8) Epoch 10, batch 29800, loss[loss=0.1255, simple_loss=0.207, pruned_loss=0.022, over 4890.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2117, pruned_loss=0.03268, over 971880.67 frames.], batch size: 22, lr: 2.10e-04 2022-05-06 23:15:00,553 INFO [train.py:715] (0/8) Epoch 10, batch 29850, loss[loss=0.157, simple_loss=0.2351, pruned_loss=0.03939, over 4835.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2124, pruned_loss=0.03286, over 972355.35 frames.], batch size: 20, lr: 2.10e-04 2022-05-06 23:15:39,438 INFO [train.py:715] (0/8) Epoch 10, batch 29900, loss[loss=0.1329, simple_loss=0.1982, pruned_loss=0.03384, over 4812.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2134, pruned_loss=0.03363, over 972371.80 frames.], batch size: 13, lr: 2.10e-04 2022-05-06 23:16:17,896 INFO [train.py:715] (0/8) Epoch 10, batch 29950, loss[loss=0.1253, simple_loss=0.2081, pruned_loss=0.02129, over 4880.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03316, over 971123.80 frames.], batch size: 22, lr: 2.10e-04 2022-05-06 23:16:57,114 INFO [train.py:715] (0/8) Epoch 10, batch 30000, loss[loss=0.1387, simple_loss=0.2091, pruned_loss=0.03415, over 4892.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03353, over 971352.66 frames.], batch size: 19, lr: 2.10e-04 2022-05-06 23:16:57,115 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 23:17:06,542 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1063, simple_loss=0.1906, pruned_loss=0.01106, over 914524.00 frames. 2022-05-06 23:17:46,310 INFO [train.py:715] (0/8) Epoch 10, batch 30050, loss[loss=0.1071, simple_loss=0.1715, pruned_loss=0.02132, over 4770.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2126, pruned_loss=0.03355, over 970759.21 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:18:25,804 INFO [train.py:715] (0/8) Epoch 10, batch 30100, loss[loss=0.1546, simple_loss=0.2363, pruned_loss=0.03643, over 4921.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03357, over 971408.68 frames.], batch size: 29, lr: 2.10e-04 2022-05-06 23:19:04,198 INFO [train.py:715] (0/8) Epoch 10, batch 30150, loss[loss=0.1331, simple_loss=0.2004, pruned_loss=0.03284, over 4696.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2127, pruned_loss=0.03354, over 971051.86 frames.], batch size: 15, lr: 2.10e-04 2022-05-06 23:19:44,547 INFO [train.py:715] (0/8) Epoch 10, batch 30200, loss[loss=0.1742, simple_loss=0.2455, pruned_loss=0.05143, over 4886.00 frames.], tot_loss[loss=0.1405, simple_loss=0.2129, pruned_loss=0.03403, over 970889.15 frames.], batch size: 39, lr: 2.10e-04 2022-05-06 23:20:24,569 INFO [train.py:715] (0/8) Epoch 10, batch 30250, loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02852, over 4901.00 frames.], tot_loss[loss=0.1398, simple_loss=0.212, pruned_loss=0.03378, over 970338.15 frames.], batch size: 17, lr: 2.10e-04 2022-05-06 23:21:02,961 INFO [train.py:715] (0/8) Epoch 10, batch 30300, loss[loss=0.1727, simple_loss=0.2534, pruned_loss=0.04606, over 4954.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2128, pruned_loss=0.03393, over 970517.75 frames.], batch size: 21, lr: 2.10e-04 2022-05-06 23:21:41,377 INFO [train.py:715] (0/8) Epoch 10, batch 30350, loss[loss=0.1229, simple_loss=0.1938, pruned_loss=0.02596, over 4871.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2128, pruned_loss=0.03396, over 972401.23 frames.], batch size: 32, lr: 2.10e-04 2022-05-06 23:22:21,180 INFO [train.py:715] (0/8) Epoch 10, batch 30400, loss[loss=0.1262, simple_loss=0.1944, pruned_loss=0.02905, over 4951.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03358, over 971505.43 frames.], batch size: 21, lr: 2.10e-04 2022-05-06 23:23:00,548 INFO [train.py:715] (0/8) Epoch 10, batch 30450, loss[loss=0.1661, simple_loss=0.2348, pruned_loss=0.04871, over 4807.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2119, pruned_loss=0.03333, over 971716.65 frames.], batch size: 26, lr: 2.10e-04 2022-05-06 23:23:38,707 INFO [train.py:715] (0/8) Epoch 10, batch 30500, loss[loss=0.1414, simple_loss=0.2099, pruned_loss=0.03645, over 4860.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2121, pruned_loss=0.03349, over 971830.48 frames.], batch size: 20, lr: 2.10e-04 2022-05-06 23:24:18,302 INFO [train.py:715] (0/8) Epoch 10, batch 30550, loss[loss=0.1359, simple_loss=0.2054, pruned_loss=0.0332, over 4775.00 frames.], tot_loss[loss=0.1395, simple_loss=0.212, pruned_loss=0.03345, over 971295.14 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:24:57,942 INFO [train.py:715] (0/8) Epoch 10, batch 30600, loss[loss=0.1267, simple_loss=0.1957, pruned_loss=0.02888, over 4758.00 frames.], tot_loss[loss=0.1393, simple_loss=0.212, pruned_loss=0.0333, over 971490.03 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:25:36,406 INFO [train.py:715] (0/8) Epoch 10, batch 30650, loss[loss=0.1522, simple_loss=0.2298, pruned_loss=0.03728, over 4751.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2123, pruned_loss=0.03341, over 972449.42 frames.], batch size: 16, lr: 2.10e-04 2022-05-06 23:26:15,884 INFO [train.py:715] (0/8) Epoch 10, batch 30700, loss[loss=0.1215, simple_loss=0.1967, pruned_loss=0.02315, over 4940.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2123, pruned_loss=0.03338, over 973278.42 frames.], batch size: 23, lr: 2.10e-04 2022-05-06 23:26:55,010 INFO [train.py:715] (0/8) Epoch 10, batch 30750, loss[loss=0.1262, simple_loss=0.2029, pruned_loss=0.02474, over 4810.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2118, pruned_loss=0.03302, over 972923.22 frames.], batch size: 21, lr: 2.10e-04 2022-05-06 23:27:33,899 INFO [train.py:715] (0/8) Epoch 10, batch 30800, loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02962, over 4756.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2114, pruned_loss=0.03285, over 972619.99 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:28:12,410 INFO [train.py:715] (0/8) Epoch 10, batch 30850, loss[loss=0.1301, simple_loss=0.198, pruned_loss=0.03115, over 4853.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2115, pruned_loss=0.03268, over 973347.42 frames.], batch size: 13, lr: 2.10e-04 2022-05-06 23:28:52,163 INFO [train.py:715] (0/8) Epoch 10, batch 30900, loss[loss=0.1239, simple_loss=0.1993, pruned_loss=0.02425, over 4870.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2112, pruned_loss=0.03213, over 973417.95 frames.], batch size: 32, lr: 2.10e-04 2022-05-06 23:29:32,111 INFO [train.py:715] (0/8) Epoch 10, batch 30950, loss[loss=0.1484, simple_loss=0.2158, pruned_loss=0.04051, over 4702.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2114, pruned_loss=0.03237, over 973707.28 frames.], batch size: 15, lr: 2.10e-04 2022-05-06 23:30:11,644 INFO [train.py:715] (0/8) Epoch 10, batch 31000, loss[loss=0.1453, simple_loss=0.2209, pruned_loss=0.03487, over 4905.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2116, pruned_loss=0.03238, over 973171.81 frames.], batch size: 19, lr: 2.10e-04 2022-05-06 23:30:50,319 INFO [train.py:715] (0/8) Epoch 10, batch 31050, loss[loss=0.1326, simple_loss=0.21, pruned_loss=0.02755, over 4883.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2119, pruned_loss=0.03232, over 972727.43 frames.], batch size: 22, lr: 2.10e-04 2022-05-06 23:31:29,591 INFO [train.py:715] (0/8) Epoch 10, batch 31100, loss[loss=0.1303, simple_loss=0.2027, pruned_loss=0.02899, over 4764.00 frames.], tot_loss[loss=0.1384, simple_loss=0.212, pruned_loss=0.0324, over 971976.49 frames.], batch size: 19, lr: 2.10e-04 2022-05-06 23:32:09,325 INFO [train.py:715] (0/8) Epoch 10, batch 31150, loss[loss=0.1066, simple_loss=0.1911, pruned_loss=0.01103, over 4803.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2128, pruned_loss=0.03294, over 971509.61 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:32:47,334 INFO [train.py:715] (0/8) Epoch 10, batch 31200, loss[loss=0.1093, simple_loss=0.1823, pruned_loss=0.01816, over 4982.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2126, pruned_loss=0.03264, over 971974.31 frames.], batch size: 25, lr: 2.10e-04 2022-05-06 23:33:26,828 INFO [train.py:715] (0/8) Epoch 10, batch 31250, loss[loss=0.1525, simple_loss=0.2274, pruned_loss=0.03882, over 4884.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.033, over 972688.53 frames.], batch size: 38, lr: 2.10e-04 2022-05-06 23:34:06,252 INFO [train.py:715] (0/8) Epoch 10, batch 31300, loss[loss=0.1278, simple_loss=0.1967, pruned_loss=0.02944, over 4958.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2118, pruned_loss=0.03296, over 972432.01 frames.], batch size: 14, lr: 2.10e-04 2022-05-06 23:34:45,238 INFO [train.py:715] (0/8) Epoch 10, batch 31350, loss[loss=0.1291, simple_loss=0.2018, pruned_loss=0.02816, over 4966.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2122, pruned_loss=0.03318, over 972418.25 frames.], batch size: 15, lr: 2.10e-04 2022-05-06 23:35:23,756 INFO [train.py:715] (0/8) Epoch 10, batch 31400, loss[loss=0.1498, simple_loss=0.231, pruned_loss=0.03426, over 4812.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2123, pruned_loss=0.03354, over 973192.40 frames.], batch size: 25, lr: 2.10e-04 2022-05-06 23:36:02,748 INFO [train.py:715] (0/8) Epoch 10, batch 31450, loss[loss=0.1474, simple_loss=0.2294, pruned_loss=0.03266, over 4890.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.03343, over 972338.69 frames.], batch size: 17, lr: 2.10e-04 2022-05-06 23:36:42,172 INFO [train.py:715] (0/8) Epoch 10, batch 31500, loss[loss=0.1365, simple_loss=0.2108, pruned_loss=0.03105, over 4893.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2133, pruned_loss=0.03344, over 973532.19 frames.], batch size: 22, lr: 2.10e-04 2022-05-06 23:37:19,853 INFO [train.py:715] (0/8) Epoch 10, batch 31550, loss[loss=0.09945, simple_loss=0.1664, pruned_loss=0.01623, over 4788.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2135, pruned_loss=0.03355, over 973257.68 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:37:58,954 INFO [train.py:715] (0/8) Epoch 10, batch 31600, loss[loss=0.1328, simple_loss=0.1979, pruned_loss=0.03379, over 4713.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2143, pruned_loss=0.03423, over 972586.98 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:38:38,094 INFO [train.py:715] (0/8) Epoch 10, batch 31650, loss[loss=0.1305, simple_loss=0.2135, pruned_loss=0.02378, over 4818.00 frames.], tot_loss[loss=0.1411, simple_loss=0.214, pruned_loss=0.03412, over 972563.71 frames.], batch size: 26, lr: 2.10e-04 2022-05-06 23:39:17,244 INFO [train.py:715] (0/8) Epoch 10, batch 31700, loss[loss=0.1255, simple_loss=0.2022, pruned_loss=0.02443, over 4865.00 frames.], tot_loss[loss=0.1415, simple_loss=0.2142, pruned_loss=0.03445, over 972593.50 frames.], batch size: 38, lr: 2.10e-04 2022-05-06 23:39:55,911 INFO [train.py:715] (0/8) Epoch 10, batch 31750, loss[loss=0.1688, simple_loss=0.2403, pruned_loss=0.04869, over 4761.00 frames.], tot_loss[loss=0.1412, simple_loss=0.2135, pruned_loss=0.03449, over 972347.40 frames.], batch size: 16, lr: 2.10e-04 2022-05-06 23:40:34,950 INFO [train.py:715] (0/8) Epoch 10, batch 31800, loss[loss=0.1414, simple_loss=0.2162, pruned_loss=0.03327, over 4733.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2127, pruned_loss=0.03432, over 972322.89 frames.], batch size: 16, lr: 2.10e-04 2022-05-06 23:41:14,306 INFO [train.py:715] (0/8) Epoch 10, batch 31850, loss[loss=0.1284, simple_loss=0.2185, pruned_loss=0.01915, over 4878.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.03354, over 972393.20 frames.], batch size: 32, lr: 2.10e-04 2022-05-06 23:41:52,371 INFO [train.py:715] (0/8) Epoch 10, batch 31900, loss[loss=0.1382, simple_loss=0.2027, pruned_loss=0.03688, over 4831.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03306, over 972941.04 frames.], batch size: 30, lr: 2.10e-04 2022-05-06 23:42:31,526 INFO [train.py:715] (0/8) Epoch 10, batch 31950, loss[loss=0.1261, simple_loss=0.1979, pruned_loss=0.02715, over 4782.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2114, pruned_loss=0.03307, over 972633.25 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:43:10,928 INFO [train.py:715] (0/8) Epoch 10, batch 32000, loss[loss=0.1335, simple_loss=0.199, pruned_loss=0.03403, over 4929.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03316, over 972755.57 frames.], batch size: 23, lr: 2.10e-04 2022-05-06 23:43:49,601 INFO [train.py:715] (0/8) Epoch 10, batch 32050, loss[loss=0.1563, simple_loss=0.2333, pruned_loss=0.03965, over 4791.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.033, over 971643.04 frames.], batch size: 21, lr: 2.10e-04 2022-05-06 23:44:27,915 INFO [train.py:715] (0/8) Epoch 10, batch 32100, loss[loss=0.1245, simple_loss=0.183, pruned_loss=0.03301, over 4847.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03267, over 971386.58 frames.], batch size: 13, lr: 2.10e-04 2022-05-06 23:45:06,913 INFO [train.py:715] (0/8) Epoch 10, batch 32150, loss[loss=0.14, simple_loss=0.2201, pruned_loss=0.02995, over 4923.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2108, pruned_loss=0.0323, over 971983.29 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:45:45,852 INFO [train.py:715] (0/8) Epoch 10, batch 32200, loss[loss=0.1725, simple_loss=0.2441, pruned_loss=0.05047, over 4910.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2113, pruned_loss=0.03281, over 972178.80 frames.], batch size: 19, lr: 2.10e-04 2022-05-06 23:46:23,725 INFO [train.py:715] (0/8) Epoch 10, batch 32250, loss[loss=0.1202, simple_loss=0.1893, pruned_loss=0.02555, over 4945.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2116, pruned_loss=0.03312, over 971858.54 frames.], batch size: 29, lr: 2.10e-04 2022-05-06 23:47:02,885 INFO [train.py:715] (0/8) Epoch 10, batch 32300, loss[loss=0.1158, simple_loss=0.1785, pruned_loss=0.02658, over 4790.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2114, pruned_loss=0.03338, over 972111.07 frames.], batch size: 12, lr: 2.10e-04 2022-05-06 23:47:42,099 INFO [train.py:715] (0/8) Epoch 10, batch 32350, loss[loss=0.1306, simple_loss=0.2059, pruned_loss=0.02766, over 4915.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2117, pruned_loss=0.03353, over 972008.09 frames.], batch size: 17, lr: 2.10e-04 2022-05-06 23:48:20,904 INFO [train.py:715] (0/8) Epoch 10, batch 32400, loss[loss=0.13, simple_loss=0.1984, pruned_loss=0.03083, over 4858.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2123, pruned_loss=0.03356, over 971398.06 frames.], batch size: 13, lr: 2.10e-04 2022-05-06 23:48:59,312 INFO [train.py:715] (0/8) Epoch 10, batch 32450, loss[loss=0.1406, simple_loss=0.2149, pruned_loss=0.0331, over 4870.00 frames.], tot_loss[loss=0.139, simple_loss=0.2119, pruned_loss=0.03304, over 971472.87 frames.], batch size: 32, lr: 2.10e-04 2022-05-06 23:49:38,631 INFO [train.py:715] (0/8) Epoch 10, batch 32500, loss[loss=0.1334, simple_loss=0.2044, pruned_loss=0.03117, over 4781.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2113, pruned_loss=0.03295, over 971419.17 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:50:18,345 INFO [train.py:715] (0/8) Epoch 10, batch 32550, loss[loss=0.1414, simple_loss=0.2133, pruned_loss=0.03475, over 4813.00 frames.], tot_loss[loss=0.1384, simple_loss=0.211, pruned_loss=0.03293, over 971839.96 frames.], batch size: 15, lr: 2.10e-04 2022-05-06 23:50:56,261 INFO [train.py:715] (0/8) Epoch 10, batch 32600, loss[loss=0.1571, simple_loss=0.2172, pruned_loss=0.04852, over 4761.00 frames.], tot_loss[loss=0.1389, simple_loss=0.211, pruned_loss=0.03338, over 972478.58 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:51:35,796 INFO [train.py:715] (0/8) Epoch 10, batch 32650, loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03333, over 4935.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2114, pruned_loss=0.03338, over 972781.84 frames.], batch size: 18, lr: 2.10e-04 2022-05-06 23:52:15,566 INFO [train.py:715] (0/8) Epoch 10, batch 32700, loss[loss=0.1237, simple_loss=0.1885, pruned_loss=0.02951, over 4886.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2112, pruned_loss=0.03348, over 972809.18 frames.], batch size: 19, lr: 2.09e-04 2022-05-06 23:52:53,817 INFO [train.py:715] (0/8) Epoch 10, batch 32750, loss[loss=0.1499, simple_loss=0.2289, pruned_loss=0.03543, over 4892.00 frames.], tot_loss[loss=0.1399, simple_loss=0.212, pruned_loss=0.03394, over 972300.46 frames.], batch size: 39, lr: 2.09e-04 2022-05-06 23:53:34,508 INFO [train.py:715] (0/8) Epoch 10, batch 32800, loss[loss=0.1264, simple_loss=0.201, pruned_loss=0.02592, over 4893.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2116, pruned_loss=0.03358, over 972395.82 frames.], batch size: 17, lr: 2.09e-04 2022-05-06 23:54:14,774 INFO [train.py:715] (0/8) Epoch 10, batch 32850, loss[loss=0.1293, simple_loss=0.1967, pruned_loss=0.03092, over 4827.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2123, pruned_loss=0.03404, over 972810.33 frames.], batch size: 15, lr: 2.09e-04 2022-05-06 23:54:54,883 INFO [train.py:715] (0/8) Epoch 10, batch 32900, loss[loss=0.1308, simple_loss=0.1997, pruned_loss=0.03095, over 4857.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2125, pruned_loss=0.03396, over 971950.08 frames.], batch size: 13, lr: 2.09e-04 2022-05-06 23:55:34,227 INFO [train.py:715] (0/8) Epoch 10, batch 32950, loss[loss=0.1241, simple_loss=0.2076, pruned_loss=0.02036, over 4990.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2119, pruned_loss=0.03325, over 971382.03 frames.], batch size: 25, lr: 2.09e-04 2022-05-06 23:56:14,906 INFO [train.py:715] (0/8) Epoch 10, batch 33000, loss[loss=0.1239, simple_loss=0.1938, pruned_loss=0.02701, over 4695.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2122, pruned_loss=0.03322, over 971520.61 frames.], batch size: 15, lr: 2.09e-04 2022-05-06 23:56:14,908 INFO [train.py:733] (0/8) Computing validation loss 2022-05-06 23:56:24,576 INFO [train.py:742] (0/8) Epoch 10, validation: loss=0.1063, simple_loss=0.1905, pruned_loss=0.01103, over 914524.00 frames. 2022-05-06 23:57:03,963 INFO [train.py:715] (0/8) Epoch 10, batch 33050, loss[loss=0.1344, simple_loss=0.1973, pruned_loss=0.03578, over 4899.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2117, pruned_loss=0.03322, over 971614.96 frames.], batch size: 32, lr: 2.09e-04 2022-05-06 23:57:43,740 INFO [train.py:715] (0/8) Epoch 10, batch 33100, loss[loss=0.1212, simple_loss=0.1983, pruned_loss=0.02204, over 4904.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2113, pruned_loss=0.03313, over 972556.02 frames.], batch size: 19, lr: 2.09e-04 2022-05-06 23:58:21,691 INFO [train.py:715] (0/8) Epoch 10, batch 33150, loss[loss=0.1305, simple_loss=0.1978, pruned_loss=0.0316, over 4930.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2116, pruned_loss=0.03312, over 972411.59 frames.], batch size: 29, lr: 2.09e-04 2022-05-06 23:59:00,822 INFO [train.py:715] (0/8) Epoch 10, batch 33200, loss[loss=0.1135, simple_loss=0.1873, pruned_loss=0.01989, over 4900.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2115, pruned_loss=0.03292, over 971737.64 frames.], batch size: 19, lr: 2.09e-04 2022-05-06 23:59:40,445 INFO [train.py:715] (0/8) Epoch 10, batch 33250, loss[loss=0.1461, simple_loss=0.2197, pruned_loss=0.03627, over 4924.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.03304, over 971628.08 frames.], batch size: 23, lr: 2.09e-04 2022-05-07 00:00:18,360 INFO [train.py:715] (0/8) Epoch 10, batch 33300, loss[loss=0.1344, simple_loss=0.2055, pruned_loss=0.03163, over 4939.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03327, over 971282.86 frames.], batch size: 21, lr: 2.09e-04 2022-05-07 00:00:57,770 INFO [train.py:715] (0/8) Epoch 10, batch 33350, loss[loss=0.1267, simple_loss=0.202, pruned_loss=0.02572, over 4805.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03272, over 970723.16 frames.], batch size: 21, lr: 2.09e-04 2022-05-07 00:01:37,019 INFO [train.py:715] (0/8) Epoch 10, batch 33400, loss[loss=0.1324, simple_loss=0.2032, pruned_loss=0.03084, over 4971.00 frames.], tot_loss[loss=0.139, simple_loss=0.2123, pruned_loss=0.03283, over 971170.41 frames.], batch size: 14, lr: 2.09e-04 2022-05-07 00:02:16,544 INFO [train.py:715] (0/8) Epoch 10, batch 33450, loss[loss=0.133, simple_loss=0.208, pruned_loss=0.029, over 4739.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.03306, over 971127.01 frames.], batch size: 16, lr: 2.09e-04 2022-05-07 00:02:54,366 INFO [train.py:715] (0/8) Epoch 10, batch 33500, loss[loss=0.1145, simple_loss=0.196, pruned_loss=0.01655, over 4774.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2124, pruned_loss=0.03253, over 971651.40 frames.], batch size: 18, lr: 2.09e-04 2022-05-07 00:03:33,957 INFO [train.py:715] (0/8) Epoch 10, batch 33550, loss[loss=0.1302, simple_loss=0.1959, pruned_loss=0.03225, over 4868.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03278, over 971854.25 frames.], batch size: 20, lr: 2.09e-04 2022-05-07 00:04:13,562 INFO [train.py:715] (0/8) Epoch 10, batch 33600, loss[loss=0.1317, simple_loss=0.21, pruned_loss=0.02665, over 4831.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2128, pruned_loss=0.0328, over 971579.93 frames.], batch size: 15, lr: 2.09e-04 2022-05-07 00:04:52,113 INFO [train.py:715] (0/8) Epoch 10, batch 33650, loss[loss=0.1424, simple_loss=0.2153, pruned_loss=0.03478, over 4836.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2127, pruned_loss=0.03255, over 972115.18 frames.], batch size: 32, lr: 2.09e-04 2022-05-07 00:05:30,841 INFO [train.py:715] (0/8) Epoch 10, batch 33700, loss[loss=0.1196, simple_loss=0.2015, pruned_loss=0.01886, over 4881.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2132, pruned_loss=0.03291, over 972027.48 frames.], batch size: 20, lr: 2.09e-04 2022-05-07 00:06:10,495 INFO [train.py:715] (0/8) Epoch 10, batch 33750, loss[loss=0.141, simple_loss=0.1993, pruned_loss=0.04135, over 4824.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2128, pruned_loss=0.03304, over 971785.21 frames.], batch size: 12, lr: 2.09e-04 2022-05-07 00:06:50,168 INFO [train.py:715] (0/8) Epoch 10, batch 33800, loss[loss=0.1331, simple_loss=0.207, pruned_loss=0.02965, over 4797.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03305, over 971731.06 frames.], batch size: 24, lr: 2.09e-04 2022-05-07 00:07:29,174 INFO [train.py:715] (0/8) Epoch 10, batch 33850, loss[loss=0.1293, simple_loss=0.201, pruned_loss=0.02876, over 4868.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2122, pruned_loss=0.03251, over 972581.36 frames.], batch size: 20, lr: 2.09e-04 2022-05-07 00:08:08,840 INFO [train.py:715] (0/8) Epoch 10, batch 33900, loss[loss=0.128, simple_loss=0.1941, pruned_loss=0.03095, over 4789.00 frames.], tot_loss[loss=0.1383, simple_loss=0.212, pruned_loss=0.03232, over 971893.87 frames.], batch size: 12, lr: 2.09e-04 2022-05-07 00:08:48,747 INFO [train.py:715] (0/8) Epoch 10, batch 33950, loss[loss=0.1099, simple_loss=0.1798, pruned_loss=0.01994, over 4772.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2117, pruned_loss=0.03227, over 971846.95 frames.], batch size: 12, lr: 2.09e-04 2022-05-07 00:09:27,307 INFO [train.py:715] (0/8) Epoch 10, batch 34000, loss[loss=0.1475, simple_loss=0.2285, pruned_loss=0.03324, over 4975.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03298, over 972492.28 frames.], batch size: 15, lr: 2.09e-04 2022-05-07 00:10:06,612 INFO [train.py:715] (0/8) Epoch 10, batch 34050, loss[loss=0.1549, simple_loss=0.2319, pruned_loss=0.03897, over 4857.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03309, over 972286.02 frames.], batch size: 20, lr: 2.09e-04 2022-05-07 00:10:45,879 INFO [train.py:715] (0/8) Epoch 10, batch 34100, loss[loss=0.13, simple_loss=0.2058, pruned_loss=0.02715, over 4766.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2124, pruned_loss=0.03293, over 972026.67 frames.], batch size: 16, lr: 2.09e-04 2022-05-07 00:11:25,362 INFO [train.py:715] (0/8) Epoch 10, batch 34150, loss[loss=0.1399, simple_loss=0.2088, pruned_loss=0.03552, over 4771.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2125, pruned_loss=0.03323, over 971539.22 frames.], batch size: 17, lr: 2.09e-04 2022-05-07 00:12:04,923 INFO [train.py:715] (0/8) Epoch 10, batch 34200, loss[loss=0.1536, simple_loss=0.223, pruned_loss=0.04206, over 4901.00 frames.], tot_loss[loss=0.1411, simple_loss=0.2138, pruned_loss=0.03419, over 971248.07 frames.], batch size: 19, lr: 2.09e-04 2022-05-07 00:12:44,140 INFO [train.py:715] (0/8) Epoch 10, batch 34250, loss[loss=0.1395, simple_loss=0.2169, pruned_loss=0.03107, over 4837.00 frames.], tot_loss[loss=0.141, simple_loss=0.2137, pruned_loss=0.03418, over 971341.20 frames.], batch size: 13, lr: 2.09e-04 2022-05-07 00:13:23,645 INFO [train.py:715] (0/8) Epoch 10, batch 34300, loss[loss=0.1367, simple_loss=0.2126, pruned_loss=0.0304, over 4802.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.0333, over 971002.79 frames.], batch size: 21, lr: 2.09e-04 2022-05-07 00:14:03,551 INFO [train.py:715] (0/8) Epoch 10, batch 34350, loss[loss=0.159, simple_loss=0.2229, pruned_loss=0.04758, over 4810.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2122, pruned_loss=0.0334, over 970949.40 frames.], batch size: 27, lr: 2.09e-04 2022-05-07 00:14:43,435 INFO [train.py:715] (0/8) Epoch 10, batch 34400, loss[loss=0.121, simple_loss=0.1964, pruned_loss=0.02279, over 4965.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2126, pruned_loss=0.03351, over 971735.42 frames.], batch size: 25, lr: 2.09e-04 2022-05-07 00:15:23,582 INFO [train.py:715] (0/8) Epoch 10, batch 34450, loss[loss=0.1129, simple_loss=0.1867, pruned_loss=0.01959, over 4820.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2123, pruned_loss=0.03311, over 971632.85 frames.], batch size: 27, lr: 2.09e-04 2022-05-07 00:16:03,651 INFO [train.py:715] (0/8) Epoch 10, batch 34500, loss[loss=0.1398, simple_loss=0.2143, pruned_loss=0.03272, over 4914.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2137, pruned_loss=0.0335, over 971959.10 frames.], batch size: 23, lr: 2.09e-04 2022-05-07 00:16:42,850 INFO [train.py:715] (0/8) Epoch 10, batch 34550, loss[loss=0.1231, simple_loss=0.1963, pruned_loss=0.02499, over 4897.00 frames.], tot_loss[loss=0.14, simple_loss=0.2132, pruned_loss=0.03337, over 971116.61 frames.], batch size: 19, lr: 2.09e-04 2022-05-07 00:17:23,152 INFO [train.py:715] (0/8) Epoch 10, batch 34600, loss[loss=0.1554, simple_loss=0.2319, pruned_loss=0.03947, over 4821.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2135, pruned_loss=0.03368, over 971567.49 frames.], batch size: 25, lr: 2.09e-04 2022-05-07 00:18:03,611 INFO [train.py:715] (0/8) Epoch 10, batch 34650, loss[loss=0.1164, simple_loss=0.1864, pruned_loss=0.0232, over 4796.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2132, pruned_loss=0.03369, over 972573.31 frames.], batch size: 24, lr: 2.09e-04 2022-05-07 00:18:42,649 INFO [train.py:715] (0/8) Epoch 10, batch 34700, loss[loss=0.16, simple_loss=0.2244, pruned_loss=0.04782, over 4869.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2135, pruned_loss=0.03383, over 972407.34 frames.], batch size: 16, lr: 2.09e-04 2022-05-07 00:19:21,229 INFO [train.py:715] (0/8) Epoch 10, batch 34750, loss[loss=0.1297, simple_loss=0.197, pruned_loss=0.03117, over 4815.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03331, over 971192.92 frames.], batch size: 12, lr: 2.09e-04 2022-05-07 00:19:57,686 INFO [train.py:715] (0/8) Epoch 10, batch 34800, loss[loss=0.118, simple_loss=0.1856, pruned_loss=0.02517, over 4785.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2117, pruned_loss=0.03289, over 970742.36 frames.], batch size: 12, lr: 2.09e-04 2022-05-07 00:20:06,279 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-10.pt 2022-05-07 00:20:47,593 INFO [train.py:715] (0/8) Epoch 11, batch 0, loss[loss=0.1361, simple_loss=0.2133, pruned_loss=0.02939, over 4865.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2133, pruned_loss=0.02939, over 4865.00 frames.], batch size: 20, lr: 2.00e-04 2022-05-07 00:21:26,498 INFO [train.py:715] (0/8) Epoch 11, batch 50, loss[loss=0.1335, simple_loss=0.2163, pruned_loss=0.02536, over 4791.00 frames.], tot_loss[loss=0.1414, simple_loss=0.2138, pruned_loss=0.03448, over 219029.94 frames.], batch size: 21, lr: 2.00e-04 2022-05-07 00:22:06,398 INFO [train.py:715] (0/8) Epoch 11, batch 100, loss[loss=0.1517, simple_loss=0.222, pruned_loss=0.04073, over 4796.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2121, pruned_loss=0.03365, over 385561.41 frames.], batch size: 21, lr: 2.00e-04 2022-05-07 00:22:46,275 INFO [train.py:715] (0/8) Epoch 11, batch 150, loss[loss=0.1322, simple_loss=0.1993, pruned_loss=0.03257, over 4728.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2124, pruned_loss=0.03415, over 514856.08 frames.], batch size: 12, lr: 2.00e-04 2022-05-07 00:23:26,825 INFO [train.py:715] (0/8) Epoch 11, batch 200, loss[loss=0.1662, simple_loss=0.2384, pruned_loss=0.04695, over 4879.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2126, pruned_loss=0.0339, over 616259.86 frames.], batch size: 16, lr: 2.00e-04 2022-05-07 00:24:06,699 INFO [train.py:715] (0/8) Epoch 11, batch 250, loss[loss=0.1402, simple_loss=0.2152, pruned_loss=0.03266, over 4958.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2128, pruned_loss=0.03387, over 694790.09 frames.], batch size: 35, lr: 2.00e-04 2022-05-07 00:24:45,519 INFO [train.py:715] (0/8) Epoch 11, batch 300, loss[loss=0.1287, simple_loss=0.2108, pruned_loss=0.02327, over 4774.00 frames.], tot_loss[loss=0.139, simple_loss=0.2119, pruned_loss=0.03308, over 756447.02 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:25:26,108 INFO [train.py:715] (0/8) Epoch 11, batch 350, loss[loss=0.1588, simple_loss=0.2369, pruned_loss=0.04039, over 4695.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03317, over 804826.01 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:26:05,774 INFO [train.py:715] (0/8) Epoch 11, batch 400, loss[loss=0.1371, simple_loss=0.2124, pruned_loss=0.0309, over 4917.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.03305, over 842763.65 frames.], batch size: 29, lr: 2.00e-04 2022-05-07 00:26:46,459 INFO [train.py:715] (0/8) Epoch 11, batch 450, loss[loss=0.1438, simple_loss=0.212, pruned_loss=0.0378, over 4882.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.03268, over 871677.66 frames.], batch size: 22, lr: 2.00e-04 2022-05-07 00:27:27,782 INFO [train.py:715] (0/8) Epoch 11, batch 500, loss[loss=0.116, simple_loss=0.1852, pruned_loss=0.02341, over 4786.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2104, pruned_loss=0.03218, over 892836.54 frames.], batch size: 17, lr: 2.00e-04 2022-05-07 00:28:09,387 INFO [train.py:715] (0/8) Epoch 11, batch 550, loss[loss=0.1282, simple_loss=0.2039, pruned_loss=0.02631, over 4827.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03179, over 909806.54 frames.], batch size: 26, lr: 2.00e-04 2022-05-07 00:28:50,702 INFO [train.py:715] (0/8) Epoch 11, batch 600, loss[loss=0.1635, simple_loss=0.2324, pruned_loss=0.04733, over 4983.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03225, over 923598.06 frames.], batch size: 24, lr: 2.00e-04 2022-05-07 00:29:32,047 INFO [train.py:715] (0/8) Epoch 11, batch 650, loss[loss=0.1439, simple_loss=0.2164, pruned_loss=0.03573, over 4965.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03272, over 934478.75 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:30:13,300 INFO [train.py:715] (0/8) Epoch 11, batch 700, loss[loss=0.129, simple_loss=0.1947, pruned_loss=0.03161, over 4707.00 frames.], tot_loss[loss=0.139, simple_loss=0.2115, pruned_loss=0.03325, over 942926.43 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:30:54,877 INFO [train.py:715] (0/8) Epoch 11, batch 750, loss[loss=0.1579, simple_loss=0.2243, pruned_loss=0.04578, over 4772.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2116, pruned_loss=0.03325, over 948546.69 frames.], batch size: 14, lr: 2.00e-04 2022-05-07 00:31:36,032 INFO [train.py:715] (0/8) Epoch 11, batch 800, loss[loss=0.1489, simple_loss=0.213, pruned_loss=0.0424, over 4770.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.03322, over 954453.23 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:32:16,761 INFO [train.py:715] (0/8) Epoch 11, batch 850, loss[loss=0.1339, simple_loss=0.2189, pruned_loss=0.02445, over 4786.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2118, pruned_loss=0.03284, over 958476.91 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:32:58,360 INFO [train.py:715] (0/8) Epoch 11, batch 900, loss[loss=0.1234, simple_loss=0.199, pruned_loss=0.02394, over 4775.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.03292, over 961944.77 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:33:38,985 INFO [train.py:715] (0/8) Epoch 11, batch 950, loss[loss=0.1242, simple_loss=0.2041, pruned_loss=0.0221, over 4888.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2117, pruned_loss=0.03276, over 964627.20 frames.], batch size: 16, lr: 2.00e-04 2022-05-07 00:34:19,481 INFO [train.py:715] (0/8) Epoch 11, batch 1000, loss[loss=0.1331, simple_loss=0.2067, pruned_loss=0.02977, over 4781.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2121, pruned_loss=0.03357, over 966533.41 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:34:58,893 INFO [train.py:715] (0/8) Epoch 11, batch 1050, loss[loss=0.1443, simple_loss=0.2252, pruned_loss=0.03173, over 4918.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2118, pruned_loss=0.03353, over 968386.27 frames.], batch size: 17, lr: 2.00e-04 2022-05-07 00:35:35,548 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-384000.pt 2022-05-07 00:35:41,046 INFO [train.py:715] (0/8) Epoch 11, batch 1100, loss[loss=0.1251, simple_loss=0.1987, pruned_loss=0.02581, over 4910.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2123, pruned_loss=0.03366, over 968773.83 frames.], batch size: 17, lr: 2.00e-04 2022-05-07 00:36:20,724 INFO [train.py:715] (0/8) Epoch 11, batch 1150, loss[loss=0.1403, simple_loss=0.2113, pruned_loss=0.03467, over 4913.00 frames.], tot_loss[loss=0.14, simple_loss=0.213, pruned_loss=0.0335, over 970158.17 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:37:00,329 INFO [train.py:715] (0/8) Epoch 11, batch 1200, loss[loss=0.1642, simple_loss=0.2324, pruned_loss=0.04802, over 4695.00 frames.], tot_loss[loss=0.1399, simple_loss=0.213, pruned_loss=0.03344, over 969759.51 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:37:39,162 INFO [train.py:715] (0/8) Epoch 11, batch 1250, loss[loss=0.1327, simple_loss=0.2109, pruned_loss=0.02721, over 4898.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2122, pruned_loss=0.03319, over 970032.01 frames.], batch size: 17, lr: 2.00e-04 2022-05-07 00:38:18,007 INFO [train.py:715] (0/8) Epoch 11, batch 1300, loss[loss=0.1331, simple_loss=0.2118, pruned_loss=0.02723, over 4963.00 frames.], tot_loss[loss=0.1397, simple_loss=0.213, pruned_loss=0.03317, over 970988.54 frames.], batch size: 39, lr: 2.00e-04 2022-05-07 00:38:56,862 INFO [train.py:715] (0/8) Epoch 11, batch 1350, loss[loss=0.1366, simple_loss=0.1881, pruned_loss=0.04256, over 4775.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2123, pruned_loss=0.03316, over 970798.11 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 00:39:35,882 INFO [train.py:715] (0/8) Epoch 11, batch 1400, loss[loss=0.1313, simple_loss=0.1996, pruned_loss=0.03149, over 4815.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03295, over 970279.05 frames.], batch size: 25, lr: 2.00e-04 2022-05-07 00:40:14,714 INFO [train.py:715] (0/8) Epoch 11, batch 1450, loss[loss=0.1431, simple_loss=0.2157, pruned_loss=0.03525, over 4857.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2125, pruned_loss=0.03322, over 971365.67 frames.], batch size: 32, lr: 2.00e-04 2022-05-07 00:40:53,347 INFO [train.py:715] (0/8) Epoch 11, batch 1500, loss[loss=0.1254, simple_loss=0.1978, pruned_loss=0.02654, over 4975.00 frames.], tot_loss[loss=0.14, simple_loss=0.2134, pruned_loss=0.03326, over 972571.67 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:41:31,712 INFO [train.py:715] (0/8) Epoch 11, batch 1550, loss[loss=0.1179, simple_loss=0.1909, pruned_loss=0.02244, over 4971.00 frames.], tot_loss[loss=0.139, simple_loss=0.2126, pruned_loss=0.03274, over 972767.07 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:42:10,771 INFO [train.py:715] (0/8) Epoch 11, batch 1600, loss[loss=0.1193, simple_loss=0.1892, pruned_loss=0.02476, over 4852.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.03339, over 972155.55 frames.], batch size: 13, lr: 2.00e-04 2022-05-07 00:42:49,742 INFO [train.py:715] (0/8) Epoch 11, batch 1650, loss[loss=0.1262, simple_loss=0.1972, pruned_loss=0.02759, over 4953.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2132, pruned_loss=0.03321, over 971990.05 frames.], batch size: 29, lr: 2.00e-04 2022-05-07 00:43:28,107 INFO [train.py:715] (0/8) Epoch 11, batch 1700, loss[loss=0.1352, simple_loss=0.2075, pruned_loss=0.03147, over 4961.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2121, pruned_loss=0.03263, over 972351.71 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:44:07,381 INFO [train.py:715] (0/8) Epoch 11, batch 1750, loss[loss=0.1593, simple_loss=0.2414, pruned_loss=0.03857, over 4707.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2116, pruned_loss=0.0326, over 972054.41 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:44:46,267 INFO [train.py:715] (0/8) Epoch 11, batch 1800, loss[loss=0.1575, simple_loss=0.2254, pruned_loss=0.04475, over 4857.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2117, pruned_loss=0.03264, over 971424.15 frames.], batch size: 30, lr: 2.00e-04 2022-05-07 00:45:25,305 INFO [train.py:715] (0/8) Epoch 11, batch 1850, loss[loss=0.1101, simple_loss=0.195, pruned_loss=0.01259, over 4825.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03293, over 971648.09 frames.], batch size: 25, lr: 2.00e-04 2022-05-07 00:46:04,486 INFO [train.py:715] (0/8) Epoch 11, batch 1900, loss[loss=0.1121, simple_loss=0.1787, pruned_loss=0.02274, over 4882.00 frames.], tot_loss[loss=0.1388, simple_loss=0.212, pruned_loss=0.0328, over 971976.33 frames.], batch size: 32, lr: 2.00e-04 2022-05-07 00:46:43,763 INFO [train.py:715] (0/8) Epoch 11, batch 1950, loss[loss=0.1325, simple_loss=0.2038, pruned_loss=0.0306, over 4918.00 frames.], tot_loss[loss=0.1374, simple_loss=0.21, pruned_loss=0.03243, over 972454.79 frames.], batch size: 23, lr: 2.00e-04 2022-05-07 00:47:23,299 INFO [train.py:715] (0/8) Epoch 11, batch 2000, loss[loss=0.1447, simple_loss=0.2202, pruned_loss=0.03463, over 4795.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2112, pruned_loss=0.03285, over 972522.08 frames.], batch size: 21, lr: 2.00e-04 2022-05-07 00:48:01,929 INFO [train.py:715] (0/8) Epoch 11, batch 2050, loss[loss=0.1377, simple_loss=0.2049, pruned_loss=0.03523, over 4840.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2111, pruned_loss=0.03266, over 972826.14 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:48:41,071 INFO [train.py:715] (0/8) Epoch 11, batch 2100, loss[loss=0.1127, simple_loss=0.1892, pruned_loss=0.01811, over 4702.00 frames.], tot_loss[loss=0.138, simple_loss=0.2111, pruned_loss=0.03251, over 972518.60 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:49:20,361 INFO [train.py:715] (0/8) Epoch 11, batch 2150, loss[loss=0.1307, simple_loss=0.1979, pruned_loss=0.0318, over 4753.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2101, pruned_loss=0.0322, over 972656.11 frames.], batch size: 19, lr: 2.00e-04 2022-05-07 00:49:59,563 INFO [train.py:715] (0/8) Epoch 11, batch 2200, loss[loss=0.1153, simple_loss=0.1839, pruned_loss=0.02334, over 4907.00 frames.], tot_loss[loss=0.1371, simple_loss=0.21, pruned_loss=0.03206, over 972094.39 frames.], batch size: 23, lr: 2.00e-04 2022-05-07 00:50:38,223 INFO [train.py:715] (0/8) Epoch 11, batch 2250, loss[loss=0.1362, simple_loss=0.2055, pruned_loss=0.03348, over 4807.00 frames.], tot_loss[loss=0.1383, simple_loss=0.211, pruned_loss=0.03275, over 972294.70 frames.], batch size: 25, lr: 2.00e-04 2022-05-07 00:51:17,281 INFO [train.py:715] (0/8) Epoch 11, batch 2300, loss[loss=0.1553, simple_loss=0.2361, pruned_loss=0.03731, over 4954.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2113, pruned_loss=0.03276, over 972513.30 frames.], batch size: 24, lr: 2.00e-04 2022-05-07 00:51:56,681 INFO [train.py:715] (0/8) Epoch 11, batch 2350, loss[loss=0.1664, simple_loss=0.2432, pruned_loss=0.04485, over 4892.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2112, pruned_loss=0.03276, over 972599.10 frames.], batch size: 39, lr: 2.00e-04 2022-05-07 00:52:35,083 INFO [train.py:715] (0/8) Epoch 11, batch 2400, loss[loss=0.1481, simple_loss=0.226, pruned_loss=0.03508, over 4958.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2115, pruned_loss=0.03278, over 972446.71 frames.], batch size: 21, lr: 2.00e-04 2022-05-07 00:53:14,456 INFO [train.py:715] (0/8) Epoch 11, batch 2450, loss[loss=0.12, simple_loss=0.1962, pruned_loss=0.02187, over 4805.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2114, pruned_loss=0.03263, over 973451.90 frames.], batch size: 24, lr: 2.00e-04 2022-05-07 00:53:54,030 INFO [train.py:715] (0/8) Epoch 11, batch 2500, loss[loss=0.1344, simple_loss=0.2098, pruned_loss=0.02945, over 4980.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2118, pruned_loss=0.03296, over 974401.90 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:54:33,182 INFO [train.py:715] (0/8) Epoch 11, batch 2550, loss[loss=0.1977, simple_loss=0.27, pruned_loss=0.06263, over 4825.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2111, pruned_loss=0.03263, over 974809.95 frames.], batch size: 15, lr: 2.00e-04 2022-05-07 00:55:12,420 INFO [train.py:715] (0/8) Epoch 11, batch 2600, loss[loss=0.1565, simple_loss=0.2256, pruned_loss=0.04368, over 4928.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03296, over 974397.23 frames.], batch size: 23, lr: 2.00e-04 2022-05-07 00:55:51,261 INFO [train.py:715] (0/8) Epoch 11, batch 2650, loss[loss=0.1498, simple_loss=0.2173, pruned_loss=0.0412, over 4825.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2124, pruned_loss=0.0327, over 973878.28 frames.], batch size: 13, lr: 2.00e-04 2022-05-07 00:56:30,345 INFO [train.py:715] (0/8) Epoch 11, batch 2700, loss[loss=0.1214, simple_loss=0.1808, pruned_loss=0.03101, over 4839.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2119, pruned_loss=0.03236, over 973498.75 frames.], batch size: 13, lr: 2.00e-04 2022-05-07 00:57:09,091 INFO [train.py:715] (0/8) Epoch 11, batch 2750, loss[loss=0.1216, simple_loss=0.1907, pruned_loss=0.02624, over 4856.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2118, pruned_loss=0.03255, over 973084.69 frames.], batch size: 20, lr: 2.00e-04 2022-05-07 00:57:48,074 INFO [train.py:715] (0/8) Epoch 11, batch 2800, loss[loss=0.1282, simple_loss=0.1991, pruned_loss=0.02862, over 4777.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2122, pruned_loss=0.03269, over 973747.99 frames.], batch size: 14, lr: 2.00e-04 2022-05-07 00:58:27,247 INFO [train.py:715] (0/8) Epoch 11, batch 2850, loss[loss=0.1279, simple_loss=0.2003, pruned_loss=0.02775, over 4933.00 frames.], tot_loss[loss=0.1386, simple_loss=0.212, pruned_loss=0.03261, over 974198.38 frames.], batch size: 23, lr: 2.00e-04 2022-05-07 00:59:05,707 INFO [train.py:715] (0/8) Epoch 11, batch 2900, loss[loss=0.1226, simple_loss=0.1945, pruned_loss=0.02537, over 4878.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2114, pruned_loss=0.03239, over 973931.48 frames.], batch size: 32, lr: 2.00e-04 2022-05-07 00:59:45,164 INFO [train.py:715] (0/8) Epoch 11, batch 2950, loss[loss=0.1599, simple_loss=0.2156, pruned_loss=0.05207, over 4859.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.03288, over 973060.78 frames.], batch size: 20, lr: 2.00e-04 2022-05-07 01:00:25,028 INFO [train.py:715] (0/8) Epoch 11, batch 3000, loss[loss=0.1663, simple_loss=0.2311, pruned_loss=0.05075, over 4806.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03318, over 973230.59 frames.], batch size: 24, lr: 2.00e-04 2022-05-07 01:00:25,029 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 01:00:34,772 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1061, simple_loss=0.1902, pruned_loss=0.01097, over 914524.00 frames. 2022-05-07 01:01:14,745 INFO [train.py:715] (0/8) Epoch 11, batch 3050, loss[loss=0.1264, simple_loss=0.193, pruned_loss=0.02993, over 4769.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03316, over 972547.40 frames.], batch size: 18, lr: 2.00e-04 2022-05-07 01:01:54,016 INFO [train.py:715] (0/8) Epoch 11, batch 3100, loss[loss=0.1332, simple_loss=0.204, pruned_loss=0.03113, over 4980.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03307, over 972397.01 frames.], batch size: 14, lr: 2.00e-04 2022-05-07 01:02:34,089 INFO [train.py:715] (0/8) Epoch 11, batch 3150, loss[loss=0.1511, simple_loss=0.2225, pruned_loss=0.03989, over 4970.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2125, pruned_loss=0.0334, over 972403.13 frames.], batch size: 24, lr: 2.00e-04 2022-05-07 01:03:13,124 INFO [train.py:715] (0/8) Epoch 11, batch 3200, loss[loss=0.1607, simple_loss=0.247, pruned_loss=0.03721, over 4815.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2132, pruned_loss=0.03383, over 972353.35 frames.], batch size: 21, lr: 2.00e-04 2022-05-07 01:03:52,800 INFO [train.py:715] (0/8) Epoch 11, batch 3250, loss[loss=0.1578, simple_loss=0.2479, pruned_loss=0.03386, over 4925.00 frames.], tot_loss[loss=0.1409, simple_loss=0.2141, pruned_loss=0.03388, over 973039.23 frames.], batch size: 23, lr: 2.00e-04 2022-05-07 01:04:31,532 INFO [train.py:715] (0/8) Epoch 11, batch 3300, loss[loss=0.106, simple_loss=0.1861, pruned_loss=0.01293, over 4940.00 frames.], tot_loss[loss=0.1403, simple_loss=0.214, pruned_loss=0.03335, over 972705.43 frames.], batch size: 23, lr: 2.00e-04 2022-05-07 01:05:10,790 INFO [train.py:715] (0/8) Epoch 11, batch 3350, loss[loss=0.1265, simple_loss=0.1995, pruned_loss=0.02677, over 4805.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2133, pruned_loss=0.03313, over 972422.22 frames.], batch size: 21, lr: 2.00e-04 2022-05-07 01:05:50,444 INFO [train.py:715] (0/8) Epoch 11, batch 3400, loss[loss=0.1576, simple_loss=0.233, pruned_loss=0.04114, over 4854.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2122, pruned_loss=0.03249, over 972913.18 frames.], batch size: 20, lr: 2.00e-04 2022-05-07 01:06:29,434 INFO [train.py:715] (0/8) Epoch 11, batch 3450, loss[loss=0.1429, simple_loss=0.2156, pruned_loss=0.03506, over 4823.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2128, pruned_loss=0.03293, over 972444.60 frames.], batch size: 27, lr: 2.00e-04 2022-05-07 01:07:08,297 INFO [train.py:715] (0/8) Epoch 11, batch 3500, loss[loss=0.1396, simple_loss=0.2071, pruned_loss=0.03604, over 4974.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2129, pruned_loss=0.03316, over 973567.46 frames.], batch size: 33, lr: 1.99e-04 2022-05-07 01:07:47,577 INFO [train.py:715] (0/8) Epoch 11, batch 3550, loss[loss=0.1127, simple_loss=0.1897, pruned_loss=0.01785, over 4947.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03284, over 973304.56 frames.], batch size: 24, lr: 1.99e-04 2022-05-07 01:08:27,194 INFO [train.py:715] (0/8) Epoch 11, batch 3600, loss[loss=0.115, simple_loss=0.1959, pruned_loss=0.01706, over 4819.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2114, pruned_loss=0.03266, over 972707.92 frames.], batch size: 25, lr: 1.99e-04 2022-05-07 01:09:05,514 INFO [train.py:715] (0/8) Epoch 11, batch 3650, loss[loss=0.1381, simple_loss=0.2224, pruned_loss=0.02691, over 4800.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.03247, over 971982.96 frames.], batch size: 25, lr: 1.99e-04 2022-05-07 01:09:45,167 INFO [train.py:715] (0/8) Epoch 11, batch 3700, loss[loss=0.1148, simple_loss=0.179, pruned_loss=0.02533, over 4821.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2111, pruned_loss=0.03277, over 972020.56 frames.], batch size: 12, lr: 1.99e-04 2022-05-07 01:10:24,603 INFO [train.py:715] (0/8) Epoch 11, batch 3750, loss[loss=0.1368, simple_loss=0.2013, pruned_loss=0.03619, over 4894.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2113, pruned_loss=0.03241, over 971771.44 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:11:03,048 INFO [train.py:715] (0/8) Epoch 11, batch 3800, loss[loss=0.1362, simple_loss=0.2095, pruned_loss=0.03144, over 4756.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03218, over 972165.95 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:11:42,114 INFO [train.py:715] (0/8) Epoch 11, batch 3850, loss[loss=0.1386, simple_loss=0.2172, pruned_loss=0.03002, over 4698.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03205, over 972046.00 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:12:21,424 INFO [train.py:715] (0/8) Epoch 11, batch 3900, loss[loss=0.1319, simple_loss=0.2074, pruned_loss=0.02817, over 4853.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2107, pruned_loss=0.03238, over 972311.15 frames.], batch size: 20, lr: 1.99e-04 2022-05-07 01:13:01,146 INFO [train.py:715] (0/8) Epoch 11, batch 3950, loss[loss=0.145, simple_loss=0.214, pruned_loss=0.03801, over 4973.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2107, pruned_loss=0.03252, over 972200.87 frames.], batch size: 35, lr: 1.99e-04 2022-05-07 01:13:39,993 INFO [train.py:715] (0/8) Epoch 11, batch 4000, loss[loss=0.1086, simple_loss=0.1761, pruned_loss=0.02056, over 4808.00 frames.], tot_loss[loss=0.1379, simple_loss=0.211, pruned_loss=0.03243, over 972173.67 frames.], batch size: 12, lr: 1.99e-04 2022-05-07 01:14:19,836 INFO [train.py:715] (0/8) Epoch 11, batch 4050, loss[loss=0.1129, simple_loss=0.1857, pruned_loss=0.02006, over 4762.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2111, pruned_loss=0.03231, over 972424.07 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:14:59,478 INFO [train.py:715] (0/8) Epoch 11, batch 4100, loss[loss=0.1351, simple_loss=0.2171, pruned_loss=0.02656, over 4834.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2108, pruned_loss=0.03255, over 972974.88 frames.], batch size: 26, lr: 1.99e-04 2022-05-07 01:15:38,031 INFO [train.py:715] (0/8) Epoch 11, batch 4150, loss[loss=0.1504, simple_loss=0.2211, pruned_loss=0.03986, over 4910.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.03281, over 973031.06 frames.], batch size: 39, lr: 1.99e-04 2022-05-07 01:16:16,415 INFO [train.py:715] (0/8) Epoch 11, batch 4200, loss[loss=0.1604, simple_loss=0.2279, pruned_loss=0.04649, over 4873.00 frames.], tot_loss[loss=0.138, simple_loss=0.2105, pruned_loss=0.03277, over 973907.33 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:16:56,661 INFO [train.py:715] (0/8) Epoch 11, batch 4250, loss[loss=0.1467, simple_loss=0.2259, pruned_loss=0.0337, over 4838.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2106, pruned_loss=0.03294, over 973368.48 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:17:36,658 INFO [train.py:715] (0/8) Epoch 11, batch 4300, loss[loss=0.1259, simple_loss=0.2065, pruned_loss=0.0227, over 4959.00 frames.], tot_loss[loss=0.1391, simple_loss=0.212, pruned_loss=0.03312, over 973636.76 frames.], batch size: 24, lr: 1.99e-04 2022-05-07 01:18:15,822 INFO [train.py:715] (0/8) Epoch 11, batch 4350, loss[loss=0.1198, simple_loss=0.1952, pruned_loss=0.02218, over 4836.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2112, pruned_loss=0.03281, over 973152.88 frames.], batch size: 26, lr: 1.99e-04 2022-05-07 01:18:56,181 INFO [train.py:715] (0/8) Epoch 11, batch 4400, loss[loss=0.117, simple_loss=0.2003, pruned_loss=0.01692, over 4770.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2118, pruned_loss=0.03348, over 972458.36 frames.], batch size: 17, lr: 1.99e-04 2022-05-07 01:19:36,296 INFO [train.py:715] (0/8) Epoch 11, batch 4450, loss[loss=0.1396, simple_loss=0.2216, pruned_loss=0.02878, over 4955.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2119, pruned_loss=0.03332, over 972950.82 frames.], batch size: 21, lr: 1.99e-04 2022-05-07 01:20:15,924 INFO [train.py:715] (0/8) Epoch 11, batch 4500, loss[loss=0.1525, simple_loss=0.2218, pruned_loss=0.04162, over 4859.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2118, pruned_loss=0.03343, over 973865.27 frames.], batch size: 32, lr: 1.99e-04 2022-05-07 01:20:55,942 INFO [train.py:715] (0/8) Epoch 11, batch 4550, loss[loss=0.1244, simple_loss=0.1929, pruned_loss=0.02798, over 4977.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2116, pruned_loss=0.03305, over 973719.84 frames.], batch size: 28, lr: 1.99e-04 2022-05-07 01:21:35,993 INFO [train.py:715] (0/8) Epoch 11, batch 4600, loss[loss=0.1158, simple_loss=0.1952, pruned_loss=0.01825, over 4789.00 frames.], tot_loss[loss=0.14, simple_loss=0.2128, pruned_loss=0.03358, over 973435.96 frames.], batch size: 17, lr: 1.99e-04 2022-05-07 01:22:15,463 INFO [train.py:715] (0/8) Epoch 11, batch 4650, loss[loss=0.152, simple_loss=0.214, pruned_loss=0.04498, over 4812.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2115, pruned_loss=0.03308, over 971821.80 frames.], batch size: 13, lr: 1.99e-04 2022-05-07 01:22:55,180 INFO [train.py:715] (0/8) Epoch 11, batch 4700, loss[loss=0.1397, simple_loss=0.2252, pruned_loss=0.02714, over 4936.00 frames.], tot_loss[loss=0.1381, simple_loss=0.211, pruned_loss=0.03256, over 971430.73 frames.], batch size: 21, lr: 1.99e-04 2022-05-07 01:23:35,349 INFO [train.py:715] (0/8) Epoch 11, batch 4750, loss[loss=0.1284, simple_loss=0.1961, pruned_loss=0.03036, over 4823.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.033, over 970982.89 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:24:15,521 INFO [train.py:715] (0/8) Epoch 11, batch 4800, loss[loss=0.1289, simple_loss=0.2108, pruned_loss=0.0235, over 4744.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2127, pruned_loss=0.03343, over 971713.67 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:24:55,132 INFO [train.py:715] (0/8) Epoch 11, batch 4850, loss[loss=0.1367, simple_loss=0.2112, pruned_loss=0.03113, over 4820.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03333, over 971404.35 frames.], batch size: 25, lr: 1.99e-04 2022-05-07 01:25:34,921 INFO [train.py:715] (0/8) Epoch 11, batch 4900, loss[loss=0.1484, simple_loss=0.2249, pruned_loss=0.03596, over 4783.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2121, pruned_loss=0.03347, over 971929.24 frames.], batch size: 14, lr: 1.99e-04 2022-05-07 01:26:14,639 INFO [train.py:715] (0/8) Epoch 11, batch 4950, loss[loss=0.1546, simple_loss=0.2282, pruned_loss=0.04057, over 4871.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2116, pruned_loss=0.03285, over 972006.05 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:26:53,440 INFO [train.py:715] (0/8) Epoch 11, batch 5000, loss[loss=0.144, simple_loss=0.2175, pruned_loss=0.0352, over 4964.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2113, pruned_loss=0.03245, over 972742.16 frames.], batch size: 24, lr: 1.99e-04 2022-05-07 01:27:31,883 INFO [train.py:715] (0/8) Epoch 11, batch 5050, loss[loss=0.1503, simple_loss=0.2215, pruned_loss=0.03955, over 4887.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2112, pruned_loss=0.03236, over 973366.63 frames.], batch size: 22, lr: 1.99e-04 2022-05-07 01:28:11,143 INFO [train.py:715] (0/8) Epoch 11, batch 5100, loss[loss=0.1562, simple_loss=0.2282, pruned_loss=0.04207, over 4855.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03242, over 973146.49 frames.], batch size: 30, lr: 1.99e-04 2022-05-07 01:28:50,276 INFO [train.py:715] (0/8) Epoch 11, batch 5150, loss[loss=0.1436, simple_loss=0.2168, pruned_loss=0.03517, over 4871.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2118, pruned_loss=0.03267, over 973522.26 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:29:29,203 INFO [train.py:715] (0/8) Epoch 11, batch 5200, loss[loss=0.1189, simple_loss=0.1936, pruned_loss=0.02204, over 4689.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2116, pruned_loss=0.03266, over 972945.84 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:30:08,608 INFO [train.py:715] (0/8) Epoch 11, batch 5250, loss[loss=0.1362, simple_loss=0.2107, pruned_loss=0.03085, over 4770.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2113, pruned_loss=0.03263, over 973133.94 frames.], batch size: 18, lr: 1.99e-04 2022-05-07 01:30:48,289 INFO [train.py:715] (0/8) Epoch 11, batch 5300, loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03064, over 4899.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.0325, over 972938.62 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:31:27,444 INFO [train.py:715] (0/8) Epoch 11, batch 5350, loss[loss=0.1445, simple_loss=0.2148, pruned_loss=0.03705, over 4984.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2106, pruned_loss=0.03218, over 972660.82 frames.], batch size: 28, lr: 1.99e-04 2022-05-07 01:32:06,510 INFO [train.py:715] (0/8) Epoch 11, batch 5400, loss[loss=0.1479, simple_loss=0.2128, pruned_loss=0.04146, over 4915.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2101, pruned_loss=0.03211, over 972367.45 frames.], batch size: 18, lr: 1.99e-04 2022-05-07 01:32:45,898 INFO [train.py:715] (0/8) Epoch 11, batch 5450, loss[loss=0.146, simple_loss=0.2202, pruned_loss=0.03593, over 4857.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2102, pruned_loss=0.03237, over 972317.67 frames.], batch size: 38, lr: 1.99e-04 2022-05-07 01:33:25,398 INFO [train.py:715] (0/8) Epoch 11, batch 5500, loss[loss=0.1706, simple_loss=0.2476, pruned_loss=0.04678, over 4891.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03305, over 972428.13 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:34:04,251 INFO [train.py:715] (0/8) Epoch 11, batch 5550, loss[loss=0.132, simple_loss=0.2018, pruned_loss=0.03112, over 4914.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03286, over 973578.07 frames.], batch size: 18, lr: 1.99e-04 2022-05-07 01:34:42,706 INFO [train.py:715] (0/8) Epoch 11, batch 5600, loss[loss=0.1277, simple_loss=0.2015, pruned_loss=0.02693, over 4982.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2115, pruned_loss=0.03279, over 974018.38 frames.], batch size: 25, lr: 1.99e-04 2022-05-07 01:35:22,172 INFO [train.py:715] (0/8) Epoch 11, batch 5650, loss[loss=0.1619, simple_loss=0.2297, pruned_loss=0.04707, over 4879.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03352, over 974052.58 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:36:01,612 INFO [train.py:715] (0/8) Epoch 11, batch 5700, loss[loss=0.1498, simple_loss=0.2189, pruned_loss=0.04036, over 4894.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03319, over 973774.18 frames.], batch size: 17, lr: 1.99e-04 2022-05-07 01:36:40,401 INFO [train.py:715] (0/8) Epoch 11, batch 5750, loss[loss=0.1338, simple_loss=0.2087, pruned_loss=0.02942, over 4698.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03328, over 973367.46 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:37:19,378 INFO [train.py:715] (0/8) Epoch 11, batch 5800, loss[loss=0.1521, simple_loss=0.2192, pruned_loss=0.04251, over 4978.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2124, pruned_loss=0.03289, over 973482.91 frames.], batch size: 35, lr: 1.99e-04 2022-05-07 01:37:58,487 INFO [train.py:715] (0/8) Epoch 11, batch 5850, loss[loss=0.1389, simple_loss=0.219, pruned_loss=0.02935, over 4817.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2119, pruned_loss=0.03256, over 973763.59 frames.], batch size: 24, lr: 1.99e-04 2022-05-07 01:38:37,495 INFO [train.py:715] (0/8) Epoch 11, batch 5900, loss[loss=0.1547, simple_loss=0.2246, pruned_loss=0.04239, over 4832.00 frames.], tot_loss[loss=0.1387, simple_loss=0.212, pruned_loss=0.03267, over 973655.08 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:39:16,656 INFO [train.py:715] (0/8) Epoch 11, batch 5950, loss[loss=0.1156, simple_loss=0.1917, pruned_loss=0.0198, over 4791.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2128, pruned_loss=0.03282, over 973569.01 frames.], batch size: 24, lr: 1.99e-04 2022-05-07 01:39:56,446 INFO [train.py:715] (0/8) Epoch 11, batch 6000, loss[loss=0.1463, simple_loss=0.2201, pruned_loss=0.03627, over 4827.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.03267, over 973792.45 frames.], batch size: 15, lr: 1.99e-04 2022-05-07 01:39:56,447 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 01:40:06,015 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1059, simple_loss=0.1901, pruned_loss=0.01082, over 914524.00 frames. 2022-05-07 01:40:45,574 INFO [train.py:715] (0/8) Epoch 11, batch 6050, loss[loss=0.1146, simple_loss=0.1841, pruned_loss=0.02259, over 4863.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2118, pruned_loss=0.03255, over 974115.41 frames.], batch size: 20, lr: 1.99e-04 2022-05-07 01:41:24,989 INFO [train.py:715] (0/8) Epoch 11, batch 6100, loss[loss=0.1464, simple_loss=0.2201, pruned_loss=0.03628, over 4900.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2121, pruned_loss=0.03243, over 973427.36 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:42:03,735 INFO [train.py:715] (0/8) Epoch 11, batch 6150, loss[loss=0.1524, simple_loss=0.2157, pruned_loss=0.04455, over 4993.00 frames.], tot_loss[loss=0.139, simple_loss=0.2121, pruned_loss=0.03298, over 973492.50 frames.], batch size: 14, lr: 1.99e-04 2022-05-07 01:42:43,199 INFO [train.py:715] (0/8) Epoch 11, batch 6200, loss[loss=0.1183, simple_loss=0.1943, pruned_loss=0.02118, over 4765.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2115, pruned_loss=0.03266, over 973581.54 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:43:22,227 INFO [train.py:715] (0/8) Epoch 11, batch 6250, loss[loss=0.1496, simple_loss=0.2257, pruned_loss=0.03672, over 4822.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2111, pruned_loss=0.03257, over 973646.44 frames.], batch size: 13, lr: 1.99e-04 2022-05-07 01:44:01,020 INFO [train.py:715] (0/8) Epoch 11, batch 6300, loss[loss=0.1302, simple_loss=0.2078, pruned_loss=0.02629, over 4887.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03275, over 973970.13 frames.], batch size: 22, lr: 1.99e-04 2022-05-07 01:44:39,691 INFO [train.py:715] (0/8) Epoch 11, batch 6350, loss[loss=0.1375, simple_loss=0.2133, pruned_loss=0.03088, over 4766.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2115, pruned_loss=0.03279, over 973627.35 frames.], batch size: 18, lr: 1.99e-04 2022-05-07 01:45:20,269 INFO [train.py:715] (0/8) Epoch 11, batch 6400, loss[loss=0.143, simple_loss=0.2172, pruned_loss=0.03437, over 4932.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03311, over 974491.46 frames.], batch size: 23, lr: 1.99e-04 2022-05-07 01:45:59,613 INFO [train.py:715] (0/8) Epoch 11, batch 6450, loss[loss=0.1246, simple_loss=0.1933, pruned_loss=0.0279, over 4860.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03316, over 974370.36 frames.], batch size: 32, lr: 1.99e-04 2022-05-07 01:46:38,691 INFO [train.py:715] (0/8) Epoch 11, batch 6500, loss[loss=0.1343, simple_loss=0.2013, pruned_loss=0.03363, over 4852.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03295, over 974659.92 frames.], batch size: 32, lr: 1.99e-04 2022-05-07 01:47:18,036 INFO [train.py:715] (0/8) Epoch 11, batch 6550, loss[loss=0.1102, simple_loss=0.1814, pruned_loss=0.01948, over 4849.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2116, pruned_loss=0.03247, over 974563.38 frames.], batch size: 13, lr: 1.99e-04 2022-05-07 01:47:58,217 INFO [train.py:715] (0/8) Epoch 11, batch 6600, loss[loss=0.1488, simple_loss=0.2171, pruned_loss=0.04026, over 4759.00 frames.], tot_loss[loss=0.138, simple_loss=0.2109, pruned_loss=0.03252, over 973384.02 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:48:38,345 INFO [train.py:715] (0/8) Epoch 11, batch 6650, loss[loss=0.1454, simple_loss=0.2148, pruned_loss=0.03803, over 4906.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2116, pruned_loss=0.03303, over 973556.33 frames.], batch size: 17, lr: 1.99e-04 2022-05-07 01:49:17,551 INFO [train.py:715] (0/8) Epoch 11, batch 6700, loss[loss=0.1578, simple_loss=0.2312, pruned_loss=0.04226, over 4808.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03314, over 973037.64 frames.], batch size: 26, lr: 1.99e-04 2022-05-07 01:49:57,810 INFO [train.py:715] (0/8) Epoch 11, batch 6750, loss[loss=0.1256, simple_loss=0.1993, pruned_loss=0.02594, over 4990.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.03341, over 973235.86 frames.], batch size: 20, lr: 1.99e-04 2022-05-07 01:50:37,608 INFO [train.py:715] (0/8) Epoch 11, batch 6800, loss[loss=0.1377, simple_loss=0.216, pruned_loss=0.02972, over 4905.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.03308, over 972351.84 frames.], batch size: 22, lr: 1.99e-04 2022-05-07 01:51:16,474 INFO [train.py:715] (0/8) Epoch 11, batch 6850, loss[loss=0.1272, simple_loss=0.1935, pruned_loss=0.03043, over 4864.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03318, over 974023.78 frames.], batch size: 32, lr: 1.99e-04 2022-05-07 01:51:55,544 INFO [train.py:715] (0/8) Epoch 11, batch 6900, loss[loss=0.1289, simple_loss=0.1998, pruned_loss=0.029, over 4866.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.03294, over 973011.45 frames.], batch size: 39, lr: 1.99e-04 2022-05-07 01:52:34,234 INFO [train.py:715] (0/8) Epoch 11, batch 6950, loss[loss=0.1499, simple_loss=0.2128, pruned_loss=0.04353, over 4978.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.03299, over 973107.89 frames.], batch size: 33, lr: 1.99e-04 2022-05-07 01:53:13,692 INFO [train.py:715] (0/8) Epoch 11, batch 7000, loss[loss=0.1536, simple_loss=0.2236, pruned_loss=0.04181, over 4907.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2126, pruned_loss=0.0329, over 972505.45 frames.], batch size: 19, lr: 1.99e-04 2022-05-07 01:53:52,255 INFO [train.py:715] (0/8) Epoch 11, batch 7050, loss[loss=0.1579, simple_loss=0.2115, pruned_loss=0.05218, over 4874.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2124, pruned_loss=0.03272, over 972279.21 frames.], batch size: 16, lr: 1.99e-04 2022-05-07 01:54:31,696 INFO [train.py:715] (0/8) Epoch 11, batch 7100, loss[loss=0.1425, simple_loss=0.226, pruned_loss=0.02946, over 4796.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2121, pruned_loss=0.03247, over 972971.06 frames.], batch size: 21, lr: 1.99e-04 2022-05-07 01:55:10,746 INFO [train.py:715] (0/8) Epoch 11, batch 7150, loss[loss=0.1145, simple_loss=0.1887, pruned_loss=0.02009, over 4793.00 frames.], tot_loss[loss=0.1395, simple_loss=0.213, pruned_loss=0.03299, over 972154.95 frames.], batch size: 18, lr: 1.99e-04 2022-05-07 01:55:49,509 INFO [train.py:715] (0/8) Epoch 11, batch 7200, loss[loss=0.1161, simple_loss=0.1908, pruned_loss=0.02071, over 4808.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2132, pruned_loss=0.03323, over 972293.24 frames.], batch size: 25, lr: 1.99e-04 2022-05-07 01:56:28,451 INFO [train.py:715] (0/8) Epoch 11, batch 7250, loss[loss=0.1264, simple_loss=0.2018, pruned_loss=0.02545, over 4852.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2131, pruned_loss=0.03336, over 972791.02 frames.], batch size: 20, lr: 1.99e-04 2022-05-07 01:57:07,427 INFO [train.py:715] (0/8) Epoch 11, batch 7300, loss[loss=0.1154, simple_loss=0.1929, pruned_loss=0.01893, over 4844.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2134, pruned_loss=0.03349, over 972179.19 frames.], batch size: 13, lr: 1.99e-04 2022-05-07 01:57:46,540 INFO [train.py:715] (0/8) Epoch 11, batch 7350, loss[loss=0.1271, simple_loss=0.1972, pruned_loss=0.0285, over 4807.00 frames.], tot_loss[loss=0.1397, simple_loss=0.213, pruned_loss=0.03323, over 972451.74 frames.], batch size: 12, lr: 1.99e-04 2022-05-07 01:58:25,306 INFO [train.py:715] (0/8) Epoch 11, batch 7400, loss[loss=0.1227, simple_loss=0.2006, pruned_loss=0.02238, over 4816.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2126, pruned_loss=0.03349, over 972419.66 frames.], batch size: 27, lr: 1.98e-04 2022-05-07 01:59:04,699 INFO [train.py:715] (0/8) Epoch 11, batch 7450, loss[loss=0.1485, simple_loss=0.2115, pruned_loss=0.04273, over 4944.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2133, pruned_loss=0.03398, over 972137.38 frames.], batch size: 35, lr: 1.98e-04 2022-05-07 01:59:43,840 INFO [train.py:715] (0/8) Epoch 11, batch 7500, loss[loss=0.1381, simple_loss=0.2181, pruned_loss=0.02901, over 4834.00 frames.], tot_loss[loss=0.1407, simple_loss=0.2132, pruned_loss=0.03409, over 973394.33 frames.], batch size: 26, lr: 1.98e-04 2022-05-07 02:00:23,092 INFO [train.py:715] (0/8) Epoch 11, batch 7550, loss[loss=0.1219, simple_loss=0.2001, pruned_loss=0.02184, over 4927.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2128, pruned_loss=0.03377, over 973323.59 frames.], batch size: 29, lr: 1.98e-04 2022-05-07 02:01:02,843 INFO [train.py:715] (0/8) Epoch 11, batch 7600, loss[loss=0.1393, simple_loss=0.2085, pruned_loss=0.03504, over 4964.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2125, pruned_loss=0.03348, over 973400.48 frames.], batch size: 39, lr: 1.98e-04 2022-05-07 02:01:42,513 INFO [train.py:715] (0/8) Epoch 11, batch 7650, loss[loss=0.16, simple_loss=0.2315, pruned_loss=0.04419, over 4946.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03317, over 973735.45 frames.], batch size: 39, lr: 1.98e-04 2022-05-07 02:02:22,053 INFO [train.py:715] (0/8) Epoch 11, batch 7700, loss[loss=0.1504, simple_loss=0.2292, pruned_loss=0.0358, over 4870.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2115, pruned_loss=0.03296, over 973783.71 frames.], batch size: 22, lr: 1.98e-04 2022-05-07 02:03:01,230 INFO [train.py:715] (0/8) Epoch 11, batch 7750, loss[loss=0.1454, simple_loss=0.2155, pruned_loss=0.0377, over 4877.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2115, pruned_loss=0.03308, over 973100.41 frames.], batch size: 20, lr: 1.98e-04 2022-05-07 02:03:40,565 INFO [train.py:715] (0/8) Epoch 11, batch 7800, loss[loss=0.1266, simple_loss=0.191, pruned_loss=0.03107, over 4834.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2119, pruned_loss=0.03323, over 973077.46 frames.], batch size: 13, lr: 1.98e-04 2022-05-07 02:04:19,852 INFO [train.py:715] (0/8) Epoch 11, batch 7850, loss[loss=0.1305, simple_loss=0.2119, pruned_loss=0.02451, over 4932.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2118, pruned_loss=0.03337, over 972455.68 frames.], batch size: 23, lr: 1.98e-04 2022-05-07 02:04:58,991 INFO [train.py:715] (0/8) Epoch 11, batch 7900, loss[loss=0.1544, simple_loss=0.2215, pruned_loss=0.04367, over 4860.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2124, pruned_loss=0.03359, over 973573.07 frames.], batch size: 32, lr: 1.98e-04 2022-05-07 02:05:37,730 INFO [train.py:715] (0/8) Epoch 11, batch 7950, loss[loss=0.1284, simple_loss=0.1878, pruned_loss=0.03449, over 4752.00 frames.], tot_loss[loss=0.1393, simple_loss=0.212, pruned_loss=0.03334, over 973531.38 frames.], batch size: 16, lr: 1.98e-04 2022-05-07 02:06:18,361 INFO [train.py:715] (0/8) Epoch 11, batch 8000, loss[loss=0.1351, simple_loss=0.2123, pruned_loss=0.029, over 4767.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2119, pruned_loss=0.03319, over 972337.25 frames.], batch size: 14, lr: 1.98e-04 2022-05-07 02:06:57,622 INFO [train.py:715] (0/8) Epoch 11, batch 8050, loss[loss=0.1218, simple_loss=0.2009, pruned_loss=0.02136, over 4991.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2129, pruned_loss=0.03332, over 971997.07 frames.], batch size: 28, lr: 1.98e-04 2022-05-07 02:07:37,874 INFO [train.py:715] (0/8) Epoch 11, batch 8100, loss[loss=0.1516, simple_loss=0.216, pruned_loss=0.04359, over 4944.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2129, pruned_loss=0.03338, over 971730.11 frames.], batch size: 21, lr: 1.98e-04 2022-05-07 02:08:17,869 INFO [train.py:715] (0/8) Epoch 11, batch 8150, loss[loss=0.1519, simple_loss=0.225, pruned_loss=0.0394, over 4867.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2128, pruned_loss=0.03306, over 971300.32 frames.], batch size: 16, lr: 1.98e-04 2022-05-07 02:08:57,399 INFO [train.py:715] (0/8) Epoch 11, batch 8200, loss[loss=0.1201, simple_loss=0.1884, pruned_loss=0.02588, over 4834.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2129, pruned_loss=0.03311, over 971573.19 frames.], batch size: 13, lr: 1.98e-04 2022-05-07 02:09:36,729 INFO [train.py:715] (0/8) Epoch 11, batch 8250, loss[loss=0.1671, simple_loss=0.2296, pruned_loss=0.05226, over 4976.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2129, pruned_loss=0.03291, over 971544.75 frames.], batch size: 35, lr: 1.98e-04 2022-05-07 02:10:15,061 INFO [train.py:715] (0/8) Epoch 11, batch 8300, loss[loss=0.1519, simple_loss=0.2175, pruned_loss=0.04318, over 4837.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2127, pruned_loss=0.03282, over 971676.11 frames.], batch size: 30, lr: 1.98e-04 2022-05-07 02:10:54,958 INFO [train.py:715] (0/8) Epoch 11, batch 8350, loss[loss=0.1181, simple_loss=0.1906, pruned_loss=0.02287, over 4825.00 frames.], tot_loss[loss=0.139, simple_loss=0.2123, pruned_loss=0.03285, over 971959.41 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:11:34,528 INFO [train.py:715] (0/8) Epoch 11, batch 8400, loss[loss=0.1258, simple_loss=0.2056, pruned_loss=0.02298, over 4784.00 frames.], tot_loss[loss=0.138, simple_loss=0.2117, pruned_loss=0.03217, over 972755.46 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:12:13,502 INFO [train.py:715] (0/8) Epoch 11, batch 8450, loss[loss=0.1307, simple_loss=0.206, pruned_loss=0.02766, over 4920.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2117, pruned_loss=0.032, over 973190.36 frames.], batch size: 23, lr: 1.98e-04 2022-05-07 02:12:52,197 INFO [train.py:715] (0/8) Epoch 11, batch 8500, loss[loss=0.1324, simple_loss=0.2127, pruned_loss=0.02611, over 4957.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2129, pruned_loss=0.03284, over 973204.86 frames.], batch size: 29, lr: 1.98e-04 2022-05-07 02:13:32,006 INFO [train.py:715] (0/8) Epoch 11, batch 8550, loss[loss=0.1013, simple_loss=0.1736, pruned_loss=0.01456, over 4845.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03276, over 973206.90 frames.], batch size: 13, lr: 1.98e-04 2022-05-07 02:14:11,210 INFO [train.py:715] (0/8) Epoch 11, batch 8600, loss[loss=0.1361, simple_loss=0.2126, pruned_loss=0.02978, over 4707.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2114, pruned_loss=0.0325, over 973267.07 frames.], batch size: 15, lr: 1.98e-04 2022-05-07 02:14:49,541 INFO [train.py:715] (0/8) Epoch 11, batch 8650, loss[loss=0.1399, simple_loss=0.2129, pruned_loss=0.03346, over 4781.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03337, over 972621.59 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:15:29,402 INFO [train.py:715] (0/8) Epoch 11, batch 8700, loss[loss=0.1174, simple_loss=0.1932, pruned_loss=0.02078, over 4905.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.03308, over 973467.11 frames.], batch size: 23, lr: 1.98e-04 2022-05-07 02:16:08,720 INFO [train.py:715] (0/8) Epoch 11, batch 8750, loss[loss=0.1397, simple_loss=0.2217, pruned_loss=0.02891, over 4891.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2122, pruned_loss=0.03329, over 973856.42 frames.], batch size: 19, lr: 1.98e-04 2022-05-07 02:16:47,702 INFO [train.py:715] (0/8) Epoch 11, batch 8800, loss[loss=0.1231, simple_loss=0.199, pruned_loss=0.02358, over 4766.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03308, over 973858.30 frames.], batch size: 17, lr: 1.98e-04 2022-05-07 02:17:26,836 INFO [train.py:715] (0/8) Epoch 11, batch 8850, loss[loss=0.1423, simple_loss=0.2203, pruned_loss=0.03214, over 4813.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.03341, over 974287.17 frames.], batch size: 26, lr: 1.98e-04 2022-05-07 02:18:06,536 INFO [train.py:715] (0/8) Epoch 11, batch 8900, loss[loss=0.1396, simple_loss=0.223, pruned_loss=0.0281, over 4876.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2129, pruned_loss=0.03344, over 973655.59 frames.], batch size: 22, lr: 1.98e-04 2022-05-07 02:18:46,174 INFO [train.py:715] (0/8) Epoch 11, batch 8950, loss[loss=0.1247, simple_loss=0.1984, pruned_loss=0.02543, over 4795.00 frames.], tot_loss[loss=0.14, simple_loss=0.2133, pruned_loss=0.03338, over 973519.04 frames.], batch size: 21, lr: 1.98e-04 2022-05-07 02:19:25,276 INFO [train.py:715] (0/8) Epoch 11, batch 9000, loss[loss=0.1591, simple_loss=0.2326, pruned_loss=0.04277, over 4985.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2131, pruned_loss=0.03322, over 973630.19 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:19:25,278 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 02:19:34,857 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1061, simple_loss=0.1903, pruned_loss=0.011, over 914524.00 frames. 2022-05-07 02:20:13,750 INFO [train.py:715] (0/8) Epoch 11, batch 9050, loss[loss=0.1319, simple_loss=0.2094, pruned_loss=0.02718, over 4843.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2137, pruned_loss=0.03347, over 973101.19 frames.], batch size: 30, lr: 1.98e-04 2022-05-07 02:20:50,472 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-392000.pt 2022-05-07 02:20:55,918 INFO [train.py:715] (0/8) Epoch 11, batch 9100, loss[loss=0.1571, simple_loss=0.2484, pruned_loss=0.03285, over 4868.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2131, pruned_loss=0.0333, over 973250.39 frames.], batch size: 16, lr: 1.98e-04 2022-05-07 02:21:35,542 INFO [train.py:715] (0/8) Epoch 11, batch 9150, loss[loss=0.1532, simple_loss=0.2304, pruned_loss=0.03805, over 4990.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.03337, over 972439.91 frames.], batch size: 20, lr: 1.98e-04 2022-05-07 02:22:15,054 INFO [train.py:715] (0/8) Epoch 11, batch 9200, loss[loss=0.1415, simple_loss=0.2101, pruned_loss=0.03652, over 4952.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03337, over 971513.52 frames.], batch size: 24, lr: 1.98e-04 2022-05-07 02:22:54,636 INFO [train.py:715] (0/8) Epoch 11, batch 9250, loss[loss=0.1362, simple_loss=0.2232, pruned_loss=0.02463, over 4789.00 frames.], tot_loss[loss=0.139, simple_loss=0.2124, pruned_loss=0.03277, over 971510.05 frames.], batch size: 14, lr: 1.98e-04 2022-05-07 02:23:33,871 INFO [train.py:715] (0/8) Epoch 11, batch 9300, loss[loss=0.1311, simple_loss=0.1987, pruned_loss=0.03175, over 4767.00 frames.], tot_loss[loss=0.139, simple_loss=0.2126, pruned_loss=0.03269, over 971868.97 frames.], batch size: 12, lr: 1.98e-04 2022-05-07 02:24:12,710 INFO [train.py:715] (0/8) Epoch 11, batch 9350, loss[loss=0.1307, simple_loss=0.2066, pruned_loss=0.02738, over 4862.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2121, pruned_loss=0.03283, over 971477.63 frames.], batch size: 32, lr: 1.98e-04 2022-05-07 02:24:51,485 INFO [train.py:715] (0/8) Epoch 11, batch 9400, loss[loss=0.1521, simple_loss=0.2259, pruned_loss=0.03915, over 4760.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2124, pruned_loss=0.03285, over 971503.30 frames.], batch size: 19, lr: 1.98e-04 2022-05-07 02:25:31,001 INFO [train.py:715] (0/8) Epoch 11, batch 9450, loss[loss=0.1716, simple_loss=0.2436, pruned_loss=0.04985, over 4930.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.0327, over 971908.07 frames.], batch size: 23, lr: 1.98e-04 2022-05-07 02:26:10,039 INFO [train.py:715] (0/8) Epoch 11, batch 9500, loss[loss=0.1295, simple_loss=0.207, pruned_loss=0.02601, over 4805.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2131, pruned_loss=0.03329, over 971941.72 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:26:48,572 INFO [train.py:715] (0/8) Epoch 11, batch 9550, loss[loss=0.1279, simple_loss=0.2008, pruned_loss=0.02747, over 4834.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2123, pruned_loss=0.03292, over 971503.89 frames.], batch size: 32, lr: 1.98e-04 2022-05-07 02:27:28,236 INFO [train.py:715] (0/8) Epoch 11, batch 9600, loss[loss=0.1411, simple_loss=0.2084, pruned_loss=0.03694, over 4975.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2113, pruned_loss=0.03276, over 971880.91 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:28:07,060 INFO [train.py:715] (0/8) Epoch 11, batch 9650, loss[loss=0.119, simple_loss=0.1859, pruned_loss=0.02609, over 4778.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2108, pruned_loss=0.03272, over 971844.22 frames.], batch size: 14, lr: 1.98e-04 2022-05-07 02:28:45,584 INFO [train.py:715] (0/8) Epoch 11, batch 9700, loss[loss=0.1444, simple_loss=0.23, pruned_loss=0.0294, over 4800.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2116, pruned_loss=0.03287, over 971873.84 frames.], batch size: 21, lr: 1.98e-04 2022-05-07 02:29:24,587 INFO [train.py:715] (0/8) Epoch 11, batch 9750, loss[loss=0.1108, simple_loss=0.1848, pruned_loss=0.01838, over 4990.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2104, pruned_loss=0.03242, over 971012.02 frames.], batch size: 14, lr: 1.98e-04 2022-05-07 02:30:03,697 INFO [train.py:715] (0/8) Epoch 11, batch 9800, loss[loss=0.1446, simple_loss=0.2188, pruned_loss=0.03519, over 4923.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2106, pruned_loss=0.03244, over 971117.95 frames.], batch size: 17, lr: 1.98e-04 2022-05-07 02:30:43,325 INFO [train.py:715] (0/8) Epoch 11, batch 9850, loss[loss=0.1216, simple_loss=0.19, pruned_loss=0.02654, over 4816.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03223, over 971434.78 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:31:22,285 INFO [train.py:715] (0/8) Epoch 11, batch 9900, loss[loss=0.1165, simple_loss=0.1933, pruned_loss=0.01983, over 4802.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03219, over 971934.46 frames.], batch size: 21, lr: 1.98e-04 2022-05-07 02:32:02,524 INFO [train.py:715] (0/8) Epoch 11, batch 9950, loss[loss=0.1129, simple_loss=0.1825, pruned_loss=0.02169, over 4897.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03219, over 972806.97 frames.], batch size: 19, lr: 1.98e-04 2022-05-07 02:32:41,856 INFO [train.py:715] (0/8) Epoch 11, batch 10000, loss[loss=0.1254, simple_loss=0.1992, pruned_loss=0.02581, over 4987.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2106, pruned_loss=0.03213, over 972527.13 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:33:21,580 INFO [train.py:715] (0/8) Epoch 11, batch 10050, loss[loss=0.127, simple_loss=0.1979, pruned_loss=0.02803, over 4828.00 frames.], tot_loss[loss=0.1381, simple_loss=0.211, pruned_loss=0.03259, over 973239.70 frames.], batch size: 27, lr: 1.98e-04 2022-05-07 02:33:59,721 INFO [train.py:715] (0/8) Epoch 11, batch 10100, loss[loss=0.1233, simple_loss=0.2083, pruned_loss=0.0191, over 4832.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2118, pruned_loss=0.03296, over 972335.32 frames.], batch size: 15, lr: 1.98e-04 2022-05-07 02:34:38,757 INFO [train.py:715] (0/8) Epoch 11, batch 10150, loss[loss=0.1197, simple_loss=0.199, pruned_loss=0.02019, over 4859.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2117, pruned_loss=0.03303, over 972138.32 frames.], batch size: 20, lr: 1.98e-04 2022-05-07 02:35:17,190 INFO [train.py:715] (0/8) Epoch 11, batch 10200, loss[loss=0.1209, simple_loss=0.2047, pruned_loss=0.0186, over 4931.00 frames.], tot_loss[loss=0.1393, simple_loss=0.212, pruned_loss=0.03328, over 972693.88 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:35:55,358 INFO [train.py:715] (0/8) Epoch 11, batch 10250, loss[loss=0.1235, simple_loss=0.1993, pruned_loss=0.02386, over 4790.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03307, over 972138.88 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:36:34,757 INFO [train.py:715] (0/8) Epoch 11, batch 10300, loss[loss=0.1392, simple_loss=0.2099, pruned_loss=0.03428, over 4989.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2119, pruned_loss=0.03325, over 972636.87 frames.], batch size: 15, lr: 1.98e-04 2022-05-07 02:37:13,485 INFO [train.py:715] (0/8) Epoch 11, batch 10350, loss[loss=0.1208, simple_loss=0.2021, pruned_loss=0.01978, over 4872.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2121, pruned_loss=0.03329, over 973259.30 frames.], batch size: 16, lr: 1.98e-04 2022-05-07 02:37:52,304 INFO [train.py:715] (0/8) Epoch 11, batch 10400, loss[loss=0.1453, simple_loss=0.211, pruned_loss=0.03981, over 4846.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2122, pruned_loss=0.03354, over 972155.59 frames.], batch size: 32, lr: 1.98e-04 2022-05-07 02:38:30,787 INFO [train.py:715] (0/8) Epoch 11, batch 10450, loss[loss=0.1157, simple_loss=0.1848, pruned_loss=0.02329, over 4873.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.03301, over 971133.14 frames.], batch size: 22, lr: 1.98e-04 2022-05-07 02:39:09,426 INFO [train.py:715] (0/8) Epoch 11, batch 10500, loss[loss=0.1245, simple_loss=0.1899, pruned_loss=0.02959, over 4847.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2113, pruned_loss=0.03294, over 971472.96 frames.], batch size: 32, lr: 1.98e-04 2022-05-07 02:39:48,487 INFO [train.py:715] (0/8) Epoch 11, batch 10550, loss[loss=0.1308, simple_loss=0.2003, pruned_loss=0.03064, over 4947.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2107, pruned_loss=0.03256, over 971415.29 frames.], batch size: 35, lr: 1.98e-04 2022-05-07 02:40:27,835 INFO [train.py:715] (0/8) Epoch 11, batch 10600, loss[loss=0.1282, simple_loss=0.1974, pruned_loss=0.02952, over 4836.00 frames.], tot_loss[loss=0.1375, simple_loss=0.21, pruned_loss=0.03255, over 971820.53 frames.], batch size: 12, lr: 1.98e-04 2022-05-07 02:41:06,624 INFO [train.py:715] (0/8) Epoch 11, batch 10650, loss[loss=0.1385, simple_loss=0.208, pruned_loss=0.03453, over 4913.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2092, pruned_loss=0.03217, over 971747.39 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:41:45,851 INFO [train.py:715] (0/8) Epoch 11, batch 10700, loss[loss=0.1587, simple_loss=0.2351, pruned_loss=0.04118, over 4780.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2088, pruned_loss=0.03182, over 971786.99 frames.], batch size: 14, lr: 1.98e-04 2022-05-07 02:42:25,054 INFO [train.py:715] (0/8) Epoch 11, batch 10750, loss[loss=0.1499, simple_loss=0.2372, pruned_loss=0.03127, over 4817.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2102, pruned_loss=0.03244, over 972948.59 frames.], batch size: 12, lr: 1.98e-04 2022-05-07 02:43:03,972 INFO [train.py:715] (0/8) Epoch 11, batch 10800, loss[loss=0.1528, simple_loss=0.2242, pruned_loss=0.04066, over 4848.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2108, pruned_loss=0.03248, over 972558.37 frames.], batch size: 30, lr: 1.98e-04 2022-05-07 02:43:43,678 INFO [train.py:715] (0/8) Epoch 11, batch 10850, loss[loss=0.1406, simple_loss=0.202, pruned_loss=0.03958, over 4790.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2109, pruned_loss=0.03274, over 972950.79 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:44:23,471 INFO [train.py:715] (0/8) Epoch 11, batch 10900, loss[loss=0.1376, simple_loss=0.2087, pruned_loss=0.03323, over 4971.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03275, over 973336.79 frames.], batch size: 39, lr: 1.98e-04 2022-05-07 02:45:02,831 INFO [train.py:715] (0/8) Epoch 11, batch 10950, loss[loss=0.1432, simple_loss=0.2143, pruned_loss=0.03606, over 4816.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2107, pruned_loss=0.03225, over 973754.91 frames.], batch size: 25, lr: 1.98e-04 2022-05-07 02:45:42,046 INFO [train.py:715] (0/8) Epoch 11, batch 11000, loss[loss=0.1675, simple_loss=0.2253, pruned_loss=0.05488, over 4981.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03228, over 973843.90 frames.], batch size: 31, lr: 1.98e-04 2022-05-07 02:46:21,451 INFO [train.py:715] (0/8) Epoch 11, batch 11050, loss[loss=0.1472, simple_loss=0.2134, pruned_loss=0.04052, over 4911.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2116, pruned_loss=0.0327, over 974002.80 frames.], batch size: 18, lr: 1.98e-04 2022-05-07 02:47:00,457 INFO [train.py:715] (0/8) Epoch 11, batch 11100, loss[loss=0.1306, simple_loss=0.2081, pruned_loss=0.02659, over 4871.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2115, pruned_loss=0.03273, over 972718.73 frames.], batch size: 22, lr: 1.98e-04 2022-05-07 02:47:39,069 INFO [train.py:715] (0/8) Epoch 11, batch 11150, loss[loss=0.1401, simple_loss=0.2071, pruned_loss=0.03655, over 4858.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2113, pruned_loss=0.03248, over 972475.04 frames.], batch size: 30, lr: 1.98e-04 2022-05-07 02:48:18,472 INFO [train.py:715] (0/8) Epoch 11, batch 11200, loss[loss=0.1695, simple_loss=0.2374, pruned_loss=0.05086, over 4851.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03248, over 972973.26 frames.], batch size: 32, lr: 1.98e-04 2022-05-07 02:48:57,589 INFO [train.py:715] (0/8) Epoch 11, batch 11250, loss[loss=0.1432, simple_loss=0.2232, pruned_loss=0.03162, over 4850.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2107, pruned_loss=0.03204, over 972548.57 frames.], batch size: 30, lr: 1.98e-04 2022-05-07 02:49:35,929 INFO [train.py:715] (0/8) Epoch 11, batch 11300, loss[loss=0.1388, simple_loss=0.2116, pruned_loss=0.03305, over 4835.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03151, over 973722.37 frames.], batch size: 13, lr: 1.98e-04 2022-05-07 02:50:14,827 INFO [train.py:715] (0/8) Epoch 11, batch 11350, loss[loss=0.1306, simple_loss=0.21, pruned_loss=0.02557, over 4954.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2096, pruned_loss=0.03132, over 973899.64 frames.], batch size: 21, lr: 1.97e-04 2022-05-07 02:50:54,372 INFO [train.py:715] (0/8) Epoch 11, batch 11400, loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03145, over 4768.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2098, pruned_loss=0.03156, over 973348.68 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 02:51:32,951 INFO [train.py:715] (0/8) Epoch 11, batch 11450, loss[loss=0.1285, simple_loss=0.2077, pruned_loss=0.02466, over 4878.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.03184, over 973763.92 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 02:52:11,282 INFO [train.py:715] (0/8) Epoch 11, batch 11500, loss[loss=0.1184, simple_loss=0.1846, pruned_loss=0.02609, over 4838.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03136, over 972906.09 frames.], batch size: 13, lr: 1.97e-04 2022-05-07 02:52:50,111 INFO [train.py:715] (0/8) Epoch 11, batch 11550, loss[loss=0.1583, simple_loss=0.2286, pruned_loss=0.04405, over 4861.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2092, pruned_loss=0.03125, over 973043.35 frames.], batch size: 32, lr: 1.97e-04 2022-05-07 02:53:29,301 INFO [train.py:715] (0/8) Epoch 11, batch 11600, loss[loss=0.1491, simple_loss=0.2212, pruned_loss=0.03851, over 4929.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2098, pruned_loss=0.03172, over 972632.93 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 02:54:08,233 INFO [train.py:715] (0/8) Epoch 11, batch 11650, loss[loss=0.1229, simple_loss=0.2012, pruned_loss=0.02228, over 4836.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2096, pruned_loss=0.03167, over 972208.82 frames.], batch size: 26, lr: 1.97e-04 2022-05-07 02:54:46,492 INFO [train.py:715] (0/8) Epoch 11, batch 11700, loss[loss=0.1182, simple_loss=0.1876, pruned_loss=0.02444, over 4792.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03189, over 972307.78 frames.], batch size: 12, lr: 1.97e-04 2022-05-07 02:55:25,409 INFO [train.py:715] (0/8) Epoch 11, batch 11750, loss[loss=0.1189, simple_loss=0.1912, pruned_loss=0.02331, over 4846.00 frames.], tot_loss[loss=0.1379, simple_loss=0.211, pruned_loss=0.03243, over 973309.36 frames.], batch size: 12, lr: 1.97e-04 2022-05-07 02:56:04,624 INFO [train.py:715] (0/8) Epoch 11, batch 11800, loss[loss=0.1414, simple_loss=0.2097, pruned_loss=0.03653, over 4964.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2111, pruned_loss=0.03264, over 972318.93 frames.], batch size: 14, lr: 1.97e-04 2022-05-07 02:56:43,714 INFO [train.py:715] (0/8) Epoch 11, batch 11850, loss[loss=0.1171, simple_loss=0.1963, pruned_loss=0.019, over 4933.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03194, over 972332.10 frames.], batch size: 21, lr: 1.97e-04 2022-05-07 02:57:23,413 INFO [train.py:715] (0/8) Epoch 11, batch 11900, loss[loss=0.1564, simple_loss=0.2392, pruned_loss=0.03685, over 4760.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2104, pruned_loss=0.03201, over 972083.46 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 02:58:03,751 INFO [train.py:715] (0/8) Epoch 11, batch 11950, loss[loss=0.1311, simple_loss=0.1994, pruned_loss=0.03139, over 4746.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03204, over 970973.92 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 02:58:43,547 INFO [train.py:715] (0/8) Epoch 11, batch 12000, loss[loss=0.1123, simple_loss=0.1894, pruned_loss=0.01767, over 4797.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.0324, over 971006.90 frames.], batch size: 24, lr: 1.97e-04 2022-05-07 02:58:43,548 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 02:58:53,276 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1061, simple_loss=0.1902, pruned_loss=0.01096, over 914524.00 frames. 2022-05-07 02:59:33,211 INFO [train.py:715] (0/8) Epoch 11, batch 12050, loss[loss=0.1271, simple_loss=0.2029, pruned_loss=0.02571, over 4833.00 frames.], tot_loss[loss=0.137, simple_loss=0.2101, pruned_loss=0.03196, over 971957.38 frames.], batch size: 25, lr: 1.97e-04 2022-05-07 03:00:12,648 INFO [train.py:715] (0/8) Epoch 11, batch 12100, loss[loss=0.1339, simple_loss=0.1966, pruned_loss=0.03559, over 4967.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2112, pruned_loss=0.03249, over 972313.53 frames.], batch size: 35, lr: 1.97e-04 2022-05-07 03:00:51,871 INFO [train.py:715] (0/8) Epoch 11, batch 12150, loss[loss=0.1707, simple_loss=0.2512, pruned_loss=0.04509, over 4784.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2117, pruned_loss=0.03278, over 972107.94 frames.], batch size: 14, lr: 1.97e-04 2022-05-07 03:01:31,401 INFO [train.py:715] (0/8) Epoch 11, batch 12200, loss[loss=0.1277, simple_loss=0.197, pruned_loss=0.02916, over 4808.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2116, pruned_loss=0.03294, over 972125.76 frames.], batch size: 26, lr: 1.97e-04 2022-05-07 03:02:09,902 INFO [train.py:715] (0/8) Epoch 11, batch 12250, loss[loss=0.1447, simple_loss=0.2189, pruned_loss=0.03523, over 4967.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2113, pruned_loss=0.03296, over 971095.98 frames.], batch size: 24, lr: 1.97e-04 2022-05-07 03:02:49,515 INFO [train.py:715] (0/8) Epoch 11, batch 12300, loss[loss=0.1333, simple_loss=0.2164, pruned_loss=0.02505, over 4965.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2117, pruned_loss=0.03308, over 970875.33 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:03:29,334 INFO [train.py:715] (0/8) Epoch 11, batch 12350, loss[loss=0.1253, simple_loss=0.1998, pruned_loss=0.02537, over 4991.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2121, pruned_loss=0.03322, over 971077.78 frames.], batch size: 25, lr: 1.97e-04 2022-05-07 03:04:08,692 INFO [train.py:715] (0/8) Epoch 11, batch 12400, loss[loss=0.1253, simple_loss=0.2042, pruned_loss=0.02321, over 4881.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2105, pruned_loss=0.0323, over 971395.92 frames.], batch size: 22, lr: 1.97e-04 2022-05-07 03:04:46,932 INFO [train.py:715] (0/8) Epoch 11, batch 12450, loss[loss=0.1277, simple_loss=0.2016, pruned_loss=0.02688, over 4810.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2104, pruned_loss=0.0322, over 972226.11 frames.], batch size: 27, lr: 1.97e-04 2022-05-07 03:05:26,165 INFO [train.py:715] (0/8) Epoch 11, batch 12500, loss[loss=0.1305, simple_loss=0.2026, pruned_loss=0.02915, over 4877.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2103, pruned_loss=0.0321, over 972971.71 frames.], batch size: 22, lr: 1.97e-04 2022-05-07 03:06:05,434 INFO [train.py:715] (0/8) Epoch 11, batch 12550, loss[loss=0.1031, simple_loss=0.1709, pruned_loss=0.01759, over 4832.00 frames.], tot_loss[loss=0.1371, simple_loss=0.21, pruned_loss=0.03213, over 973013.09 frames.], batch size: 13, lr: 1.97e-04 2022-05-07 03:06:44,091 INFO [train.py:715] (0/8) Epoch 11, batch 12600, loss[loss=0.1216, simple_loss=0.2093, pruned_loss=0.01694, over 4945.00 frames.], tot_loss[loss=0.135, simple_loss=0.2084, pruned_loss=0.03084, over 973002.86 frames.], batch size: 24, lr: 1.97e-04 2022-05-07 03:07:23,079 INFO [train.py:715] (0/8) Epoch 11, batch 12650, loss[loss=0.1223, simple_loss=0.1925, pruned_loss=0.02605, over 4915.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2094, pruned_loss=0.03161, over 973868.06 frames.], batch size: 23, lr: 1.97e-04 2022-05-07 03:08:02,192 INFO [train.py:715] (0/8) Epoch 11, batch 12700, loss[loss=0.1201, simple_loss=0.1921, pruned_loss=0.02407, over 4953.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2101, pruned_loss=0.03225, over 973643.61 frames.], batch size: 35, lr: 1.97e-04 2022-05-07 03:08:40,888 INFO [train.py:715] (0/8) Epoch 11, batch 12750, loss[loss=0.1195, simple_loss=0.1939, pruned_loss=0.02252, over 4770.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2109, pruned_loss=0.03288, over 973611.32 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 03:09:19,299 INFO [train.py:715] (0/8) Epoch 11, batch 12800, loss[loss=0.1264, simple_loss=0.2056, pruned_loss=0.02361, over 4813.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03235, over 973387.86 frames.], batch size: 25, lr: 1.97e-04 2022-05-07 03:09:58,875 INFO [train.py:715] (0/8) Epoch 11, batch 12850, loss[loss=0.1095, simple_loss=0.1952, pruned_loss=0.0119, over 4933.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2103, pruned_loss=0.03173, over 974146.90 frames.], batch size: 29, lr: 1.97e-04 2022-05-07 03:10:38,288 INFO [train.py:715] (0/8) Epoch 11, batch 12900, loss[loss=0.1092, simple_loss=0.1855, pruned_loss=0.01649, over 4946.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03194, over 974491.00 frames.], batch size: 21, lr: 1.97e-04 2022-05-07 03:11:17,930 INFO [train.py:715] (0/8) Epoch 11, batch 12950, loss[loss=0.1156, simple_loss=0.1916, pruned_loss=0.01984, over 4874.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.03233, over 974748.13 frames.], batch size: 22, lr: 1.97e-04 2022-05-07 03:11:56,709 INFO [train.py:715] (0/8) Epoch 11, batch 13000, loss[loss=0.1505, simple_loss=0.2211, pruned_loss=0.03991, over 4941.00 frames.], tot_loss[loss=0.1381, simple_loss=0.211, pruned_loss=0.03255, over 974797.13 frames.], batch size: 35, lr: 1.97e-04 2022-05-07 03:12:36,379 INFO [train.py:715] (0/8) Epoch 11, batch 13050, loss[loss=0.1282, simple_loss=0.2081, pruned_loss=0.02417, over 4923.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2118, pruned_loss=0.03319, over 973452.24 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 03:13:15,473 INFO [train.py:715] (0/8) Epoch 11, batch 13100, loss[loss=0.1326, simple_loss=0.205, pruned_loss=0.03009, over 4943.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03301, over 972483.36 frames.], batch size: 21, lr: 1.97e-04 2022-05-07 03:13:53,586 INFO [train.py:715] (0/8) Epoch 11, batch 13150, loss[loss=0.1054, simple_loss=0.1856, pruned_loss=0.01259, over 4955.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.03288, over 973222.74 frames.], batch size: 21, lr: 1.97e-04 2022-05-07 03:14:32,697 INFO [train.py:715] (0/8) Epoch 11, batch 13200, loss[loss=0.1156, simple_loss=0.1874, pruned_loss=0.0219, over 4937.00 frames.], tot_loss[loss=0.139, simple_loss=0.2121, pruned_loss=0.03293, over 973663.00 frames.], batch size: 21, lr: 1.97e-04 2022-05-07 03:15:11,055 INFO [train.py:715] (0/8) Epoch 11, batch 13250, loss[loss=0.1167, simple_loss=0.1879, pruned_loss=0.02276, over 4846.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2118, pruned_loss=0.03306, over 973370.93 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:15:50,451 INFO [train.py:715] (0/8) Epoch 11, batch 13300, loss[loss=0.1581, simple_loss=0.2253, pruned_loss=0.04546, over 4982.00 frames.], tot_loss[loss=0.1387, simple_loss=0.212, pruned_loss=0.03275, over 972769.97 frames.], batch size: 14, lr: 1.97e-04 2022-05-07 03:16:29,350 INFO [train.py:715] (0/8) Epoch 11, batch 13350, loss[loss=0.1374, simple_loss=0.2147, pruned_loss=0.03006, over 4864.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2123, pruned_loss=0.03324, over 973545.38 frames.], batch size: 20, lr: 1.97e-04 2022-05-07 03:17:08,597 INFO [train.py:715] (0/8) Epoch 11, batch 13400, loss[loss=0.1282, simple_loss=0.2035, pruned_loss=0.02644, over 4818.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2121, pruned_loss=0.03261, over 972755.83 frames.], batch size: 26, lr: 1.97e-04 2022-05-07 03:17:47,307 INFO [train.py:715] (0/8) Epoch 11, batch 13450, loss[loss=0.1601, simple_loss=0.2247, pruned_loss=0.04772, over 4908.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.033, over 973225.69 frames.], batch size: 17, lr: 1.97e-04 2022-05-07 03:18:26,308 INFO [train.py:715] (0/8) Epoch 11, batch 13500, loss[loss=0.1834, simple_loss=0.2386, pruned_loss=0.06408, over 4797.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.0331, over 972897.39 frames.], batch size: 14, lr: 1.97e-04 2022-05-07 03:19:05,023 INFO [train.py:715] (0/8) Epoch 11, batch 13550, loss[loss=0.1439, simple_loss=0.2161, pruned_loss=0.03583, over 4780.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2124, pruned_loss=0.0329, over 972846.99 frames.], batch size: 17, lr: 1.97e-04 2022-05-07 03:19:44,146 INFO [train.py:715] (0/8) Epoch 11, batch 13600, loss[loss=0.1328, simple_loss=0.2024, pruned_loss=0.03159, over 4875.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2129, pruned_loss=0.03308, over 972584.11 frames.], batch size: 22, lr: 1.97e-04 2022-05-07 03:20:22,535 INFO [train.py:715] (0/8) Epoch 11, batch 13650, loss[loss=0.1209, simple_loss=0.1891, pruned_loss=0.0263, over 4896.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2116, pruned_loss=0.03252, over 973199.86 frames.], batch size: 17, lr: 1.97e-04 2022-05-07 03:21:00,719 INFO [train.py:715] (0/8) Epoch 11, batch 13700, loss[loss=0.119, simple_loss=0.1843, pruned_loss=0.02682, over 4746.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.03259, over 973377.06 frames.], batch size: 12, lr: 1.97e-04 2022-05-07 03:21:39,797 INFO [train.py:715] (0/8) Epoch 11, batch 13750, loss[loss=0.1615, simple_loss=0.2271, pruned_loss=0.04796, over 4976.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2131, pruned_loss=0.033, over 973518.91 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:22:19,173 INFO [train.py:715] (0/8) Epoch 11, batch 13800, loss[loss=0.1241, simple_loss=0.1992, pruned_loss=0.02451, over 4877.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2129, pruned_loss=0.03308, over 973211.57 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 03:22:57,643 INFO [train.py:715] (0/8) Epoch 11, batch 13850, loss[loss=0.132, simple_loss=0.2084, pruned_loss=0.02778, over 4711.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03327, over 972895.51 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:23:37,053 INFO [train.py:715] (0/8) Epoch 11, batch 13900, loss[loss=0.1236, simple_loss=0.1891, pruned_loss=0.02907, over 4852.00 frames.], tot_loss[loss=0.1393, simple_loss=0.212, pruned_loss=0.03324, over 972638.99 frames.], batch size: 13, lr: 1.97e-04 2022-05-07 03:24:15,990 INFO [train.py:715] (0/8) Epoch 11, batch 13950, loss[loss=0.1278, simple_loss=0.1975, pruned_loss=0.02907, over 4842.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2116, pruned_loss=0.03315, over 972087.34 frames.], batch size: 30, lr: 1.97e-04 2022-05-07 03:24:55,180 INFO [train.py:715] (0/8) Epoch 11, batch 14000, loss[loss=0.1332, simple_loss=0.2042, pruned_loss=0.03113, over 4931.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2125, pruned_loss=0.03355, over 972753.16 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 03:25:34,610 INFO [train.py:715] (0/8) Epoch 11, batch 14050, loss[loss=0.1396, simple_loss=0.2085, pruned_loss=0.03538, over 4821.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2117, pruned_loss=0.03344, over 972427.07 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:26:14,331 INFO [train.py:715] (0/8) Epoch 11, batch 14100, loss[loss=0.135, simple_loss=0.2119, pruned_loss=0.02909, over 4922.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2116, pruned_loss=0.03284, over 972876.25 frames.], batch size: 23, lr: 1.97e-04 2022-05-07 03:26:53,601 INFO [train.py:715] (0/8) Epoch 11, batch 14150, loss[loss=0.1468, simple_loss=0.2156, pruned_loss=0.03894, over 4916.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03242, over 973413.50 frames.], batch size: 35, lr: 1.97e-04 2022-05-07 03:27:32,868 INFO [train.py:715] (0/8) Epoch 11, batch 14200, loss[loss=0.17, simple_loss=0.2429, pruned_loss=0.04849, over 4919.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2106, pruned_loss=0.03219, over 973684.88 frames.], batch size: 19, lr: 1.97e-04 2022-05-07 03:28:13,019 INFO [train.py:715] (0/8) Epoch 11, batch 14250, loss[loss=0.1408, simple_loss=0.2144, pruned_loss=0.0336, over 4685.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.0321, over 973560.52 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:28:53,022 INFO [train.py:715] (0/8) Epoch 11, batch 14300, loss[loss=0.1299, simple_loss=0.1999, pruned_loss=0.02999, over 4964.00 frames.], tot_loss[loss=0.138, simple_loss=0.2111, pruned_loss=0.03247, over 973775.40 frames.], batch size: 35, lr: 1.97e-04 2022-05-07 03:29:32,289 INFO [train.py:715] (0/8) Epoch 11, batch 14350, loss[loss=0.1609, simple_loss=0.2312, pruned_loss=0.04529, over 4976.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2124, pruned_loss=0.03316, over 973726.97 frames.], batch size: 39, lr: 1.97e-04 2022-05-07 03:30:12,238 INFO [train.py:715] (0/8) Epoch 11, batch 14400, loss[loss=0.1185, simple_loss=0.1965, pruned_loss=0.02024, over 4861.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2114, pruned_loss=0.03251, over 973881.85 frames.], batch size: 32, lr: 1.97e-04 2022-05-07 03:30:52,511 INFO [train.py:715] (0/8) Epoch 11, batch 14450, loss[loss=0.1619, simple_loss=0.2394, pruned_loss=0.04215, over 4976.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2128, pruned_loss=0.03313, over 973336.46 frames.], batch size: 24, lr: 1.97e-04 2022-05-07 03:31:31,923 INFO [train.py:715] (0/8) Epoch 11, batch 14500, loss[loss=0.1335, simple_loss=0.2121, pruned_loss=0.0274, over 4936.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2122, pruned_loss=0.03272, over 972590.44 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 03:32:11,424 INFO [train.py:715] (0/8) Epoch 11, batch 14550, loss[loss=0.1186, simple_loss=0.1914, pruned_loss=0.02295, over 4967.00 frames.], tot_loss[loss=0.1386, simple_loss=0.212, pruned_loss=0.03262, over 973058.15 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:32:51,264 INFO [train.py:715] (0/8) Epoch 11, batch 14600, loss[loss=0.1441, simple_loss=0.219, pruned_loss=0.03463, over 4905.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2122, pruned_loss=0.03261, over 972717.61 frames.], batch size: 19, lr: 1.97e-04 2022-05-07 03:33:30,637 INFO [train.py:715] (0/8) Epoch 11, batch 14650, loss[loss=0.1363, simple_loss=0.2147, pruned_loss=0.02892, over 4842.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03227, over 971880.65 frames.], batch size: 26, lr: 1.97e-04 2022-05-07 03:34:09,056 INFO [train.py:715] (0/8) Epoch 11, batch 14700, loss[loss=0.1326, simple_loss=0.2139, pruned_loss=0.02564, over 4896.00 frames.], tot_loss[loss=0.1369, simple_loss=0.21, pruned_loss=0.03187, over 971766.49 frames.], batch size: 17, lr: 1.97e-04 2022-05-07 03:34:48,550 INFO [train.py:715] (0/8) Epoch 11, batch 14750, loss[loss=0.1372, simple_loss=0.1987, pruned_loss=0.03784, over 4741.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2095, pruned_loss=0.03195, over 971838.07 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 03:35:27,681 INFO [train.py:715] (0/8) Epoch 11, batch 14800, loss[loss=0.1624, simple_loss=0.2348, pruned_loss=0.04502, over 4907.00 frames.], tot_loss[loss=0.137, simple_loss=0.21, pruned_loss=0.032, over 972375.57 frames.], batch size: 22, lr: 1.97e-04 2022-05-07 03:36:06,358 INFO [train.py:715] (0/8) Epoch 11, batch 14850, loss[loss=0.1426, simple_loss=0.2261, pruned_loss=0.02956, over 4975.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03228, over 972226.20 frames.], batch size: 15, lr: 1.97e-04 2022-05-07 03:36:45,860 INFO [train.py:715] (0/8) Epoch 11, batch 14900, loss[loss=0.1448, simple_loss=0.2235, pruned_loss=0.03309, over 4780.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2124, pruned_loss=0.03243, over 971517.83 frames.], batch size: 18, lr: 1.97e-04 2022-05-07 03:37:25,085 INFO [train.py:715] (0/8) Epoch 11, batch 14950, loss[loss=0.1528, simple_loss=0.2238, pruned_loss=0.04088, over 4737.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2126, pruned_loss=0.03281, over 970854.32 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 03:38:03,590 INFO [train.py:715] (0/8) Epoch 11, batch 15000, loss[loss=0.1501, simple_loss=0.2157, pruned_loss=0.04224, over 4891.00 frames.], tot_loss[loss=0.1386, simple_loss=0.212, pruned_loss=0.03261, over 970820.23 frames.], batch size: 32, lr: 1.97e-04 2022-05-07 03:38:03,591 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 03:38:13,229 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.106, simple_loss=0.1901, pruned_loss=0.01091, over 914524.00 frames. 2022-05-07 03:38:52,000 INFO [train.py:715] (0/8) Epoch 11, batch 15050, loss[loss=0.1713, simple_loss=0.247, pruned_loss=0.04777, over 4865.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03281, over 971083.95 frames.], batch size: 20, lr: 1.97e-04 2022-05-07 03:39:30,959 INFO [train.py:715] (0/8) Epoch 11, batch 15100, loss[loss=0.1108, simple_loss=0.1835, pruned_loss=0.01906, over 4980.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2116, pruned_loss=0.03281, over 971699.78 frames.], batch size: 14, lr: 1.97e-04 2022-05-07 03:40:10,669 INFO [train.py:715] (0/8) Epoch 11, batch 15150, loss[loss=0.1524, simple_loss=0.2299, pruned_loss=0.03741, over 4815.00 frames.], tot_loss[loss=0.1388, simple_loss=0.212, pruned_loss=0.03281, over 971856.81 frames.], batch size: 27, lr: 1.97e-04 2022-05-07 03:40:49,839 INFO [train.py:715] (0/8) Epoch 11, batch 15200, loss[loss=0.1325, simple_loss=0.209, pruned_loss=0.02803, over 4760.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2119, pruned_loss=0.0327, over 972144.70 frames.], batch size: 16, lr: 1.97e-04 2022-05-07 03:41:28,406 INFO [train.py:715] (0/8) Epoch 11, batch 15250, loss[loss=0.1252, simple_loss=0.1977, pruned_loss=0.02639, over 4856.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03314, over 972338.26 frames.], batch size: 38, lr: 1.97e-04 2022-05-07 03:42:07,670 INFO [train.py:715] (0/8) Epoch 11, batch 15300, loss[loss=0.1352, simple_loss=0.2076, pruned_loss=0.0314, over 4759.00 frames.], tot_loss[loss=0.139, simple_loss=0.2123, pruned_loss=0.03282, over 971955.09 frames.], batch size: 19, lr: 1.97e-04 2022-05-07 03:42:46,992 INFO [train.py:715] (0/8) Epoch 11, batch 15350, loss[loss=0.1914, simple_loss=0.2508, pruned_loss=0.06597, over 4983.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2134, pruned_loss=0.03323, over 972109.66 frames.], batch size: 39, lr: 1.96e-04 2022-05-07 03:43:25,863 INFO [train.py:715] (0/8) Epoch 11, batch 15400, loss[loss=0.1558, simple_loss=0.2285, pruned_loss=0.04153, over 4916.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2133, pruned_loss=0.03326, over 971727.34 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 03:44:04,607 INFO [train.py:715] (0/8) Epoch 11, batch 15450, loss[loss=0.148, simple_loss=0.2138, pruned_loss=0.04113, over 4921.00 frames.], tot_loss[loss=0.1395, simple_loss=0.213, pruned_loss=0.03303, over 972372.78 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 03:44:44,025 INFO [train.py:715] (0/8) Epoch 11, batch 15500, loss[loss=0.1727, simple_loss=0.2528, pruned_loss=0.04631, over 4933.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2131, pruned_loss=0.03307, over 972811.57 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 03:45:23,169 INFO [train.py:715] (0/8) Epoch 11, batch 15550, loss[loss=0.1326, simple_loss=0.2169, pruned_loss=0.02411, over 4938.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2128, pruned_loss=0.03308, over 972592.33 frames.], batch size: 21, lr: 1.96e-04 2022-05-07 03:46:01,707 INFO [train.py:715] (0/8) Epoch 11, batch 15600, loss[loss=0.1239, simple_loss=0.1964, pruned_loss=0.02574, over 4835.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2118, pruned_loss=0.03293, over 972414.55 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 03:46:40,881 INFO [train.py:715] (0/8) Epoch 11, batch 15650, loss[loss=0.1645, simple_loss=0.2285, pruned_loss=0.05026, over 4832.00 frames.], tot_loss[loss=0.1378, simple_loss=0.211, pruned_loss=0.03231, over 973048.66 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 03:47:19,841 INFO [train.py:715] (0/8) Epoch 11, batch 15700, loss[loss=0.115, simple_loss=0.1905, pruned_loss=0.01978, over 4643.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2116, pruned_loss=0.03289, over 972736.96 frames.], batch size: 13, lr: 1.96e-04 2022-05-07 03:47:58,645 INFO [train.py:715] (0/8) Epoch 11, batch 15750, loss[loss=0.1115, simple_loss=0.1868, pruned_loss=0.01814, over 4857.00 frames.], tot_loss[loss=0.1381, simple_loss=0.211, pruned_loss=0.03264, over 973039.04 frames.], batch size: 20, lr: 1.96e-04 2022-05-07 03:48:37,394 INFO [train.py:715] (0/8) Epoch 11, batch 15800, loss[loss=0.1434, simple_loss=0.229, pruned_loss=0.02884, over 4812.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.0325, over 973002.03 frames.], batch size: 27, lr: 1.96e-04 2022-05-07 03:49:16,757 INFO [train.py:715] (0/8) Epoch 11, batch 15850, loss[loss=0.1395, simple_loss=0.2158, pruned_loss=0.03158, over 4823.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2115, pruned_loss=0.0327, over 972670.40 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 03:49:55,694 INFO [train.py:715] (0/8) Epoch 11, batch 15900, loss[loss=0.1412, simple_loss=0.2002, pruned_loss=0.04109, over 4849.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2107, pruned_loss=0.03229, over 972467.89 frames.], batch size: 30, lr: 1.96e-04 2022-05-07 03:50:34,610 INFO [train.py:715] (0/8) Epoch 11, batch 15950, loss[loss=0.166, simple_loss=0.2376, pruned_loss=0.04724, over 4934.00 frames.], tot_loss[loss=0.1381, simple_loss=0.211, pruned_loss=0.03264, over 972236.12 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 03:51:13,827 INFO [train.py:715] (0/8) Epoch 11, batch 16000, loss[loss=0.189, simple_loss=0.2731, pruned_loss=0.05248, over 4840.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2116, pruned_loss=0.03277, over 973728.80 frames.], batch size: 13, lr: 1.96e-04 2022-05-07 03:51:53,249 INFO [train.py:715] (0/8) Epoch 11, batch 16050, loss[loss=0.1285, simple_loss=0.2113, pruned_loss=0.02282, over 4815.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2115, pruned_loss=0.03305, over 973623.68 frames.], batch size: 13, lr: 1.96e-04 2022-05-07 03:52:31,938 INFO [train.py:715] (0/8) Epoch 11, batch 16100, loss[loss=0.1619, simple_loss=0.2244, pruned_loss=0.0497, over 4861.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2112, pruned_loss=0.03252, over 972587.90 frames.], batch size: 20, lr: 1.96e-04 2022-05-07 03:53:10,816 INFO [train.py:715] (0/8) Epoch 11, batch 16150, loss[loss=0.1645, simple_loss=0.2283, pruned_loss=0.05034, over 4954.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2119, pruned_loss=0.03233, over 972378.83 frames.], batch size: 35, lr: 1.96e-04 2022-05-07 03:53:50,404 INFO [train.py:715] (0/8) Epoch 11, batch 16200, loss[loss=0.1166, simple_loss=0.1766, pruned_loss=0.02833, over 4970.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2117, pruned_loss=0.03193, over 972657.50 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 03:54:29,887 INFO [train.py:715] (0/8) Epoch 11, batch 16250, loss[loss=0.1483, simple_loss=0.2256, pruned_loss=0.03551, over 4879.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2117, pruned_loss=0.03188, over 972673.94 frames.], batch size: 16, lr: 1.96e-04 2022-05-07 03:55:08,237 INFO [train.py:715] (0/8) Epoch 11, batch 16300, loss[loss=0.1051, simple_loss=0.1849, pruned_loss=0.01269, over 4966.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2117, pruned_loss=0.03182, over 971741.20 frames.], batch size: 24, lr: 1.96e-04 2022-05-07 03:55:47,434 INFO [train.py:715] (0/8) Epoch 11, batch 16350, loss[loss=0.1332, simple_loss=0.212, pruned_loss=0.02726, over 4896.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2114, pruned_loss=0.03171, over 971774.65 frames.], batch size: 19, lr: 1.96e-04 2022-05-07 03:56:26,682 INFO [train.py:715] (0/8) Epoch 11, batch 16400, loss[loss=0.1282, simple_loss=0.203, pruned_loss=0.0267, over 4779.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2115, pruned_loss=0.03194, over 972491.29 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 03:57:05,181 INFO [train.py:715] (0/8) Epoch 11, batch 16450, loss[loss=0.1253, simple_loss=0.1907, pruned_loss=0.02993, over 4780.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2105, pruned_loss=0.03143, over 972834.87 frames.], batch size: 14, lr: 1.96e-04 2022-05-07 03:57:44,152 INFO [train.py:715] (0/8) Epoch 11, batch 16500, loss[loss=0.1126, simple_loss=0.1835, pruned_loss=0.0209, over 4939.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.03154, over 973156.83 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 03:58:23,669 INFO [train.py:715] (0/8) Epoch 11, batch 16550, loss[loss=0.1741, simple_loss=0.2311, pruned_loss=0.05855, over 4857.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2107, pruned_loss=0.03185, over 972841.79 frames.], batch size: 34, lr: 1.96e-04 2022-05-07 03:59:02,824 INFO [train.py:715] (0/8) Epoch 11, batch 16600, loss[loss=0.1305, simple_loss=0.2026, pruned_loss=0.02921, over 4760.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.032, over 972550.55 frames.], batch size: 16, lr: 1.96e-04 2022-05-07 03:59:41,210 INFO [train.py:715] (0/8) Epoch 11, batch 16650, loss[loss=0.1465, simple_loss=0.2228, pruned_loss=0.03513, over 4870.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2113, pruned_loss=0.03191, over 972908.23 frames.], batch size: 22, lr: 1.96e-04 2022-05-07 04:00:20,429 INFO [train.py:715] (0/8) Epoch 11, batch 16700, loss[loss=0.1664, simple_loss=0.2357, pruned_loss=0.04853, over 4891.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.0319, over 973043.45 frames.], batch size: 16, lr: 1.96e-04 2022-05-07 04:00:59,408 INFO [train.py:715] (0/8) Epoch 11, batch 16750, loss[loss=0.1166, simple_loss=0.1964, pruned_loss=0.01834, over 4780.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2117, pruned_loss=0.0323, over 972037.68 frames.], batch size: 14, lr: 1.96e-04 2022-05-07 04:01:38,339 INFO [train.py:715] (0/8) Epoch 11, batch 16800, loss[loss=0.1372, simple_loss=0.2142, pruned_loss=0.03011, over 4872.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03172, over 972615.00 frames.], batch size: 22, lr: 1.96e-04 2022-05-07 04:02:17,996 INFO [train.py:715] (0/8) Epoch 11, batch 16850, loss[loss=0.1292, simple_loss=0.2034, pruned_loss=0.02754, over 4986.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2113, pruned_loss=0.03208, over 972702.81 frames.], batch size: 14, lr: 1.96e-04 2022-05-07 04:02:57,558 INFO [train.py:715] (0/8) Epoch 11, batch 16900, loss[loss=0.1231, simple_loss=0.197, pruned_loss=0.02457, over 4943.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03197, over 972531.70 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 04:03:37,026 INFO [train.py:715] (0/8) Epoch 11, batch 16950, loss[loss=0.1245, simple_loss=0.199, pruned_loss=0.02501, over 4992.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03159, over 973241.73 frames.], batch size: 14, lr: 1.96e-04 2022-05-07 04:04:15,764 INFO [train.py:715] (0/8) Epoch 11, batch 17000, loss[loss=0.1073, simple_loss=0.1726, pruned_loss=0.02098, over 4789.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2097, pruned_loss=0.03163, over 973396.58 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 04:04:55,493 INFO [train.py:715] (0/8) Epoch 11, batch 17050, loss[loss=0.1679, simple_loss=0.2459, pruned_loss=0.04496, over 4909.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.0314, over 973334.55 frames.], batch size: 18, lr: 1.96e-04 2022-05-07 04:05:32,737 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-400000.pt 2022-05-07 04:05:38,132 INFO [train.py:715] (0/8) Epoch 11, batch 17100, loss[loss=0.1773, simple_loss=0.2547, pruned_loss=0.04993, over 4908.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03145, over 973735.70 frames.], batch size: 18, lr: 1.96e-04 2022-05-07 04:06:17,132 INFO [train.py:715] (0/8) Epoch 11, batch 17150, loss[loss=0.1255, simple_loss=0.1966, pruned_loss=0.02726, over 4877.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2096, pruned_loss=0.03169, over 972694.92 frames.], batch size: 22, lr: 1.96e-04 2022-05-07 04:06:56,395 INFO [train.py:715] (0/8) Epoch 11, batch 17200, loss[loss=0.1349, simple_loss=0.2115, pruned_loss=0.02915, over 4811.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2098, pruned_loss=0.0319, over 971602.53 frames.], batch size: 13, lr: 1.96e-04 2022-05-07 04:07:35,864 INFO [train.py:715] (0/8) Epoch 11, batch 17250, loss[loss=0.1245, simple_loss=0.2024, pruned_loss=0.02328, over 4922.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2095, pruned_loss=0.03162, over 971666.44 frames.], batch size: 39, lr: 1.96e-04 2022-05-07 04:08:14,916 INFO [train.py:715] (0/8) Epoch 11, batch 17300, loss[loss=0.1491, simple_loss=0.2245, pruned_loss=0.03688, over 4942.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2097, pruned_loss=0.03174, over 972332.08 frames.], batch size: 21, lr: 1.96e-04 2022-05-07 04:08:53,646 INFO [train.py:715] (0/8) Epoch 11, batch 17350, loss[loss=0.1268, simple_loss=0.2009, pruned_loss=0.02631, over 4948.00 frames.], tot_loss[loss=0.138, simple_loss=0.2108, pruned_loss=0.0326, over 972154.19 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 04:09:33,975 INFO [train.py:715] (0/8) Epoch 11, batch 17400, loss[loss=0.1533, simple_loss=0.2332, pruned_loss=0.03677, over 4817.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03264, over 972182.82 frames.], batch size: 26, lr: 1.96e-04 2022-05-07 04:10:14,469 INFO [train.py:715] (0/8) Epoch 11, batch 17450, loss[loss=0.1299, simple_loss=0.1948, pruned_loss=0.03253, over 4834.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2115, pruned_loss=0.03232, over 972367.47 frames.], batch size: 30, lr: 1.96e-04 2022-05-07 04:10:53,784 INFO [train.py:715] (0/8) Epoch 11, batch 17500, loss[loss=0.155, simple_loss=0.2285, pruned_loss=0.04078, over 4945.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03277, over 972100.63 frames.], batch size: 39, lr: 1.96e-04 2022-05-07 04:11:33,221 INFO [train.py:715] (0/8) Epoch 11, batch 17550, loss[loss=0.1136, simple_loss=0.1893, pruned_loss=0.01893, over 4805.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2126, pruned_loss=0.03306, over 972057.32 frames.], batch size: 21, lr: 1.96e-04 2022-05-07 04:12:12,575 INFO [train.py:715] (0/8) Epoch 11, batch 17600, loss[loss=0.1327, simple_loss=0.2078, pruned_loss=0.02882, over 4985.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2123, pruned_loss=0.03318, over 971954.35 frames.], batch size: 25, lr: 1.96e-04 2022-05-07 04:12:51,729 INFO [train.py:715] (0/8) Epoch 11, batch 17650, loss[loss=0.1632, simple_loss=0.2402, pruned_loss=0.04312, over 4913.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03303, over 971942.84 frames.], batch size: 18, lr: 1.96e-04 2022-05-07 04:13:29,966 INFO [train.py:715] (0/8) Epoch 11, batch 17700, loss[loss=0.1475, simple_loss=0.2084, pruned_loss=0.0433, over 4843.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.03287, over 971836.79 frames.], batch size: 32, lr: 1.96e-04 2022-05-07 04:14:09,453 INFO [train.py:715] (0/8) Epoch 11, batch 17750, loss[loss=0.1459, simple_loss=0.2162, pruned_loss=0.03783, over 4928.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2113, pruned_loss=0.03267, over 972255.82 frames.], batch size: 23, lr: 1.96e-04 2022-05-07 04:14:49,013 INFO [train.py:715] (0/8) Epoch 11, batch 17800, loss[loss=0.1239, simple_loss=0.2056, pruned_loss=0.02113, over 4934.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03303, over 972975.11 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 04:15:27,263 INFO [train.py:715] (0/8) Epoch 11, batch 17850, loss[loss=0.1511, simple_loss=0.2237, pruned_loss=0.03923, over 4884.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2123, pruned_loss=0.03291, over 973224.24 frames.], batch size: 19, lr: 1.96e-04 2022-05-07 04:16:06,255 INFO [train.py:715] (0/8) Epoch 11, batch 17900, loss[loss=0.2116, simple_loss=0.272, pruned_loss=0.07555, over 4731.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2114, pruned_loss=0.03257, over 972386.34 frames.], batch size: 16, lr: 1.96e-04 2022-05-07 04:16:45,879 INFO [train.py:715] (0/8) Epoch 11, batch 17950, loss[loss=0.1601, simple_loss=0.227, pruned_loss=0.04658, over 4902.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2112, pruned_loss=0.03301, over 972954.89 frames.], batch size: 19, lr: 1.96e-04 2022-05-07 04:17:24,873 INFO [train.py:715] (0/8) Epoch 11, batch 18000, loss[loss=0.1524, simple_loss=0.2094, pruned_loss=0.04772, over 4827.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2113, pruned_loss=0.03274, over 973118.85 frames.], batch size: 30, lr: 1.96e-04 2022-05-07 04:17:24,874 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 04:17:34,463 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1061, simple_loss=0.1903, pruned_loss=0.01092, over 914524.00 frames. 2022-05-07 04:18:14,139 INFO [train.py:715] (0/8) Epoch 11, batch 18050, loss[loss=0.1117, simple_loss=0.181, pruned_loss=0.02118, over 4909.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2103, pruned_loss=0.03226, over 972190.03 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 04:18:53,407 INFO [train.py:715] (0/8) Epoch 11, batch 18100, loss[loss=0.1457, simple_loss=0.2108, pruned_loss=0.04027, over 4921.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2117, pruned_loss=0.03252, over 972021.12 frames.], batch size: 23, lr: 1.96e-04 2022-05-07 04:19:32,615 INFO [train.py:715] (0/8) Epoch 11, batch 18150, loss[loss=0.1285, simple_loss=0.2046, pruned_loss=0.02621, over 4911.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.03247, over 972277.92 frames.], batch size: 29, lr: 1.96e-04 2022-05-07 04:20:12,192 INFO [train.py:715] (0/8) Epoch 11, batch 18200, loss[loss=0.1309, simple_loss=0.2132, pruned_loss=0.02425, over 4924.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2107, pruned_loss=0.03231, over 972466.20 frames.], batch size: 18, lr: 1.96e-04 2022-05-07 04:20:50,625 INFO [train.py:715] (0/8) Epoch 11, batch 18250, loss[loss=0.1898, simple_loss=0.2628, pruned_loss=0.05837, over 4791.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03222, over 972195.19 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 04:21:29,926 INFO [train.py:715] (0/8) Epoch 11, batch 18300, loss[loss=0.1642, simple_loss=0.2358, pruned_loss=0.04634, over 4814.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2118, pruned_loss=0.03262, over 972031.96 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 04:22:09,173 INFO [train.py:715] (0/8) Epoch 11, batch 18350, loss[loss=0.1245, simple_loss=0.1843, pruned_loss=0.0324, over 4745.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2112, pruned_loss=0.0323, over 972341.14 frames.], batch size: 16, lr: 1.96e-04 2022-05-07 04:22:47,569 INFO [train.py:715] (0/8) Epoch 11, batch 18400, loss[loss=0.1413, simple_loss=0.2184, pruned_loss=0.03209, over 4790.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2119, pruned_loss=0.03277, over 972593.36 frames.], batch size: 24, lr: 1.96e-04 2022-05-07 04:23:25,985 INFO [train.py:715] (0/8) Epoch 11, batch 18450, loss[loss=0.1339, simple_loss=0.213, pruned_loss=0.02737, over 4988.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03296, over 973154.22 frames.], batch size: 31, lr: 1.96e-04 2022-05-07 04:24:05,020 INFO [train.py:715] (0/8) Epoch 11, batch 18500, loss[loss=0.1357, simple_loss=0.2132, pruned_loss=0.02908, over 4927.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.03239, over 972976.73 frames.], batch size: 23, lr: 1.96e-04 2022-05-07 04:24:44,459 INFO [train.py:715] (0/8) Epoch 11, batch 18550, loss[loss=0.1155, simple_loss=0.1767, pruned_loss=0.02712, over 4813.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03272, over 973005.20 frames.], batch size: 12, lr: 1.96e-04 2022-05-07 04:25:22,563 INFO [train.py:715] (0/8) Epoch 11, batch 18600, loss[loss=0.1315, simple_loss=0.2165, pruned_loss=0.02328, over 4795.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2125, pruned_loss=0.0328, over 972137.65 frames.], batch size: 21, lr: 1.96e-04 2022-05-07 04:26:01,409 INFO [train.py:715] (0/8) Epoch 11, batch 18650, loss[loss=0.1306, simple_loss=0.2119, pruned_loss=0.02464, over 4941.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2125, pruned_loss=0.03254, over 972267.26 frames.], batch size: 23, lr: 1.96e-04 2022-05-07 04:26:40,666 INFO [train.py:715] (0/8) Epoch 11, batch 18700, loss[loss=0.1759, simple_loss=0.2421, pruned_loss=0.0549, over 4917.00 frames.], tot_loss[loss=0.1392, simple_loss=0.213, pruned_loss=0.0327, over 972072.69 frames.], batch size: 18, lr: 1.96e-04 2022-05-07 04:27:18,909 INFO [train.py:715] (0/8) Epoch 11, batch 18750, loss[loss=0.1695, simple_loss=0.2386, pruned_loss=0.05019, over 4786.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2125, pruned_loss=0.03238, over 971683.42 frames.], batch size: 18, lr: 1.96e-04 2022-05-07 04:27:57,976 INFO [train.py:715] (0/8) Epoch 11, batch 18800, loss[loss=0.1293, simple_loss=0.2006, pruned_loss=0.02896, over 4968.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2121, pruned_loss=0.03253, over 972458.24 frames.], batch size: 35, lr: 1.96e-04 2022-05-07 04:28:36,592 INFO [train.py:715] (0/8) Epoch 11, batch 18850, loss[loss=0.1736, simple_loss=0.2527, pruned_loss=0.04718, over 4983.00 frames.], tot_loss[loss=0.138, simple_loss=0.212, pruned_loss=0.03196, over 972572.89 frames.], batch size: 25, lr: 1.96e-04 2022-05-07 04:29:16,483 INFO [train.py:715] (0/8) Epoch 11, batch 18900, loss[loss=0.1371, simple_loss=0.2099, pruned_loss=0.03212, over 4906.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2124, pruned_loss=0.03219, over 972349.55 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 04:29:55,263 INFO [train.py:715] (0/8) Epoch 11, batch 18950, loss[loss=0.1151, simple_loss=0.1874, pruned_loss=0.02141, over 4864.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2125, pruned_loss=0.03251, over 973091.73 frames.], batch size: 20, lr: 1.96e-04 2022-05-07 04:30:34,361 INFO [train.py:715] (0/8) Epoch 11, batch 19000, loss[loss=0.1345, simple_loss=0.2134, pruned_loss=0.02776, over 4975.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2122, pruned_loss=0.03243, over 972737.70 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 04:31:13,454 INFO [train.py:715] (0/8) Epoch 11, batch 19050, loss[loss=0.1139, simple_loss=0.1862, pruned_loss=0.02078, over 4758.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2114, pruned_loss=0.03189, over 971479.72 frames.], batch size: 12, lr: 1.96e-04 2022-05-07 04:31:52,055 INFO [train.py:715] (0/8) Epoch 11, batch 19100, loss[loss=0.1445, simple_loss=0.2024, pruned_loss=0.04332, over 4753.00 frames.], tot_loss[loss=0.1387, simple_loss=0.212, pruned_loss=0.03268, over 971776.10 frames.], batch size: 19, lr: 1.96e-04 2022-05-07 04:32:31,179 INFO [train.py:715] (0/8) Epoch 11, batch 19150, loss[loss=0.1464, simple_loss=0.2156, pruned_loss=0.03864, over 4967.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03255, over 972669.96 frames.], batch size: 35, lr: 1.96e-04 2022-05-07 04:33:10,075 INFO [train.py:715] (0/8) Epoch 11, batch 19200, loss[loss=0.1318, simple_loss=0.1979, pruned_loss=0.03283, over 4980.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2114, pruned_loss=0.03253, over 972364.78 frames.], batch size: 35, lr: 1.96e-04 2022-05-07 04:33:49,481 INFO [train.py:715] (0/8) Epoch 11, batch 19250, loss[loss=0.1279, simple_loss=0.1966, pruned_loss=0.0296, over 4689.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03275, over 971612.17 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 04:34:27,829 INFO [train.py:715] (0/8) Epoch 11, batch 19300, loss[loss=0.1368, simple_loss=0.2028, pruned_loss=0.03538, over 4914.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2124, pruned_loss=0.03276, over 971496.14 frames.], batch size: 17, lr: 1.96e-04 2022-05-07 04:35:06,979 INFO [train.py:715] (0/8) Epoch 11, batch 19350, loss[loss=0.1425, simple_loss=0.207, pruned_loss=0.03897, over 4852.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2131, pruned_loss=0.03294, over 972009.34 frames.], batch size: 32, lr: 1.96e-04 2022-05-07 04:35:46,159 INFO [train.py:715] (0/8) Epoch 11, batch 19400, loss[loss=0.1419, simple_loss=0.2157, pruned_loss=0.03401, over 4823.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2127, pruned_loss=0.03242, over 971920.83 frames.], batch size: 15, lr: 1.96e-04 2022-05-07 04:36:24,109 INFO [train.py:715] (0/8) Epoch 11, batch 19450, loss[loss=0.161, simple_loss=0.2411, pruned_loss=0.04045, over 4950.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2128, pruned_loss=0.03272, over 971823.58 frames.], batch size: 39, lr: 1.95e-04 2022-05-07 04:37:03,250 INFO [train.py:715] (0/8) Epoch 11, batch 19500, loss[loss=0.1438, simple_loss=0.2088, pruned_loss=0.03937, over 4909.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.03289, over 971611.20 frames.], batch size: 39, lr: 1.95e-04 2022-05-07 04:37:42,216 INFO [train.py:715] (0/8) Epoch 11, batch 19550, loss[loss=0.1483, simple_loss=0.2119, pruned_loss=0.04238, over 4762.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.03245, over 971972.93 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 04:38:20,962 INFO [train.py:715] (0/8) Epoch 11, batch 19600, loss[loss=0.121, simple_loss=0.1981, pruned_loss=0.02193, over 4641.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2113, pruned_loss=0.03259, over 971717.12 frames.], batch size: 13, lr: 1.95e-04 2022-05-07 04:38:59,544 INFO [train.py:715] (0/8) Epoch 11, batch 19650, loss[loss=0.1205, simple_loss=0.1896, pruned_loss=0.02569, over 4830.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03233, over 971292.56 frames.], batch size: 13, lr: 1.95e-04 2022-05-07 04:39:38,337 INFO [train.py:715] (0/8) Epoch 11, batch 19700, loss[loss=0.1404, simple_loss=0.2187, pruned_loss=0.03104, over 4989.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2109, pruned_loss=0.03217, over 972302.12 frames.], batch size: 26, lr: 1.95e-04 2022-05-07 04:40:17,418 INFO [train.py:715] (0/8) Epoch 11, batch 19750, loss[loss=0.1456, simple_loss=0.2158, pruned_loss=0.03765, over 4883.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.03248, over 972095.30 frames.], batch size: 16, lr: 1.95e-04 2022-05-07 04:40:55,509 INFO [train.py:715] (0/8) Epoch 11, batch 19800, loss[loss=0.1395, simple_loss=0.2094, pruned_loss=0.03476, over 4964.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2124, pruned_loss=0.03306, over 972321.03 frames.], batch size: 35, lr: 1.95e-04 2022-05-07 04:41:35,006 INFO [train.py:715] (0/8) Epoch 11, batch 19850, loss[loss=0.1649, simple_loss=0.2337, pruned_loss=0.04812, over 4849.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03317, over 971394.31 frames.], batch size: 20, lr: 1.95e-04 2022-05-07 04:42:14,373 INFO [train.py:715] (0/8) Epoch 11, batch 19900, loss[loss=0.1129, simple_loss=0.1816, pruned_loss=0.02212, over 4973.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2128, pruned_loss=0.03318, over 971601.29 frames.], batch size: 14, lr: 1.95e-04 2022-05-07 04:42:53,601 INFO [train.py:715] (0/8) Epoch 11, batch 19950, loss[loss=0.133, simple_loss=0.2104, pruned_loss=0.02777, over 4851.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2128, pruned_loss=0.03302, over 971400.00 frames.], batch size: 20, lr: 1.95e-04 2022-05-07 04:43:32,802 INFO [train.py:715] (0/8) Epoch 11, batch 20000, loss[loss=0.1252, simple_loss=0.2031, pruned_loss=0.02366, over 4967.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2119, pruned_loss=0.0327, over 971052.84 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 04:44:11,785 INFO [train.py:715] (0/8) Epoch 11, batch 20050, loss[loss=0.1526, simple_loss=0.2252, pruned_loss=0.04, over 4993.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2118, pruned_loss=0.03263, over 971245.61 frames.], batch size: 16, lr: 1.95e-04 2022-05-07 04:44:51,033 INFO [train.py:715] (0/8) Epoch 11, batch 20100, loss[loss=0.1353, simple_loss=0.2114, pruned_loss=0.02966, over 4893.00 frames.], tot_loss[loss=0.139, simple_loss=0.2126, pruned_loss=0.03272, over 970968.43 frames.], batch size: 17, lr: 1.95e-04 2022-05-07 04:45:29,364 INFO [train.py:715] (0/8) Epoch 11, batch 20150, loss[loss=0.1503, simple_loss=0.2179, pruned_loss=0.04141, over 4900.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2124, pruned_loss=0.03263, over 971427.84 frames.], batch size: 16, lr: 1.95e-04 2022-05-07 04:46:08,141 INFO [train.py:715] (0/8) Epoch 11, batch 20200, loss[loss=0.1512, simple_loss=0.2258, pruned_loss=0.03827, over 4791.00 frames.], tot_loss[loss=0.1395, simple_loss=0.213, pruned_loss=0.03301, over 971544.71 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 04:46:46,983 INFO [train.py:715] (0/8) Epoch 11, batch 20250, loss[loss=0.1162, simple_loss=0.1866, pruned_loss=0.02295, over 4810.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2137, pruned_loss=0.03288, over 971500.48 frames.], batch size: 25, lr: 1.95e-04 2022-05-07 04:47:25,725 INFO [train.py:715] (0/8) Epoch 11, batch 20300, loss[loss=0.1271, simple_loss=0.1984, pruned_loss=0.02794, over 4903.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2133, pruned_loss=0.03253, over 971450.88 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 04:48:04,825 INFO [train.py:715] (0/8) Epoch 11, batch 20350, loss[loss=0.1661, simple_loss=0.2353, pruned_loss=0.04843, over 4939.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2128, pruned_loss=0.03296, over 970716.35 frames.], batch size: 39, lr: 1.95e-04 2022-05-07 04:48:43,798 INFO [train.py:715] (0/8) Epoch 11, batch 20400, loss[loss=0.131, simple_loss=0.2, pruned_loss=0.03098, over 4691.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2117, pruned_loss=0.0324, over 970421.83 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 04:49:23,227 INFO [train.py:715] (0/8) Epoch 11, batch 20450, loss[loss=0.1418, simple_loss=0.217, pruned_loss=0.03332, over 4971.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03254, over 970888.28 frames.], batch size: 14, lr: 1.95e-04 2022-05-07 04:50:01,760 INFO [train.py:715] (0/8) Epoch 11, batch 20500, loss[loss=0.1781, simple_loss=0.2356, pruned_loss=0.06029, over 4777.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2119, pruned_loss=0.03315, over 970799.44 frames.], batch size: 14, lr: 1.95e-04 2022-05-07 04:50:41,079 INFO [train.py:715] (0/8) Epoch 11, batch 20550, loss[loss=0.1556, simple_loss=0.2217, pruned_loss=0.04474, over 4802.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2126, pruned_loss=0.03359, over 971353.67 frames.], batch size: 14, lr: 1.95e-04 2022-05-07 04:51:19,713 INFO [train.py:715] (0/8) Epoch 11, batch 20600, loss[loss=0.1268, simple_loss=0.2054, pruned_loss=0.0241, over 4828.00 frames.], tot_loss[loss=0.1403, simple_loss=0.2133, pruned_loss=0.0337, over 971045.05 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 04:51:57,489 INFO [train.py:715] (0/8) Epoch 11, batch 20650, loss[loss=0.1453, simple_loss=0.2224, pruned_loss=0.03408, over 4772.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03327, over 972015.49 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 04:52:36,868 INFO [train.py:715] (0/8) Epoch 11, batch 20700, loss[loss=0.1578, simple_loss=0.2358, pruned_loss=0.0399, over 4821.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2121, pruned_loss=0.03307, over 972780.66 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 04:53:16,096 INFO [train.py:715] (0/8) Epoch 11, batch 20750, loss[loss=0.1495, simple_loss=0.2385, pruned_loss=0.03028, over 4833.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03329, over 972718.39 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 04:53:54,800 INFO [train.py:715] (0/8) Epoch 11, batch 20800, loss[loss=0.1366, simple_loss=0.2208, pruned_loss=0.02616, over 4929.00 frames.], tot_loss[loss=0.1384, simple_loss=0.212, pruned_loss=0.03237, over 972073.58 frames.], batch size: 29, lr: 1.95e-04 2022-05-07 04:54:33,169 INFO [train.py:715] (0/8) Epoch 11, batch 20850, loss[loss=0.1509, simple_loss=0.2205, pruned_loss=0.04061, over 4953.00 frames.], tot_loss[loss=0.1387, simple_loss=0.212, pruned_loss=0.03266, over 972255.79 frames.], batch size: 29, lr: 1.95e-04 2022-05-07 04:55:12,416 INFO [train.py:715] (0/8) Epoch 11, batch 20900, loss[loss=0.1565, simple_loss=0.2296, pruned_loss=0.04174, over 4982.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.03301, over 972444.43 frames.], batch size: 40, lr: 1.95e-04 2022-05-07 04:55:52,028 INFO [train.py:715] (0/8) Epoch 11, batch 20950, loss[loss=0.1328, simple_loss=0.2043, pruned_loss=0.0306, over 4747.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2124, pruned_loss=0.03303, over 972512.31 frames.], batch size: 16, lr: 1.95e-04 2022-05-07 04:56:30,992 INFO [train.py:715] (0/8) Epoch 11, batch 21000, loss[loss=0.1157, simple_loss=0.1757, pruned_loss=0.02792, over 4767.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2108, pruned_loss=0.03251, over 971864.53 frames.], batch size: 12, lr: 1.95e-04 2022-05-07 04:56:30,994 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 04:56:40,629 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.106, simple_loss=0.19, pruned_loss=0.01097, over 914524.00 frames. 2022-05-07 04:57:20,093 INFO [train.py:715] (0/8) Epoch 11, batch 21050, loss[loss=0.1258, simple_loss=0.1885, pruned_loss=0.03151, over 4920.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03272, over 971366.29 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 04:57:59,828 INFO [train.py:715] (0/8) Epoch 11, batch 21100, loss[loss=0.1253, simple_loss=0.2063, pruned_loss=0.02215, over 4937.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2113, pruned_loss=0.03263, over 971590.72 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 04:58:38,863 INFO [train.py:715] (0/8) Epoch 11, batch 21150, loss[loss=0.1247, simple_loss=0.2046, pruned_loss=0.02237, over 4756.00 frames.], tot_loss[loss=0.138, simple_loss=0.2111, pruned_loss=0.03248, over 971943.85 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 04:59:18,200 INFO [train.py:715] (0/8) Epoch 11, batch 21200, loss[loss=0.1561, simple_loss=0.2378, pruned_loss=0.03717, over 4937.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2117, pruned_loss=0.03275, over 972064.56 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 04:59:56,324 INFO [train.py:715] (0/8) Epoch 11, batch 21250, loss[loss=0.1496, simple_loss=0.216, pruned_loss=0.04164, over 4968.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2108, pruned_loss=0.03232, over 973106.30 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:00:35,638 INFO [train.py:715] (0/8) Epoch 11, batch 21300, loss[loss=0.1414, simple_loss=0.218, pruned_loss=0.03243, over 4810.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03199, over 973159.73 frames.], batch size: 25, lr: 1.95e-04 2022-05-07 05:01:15,029 INFO [train.py:715] (0/8) Epoch 11, batch 21350, loss[loss=0.1338, simple_loss=0.2051, pruned_loss=0.03121, over 4743.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2098, pruned_loss=0.03194, over 974112.18 frames.], batch size: 16, lr: 1.95e-04 2022-05-07 05:01:53,534 INFO [train.py:715] (0/8) Epoch 11, batch 21400, loss[loss=0.1377, simple_loss=0.2098, pruned_loss=0.03283, over 4921.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03158, over 973867.64 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 05:02:32,174 INFO [train.py:715] (0/8) Epoch 11, batch 21450, loss[loss=0.1244, simple_loss=0.1986, pruned_loss=0.0251, over 4942.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03154, over 973773.76 frames.], batch size: 29, lr: 1.95e-04 2022-05-07 05:03:11,026 INFO [train.py:715] (0/8) Epoch 11, batch 21500, loss[loss=0.123, simple_loss=0.2055, pruned_loss=0.02027, over 4791.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.03141, over 973383.54 frames.], batch size: 17, lr: 1.95e-04 2022-05-07 05:03:50,385 INFO [train.py:715] (0/8) Epoch 11, batch 21550, loss[loss=0.1392, simple_loss=0.2144, pruned_loss=0.03198, over 4850.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03173, over 973948.06 frames.], batch size: 32, lr: 1.95e-04 2022-05-07 05:04:28,681 INFO [train.py:715] (0/8) Epoch 11, batch 21600, loss[loss=0.1462, simple_loss=0.2248, pruned_loss=0.03374, over 4778.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2111, pruned_loss=0.03177, over 973866.75 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 05:05:07,529 INFO [train.py:715] (0/8) Epoch 11, batch 21650, loss[loss=0.1401, simple_loss=0.211, pruned_loss=0.03462, over 4929.00 frames.], tot_loss[loss=0.137, simple_loss=0.2109, pruned_loss=0.03155, over 972794.86 frames.], batch size: 23, lr: 1.95e-04 2022-05-07 05:05:47,580 INFO [train.py:715] (0/8) Epoch 11, batch 21700, loss[loss=0.177, simple_loss=0.2487, pruned_loss=0.05268, over 4980.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2112, pruned_loss=0.03176, over 972100.80 frames.], batch size: 25, lr: 1.95e-04 2022-05-07 05:06:26,872 INFO [train.py:715] (0/8) Epoch 11, batch 21750, loss[loss=0.1823, simple_loss=0.2652, pruned_loss=0.04966, over 4754.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2116, pruned_loss=0.03194, over 972045.36 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 05:07:07,058 INFO [train.py:715] (0/8) Epoch 11, batch 21800, loss[loss=0.1439, simple_loss=0.2064, pruned_loss=0.0407, over 4970.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2112, pruned_loss=0.03199, over 972312.46 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:07:46,732 INFO [train.py:715] (0/8) Epoch 11, batch 21850, loss[loss=0.1529, simple_loss=0.2128, pruned_loss=0.04653, over 4756.00 frames.], tot_loss[loss=0.138, simple_loss=0.2113, pruned_loss=0.03236, over 971697.74 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 05:08:27,225 INFO [train.py:715] (0/8) Epoch 11, batch 21900, loss[loss=0.1282, simple_loss=0.2044, pruned_loss=0.02598, over 4981.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2118, pruned_loss=0.03257, over 972607.89 frames.], batch size: 28, lr: 1.95e-04 2022-05-07 05:09:06,442 INFO [train.py:715] (0/8) Epoch 11, batch 21950, loss[loss=0.1579, simple_loss=0.241, pruned_loss=0.03736, over 4694.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03255, over 972020.46 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:09:46,765 INFO [train.py:715] (0/8) Epoch 11, batch 22000, loss[loss=0.1323, simple_loss=0.2142, pruned_loss=0.02525, over 4786.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2117, pruned_loss=0.03264, over 971186.31 frames.], batch size: 14, lr: 1.95e-04 2022-05-07 05:10:27,228 INFO [train.py:715] (0/8) Epoch 11, batch 22050, loss[loss=0.1652, simple_loss=0.228, pruned_loss=0.05123, over 4858.00 frames.], tot_loss[loss=0.139, simple_loss=0.2123, pruned_loss=0.03283, over 972014.27 frames.], batch size: 30, lr: 1.95e-04 2022-05-07 05:11:05,499 INFO [train.py:715] (0/8) Epoch 11, batch 22100, loss[loss=0.1172, simple_loss=0.1948, pruned_loss=0.01978, over 4930.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03279, over 972603.79 frames.], batch size: 29, lr: 1.95e-04 2022-05-07 05:11:45,097 INFO [train.py:715] (0/8) Epoch 11, batch 22150, loss[loss=0.1586, simple_loss=0.2331, pruned_loss=0.042, over 4819.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03296, over 972294.14 frames.], batch size: 25, lr: 1.95e-04 2022-05-07 05:12:24,689 INFO [train.py:715] (0/8) Epoch 11, batch 22200, loss[loss=0.1306, simple_loss=0.2077, pruned_loss=0.02672, over 4817.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03312, over 972504.78 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 05:13:03,453 INFO [train.py:715] (0/8) Epoch 11, batch 22250, loss[loss=0.1261, simple_loss=0.1931, pruned_loss=0.02948, over 4792.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03296, over 972759.96 frames.], batch size: 13, lr: 1.95e-04 2022-05-07 05:13:41,889 INFO [train.py:715] (0/8) Epoch 11, batch 22300, loss[loss=0.1526, simple_loss=0.2165, pruned_loss=0.04435, over 4943.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2127, pruned_loss=0.03333, over 972668.56 frames.], batch size: 35, lr: 1.95e-04 2022-05-07 05:14:21,102 INFO [train.py:715] (0/8) Epoch 11, batch 22350, loss[loss=0.131, simple_loss=0.2066, pruned_loss=0.02769, over 4932.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2125, pruned_loss=0.03308, over 973376.61 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 05:15:00,555 INFO [train.py:715] (0/8) Epoch 11, batch 22400, loss[loss=0.1209, simple_loss=0.198, pruned_loss=0.02192, over 4762.00 frames.], tot_loss[loss=0.139, simple_loss=0.2124, pruned_loss=0.03282, over 973247.18 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 05:15:38,490 INFO [train.py:715] (0/8) Epoch 11, batch 22450, loss[loss=0.1743, simple_loss=0.2451, pruned_loss=0.05171, over 4953.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2123, pruned_loss=0.03277, over 973548.39 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:16:18,409 INFO [train.py:715] (0/8) Epoch 11, batch 22500, loss[loss=0.1691, simple_loss=0.2341, pruned_loss=0.05202, over 4656.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2125, pruned_loss=0.03284, over 973361.93 frames.], batch size: 13, lr: 1.95e-04 2022-05-07 05:16:57,483 INFO [train.py:715] (0/8) Epoch 11, batch 22550, loss[loss=0.1311, simple_loss=0.2082, pruned_loss=0.02697, over 4799.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03308, over 973465.25 frames.], batch size: 17, lr: 1.95e-04 2022-05-07 05:17:36,654 INFO [train.py:715] (0/8) Epoch 11, batch 22600, loss[loss=0.1339, simple_loss=0.2036, pruned_loss=0.03214, over 4890.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03288, over 972534.96 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 05:18:15,027 INFO [train.py:715] (0/8) Epoch 11, batch 22650, loss[loss=0.1642, simple_loss=0.2467, pruned_loss=0.0409, over 4886.00 frames.], tot_loss[loss=0.14, simple_loss=0.2132, pruned_loss=0.0334, over 972903.77 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 05:18:54,217 INFO [train.py:715] (0/8) Epoch 11, batch 22700, loss[loss=0.1169, simple_loss=0.1918, pruned_loss=0.02097, over 4750.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2133, pruned_loss=0.03354, over 972070.07 frames.], batch size: 19, lr: 1.95e-04 2022-05-07 05:19:34,075 INFO [train.py:715] (0/8) Epoch 11, batch 22750, loss[loss=0.1346, simple_loss=0.2147, pruned_loss=0.02727, over 4919.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2128, pruned_loss=0.03292, over 972242.03 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 05:20:12,495 INFO [train.py:715] (0/8) Epoch 11, batch 22800, loss[loss=0.1214, simple_loss=0.1948, pruned_loss=0.02402, over 4915.00 frames.], tot_loss[loss=0.1394, simple_loss=0.213, pruned_loss=0.03287, over 972603.64 frames.], batch size: 18, lr: 1.95e-04 2022-05-07 05:20:52,293 INFO [train.py:715] (0/8) Epoch 11, batch 22850, loss[loss=0.1338, simple_loss=0.215, pruned_loss=0.02627, over 4791.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2136, pruned_loss=0.03296, over 973268.76 frames.], batch size: 24, lr: 1.95e-04 2022-05-07 05:21:31,217 INFO [train.py:715] (0/8) Epoch 11, batch 22900, loss[loss=0.1128, simple_loss=0.1875, pruned_loss=0.01903, over 4645.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2136, pruned_loss=0.03335, over 973156.72 frames.], batch size: 13, lr: 1.95e-04 2022-05-07 05:22:10,208 INFO [train.py:715] (0/8) Epoch 11, batch 22950, loss[loss=0.1286, simple_loss=0.2004, pruned_loss=0.02836, over 4876.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2134, pruned_loss=0.0334, over 972296.96 frames.], batch size: 22, lr: 1.95e-04 2022-05-07 05:22:48,357 INFO [train.py:715] (0/8) Epoch 11, batch 23000, loss[loss=0.1322, simple_loss=0.208, pruned_loss=0.0282, over 4693.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2141, pruned_loss=0.03351, over 971813.70 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:23:27,329 INFO [train.py:715] (0/8) Epoch 11, batch 23050, loss[loss=0.1494, simple_loss=0.2218, pruned_loss=0.03854, over 4822.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2143, pruned_loss=0.03362, over 972198.17 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:24:06,659 INFO [train.py:715] (0/8) Epoch 11, batch 23100, loss[loss=0.1495, simple_loss=0.2269, pruned_loss=0.03602, over 4860.00 frames.], tot_loss[loss=0.1408, simple_loss=0.2143, pruned_loss=0.03367, over 972556.47 frames.], batch size: 30, lr: 1.95e-04 2022-05-07 05:24:44,405 INFO [train.py:715] (0/8) Epoch 11, batch 23150, loss[loss=0.1443, simple_loss=0.2148, pruned_loss=0.03696, over 4932.00 frames.], tot_loss[loss=0.1406, simple_loss=0.2142, pruned_loss=0.03344, over 973039.83 frames.], batch size: 29, lr: 1.95e-04 2022-05-07 05:25:23,976 INFO [train.py:715] (0/8) Epoch 11, batch 23200, loss[loss=0.1199, simple_loss=0.1909, pruned_loss=0.0245, over 4994.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2138, pruned_loss=0.03324, over 973053.18 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:26:02,908 INFO [train.py:715] (0/8) Epoch 11, batch 23250, loss[loss=0.1289, simple_loss=0.1995, pruned_loss=0.02914, over 4872.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2136, pruned_loss=0.03331, over 972474.49 frames.], batch size: 32, lr: 1.95e-04 2022-05-07 05:26:41,981 INFO [train.py:715] (0/8) Epoch 11, batch 23300, loss[loss=0.1568, simple_loss=0.2326, pruned_loss=0.04047, over 4955.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2131, pruned_loss=0.03299, over 973059.76 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 05:27:20,070 INFO [train.py:715] (0/8) Epoch 11, batch 23350, loss[loss=0.1297, simple_loss=0.2034, pruned_loss=0.02801, over 4936.00 frames.], tot_loss[loss=0.139, simple_loss=0.2126, pruned_loss=0.03273, over 974415.76 frames.], batch size: 21, lr: 1.95e-04 2022-05-07 05:27:59,122 INFO [train.py:715] (0/8) Epoch 11, batch 23400, loss[loss=0.1509, simple_loss=0.2234, pruned_loss=0.03925, over 4988.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.03316, over 974768.84 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:28:38,744 INFO [train.py:715] (0/8) Epoch 11, batch 23450, loss[loss=0.1307, simple_loss=0.2091, pruned_loss=0.02614, over 4896.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03279, over 974514.22 frames.], batch size: 17, lr: 1.95e-04 2022-05-07 05:29:16,869 INFO [train.py:715] (0/8) Epoch 11, batch 23500, loss[loss=0.1483, simple_loss=0.2297, pruned_loss=0.03343, over 4795.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.03265, over 973879.24 frames.], batch size: 24, lr: 1.95e-04 2022-05-07 05:29:55,782 INFO [train.py:715] (0/8) Epoch 11, batch 23550, loss[loss=0.1314, simple_loss=0.1995, pruned_loss=0.03166, over 4709.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2118, pruned_loss=0.03272, over 973517.95 frames.], batch size: 15, lr: 1.95e-04 2022-05-07 05:30:34,763 INFO [train.py:715] (0/8) Epoch 11, batch 23600, loss[loss=0.1404, simple_loss=0.223, pruned_loss=0.02887, over 4789.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03277, over 973465.43 frames.], batch size: 18, lr: 1.94e-04 2022-05-07 05:31:14,120 INFO [train.py:715] (0/8) Epoch 11, batch 23650, loss[loss=0.1396, simple_loss=0.2211, pruned_loss=0.02911, over 4956.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2121, pruned_loss=0.03309, over 972532.38 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:31:51,827 INFO [train.py:715] (0/8) Epoch 11, batch 23700, loss[loss=0.1451, simple_loss=0.2178, pruned_loss=0.03619, over 4710.00 frames.], tot_loss[loss=0.139, simple_loss=0.2119, pruned_loss=0.03304, over 972146.09 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:32:30,821 INFO [train.py:715] (0/8) Epoch 11, batch 23750, loss[loss=0.1114, simple_loss=0.1839, pruned_loss=0.01945, over 4948.00 frames.], tot_loss[loss=0.138, simple_loss=0.2112, pruned_loss=0.03237, over 971812.10 frames.], batch size: 23, lr: 1.94e-04 2022-05-07 05:33:09,306 INFO [train.py:715] (0/8) Epoch 11, batch 23800, loss[loss=0.1347, simple_loss=0.2122, pruned_loss=0.02856, over 4818.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03159, over 972424.19 frames.], batch size: 25, lr: 1.94e-04 2022-05-07 05:33:46,735 INFO [train.py:715] (0/8) Epoch 11, batch 23850, loss[loss=0.1501, simple_loss=0.2288, pruned_loss=0.03569, over 4752.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2118, pruned_loss=0.03228, over 971851.73 frames.], batch size: 19, lr: 1.94e-04 2022-05-07 05:34:24,309 INFO [train.py:715] (0/8) Epoch 11, batch 23900, loss[loss=0.1169, simple_loss=0.1861, pruned_loss=0.02388, over 4847.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2119, pruned_loss=0.0324, over 971924.26 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 05:35:01,653 INFO [train.py:715] (0/8) Epoch 11, batch 23950, loss[loss=0.1362, simple_loss=0.2175, pruned_loss=0.02743, over 4948.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03275, over 971414.26 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 05:35:39,340 INFO [train.py:715] (0/8) Epoch 11, batch 24000, loss[loss=0.1416, simple_loss=0.2174, pruned_loss=0.03295, over 4868.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.03289, over 972365.07 frames.], batch size: 22, lr: 1.94e-04 2022-05-07 05:35:39,341 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 05:35:48,813 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1059, simple_loss=0.19, pruned_loss=0.01092, over 914524.00 frames. 2022-05-07 05:36:27,133 INFO [train.py:715] (0/8) Epoch 11, batch 24050, loss[loss=0.1228, simple_loss=0.1888, pruned_loss=0.0284, over 4802.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.03248, over 972696.46 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 05:37:04,264 INFO [train.py:715] (0/8) Epoch 11, batch 24100, loss[loss=0.1255, simple_loss=0.2035, pruned_loss=0.02377, over 4932.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03205, over 973062.74 frames.], batch size: 29, lr: 1.94e-04 2022-05-07 05:37:42,093 INFO [train.py:715] (0/8) Epoch 11, batch 24150, loss[loss=0.1146, simple_loss=0.1833, pruned_loss=0.02296, over 4810.00 frames.], tot_loss[loss=0.138, simple_loss=0.2116, pruned_loss=0.03224, over 973406.12 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 05:38:20,366 INFO [train.py:715] (0/8) Epoch 11, batch 24200, loss[loss=0.1293, simple_loss=0.2134, pruned_loss=0.02262, over 4800.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2125, pruned_loss=0.03253, over 974154.20 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 05:38:57,449 INFO [train.py:715] (0/8) Epoch 11, batch 24250, loss[loss=0.1478, simple_loss=0.2415, pruned_loss=0.02705, over 4683.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2119, pruned_loss=0.03235, over 973993.59 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:39:35,482 INFO [train.py:715] (0/8) Epoch 11, batch 24300, loss[loss=0.1377, simple_loss=0.1988, pruned_loss=0.03831, over 4978.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03259, over 973325.89 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 05:40:13,070 INFO [train.py:715] (0/8) Epoch 11, batch 24350, loss[loss=0.1508, simple_loss=0.2209, pruned_loss=0.04035, over 4745.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2119, pruned_loss=0.0324, over 973969.93 frames.], batch size: 12, lr: 1.94e-04 2022-05-07 05:40:50,675 INFO [train.py:715] (0/8) Epoch 11, batch 24400, loss[loss=0.1258, simple_loss=0.2, pruned_loss=0.02576, over 4897.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.03314, over 973936.20 frames.], batch size: 32, lr: 1.94e-04 2022-05-07 05:41:28,270 INFO [train.py:715] (0/8) Epoch 11, batch 24450, loss[loss=0.1511, simple_loss=0.231, pruned_loss=0.0356, over 4815.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03293, over 973926.70 frames.], batch size: 25, lr: 1.94e-04 2022-05-07 05:42:06,384 INFO [train.py:715] (0/8) Epoch 11, batch 24500, loss[loss=0.1594, simple_loss=0.2345, pruned_loss=0.04216, over 4814.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2126, pruned_loss=0.03292, over 973667.69 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 05:42:45,025 INFO [train.py:715] (0/8) Epoch 11, batch 24550, loss[loss=0.1719, simple_loss=0.2464, pruned_loss=0.04875, over 4985.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2132, pruned_loss=0.03312, over 972928.66 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 05:43:23,049 INFO [train.py:715] (0/8) Epoch 11, batch 24600, loss[loss=0.1353, simple_loss=0.2106, pruned_loss=0.02996, over 4927.00 frames.], tot_loss[loss=0.1395, simple_loss=0.213, pruned_loss=0.03303, over 972360.12 frames.], batch size: 29, lr: 1.94e-04 2022-05-07 05:44:01,526 INFO [train.py:715] (0/8) Epoch 11, batch 24650, loss[loss=0.1377, simple_loss=0.2168, pruned_loss=0.02931, over 4738.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2124, pruned_loss=0.03305, over 971818.07 frames.], batch size: 19, lr: 1.94e-04 2022-05-07 05:44:39,860 INFO [train.py:715] (0/8) Epoch 11, batch 24700, loss[loss=0.1204, simple_loss=0.1906, pruned_loss=0.02511, over 4960.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.03334, over 971237.21 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:45:18,486 INFO [train.py:715] (0/8) Epoch 11, batch 24750, loss[loss=0.1248, simple_loss=0.1986, pruned_loss=0.02547, over 4877.00 frames.], tot_loss[loss=0.1399, simple_loss=0.2126, pruned_loss=0.03358, over 971592.89 frames.], batch size: 16, lr: 1.94e-04 2022-05-07 05:45:56,381 INFO [train.py:715] (0/8) Epoch 11, batch 24800, loss[loss=0.1496, simple_loss=0.2389, pruned_loss=0.03014, over 4927.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2124, pruned_loss=0.03334, over 971835.38 frames.], batch size: 23, lr: 1.94e-04 2022-05-07 05:46:34,701 INFO [train.py:715] (0/8) Epoch 11, batch 24850, loss[loss=0.1398, simple_loss=0.2005, pruned_loss=0.03959, over 4645.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.03288, over 971642.80 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 05:47:13,627 INFO [train.py:715] (0/8) Epoch 11, batch 24900, loss[loss=0.1562, simple_loss=0.2382, pruned_loss=0.03715, over 4835.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03312, over 971610.15 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:47:51,692 INFO [train.py:715] (0/8) Epoch 11, batch 24950, loss[loss=0.1919, simple_loss=0.2644, pruned_loss=0.05972, over 4857.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2124, pruned_loss=0.03348, over 971840.14 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:48:30,021 INFO [train.py:715] (0/8) Epoch 11, batch 25000, loss[loss=0.1328, simple_loss=0.206, pruned_loss=0.02977, over 4923.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2123, pruned_loss=0.03344, over 972835.64 frames.], batch size: 18, lr: 1.94e-04 2022-05-07 05:49:08,316 INFO [train.py:715] (0/8) Epoch 11, batch 25050, loss[loss=0.151, simple_loss=0.2305, pruned_loss=0.03578, over 4882.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.03282, over 972543.72 frames.], batch size: 22, lr: 1.94e-04 2022-05-07 05:49:44,225 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-408000.pt 2022-05-07 05:49:49,676 INFO [train.py:715] (0/8) Epoch 11, batch 25100, loss[loss=0.1226, simple_loss=0.1922, pruned_loss=0.0265, over 4782.00 frames.], tot_loss[loss=0.1388, simple_loss=0.212, pruned_loss=0.03276, over 971600.99 frames.], batch size: 17, lr: 1.94e-04 2022-05-07 05:50:27,836 INFO [train.py:715] (0/8) Epoch 11, batch 25150, loss[loss=0.1437, simple_loss=0.2087, pruned_loss=0.03931, over 4824.00 frames.], tot_loss[loss=0.1385, simple_loss=0.212, pruned_loss=0.03254, over 972074.85 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 05:51:06,428 INFO [train.py:715] (0/8) Epoch 11, batch 25200, loss[loss=0.1536, simple_loss=0.2192, pruned_loss=0.04401, over 4980.00 frames.], tot_loss[loss=0.138, simple_loss=0.2115, pruned_loss=0.03219, over 971604.56 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:51:45,281 INFO [train.py:715] (0/8) Epoch 11, batch 25250, loss[loss=0.1233, simple_loss=0.1985, pruned_loss=0.02406, over 4815.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2118, pruned_loss=0.03238, over 971106.60 frames.], batch size: 25, lr: 1.94e-04 2022-05-07 05:52:23,559 INFO [train.py:715] (0/8) Epoch 11, batch 25300, loss[loss=0.1273, simple_loss=0.1994, pruned_loss=0.02755, over 4800.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2117, pruned_loss=0.03245, over 971990.34 frames.], batch size: 24, lr: 1.94e-04 2022-05-07 05:53:01,958 INFO [train.py:715] (0/8) Epoch 11, batch 25350, loss[loss=0.1352, simple_loss=0.2107, pruned_loss=0.0299, over 4922.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2117, pruned_loss=0.03263, over 972354.64 frames.], batch size: 18, lr: 1.94e-04 2022-05-07 05:53:40,598 INFO [train.py:715] (0/8) Epoch 11, batch 25400, loss[loss=0.1071, simple_loss=0.1845, pruned_loss=0.0148, over 4941.00 frames.], tot_loss[loss=0.138, simple_loss=0.2114, pruned_loss=0.03225, over 971275.55 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 05:54:19,415 INFO [train.py:715] (0/8) Epoch 11, batch 25450, loss[loss=0.1299, simple_loss=0.1939, pruned_loss=0.03297, over 4822.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03199, over 971330.26 frames.], batch size: 26, lr: 1.94e-04 2022-05-07 05:54:57,467 INFO [train.py:715] (0/8) Epoch 11, batch 25500, loss[loss=0.1253, simple_loss=0.2061, pruned_loss=0.02227, over 4902.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03198, over 972654.25 frames.], batch size: 22, lr: 1.94e-04 2022-05-07 05:55:36,078 INFO [train.py:715] (0/8) Epoch 11, batch 25550, loss[loss=0.1343, simple_loss=0.204, pruned_loss=0.0323, over 4973.00 frames.], tot_loss[loss=0.138, simple_loss=0.2114, pruned_loss=0.0323, over 972135.34 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 05:56:15,312 INFO [train.py:715] (0/8) Epoch 11, batch 25600, loss[loss=0.1291, simple_loss=0.2022, pruned_loss=0.02799, over 4947.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.03195, over 972022.40 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 05:56:53,591 INFO [train.py:715] (0/8) Epoch 11, batch 25650, loss[loss=0.1283, simple_loss=0.1939, pruned_loss=0.03133, over 4834.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2111, pruned_loss=0.03227, over 972374.03 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 05:57:31,747 INFO [train.py:715] (0/8) Epoch 11, batch 25700, loss[loss=0.1127, simple_loss=0.1821, pruned_loss=0.02164, over 4834.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2112, pruned_loss=0.0322, over 972310.50 frames.], batch size: 12, lr: 1.94e-04 2022-05-07 05:58:10,577 INFO [train.py:715] (0/8) Epoch 11, batch 25750, loss[loss=0.1375, simple_loss=0.2112, pruned_loss=0.03189, over 4800.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2114, pruned_loss=0.03237, over 972290.91 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 05:58:48,900 INFO [train.py:715] (0/8) Epoch 11, batch 25800, loss[loss=0.1391, simple_loss=0.2303, pruned_loss=0.02393, over 4932.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03185, over 972256.78 frames.], batch size: 29, lr: 1.94e-04 2022-05-07 05:59:26,901 INFO [train.py:715] (0/8) Epoch 11, batch 25850, loss[loss=0.1467, simple_loss=0.2254, pruned_loss=0.03396, over 4793.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03209, over 972401.76 frames.], batch size: 24, lr: 1.94e-04 2022-05-07 06:00:05,559 INFO [train.py:715] (0/8) Epoch 11, batch 25900, loss[loss=0.1184, simple_loss=0.1868, pruned_loss=0.02506, over 4981.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2109, pruned_loss=0.03211, over 971811.85 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 06:00:44,265 INFO [train.py:715] (0/8) Epoch 11, batch 25950, loss[loss=0.1633, simple_loss=0.2389, pruned_loss=0.04386, over 4919.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2112, pruned_loss=0.0322, over 972271.91 frames.], batch size: 39, lr: 1.94e-04 2022-05-07 06:01:22,316 INFO [train.py:715] (0/8) Epoch 11, batch 26000, loss[loss=0.1195, simple_loss=0.1859, pruned_loss=0.02659, over 4930.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2112, pruned_loss=0.03237, over 973103.25 frames.], batch size: 29, lr: 1.94e-04 2022-05-07 06:02:00,398 INFO [train.py:715] (0/8) Epoch 11, batch 26050, loss[loss=0.1323, simple_loss=0.2103, pruned_loss=0.02711, over 4824.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03218, over 972859.68 frames.], batch size: 12, lr: 1.94e-04 2022-05-07 06:02:38,950 INFO [train.py:715] (0/8) Epoch 11, batch 26100, loss[loss=0.1849, simple_loss=0.2472, pruned_loss=0.06126, over 4775.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2099, pruned_loss=0.03218, over 972845.38 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 06:03:17,341 INFO [train.py:715] (0/8) Epoch 11, batch 26150, loss[loss=0.1301, simple_loss=0.2013, pruned_loss=0.02938, over 4919.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03246, over 972873.14 frames.], batch size: 17, lr: 1.94e-04 2022-05-07 06:03:55,310 INFO [train.py:715] (0/8) Epoch 11, batch 26200, loss[loss=0.1311, simple_loss=0.1966, pruned_loss=0.0328, over 4769.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.03281, over 972616.88 frames.], batch size: 12, lr: 1.94e-04 2022-05-07 06:04:32,927 INFO [train.py:715] (0/8) Epoch 11, batch 26250, loss[loss=0.144, simple_loss=0.2132, pruned_loss=0.03738, over 4886.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2119, pruned_loss=0.03295, over 972696.33 frames.], batch size: 19, lr: 1.94e-04 2022-05-07 06:05:10,974 INFO [train.py:715] (0/8) Epoch 11, batch 26300, loss[loss=0.16, simple_loss=0.2293, pruned_loss=0.04535, over 4906.00 frames.], tot_loss[loss=0.1399, simple_loss=0.213, pruned_loss=0.03338, over 972572.07 frames.], batch size: 17, lr: 1.94e-04 2022-05-07 06:05:48,407 INFO [train.py:715] (0/8) Epoch 11, batch 26350, loss[loss=0.1456, simple_loss=0.2232, pruned_loss=0.034, over 4969.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2123, pruned_loss=0.03297, over 972418.43 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 06:06:25,425 INFO [train.py:715] (0/8) Epoch 11, batch 26400, loss[loss=0.1398, simple_loss=0.2301, pruned_loss=0.02473, over 4903.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03297, over 972860.74 frames.], batch size: 17, lr: 1.94e-04 2022-05-07 06:07:03,854 INFO [train.py:715] (0/8) Epoch 11, batch 26450, loss[loss=0.1174, simple_loss=0.1886, pruned_loss=0.02315, over 4913.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2122, pruned_loss=0.03247, over 972604.73 frames.], batch size: 18, lr: 1.94e-04 2022-05-07 06:07:41,338 INFO [train.py:715] (0/8) Epoch 11, batch 26500, loss[loss=0.1183, simple_loss=0.1876, pruned_loss=0.02451, over 4834.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2118, pruned_loss=0.03223, over 972272.53 frames.], batch size: 13, lr: 1.94e-04 2022-05-07 06:08:19,079 INFO [train.py:715] (0/8) Epoch 11, batch 26550, loss[loss=0.1561, simple_loss=0.227, pruned_loss=0.04264, over 4924.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2119, pruned_loss=0.03223, over 971753.75 frames.], batch size: 23, lr: 1.94e-04 2022-05-07 06:08:56,817 INFO [train.py:715] (0/8) Epoch 11, batch 26600, loss[loss=0.154, simple_loss=0.2263, pruned_loss=0.04082, over 4769.00 frames.], tot_loss[loss=0.139, simple_loss=0.2129, pruned_loss=0.03256, over 971993.37 frames.], batch size: 18, lr: 1.94e-04 2022-05-07 06:09:34,841 INFO [train.py:715] (0/8) Epoch 11, batch 26650, loss[loss=0.112, simple_loss=0.1857, pruned_loss=0.01915, over 4961.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2119, pruned_loss=0.03219, over 971426.10 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 06:10:12,907 INFO [train.py:715] (0/8) Epoch 11, batch 26700, loss[loss=0.1295, simple_loss=0.2169, pruned_loss=0.02098, over 4937.00 frames.], tot_loss[loss=0.138, simple_loss=0.2117, pruned_loss=0.03217, over 971608.42 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 06:10:49,894 INFO [train.py:715] (0/8) Epoch 11, batch 26750, loss[loss=0.1261, simple_loss=0.2027, pruned_loss=0.02472, over 4925.00 frames.], tot_loss[loss=0.139, simple_loss=0.2129, pruned_loss=0.03257, over 972060.12 frames.], batch size: 23, lr: 1.94e-04 2022-05-07 06:11:28,538 INFO [train.py:715] (0/8) Epoch 11, batch 26800, loss[loss=0.1514, simple_loss=0.2174, pruned_loss=0.04274, over 4935.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.03303, over 971006.24 frames.], batch size: 39, lr: 1.94e-04 2022-05-07 06:12:06,141 INFO [train.py:715] (0/8) Epoch 11, batch 26850, loss[loss=0.1604, simple_loss=0.2399, pruned_loss=0.04043, over 4872.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2128, pruned_loss=0.03328, over 971060.76 frames.], batch size: 16, lr: 1.94e-04 2022-05-07 06:12:43,628 INFO [train.py:715] (0/8) Epoch 11, batch 26900, loss[loss=0.1753, simple_loss=0.2431, pruned_loss=0.05371, over 4887.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2126, pruned_loss=0.03302, over 971195.95 frames.], batch size: 32, lr: 1.94e-04 2022-05-07 06:13:21,267 INFO [train.py:715] (0/8) Epoch 11, batch 26950, loss[loss=0.1292, simple_loss=0.1941, pruned_loss=0.03212, over 4976.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.033, over 971351.23 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 06:13:59,652 INFO [train.py:715] (0/8) Epoch 11, batch 27000, loss[loss=0.1426, simple_loss=0.2209, pruned_loss=0.03218, over 4712.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.0332, over 970580.40 frames.], batch size: 15, lr: 1.94e-04 2022-05-07 06:13:59,654 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 06:14:09,117 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1059, simple_loss=0.19, pruned_loss=0.01084, over 914524.00 frames. 2022-05-07 06:14:47,545 INFO [train.py:715] (0/8) Epoch 11, batch 27050, loss[loss=0.1523, simple_loss=0.2253, pruned_loss=0.03962, over 4904.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2119, pruned_loss=0.0334, over 972321.54 frames.], batch size: 17, lr: 1.94e-04 2022-05-07 06:15:25,147 INFO [train.py:715] (0/8) Epoch 11, batch 27100, loss[loss=0.1388, simple_loss=0.2229, pruned_loss=0.02735, over 4777.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2118, pruned_loss=0.03324, over 972049.74 frames.], batch size: 14, lr: 1.94e-04 2022-05-07 06:16:02,380 INFO [train.py:715] (0/8) Epoch 11, batch 27150, loss[loss=0.1219, simple_loss=0.2022, pruned_loss=0.02076, over 4927.00 frames.], tot_loss[loss=0.1389, simple_loss=0.212, pruned_loss=0.0329, over 972106.75 frames.], batch size: 29, lr: 1.94e-04 2022-05-07 06:16:41,017 INFO [train.py:715] (0/8) Epoch 11, batch 27200, loss[loss=0.1222, simple_loss=0.197, pruned_loss=0.02368, over 4917.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2111, pruned_loss=0.03272, over 972196.04 frames.], batch size: 29, lr: 1.94e-04 2022-05-07 06:17:18,727 INFO [train.py:715] (0/8) Epoch 11, batch 27250, loss[loss=0.1058, simple_loss=0.1866, pruned_loss=0.01247, over 4813.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2106, pruned_loss=0.03245, over 972360.11 frames.], batch size: 27, lr: 1.94e-04 2022-05-07 06:17:56,615 INFO [train.py:715] (0/8) Epoch 11, batch 27300, loss[loss=0.1421, simple_loss=0.2141, pruned_loss=0.03503, over 4866.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2116, pruned_loss=0.03267, over 972081.03 frames.], batch size: 20, lr: 1.94e-04 2022-05-07 06:18:34,287 INFO [train.py:715] (0/8) Epoch 11, batch 27350, loss[loss=0.1101, simple_loss=0.1786, pruned_loss=0.02077, over 4918.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2109, pruned_loss=0.03245, over 972012.09 frames.], batch size: 18, lr: 1.94e-04 2022-05-07 06:19:13,061 INFO [train.py:715] (0/8) Epoch 11, batch 27400, loss[loss=0.1406, simple_loss=0.209, pruned_loss=0.03611, over 4878.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2113, pruned_loss=0.03231, over 971252.96 frames.], batch size: 32, lr: 1.94e-04 2022-05-07 06:19:50,854 INFO [train.py:715] (0/8) Epoch 11, batch 27450, loss[loss=0.1126, simple_loss=0.1901, pruned_loss=0.0175, over 4814.00 frames.], tot_loss[loss=0.1376, simple_loss=0.211, pruned_loss=0.03214, over 971745.31 frames.], batch size: 27, lr: 1.94e-04 2022-05-07 06:20:28,117 INFO [train.py:715] (0/8) Epoch 11, batch 27500, loss[loss=0.1108, simple_loss=0.1803, pruned_loss=0.02063, over 4837.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2114, pruned_loss=0.03251, over 972303.46 frames.], batch size: 12, lr: 1.94e-04 2022-05-07 06:21:07,283 INFO [train.py:715] (0/8) Epoch 11, batch 27550, loss[loss=0.1339, simple_loss=0.21, pruned_loss=0.02889, over 4970.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2119, pruned_loss=0.03262, over 973472.49 frames.], batch size: 28, lr: 1.94e-04 2022-05-07 06:21:45,750 INFO [train.py:715] (0/8) Epoch 11, batch 27600, loss[loss=0.1079, simple_loss=0.1865, pruned_loss=0.01462, over 4964.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2114, pruned_loss=0.03219, over 973200.06 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 06:22:23,474 INFO [train.py:715] (0/8) Epoch 11, batch 27650, loss[loss=0.1495, simple_loss=0.2219, pruned_loss=0.03853, over 4746.00 frames.], tot_loss[loss=0.138, simple_loss=0.2114, pruned_loss=0.03229, over 972516.18 frames.], batch size: 19, lr: 1.94e-04 2022-05-07 06:23:01,297 INFO [train.py:715] (0/8) Epoch 11, batch 27700, loss[loss=0.1197, simple_loss=0.1843, pruned_loss=0.02752, over 4752.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2113, pruned_loss=0.0323, over 972287.47 frames.], batch size: 12, lr: 1.94e-04 2022-05-07 06:23:39,624 INFO [train.py:715] (0/8) Epoch 11, batch 27750, loss[loss=0.1209, simple_loss=0.1962, pruned_loss=0.02282, over 4939.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03222, over 971791.33 frames.], batch size: 21, lr: 1.94e-04 2022-05-07 06:24:17,572 INFO [train.py:715] (0/8) Epoch 11, batch 27800, loss[loss=0.1376, simple_loss=0.2069, pruned_loss=0.03413, over 4969.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2102, pruned_loss=0.0324, over 971778.77 frames.], batch size: 31, lr: 1.93e-04 2022-05-07 06:24:54,555 INFO [train.py:715] (0/8) Epoch 11, batch 27850, loss[loss=0.1179, simple_loss=0.1865, pruned_loss=0.02468, over 4917.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03233, over 971035.19 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:25:32,917 INFO [train.py:715] (0/8) Epoch 11, batch 27900, loss[loss=0.1384, simple_loss=0.1985, pruned_loss=0.03917, over 4849.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2107, pruned_loss=0.03238, over 971237.36 frames.], batch size: 30, lr: 1.93e-04 2022-05-07 06:26:10,972 INFO [train.py:715] (0/8) Epoch 11, batch 27950, loss[loss=0.1422, simple_loss=0.2191, pruned_loss=0.03265, over 4794.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.032, over 970887.77 frames.], batch size: 24, lr: 1.93e-04 2022-05-07 06:26:48,581 INFO [train.py:715] (0/8) Epoch 11, batch 28000, loss[loss=0.1222, simple_loss=0.1957, pruned_loss=0.02437, over 4934.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2109, pruned_loss=0.03278, over 972525.26 frames.], batch size: 29, lr: 1.93e-04 2022-05-07 06:27:26,138 INFO [train.py:715] (0/8) Epoch 11, batch 28050, loss[loss=0.1351, simple_loss=0.2099, pruned_loss=0.03014, over 4885.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2107, pruned_loss=0.03226, over 973076.20 frames.], batch size: 16, lr: 1.93e-04 2022-05-07 06:28:04,141 INFO [train.py:715] (0/8) Epoch 11, batch 28100, loss[loss=0.1213, simple_loss=0.1878, pruned_loss=0.02744, over 4889.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03165, over 972808.59 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:28:41,419 INFO [train.py:715] (0/8) Epoch 11, batch 28150, loss[loss=0.141, simple_loss=0.2191, pruned_loss=0.03139, over 4948.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03163, over 973474.18 frames.], batch size: 29, lr: 1.93e-04 2022-05-07 06:29:18,867 INFO [train.py:715] (0/8) Epoch 11, batch 28200, loss[loss=0.1484, simple_loss=0.222, pruned_loss=0.03742, over 4845.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.03156, over 973159.31 frames.], batch size: 30, lr: 1.93e-04 2022-05-07 06:29:57,421 INFO [train.py:715] (0/8) Epoch 11, batch 28250, loss[loss=0.1267, simple_loss=0.1999, pruned_loss=0.02671, over 4794.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03163, over 972914.84 frames.], batch size: 14, lr: 1.93e-04 2022-05-07 06:30:34,924 INFO [train.py:715] (0/8) Epoch 11, batch 28300, loss[loss=0.1389, simple_loss=0.2051, pruned_loss=0.0364, over 4742.00 frames.], tot_loss[loss=0.1376, simple_loss=0.211, pruned_loss=0.03206, over 973170.69 frames.], batch size: 16, lr: 1.93e-04 2022-05-07 06:31:12,828 INFO [train.py:715] (0/8) Epoch 11, batch 28350, loss[loss=0.1635, simple_loss=0.2314, pruned_loss=0.04783, over 4834.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03218, over 973331.32 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 06:31:50,524 INFO [train.py:715] (0/8) Epoch 11, batch 28400, loss[loss=0.1616, simple_loss=0.2378, pruned_loss=0.04276, over 4960.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2118, pruned_loss=0.03249, over 973705.08 frames.], batch size: 21, lr: 1.93e-04 2022-05-07 06:32:28,897 INFO [train.py:715] (0/8) Epoch 11, batch 28450, loss[loss=0.165, simple_loss=0.2388, pruned_loss=0.04559, over 4831.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2112, pruned_loss=0.03247, over 973631.41 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 06:33:06,935 INFO [train.py:715] (0/8) Epoch 11, batch 28500, loss[loss=0.1341, simple_loss=0.213, pruned_loss=0.02759, over 4821.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03282, over 972334.47 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 06:33:44,626 INFO [train.py:715] (0/8) Epoch 11, batch 28550, loss[loss=0.1409, simple_loss=0.214, pruned_loss=0.03389, over 4743.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2122, pruned_loss=0.03282, over 971809.99 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:34:23,466 INFO [train.py:715] (0/8) Epoch 11, batch 28600, loss[loss=0.1403, simple_loss=0.2123, pruned_loss=0.03414, over 4748.00 frames.], tot_loss[loss=0.139, simple_loss=0.2122, pruned_loss=0.03288, over 971853.55 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:35:01,432 INFO [train.py:715] (0/8) Epoch 11, batch 28650, loss[loss=0.126, simple_loss=0.2004, pruned_loss=0.02576, over 4772.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2114, pruned_loss=0.03269, over 971704.95 frames.], batch size: 17, lr: 1.93e-04 2022-05-07 06:35:39,418 INFO [train.py:715] (0/8) Epoch 11, batch 28700, loss[loss=0.1571, simple_loss=0.2305, pruned_loss=0.04187, over 4872.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.0329, over 972314.64 frames.], batch size: 20, lr: 1.93e-04 2022-05-07 06:36:17,173 INFO [train.py:715] (0/8) Epoch 11, batch 28750, loss[loss=0.1526, simple_loss=0.234, pruned_loss=0.03559, over 4936.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2113, pruned_loss=0.03287, over 972888.40 frames.], batch size: 21, lr: 1.93e-04 2022-05-07 06:36:55,921 INFO [train.py:715] (0/8) Epoch 11, batch 28800, loss[loss=0.1551, simple_loss=0.2297, pruned_loss=0.04023, over 4869.00 frames.], tot_loss[loss=0.1392, simple_loss=0.212, pruned_loss=0.0332, over 973296.43 frames.], batch size: 32, lr: 1.93e-04 2022-05-07 06:37:33,389 INFO [train.py:715] (0/8) Epoch 11, batch 28850, loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03197, over 4815.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2111, pruned_loss=0.03269, over 972815.59 frames.], batch size: 25, lr: 1.93e-04 2022-05-07 06:38:10,804 INFO [train.py:715] (0/8) Epoch 11, batch 28900, loss[loss=0.159, simple_loss=0.2288, pruned_loss=0.04459, over 4778.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03319, over 972730.00 frames.], batch size: 14, lr: 1.93e-04 2022-05-07 06:38:49,565 INFO [train.py:715] (0/8) Epoch 11, batch 28950, loss[loss=0.1411, simple_loss=0.2159, pruned_loss=0.0331, over 4789.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2118, pruned_loss=0.03248, over 972982.56 frames.], batch size: 21, lr: 1.93e-04 2022-05-07 06:39:27,035 INFO [train.py:715] (0/8) Epoch 11, batch 29000, loss[loss=0.14, simple_loss=0.2194, pruned_loss=0.03033, over 4700.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2119, pruned_loss=0.03231, over 972399.87 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 06:40:04,955 INFO [train.py:715] (0/8) Epoch 11, batch 29050, loss[loss=0.1385, simple_loss=0.221, pruned_loss=0.02797, over 4926.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2118, pruned_loss=0.03237, over 972736.54 frames.], batch size: 18, lr: 1.93e-04 2022-05-07 06:40:42,748 INFO [train.py:715] (0/8) Epoch 11, batch 29100, loss[loss=0.1099, simple_loss=0.1898, pruned_loss=0.01499, over 4813.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2118, pruned_loss=0.0323, over 973110.24 frames.], batch size: 25, lr: 1.93e-04 2022-05-07 06:41:21,081 INFO [train.py:715] (0/8) Epoch 11, batch 29150, loss[loss=0.123, simple_loss=0.1918, pruned_loss=0.02714, over 4794.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2119, pruned_loss=0.03251, over 972424.26 frames.], batch size: 24, lr: 1.93e-04 2022-05-07 06:41:58,817 INFO [train.py:715] (0/8) Epoch 11, batch 29200, loss[loss=0.1531, simple_loss=0.2208, pruned_loss=0.04273, over 4790.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2124, pruned_loss=0.03255, over 972661.45 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 06:42:36,367 INFO [train.py:715] (0/8) Epoch 11, batch 29250, loss[loss=0.1447, simple_loss=0.2121, pruned_loss=0.03863, over 4809.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03225, over 972706.68 frames.], batch size: 21, lr: 1.93e-04 2022-05-07 06:43:15,060 INFO [train.py:715] (0/8) Epoch 11, batch 29300, loss[loss=0.1518, simple_loss=0.227, pruned_loss=0.03834, over 4754.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03216, over 973229.29 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:43:53,134 INFO [train.py:715] (0/8) Epoch 11, batch 29350, loss[loss=0.1341, simple_loss=0.2065, pruned_loss=0.03089, over 4755.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2106, pruned_loss=0.03191, over 973143.08 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:44:30,898 INFO [train.py:715] (0/8) Epoch 11, batch 29400, loss[loss=0.1341, simple_loss=0.2066, pruned_loss=0.03082, over 4981.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03194, over 974021.11 frames.], batch size: 16, lr: 1.93e-04 2022-05-07 06:45:08,808 INFO [train.py:715] (0/8) Epoch 11, batch 29450, loss[loss=0.1436, simple_loss=0.2216, pruned_loss=0.03281, over 4806.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2116, pruned_loss=0.03239, over 973850.81 frames.], batch size: 25, lr: 1.93e-04 2022-05-07 06:45:46,706 INFO [train.py:715] (0/8) Epoch 11, batch 29500, loss[loss=0.1532, simple_loss=0.2252, pruned_loss=0.04059, over 4795.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2114, pruned_loss=0.03238, over 973380.17 frames.], batch size: 17, lr: 1.93e-04 2022-05-07 06:46:25,299 INFO [train.py:715] (0/8) Epoch 11, batch 29550, loss[loss=0.1301, simple_loss=0.1997, pruned_loss=0.03024, over 4767.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2106, pruned_loss=0.03189, over 972795.99 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 06:47:02,901 INFO [train.py:715] (0/8) Epoch 11, batch 29600, loss[loss=0.1782, simple_loss=0.2423, pruned_loss=0.05701, over 4849.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2107, pruned_loss=0.0319, over 972747.07 frames.], batch size: 30, lr: 1.93e-04 2022-05-07 06:47:41,470 INFO [train.py:715] (0/8) Epoch 11, batch 29650, loss[loss=0.1522, simple_loss=0.2239, pruned_loss=0.04022, over 4838.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2108, pruned_loss=0.03229, over 973681.90 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 06:48:19,462 INFO [train.py:715] (0/8) Epoch 11, batch 29700, loss[loss=0.1262, simple_loss=0.2064, pruned_loss=0.02304, over 4884.00 frames.], tot_loss[loss=0.138, simple_loss=0.2111, pruned_loss=0.03244, over 973248.57 frames.], batch size: 22, lr: 1.93e-04 2022-05-07 06:48:57,622 INFO [train.py:715] (0/8) Epoch 11, batch 29750, loss[loss=0.1087, simple_loss=0.1891, pruned_loss=0.01413, over 4951.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.03192, over 973324.37 frames.], batch size: 29, lr: 1.93e-04 2022-05-07 06:49:35,433 INFO [train.py:715] (0/8) Epoch 11, batch 29800, loss[loss=0.1571, simple_loss=0.2288, pruned_loss=0.04273, over 4695.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2106, pruned_loss=0.03217, over 972048.85 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 06:50:13,824 INFO [train.py:715] (0/8) Epoch 11, batch 29850, loss[loss=0.1267, simple_loss=0.2006, pruned_loss=0.02634, over 4848.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.03204, over 972462.97 frames.], batch size: 30, lr: 1.93e-04 2022-05-07 06:50:52,364 INFO [train.py:715] (0/8) Epoch 11, batch 29900, loss[loss=0.1297, simple_loss=0.1981, pruned_loss=0.03068, over 4877.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03197, over 973008.24 frames.], batch size: 16, lr: 1.93e-04 2022-05-07 06:51:29,988 INFO [train.py:715] (0/8) Epoch 11, batch 29950, loss[loss=0.1102, simple_loss=0.1787, pruned_loss=0.02085, over 4800.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2096, pruned_loss=0.03157, over 971585.68 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 06:52:08,176 INFO [train.py:715] (0/8) Epoch 11, batch 30000, loss[loss=0.1628, simple_loss=0.2346, pruned_loss=0.04557, over 4794.00 frames.], tot_loss[loss=0.137, simple_loss=0.2101, pruned_loss=0.03192, over 970977.41 frames.], batch size: 18, lr: 1.93e-04 2022-05-07 06:52:08,177 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 06:52:17,627 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.106, simple_loss=0.19, pruned_loss=0.01095, over 914524.00 frames. 2022-05-07 06:52:56,512 INFO [train.py:715] (0/8) Epoch 11, batch 30050, loss[loss=0.1501, simple_loss=0.2252, pruned_loss=0.03753, over 4836.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03221, over 971396.53 frames.], batch size: 26, lr: 1.93e-04 2022-05-07 06:53:34,385 INFO [train.py:715] (0/8) Epoch 11, batch 30100, loss[loss=0.1472, simple_loss=0.2239, pruned_loss=0.03519, over 4850.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2113, pruned_loss=0.03197, over 971643.39 frames.], batch size: 20, lr: 1.93e-04 2022-05-07 06:54:13,050 INFO [train.py:715] (0/8) Epoch 11, batch 30150, loss[loss=0.1157, simple_loss=0.191, pruned_loss=0.02024, over 4822.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2112, pruned_loss=0.03207, over 972321.37 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 06:54:50,398 INFO [train.py:715] (0/8) Epoch 11, batch 30200, loss[loss=0.1275, simple_loss=0.2074, pruned_loss=0.0238, over 4776.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2114, pruned_loss=0.03212, over 971530.10 frames.], batch size: 17, lr: 1.93e-04 2022-05-07 06:55:29,242 INFO [train.py:715] (0/8) Epoch 11, batch 30250, loss[loss=0.1492, simple_loss=0.2275, pruned_loss=0.03542, over 4972.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2112, pruned_loss=0.03203, over 972067.41 frames.], batch size: 28, lr: 1.93e-04 2022-05-07 06:56:07,226 INFO [train.py:715] (0/8) Epoch 11, batch 30300, loss[loss=0.1546, simple_loss=0.2287, pruned_loss=0.04028, over 4929.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2118, pruned_loss=0.03217, over 972251.31 frames.], batch size: 35, lr: 1.93e-04 2022-05-07 06:56:45,175 INFO [train.py:715] (0/8) Epoch 11, batch 30350, loss[loss=0.127, simple_loss=0.2048, pruned_loss=0.02455, over 4885.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2119, pruned_loss=0.03244, over 971746.34 frames.], batch size: 16, lr: 1.93e-04 2022-05-07 06:57:23,260 INFO [train.py:715] (0/8) Epoch 11, batch 30400, loss[loss=0.1217, simple_loss=0.1952, pruned_loss=0.02412, over 4942.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03208, over 972130.60 frames.], batch size: 23, lr: 1.93e-04 2022-05-07 06:58:01,498 INFO [train.py:715] (0/8) Epoch 11, batch 30450, loss[loss=0.1586, simple_loss=0.2345, pruned_loss=0.04133, over 4843.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03179, over 972448.36 frames.], batch size: 13, lr: 1.93e-04 2022-05-07 06:58:39,329 INFO [train.py:715] (0/8) Epoch 11, batch 30500, loss[loss=0.09509, simple_loss=0.1626, pruned_loss=0.01378, over 4734.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2109, pruned_loss=0.03244, over 972221.07 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 06:59:17,141 INFO [train.py:715] (0/8) Epoch 11, batch 30550, loss[loss=0.1445, simple_loss=0.218, pruned_loss=0.03548, over 4903.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.03277, over 971810.88 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 06:59:56,403 INFO [train.py:715] (0/8) Epoch 11, batch 30600, loss[loss=0.1266, simple_loss=0.2016, pruned_loss=0.02575, over 4960.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2112, pruned_loss=0.03276, over 971724.35 frames.], batch size: 21, lr: 1.93e-04 2022-05-07 07:00:35,032 INFO [train.py:715] (0/8) Epoch 11, batch 30650, loss[loss=0.1428, simple_loss=0.2234, pruned_loss=0.0311, over 4825.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2113, pruned_loss=0.03255, over 971892.74 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 07:01:13,826 INFO [train.py:715] (0/8) Epoch 11, batch 30700, loss[loss=0.1216, simple_loss=0.1938, pruned_loss=0.02473, over 4928.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2097, pruned_loss=0.03177, over 972440.39 frames.], batch size: 29, lr: 1.93e-04 2022-05-07 07:01:52,333 INFO [train.py:715] (0/8) Epoch 11, batch 30750, loss[loss=0.1564, simple_loss=0.2344, pruned_loss=0.03923, over 4839.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03211, over 971833.68 frames.], batch size: 30, lr: 1.93e-04 2022-05-07 07:02:30,945 INFO [train.py:715] (0/8) Epoch 11, batch 30800, loss[loss=0.1105, simple_loss=0.1785, pruned_loss=0.0212, over 4782.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2098, pruned_loss=0.03217, over 971351.42 frames.], batch size: 14, lr: 1.93e-04 2022-05-07 07:03:09,706 INFO [train.py:715] (0/8) Epoch 11, batch 30850, loss[loss=0.09781, simple_loss=0.1671, pruned_loss=0.01428, over 4825.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2103, pruned_loss=0.03239, over 970999.00 frames.], batch size: 13, lr: 1.93e-04 2022-05-07 07:03:48,264 INFO [train.py:715] (0/8) Epoch 11, batch 30900, loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03087, over 4868.00 frames.], tot_loss[loss=0.138, simple_loss=0.2109, pruned_loss=0.03252, over 971361.07 frames.], batch size: 30, lr: 1.93e-04 2022-05-07 07:04:27,071 INFO [train.py:715] (0/8) Epoch 11, batch 30950, loss[loss=0.1496, simple_loss=0.2115, pruned_loss=0.04386, over 4980.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2104, pruned_loss=0.0322, over 972080.00 frames.], batch size: 14, lr: 1.93e-04 2022-05-07 07:05:06,004 INFO [train.py:715] (0/8) Epoch 11, batch 31000, loss[loss=0.1274, simple_loss=0.1922, pruned_loss=0.03129, over 4841.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.03249, over 971869.98 frames.], batch size: 13, lr: 1.93e-04 2022-05-07 07:05:44,520 INFO [train.py:715] (0/8) Epoch 11, batch 31050, loss[loss=0.1793, simple_loss=0.2362, pruned_loss=0.06117, over 4986.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2115, pruned_loss=0.03277, over 972649.60 frames.], batch size: 25, lr: 1.93e-04 2022-05-07 07:06:23,341 INFO [train.py:715] (0/8) Epoch 11, batch 31100, loss[loss=0.1291, simple_loss=0.2057, pruned_loss=0.02629, over 4805.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03202, over 972198.98 frames.], batch size: 25, lr: 1.93e-04 2022-05-07 07:07:01,738 INFO [train.py:715] (0/8) Epoch 11, batch 31150, loss[loss=0.1779, simple_loss=0.2362, pruned_loss=0.05979, over 4834.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03191, over 972954.36 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 07:07:39,376 INFO [train.py:715] (0/8) Epoch 11, batch 31200, loss[loss=0.1135, simple_loss=0.1862, pruned_loss=0.02039, over 4822.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.03264, over 972631.29 frames.], batch size: 13, lr: 1.93e-04 2022-05-07 07:08:17,479 INFO [train.py:715] (0/8) Epoch 11, batch 31250, loss[loss=0.1164, simple_loss=0.1881, pruned_loss=0.0223, over 4786.00 frames.], tot_loss[loss=0.1379, simple_loss=0.211, pruned_loss=0.03234, over 972423.99 frames.], batch size: 18, lr: 1.93e-04 2022-05-07 07:08:55,762 INFO [train.py:715] (0/8) Epoch 11, batch 31300, loss[loss=0.1132, simple_loss=0.1811, pruned_loss=0.02265, over 4847.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.03235, over 972973.00 frames.], batch size: 13, lr: 1.93e-04 2022-05-07 07:09:33,558 INFO [train.py:715] (0/8) Epoch 11, batch 31350, loss[loss=0.1311, simple_loss=0.2048, pruned_loss=0.02867, over 4874.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2113, pruned_loss=0.03274, over 973620.23 frames.], batch size: 22, lr: 1.93e-04 2022-05-07 07:10:10,907 INFO [train.py:715] (0/8) Epoch 11, batch 31400, loss[loss=0.1518, simple_loss=0.2214, pruned_loss=0.0411, over 4814.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2118, pruned_loss=0.03255, over 972908.74 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 07:10:48,403 INFO [train.py:715] (0/8) Epoch 11, batch 31450, loss[loss=0.1078, simple_loss=0.1817, pruned_loss=0.0169, over 4897.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2107, pruned_loss=0.03209, over 972675.47 frames.], batch size: 19, lr: 1.93e-04 2022-05-07 07:11:26,016 INFO [train.py:715] (0/8) Epoch 11, batch 31500, loss[loss=0.1235, simple_loss=0.2062, pruned_loss=0.02038, over 4916.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2117, pruned_loss=0.03237, over 973248.86 frames.], batch size: 23, lr: 1.93e-04 2022-05-07 07:12:03,663 INFO [train.py:715] (0/8) Epoch 11, batch 31550, loss[loss=0.1505, simple_loss=0.2154, pruned_loss=0.0428, over 4782.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2115, pruned_loss=0.03235, over 973646.67 frames.], batch size: 14, lr: 1.93e-04 2022-05-07 07:12:41,666 INFO [train.py:715] (0/8) Epoch 11, batch 31600, loss[loss=0.1511, simple_loss=0.2285, pruned_loss=0.03686, over 4806.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03258, over 973532.23 frames.], batch size: 25, lr: 1.93e-04 2022-05-07 07:13:19,753 INFO [train.py:715] (0/8) Epoch 11, batch 31650, loss[loss=0.146, simple_loss=0.2311, pruned_loss=0.03048, over 4968.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2126, pruned_loss=0.03322, over 973779.84 frames.], batch size: 24, lr: 1.93e-04 2022-05-07 07:13:57,686 INFO [train.py:715] (0/8) Epoch 11, batch 31700, loss[loss=0.1115, simple_loss=0.1843, pruned_loss=0.01935, over 4814.00 frames.], tot_loss[loss=0.1397, simple_loss=0.2126, pruned_loss=0.0334, over 972861.29 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 07:14:35,210 INFO [train.py:715] (0/8) Epoch 11, batch 31750, loss[loss=0.1435, simple_loss=0.2236, pruned_loss=0.03172, over 4934.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2129, pruned_loss=0.03372, over 972815.76 frames.], batch size: 21, lr: 1.93e-04 2022-05-07 07:15:14,058 INFO [train.py:715] (0/8) Epoch 11, batch 31800, loss[loss=0.1525, simple_loss=0.2218, pruned_loss=0.04159, over 4966.00 frames.], tot_loss[loss=0.1401, simple_loss=0.2129, pruned_loss=0.03359, over 971915.15 frames.], batch size: 35, lr: 1.93e-04 2022-05-07 07:15:52,635 INFO [train.py:715] (0/8) Epoch 11, batch 31850, loss[loss=0.1548, simple_loss=0.2261, pruned_loss=0.04178, over 4776.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2127, pruned_loss=0.03346, over 972040.32 frames.], batch size: 18, lr: 1.93e-04 2022-05-07 07:16:30,868 INFO [train.py:715] (0/8) Epoch 11, batch 31900, loss[loss=0.1272, simple_loss=0.1988, pruned_loss=0.02785, over 4800.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2109, pruned_loss=0.03291, over 971463.82 frames.], batch size: 24, lr: 1.93e-04 2022-05-07 07:17:09,160 INFO [train.py:715] (0/8) Epoch 11, batch 31950, loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.02978, over 4817.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2105, pruned_loss=0.03288, over 971850.29 frames.], batch size: 15, lr: 1.93e-04 2022-05-07 07:17:47,941 INFO [train.py:715] (0/8) Epoch 11, batch 32000, loss[loss=0.1484, simple_loss=0.2157, pruned_loss=0.04054, over 4793.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2112, pruned_loss=0.03284, over 972183.43 frames.], batch size: 24, lr: 1.93e-04 2022-05-07 07:18:26,167 INFO [train.py:715] (0/8) Epoch 11, batch 32050, loss[loss=0.1306, simple_loss=0.1959, pruned_loss=0.03266, over 4800.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2114, pruned_loss=0.03274, over 972357.87 frames.], batch size: 12, lr: 1.93e-04 2022-05-07 07:19:04,551 INFO [train.py:715] (0/8) Epoch 11, batch 32100, loss[loss=0.1292, simple_loss=0.2017, pruned_loss=0.02842, over 4849.00 frames.], tot_loss[loss=0.1382, simple_loss=0.211, pruned_loss=0.03272, over 972500.92 frames.], batch size: 32, lr: 1.92e-04 2022-05-07 07:19:42,569 INFO [train.py:715] (0/8) Epoch 11, batch 32150, loss[loss=0.1368, simple_loss=0.2066, pruned_loss=0.03355, over 4840.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03234, over 972003.74 frames.], batch size: 32, lr: 1.92e-04 2022-05-07 07:20:19,988 INFO [train.py:715] (0/8) Epoch 11, batch 32200, loss[loss=0.1408, simple_loss=0.2199, pruned_loss=0.03086, over 4899.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2113, pruned_loss=0.03251, over 971781.36 frames.], batch size: 19, lr: 1.92e-04 2022-05-07 07:20:57,513 INFO [train.py:715] (0/8) Epoch 11, batch 32250, loss[loss=0.1648, simple_loss=0.2252, pruned_loss=0.05218, over 4651.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2111, pruned_loss=0.03298, over 971791.28 frames.], batch size: 13, lr: 1.92e-04 2022-05-07 07:21:35,345 INFO [train.py:715] (0/8) Epoch 11, batch 32300, loss[loss=0.1234, simple_loss=0.1934, pruned_loss=0.02672, over 4785.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2106, pruned_loss=0.0326, over 970818.34 frames.], batch size: 17, lr: 1.92e-04 2022-05-07 07:22:13,995 INFO [train.py:715] (0/8) Epoch 11, batch 32350, loss[loss=0.1769, simple_loss=0.2466, pruned_loss=0.05359, over 4902.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2109, pruned_loss=0.03243, over 971409.99 frames.], batch size: 17, lr: 1.92e-04 2022-05-07 07:22:51,414 INFO [train.py:715] (0/8) Epoch 11, batch 32400, loss[loss=0.13, simple_loss=0.1949, pruned_loss=0.03254, over 4917.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03202, over 972341.06 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:23:29,416 INFO [train.py:715] (0/8) Epoch 11, batch 32450, loss[loss=0.154, simple_loss=0.2284, pruned_loss=0.03985, over 4915.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2117, pruned_loss=0.03248, over 972605.30 frames.], batch size: 23, lr: 1.92e-04 2022-05-07 07:24:07,456 INFO [train.py:715] (0/8) Epoch 11, batch 32500, loss[loss=0.1076, simple_loss=0.1892, pruned_loss=0.01303, over 4926.00 frames.], tot_loss[loss=0.1387, simple_loss=0.212, pruned_loss=0.03274, over 973386.23 frames.], batch size: 23, lr: 1.92e-04 2022-05-07 07:24:45,515 INFO [train.py:715] (0/8) Epoch 11, batch 32550, loss[loss=0.1405, simple_loss=0.2196, pruned_loss=0.03065, over 4882.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2123, pruned_loss=0.0329, over 972249.96 frames.], batch size: 22, lr: 1.92e-04 2022-05-07 07:25:23,170 INFO [train.py:715] (0/8) Epoch 11, batch 32600, loss[loss=0.1472, simple_loss=0.2223, pruned_loss=0.03602, over 4770.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2126, pruned_loss=0.03332, over 971638.97 frames.], batch size: 17, lr: 1.92e-04 2022-05-07 07:26:01,255 INFO [train.py:715] (0/8) Epoch 11, batch 32650, loss[loss=0.1452, simple_loss=0.233, pruned_loss=0.0287, over 4793.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03303, over 972005.14 frames.], batch size: 21, lr: 1.92e-04 2022-05-07 07:26:39,441 INFO [train.py:715] (0/8) Epoch 11, batch 32700, loss[loss=0.1351, simple_loss=0.2049, pruned_loss=0.0327, over 4855.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.03282, over 971691.45 frames.], batch size: 34, lr: 1.92e-04 2022-05-07 07:27:16,888 INFO [train.py:715] (0/8) Epoch 11, batch 32750, loss[loss=0.1114, simple_loss=0.1888, pruned_loss=0.01701, over 4799.00 frames.], tot_loss[loss=0.138, simple_loss=0.2113, pruned_loss=0.03237, over 972023.60 frames.], batch size: 14, lr: 1.92e-04 2022-05-07 07:27:55,655 INFO [train.py:715] (0/8) Epoch 11, batch 32800, loss[loss=0.1268, simple_loss=0.1998, pruned_loss=0.0269, over 4944.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2114, pruned_loss=0.03237, over 972705.76 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:28:35,366 INFO [train.py:715] (0/8) Epoch 11, batch 32850, loss[loss=0.1376, simple_loss=0.2093, pruned_loss=0.03298, over 4953.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2116, pruned_loss=0.03244, over 973700.80 frames.], batch size: 21, lr: 1.92e-04 2022-05-07 07:29:13,918 INFO [train.py:715] (0/8) Epoch 11, batch 32900, loss[loss=0.1577, simple_loss=0.2389, pruned_loss=0.03825, over 4809.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.032, over 972957.58 frames.], batch size: 21, lr: 1.92e-04 2022-05-07 07:29:52,130 INFO [train.py:715] (0/8) Epoch 11, batch 32950, loss[loss=0.1291, simple_loss=0.2059, pruned_loss=0.02616, over 4775.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2118, pruned_loss=0.0327, over 973080.23 frames.], batch size: 17, lr: 1.92e-04 2022-05-07 07:30:31,046 INFO [train.py:715] (0/8) Epoch 11, batch 33000, loss[loss=0.1305, simple_loss=0.1963, pruned_loss=0.0324, over 4980.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2119, pruned_loss=0.03263, over 973238.75 frames.], batch size: 35, lr: 1.92e-04 2022-05-07 07:30:31,047 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 07:30:40,494 INFO [train.py:742] (0/8) Epoch 11, validation: loss=0.1059, simple_loss=0.1899, pruned_loss=0.0109, over 914524.00 frames. 2022-05-07 07:31:19,414 INFO [train.py:715] (0/8) Epoch 11, batch 33050, loss[loss=0.1297, simple_loss=0.2133, pruned_loss=0.02302, over 4885.00 frames.], tot_loss[loss=0.139, simple_loss=0.2121, pruned_loss=0.03291, over 972913.84 frames.], batch size: 22, lr: 1.92e-04 2022-05-07 07:31:55,481 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-416000.pt 2022-05-07 07:32:00,917 INFO [train.py:715] (0/8) Epoch 11, batch 33100, loss[loss=0.1448, simple_loss=0.228, pruned_loss=0.03075, over 4936.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2129, pruned_loss=0.03296, over 972869.00 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:32:38,875 INFO [train.py:715] (0/8) Epoch 11, batch 33150, loss[loss=0.1083, simple_loss=0.1807, pruned_loss=0.01792, over 4780.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2119, pruned_loss=0.03268, over 972624.78 frames.], batch size: 12, lr: 1.92e-04 2022-05-07 07:33:17,489 INFO [train.py:715] (0/8) Epoch 11, batch 33200, loss[loss=0.1121, simple_loss=0.1891, pruned_loss=0.01749, over 4920.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2117, pruned_loss=0.03284, over 972921.06 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:33:56,615 INFO [train.py:715] (0/8) Epoch 11, batch 33250, loss[loss=0.1283, simple_loss=0.2068, pruned_loss=0.02494, over 4748.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2118, pruned_loss=0.03274, over 973479.77 frames.], batch size: 19, lr: 1.92e-04 2022-05-07 07:34:35,408 INFO [train.py:715] (0/8) Epoch 11, batch 33300, loss[loss=0.127, simple_loss=0.204, pruned_loss=0.02502, over 4942.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2119, pruned_loss=0.03254, over 973844.33 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:35:13,282 INFO [train.py:715] (0/8) Epoch 11, batch 33350, loss[loss=0.1774, simple_loss=0.2389, pruned_loss=0.05793, over 4961.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03205, over 973308.83 frames.], batch size: 24, lr: 1.92e-04 2022-05-07 07:35:51,700 INFO [train.py:715] (0/8) Epoch 11, batch 33400, loss[loss=0.1471, simple_loss=0.2221, pruned_loss=0.03598, over 4753.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2116, pruned_loss=0.03259, over 972619.64 frames.], batch size: 16, lr: 1.92e-04 2022-05-07 07:36:30,390 INFO [train.py:715] (0/8) Epoch 11, batch 33450, loss[loss=0.1433, simple_loss=0.2163, pruned_loss=0.03513, over 4826.00 frames.], tot_loss[loss=0.139, simple_loss=0.2121, pruned_loss=0.03299, over 971108.59 frames.], batch size: 15, lr: 1.92e-04 2022-05-07 07:37:08,728 INFO [train.py:715] (0/8) Epoch 11, batch 33500, loss[loss=0.1345, simple_loss=0.2087, pruned_loss=0.03014, over 4691.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03312, over 970897.62 frames.], batch size: 15, lr: 1.92e-04 2022-05-07 07:37:47,172 INFO [train.py:715] (0/8) Epoch 11, batch 33550, loss[loss=0.141, simple_loss=0.2167, pruned_loss=0.03271, over 4787.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2127, pruned_loss=0.03304, over 971522.53 frames.], batch size: 23, lr: 1.92e-04 2022-05-07 07:38:25,758 INFO [train.py:715] (0/8) Epoch 11, batch 33600, loss[loss=0.1274, simple_loss=0.2033, pruned_loss=0.0257, over 4935.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2125, pruned_loss=0.03248, over 971728.35 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:39:04,163 INFO [train.py:715] (0/8) Epoch 11, batch 33650, loss[loss=0.1365, simple_loss=0.2022, pruned_loss=0.03545, over 4780.00 frames.], tot_loss[loss=0.1382, simple_loss=0.212, pruned_loss=0.03217, over 971936.48 frames.], batch size: 17, lr: 1.92e-04 2022-05-07 07:39:42,288 INFO [train.py:715] (0/8) Epoch 11, batch 33700, loss[loss=0.1453, simple_loss=0.2089, pruned_loss=0.04084, over 4750.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2113, pruned_loss=0.03221, over 971985.28 frames.], batch size: 16, lr: 1.92e-04 2022-05-07 07:40:20,575 INFO [train.py:715] (0/8) Epoch 11, batch 33750, loss[loss=0.1348, simple_loss=0.2122, pruned_loss=0.02867, over 4874.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03172, over 971661.95 frames.], batch size: 12, lr: 1.92e-04 2022-05-07 07:40:59,157 INFO [train.py:715] (0/8) Epoch 11, batch 33800, loss[loss=0.1516, simple_loss=0.2306, pruned_loss=0.0363, over 4960.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03139, over 971832.17 frames.], batch size: 21, lr: 1.92e-04 2022-05-07 07:41:37,144 INFO [train.py:715] (0/8) Epoch 11, batch 33850, loss[loss=0.1372, simple_loss=0.2152, pruned_loss=0.02957, over 4940.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03121, over 972114.32 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:42:15,178 INFO [train.py:715] (0/8) Epoch 11, batch 33900, loss[loss=0.1421, simple_loss=0.2136, pruned_loss=0.03527, over 4867.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03162, over 971781.91 frames.], batch size: 32, lr: 1.92e-04 2022-05-07 07:42:53,931 INFO [train.py:715] (0/8) Epoch 11, batch 33950, loss[loss=0.1663, simple_loss=0.2295, pruned_loss=0.05159, over 4952.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03168, over 971826.53 frames.], batch size: 35, lr: 1.92e-04 2022-05-07 07:43:32,249 INFO [train.py:715] (0/8) Epoch 11, batch 34000, loss[loss=0.1387, simple_loss=0.2081, pruned_loss=0.03467, over 4851.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2113, pruned_loss=0.03185, over 971421.33 frames.], batch size: 30, lr: 1.92e-04 2022-05-07 07:44:10,348 INFO [train.py:715] (0/8) Epoch 11, batch 34050, loss[loss=0.1282, simple_loss=0.2016, pruned_loss=0.02739, over 4993.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2118, pruned_loss=0.03243, over 971861.61 frames.], batch size: 20, lr: 1.92e-04 2022-05-07 07:44:48,871 INFO [train.py:715] (0/8) Epoch 11, batch 34100, loss[loss=0.1188, simple_loss=0.1916, pruned_loss=0.02297, over 4795.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03187, over 971317.60 frames.], batch size: 24, lr: 1.92e-04 2022-05-07 07:45:27,610 INFO [train.py:715] (0/8) Epoch 11, batch 34150, loss[loss=0.1284, simple_loss=0.1962, pruned_loss=0.03031, over 4691.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2124, pruned_loss=0.03262, over 971421.71 frames.], batch size: 15, lr: 1.92e-04 2022-05-07 07:46:05,698 INFO [train.py:715] (0/8) Epoch 11, batch 34200, loss[loss=0.1435, simple_loss=0.2263, pruned_loss=0.03033, over 4799.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2123, pruned_loss=0.0327, over 971759.51 frames.], batch size: 24, lr: 1.92e-04 2022-05-07 07:46:44,123 INFO [train.py:715] (0/8) Epoch 11, batch 34250, loss[loss=0.1629, simple_loss=0.2336, pruned_loss=0.04614, over 4748.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2123, pruned_loss=0.03315, over 972487.83 frames.], batch size: 16, lr: 1.92e-04 2022-05-07 07:47:23,285 INFO [train.py:715] (0/8) Epoch 11, batch 34300, loss[loss=0.1067, simple_loss=0.1717, pruned_loss=0.02084, over 4784.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2124, pruned_loss=0.03316, over 972719.77 frames.], batch size: 12, lr: 1.92e-04 2022-05-07 07:48:01,578 INFO [train.py:715] (0/8) Epoch 11, batch 34350, loss[loss=0.1291, simple_loss=0.2046, pruned_loss=0.02685, over 4862.00 frames.], tot_loss[loss=0.14, simple_loss=0.2129, pruned_loss=0.03351, over 973046.54 frames.], batch size: 16, lr: 1.92e-04 2022-05-07 07:48:40,021 INFO [train.py:715] (0/8) Epoch 11, batch 34400, loss[loss=0.1434, simple_loss=0.2129, pruned_loss=0.0369, over 4988.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2125, pruned_loss=0.03312, over 973204.63 frames.], batch size: 14, lr: 1.92e-04 2022-05-07 07:49:18,671 INFO [train.py:715] (0/8) Epoch 11, batch 34450, loss[loss=0.1207, simple_loss=0.1898, pruned_loss=0.02584, over 4794.00 frames.], tot_loss[loss=0.1398, simple_loss=0.2128, pruned_loss=0.03339, over 973102.04 frames.], batch size: 14, lr: 1.92e-04 2022-05-07 07:49:57,848 INFO [train.py:715] (0/8) Epoch 11, batch 34500, loss[loss=0.137, simple_loss=0.2147, pruned_loss=0.02964, over 4917.00 frames.], tot_loss[loss=0.139, simple_loss=0.212, pruned_loss=0.03295, over 972313.39 frames.], batch size: 18, lr: 1.92e-04 2022-05-07 07:50:35,955 INFO [train.py:715] (0/8) Epoch 11, batch 34550, loss[loss=0.1222, simple_loss=0.1974, pruned_loss=0.02353, over 4929.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2118, pruned_loss=0.03228, over 973021.67 frames.], batch size: 29, lr: 1.92e-04 2022-05-07 07:51:12,738 INFO [train.py:715] (0/8) Epoch 11, batch 34600, loss[loss=0.1585, simple_loss=0.2429, pruned_loss=0.03706, over 4890.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2118, pruned_loss=0.03245, over 972721.13 frames.], batch size: 19, lr: 1.92e-04 2022-05-07 07:51:50,529 INFO [train.py:715] (0/8) Epoch 11, batch 34650, loss[loss=0.1239, simple_loss=0.1989, pruned_loss=0.02444, over 4800.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2118, pruned_loss=0.03254, over 973226.95 frames.], batch size: 14, lr: 1.92e-04 2022-05-07 07:52:27,795 INFO [train.py:715] (0/8) Epoch 11, batch 34700, loss[loss=0.1388, simple_loss=0.2189, pruned_loss=0.02936, over 4828.00 frames.], tot_loss[loss=0.1375, simple_loss=0.211, pruned_loss=0.03197, over 972453.78 frames.], batch size: 27, lr: 1.92e-04 2022-05-07 07:53:04,316 INFO [train.py:715] (0/8) Epoch 11, batch 34750, loss[loss=0.13, simple_loss=0.2065, pruned_loss=0.02673, over 4860.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03178, over 971515.27 frames.], batch size: 20, lr: 1.92e-04 2022-05-07 07:53:39,311 INFO [train.py:715] (0/8) Epoch 11, batch 34800, loss[loss=0.1441, simple_loss=0.2202, pruned_loss=0.03395, over 4910.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2112, pruned_loss=0.03194, over 971835.49 frames.], batch size: 18, lr: 1.92e-04 2022-05-07 07:53:47,382 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-11.pt 2022-05-07 07:54:26,267 INFO [train.py:715] (0/8) Epoch 12, batch 0, loss[loss=0.1359, simple_loss=0.2173, pruned_loss=0.02723, over 4761.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2173, pruned_loss=0.02723, over 4761.00 frames.], batch size: 19, lr: 1.85e-04 2022-05-07 07:55:04,628 INFO [train.py:715] (0/8) Epoch 12, batch 50, loss[loss=0.1313, simple_loss=0.2101, pruned_loss=0.02628, over 4754.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2101, pruned_loss=0.0325, over 219130.32 frames.], batch size: 16, lr: 1.85e-04 2022-05-07 07:55:42,694 INFO [train.py:715] (0/8) Epoch 12, batch 100, loss[loss=0.1216, simple_loss=0.1962, pruned_loss=0.02348, over 4963.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2111, pruned_loss=0.0326, over 385793.12 frames.], batch size: 24, lr: 1.85e-04 2022-05-07 07:56:21,317 INFO [train.py:715] (0/8) Epoch 12, batch 150, loss[loss=0.1471, simple_loss=0.2254, pruned_loss=0.03436, over 4786.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2113, pruned_loss=0.03289, over 516624.06 frames.], batch size: 18, lr: 1.85e-04 2022-05-07 07:56:59,061 INFO [train.py:715] (0/8) Epoch 12, batch 200, loss[loss=0.1411, simple_loss=0.2166, pruned_loss=0.03284, over 4853.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2112, pruned_loss=0.0331, over 618638.57 frames.], batch size: 20, lr: 1.85e-04 2022-05-07 07:57:38,278 INFO [train.py:715] (0/8) Epoch 12, batch 250, loss[loss=0.1372, simple_loss=0.2093, pruned_loss=0.03252, over 4814.00 frames.], tot_loss[loss=0.139, simple_loss=0.2119, pruned_loss=0.03301, over 696468.27 frames.], batch size: 25, lr: 1.85e-04 2022-05-07 07:58:16,538 INFO [train.py:715] (0/8) Epoch 12, batch 300, loss[loss=0.134, simple_loss=0.209, pruned_loss=0.0295, over 4702.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2108, pruned_loss=0.03225, over 758517.03 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 07:58:54,469 INFO [train.py:715] (0/8) Epoch 12, batch 350, loss[loss=0.1084, simple_loss=0.1799, pruned_loss=0.01846, over 4804.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2102, pruned_loss=0.03178, over 806026.23 frames.], batch size: 12, lr: 1.84e-04 2022-05-07 07:59:32,944 INFO [train.py:715] (0/8) Epoch 12, batch 400, loss[loss=0.1635, simple_loss=0.2349, pruned_loss=0.04603, over 4776.00 frames.], tot_loss[loss=0.1375, simple_loss=0.211, pruned_loss=0.03206, over 842450.13 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:00:10,575 INFO [train.py:715] (0/8) Epoch 12, batch 450, loss[loss=0.1461, simple_loss=0.2251, pruned_loss=0.03352, over 4823.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.03201, over 871111.08 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:00:48,785 INFO [train.py:715] (0/8) Epoch 12, batch 500, loss[loss=0.1201, simple_loss=0.1879, pruned_loss=0.02614, over 4947.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03213, over 892867.21 frames.], batch size: 24, lr: 1.84e-04 2022-05-07 08:01:26,230 INFO [train.py:715] (0/8) Epoch 12, batch 550, loss[loss=0.1569, simple_loss=0.2317, pruned_loss=0.04108, over 4875.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2118, pruned_loss=0.03252, over 909799.04 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:02:04,572 INFO [train.py:715] (0/8) Epoch 12, batch 600, loss[loss=0.1422, simple_loss=0.2122, pruned_loss=0.03609, over 4950.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2114, pruned_loss=0.03249, over 923935.91 frames.], batch size: 39, lr: 1.84e-04 2022-05-07 08:02:41,615 INFO [train.py:715] (0/8) Epoch 12, batch 650, loss[loss=0.1233, simple_loss=0.1975, pruned_loss=0.02458, over 4835.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03215, over 934450.99 frames.], batch size: 20, lr: 1.84e-04 2022-05-07 08:03:20,185 INFO [train.py:715] (0/8) Epoch 12, batch 700, loss[loss=0.1376, simple_loss=0.2101, pruned_loss=0.03255, over 4890.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2117, pruned_loss=0.03267, over 943342.98 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:03:58,805 INFO [train.py:715] (0/8) Epoch 12, batch 750, loss[loss=0.1284, simple_loss=0.2152, pruned_loss=0.02079, over 4803.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03199, over 949221.23 frames.], batch size: 25, lr: 1.84e-04 2022-05-07 08:04:37,569 INFO [train.py:715] (0/8) Epoch 12, batch 800, loss[loss=0.1149, simple_loss=0.1866, pruned_loss=0.02156, over 4772.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03214, over 953585.17 frames.], batch size: 12, lr: 1.84e-04 2022-05-07 08:05:16,047 INFO [train.py:715] (0/8) Epoch 12, batch 850, loss[loss=0.1516, simple_loss=0.2217, pruned_loss=0.04069, over 4973.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2104, pruned_loss=0.03202, over 958250.83 frames.], batch size: 35, lr: 1.84e-04 2022-05-07 08:05:54,149 INFO [train.py:715] (0/8) Epoch 12, batch 900, loss[loss=0.1212, simple_loss=0.2019, pruned_loss=0.02029, over 4754.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.03151, over 961710.40 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:06:32,487 INFO [train.py:715] (0/8) Epoch 12, batch 950, loss[loss=0.1267, simple_loss=0.1996, pruned_loss=0.0269, over 4846.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2096, pruned_loss=0.03174, over 964249.59 frames.], batch size: 13, lr: 1.84e-04 2022-05-07 08:07:09,847 INFO [train.py:715] (0/8) Epoch 12, batch 1000, loss[loss=0.1198, simple_loss=0.1877, pruned_loss=0.02599, over 4815.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2095, pruned_loss=0.03176, over 966160.46 frames.], batch size: 12, lr: 1.84e-04 2022-05-07 08:07:47,341 INFO [train.py:715] (0/8) Epoch 12, batch 1050, loss[loss=0.1346, simple_loss=0.207, pruned_loss=0.03105, over 4877.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2099, pruned_loss=0.03177, over 968018.65 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:08:25,192 INFO [train.py:715] (0/8) Epoch 12, batch 1100, loss[loss=0.1451, simple_loss=0.2285, pruned_loss=0.03085, over 4821.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03204, over 969337.23 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:09:03,086 INFO [train.py:715] (0/8) Epoch 12, batch 1150, loss[loss=0.1343, simple_loss=0.2102, pruned_loss=0.02914, over 4876.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2106, pruned_loss=0.03212, over 970244.38 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:09:41,428 INFO [train.py:715] (0/8) Epoch 12, batch 1200, loss[loss=0.145, simple_loss=0.2176, pruned_loss=0.03617, over 4807.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2104, pruned_loss=0.03219, over 971449.00 frames.], batch size: 21, lr: 1.84e-04 2022-05-07 08:10:18,736 INFO [train.py:715] (0/8) Epoch 12, batch 1250, loss[loss=0.1329, simple_loss=0.2128, pruned_loss=0.02646, over 4822.00 frames.], tot_loss[loss=0.138, simple_loss=0.2109, pruned_loss=0.03259, over 971492.91 frames.], batch size: 26, lr: 1.84e-04 2022-05-07 08:10:56,841 INFO [train.py:715] (0/8) Epoch 12, batch 1300, loss[loss=0.1874, simple_loss=0.2462, pruned_loss=0.06435, over 4939.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2111, pruned_loss=0.03232, over 971086.01 frames.], batch size: 35, lr: 1.84e-04 2022-05-07 08:11:33,984 INFO [train.py:715] (0/8) Epoch 12, batch 1350, loss[loss=0.1403, simple_loss=0.2165, pruned_loss=0.03208, over 4932.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03178, over 971795.75 frames.], batch size: 35, lr: 1.84e-04 2022-05-07 08:12:12,108 INFO [train.py:715] (0/8) Epoch 12, batch 1400, loss[loss=0.1328, simple_loss=0.2109, pruned_loss=0.02734, over 4823.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2103, pruned_loss=0.03162, over 972166.85 frames.], batch size: 26, lr: 1.84e-04 2022-05-07 08:12:49,760 INFO [train.py:715] (0/8) Epoch 12, batch 1450, loss[loss=0.1175, simple_loss=0.1892, pruned_loss=0.02291, over 4871.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2108, pruned_loss=0.03188, over 972697.71 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:13:27,677 INFO [train.py:715] (0/8) Epoch 12, batch 1500, loss[loss=0.1032, simple_loss=0.1708, pruned_loss=0.01786, over 4886.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03173, over 973823.13 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:14:05,275 INFO [train.py:715] (0/8) Epoch 12, batch 1550, loss[loss=0.1313, simple_loss=0.1964, pruned_loss=0.03304, over 4921.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03242, over 972783.83 frames.], batch size: 29, lr: 1.84e-04 2022-05-07 08:14:42,468 INFO [train.py:715] (0/8) Epoch 12, batch 1600, loss[loss=0.1519, simple_loss=0.2137, pruned_loss=0.04503, over 4855.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03236, over 971978.70 frames.], batch size: 34, lr: 1.84e-04 2022-05-07 08:15:20,487 INFO [train.py:715] (0/8) Epoch 12, batch 1650, loss[loss=0.1425, simple_loss=0.2143, pruned_loss=0.03538, over 4693.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2106, pruned_loss=0.03255, over 972446.81 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:15:57,865 INFO [train.py:715] (0/8) Epoch 12, batch 1700, loss[loss=0.1497, simple_loss=0.2144, pruned_loss=0.04248, over 4790.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2108, pruned_loss=0.03271, over 972815.51 frames.], batch size: 14, lr: 1.84e-04 2022-05-07 08:16:35,312 INFO [train.py:715] (0/8) Epoch 12, batch 1750, loss[loss=0.1251, simple_loss=0.2044, pruned_loss=0.02291, over 4818.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2111, pruned_loss=0.0329, over 972524.89 frames.], batch size: 25, lr: 1.84e-04 2022-05-07 08:17:12,470 INFO [train.py:715] (0/8) Epoch 12, batch 1800, loss[loss=0.1311, simple_loss=0.2056, pruned_loss=0.02827, over 4979.00 frames.], tot_loss[loss=0.139, simple_loss=0.2115, pruned_loss=0.0332, over 972827.18 frames.], batch size: 25, lr: 1.84e-04 2022-05-07 08:17:50,166 INFO [train.py:715] (0/8) Epoch 12, batch 1850, loss[loss=0.1266, simple_loss=0.2095, pruned_loss=0.02185, over 4725.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2111, pruned_loss=0.0333, over 972954.30 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:18:27,656 INFO [train.py:715] (0/8) Epoch 12, batch 1900, loss[loss=0.1192, simple_loss=0.1929, pruned_loss=0.02272, over 4888.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2099, pruned_loss=0.03239, over 972979.53 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:19:05,282 INFO [train.py:715] (0/8) Epoch 12, batch 1950, loss[loss=0.1461, simple_loss=0.2299, pruned_loss=0.03114, over 4793.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2097, pruned_loss=0.03198, over 972371.09 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:19:43,100 INFO [train.py:715] (0/8) Epoch 12, batch 2000, loss[loss=0.155, simple_loss=0.2199, pruned_loss=0.04504, over 4881.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03215, over 973281.92 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:20:21,261 INFO [train.py:715] (0/8) Epoch 12, batch 2050, loss[loss=0.1381, simple_loss=0.2218, pruned_loss=0.02715, over 4936.00 frames.], tot_loss[loss=0.1381, simple_loss=0.211, pruned_loss=0.03264, over 973790.01 frames.], batch size: 23, lr: 1.84e-04 2022-05-07 08:20:59,318 INFO [train.py:715] (0/8) Epoch 12, batch 2100, loss[loss=0.1445, simple_loss=0.2169, pruned_loss=0.03607, over 4903.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03271, over 973666.51 frames.], batch size: 19, lr: 1.84e-04 2022-05-07 08:21:36,623 INFO [train.py:715] (0/8) Epoch 12, batch 2150, loss[loss=0.111, simple_loss=0.1907, pruned_loss=0.01564, over 4940.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.03273, over 972690.43 frames.], batch size: 23, lr: 1.84e-04 2022-05-07 08:22:14,596 INFO [train.py:715] (0/8) Epoch 12, batch 2200, loss[loss=0.1333, simple_loss=0.2121, pruned_loss=0.02729, over 4768.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2116, pruned_loss=0.03247, over 972476.71 frames.], batch size: 19, lr: 1.84e-04 2022-05-07 08:22:52,520 INFO [train.py:715] (0/8) Epoch 12, batch 2250, loss[loss=0.16, simple_loss=0.2455, pruned_loss=0.03723, over 4889.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2121, pruned_loss=0.03321, over 972413.55 frames.], batch size: 19, lr: 1.84e-04 2022-05-07 08:23:30,598 INFO [train.py:715] (0/8) Epoch 12, batch 2300, loss[loss=0.1369, simple_loss=0.2026, pruned_loss=0.03555, over 4893.00 frames.], tot_loss[loss=0.1385, simple_loss=0.211, pruned_loss=0.03299, over 972442.57 frames.], batch size: 17, lr: 1.84e-04 2022-05-07 08:24:07,785 INFO [train.py:715] (0/8) Epoch 12, batch 2350, loss[loss=0.1213, simple_loss=0.1965, pruned_loss=0.02302, over 4903.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2114, pruned_loss=0.03341, over 972227.65 frames.], batch size: 23, lr: 1.84e-04 2022-05-07 08:24:45,328 INFO [train.py:715] (0/8) Epoch 12, batch 2400, loss[loss=0.1287, simple_loss=0.2018, pruned_loss=0.02779, over 4877.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2117, pruned_loss=0.0334, over 971802.47 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:25:23,251 INFO [train.py:715] (0/8) Epoch 12, batch 2450, loss[loss=0.1892, simple_loss=0.2351, pruned_loss=0.07164, over 4832.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2121, pruned_loss=0.03337, over 972691.37 frames.], batch size: 13, lr: 1.84e-04 2022-05-07 08:26:00,041 INFO [train.py:715] (0/8) Epoch 12, batch 2500, loss[loss=0.1704, simple_loss=0.2481, pruned_loss=0.04638, over 4938.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03303, over 972397.72 frames.], batch size: 29, lr: 1.84e-04 2022-05-07 08:26:38,141 INFO [train.py:715] (0/8) Epoch 12, batch 2550, loss[loss=0.1513, simple_loss=0.2232, pruned_loss=0.03969, over 4925.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2116, pruned_loss=0.03255, over 971797.21 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:27:15,551 INFO [train.py:715] (0/8) Epoch 12, batch 2600, loss[loss=0.1546, simple_loss=0.2222, pruned_loss=0.0435, over 4843.00 frames.], tot_loss[loss=0.138, simple_loss=0.2113, pruned_loss=0.03236, over 972582.27 frames.], batch size: 32, lr: 1.84e-04 2022-05-07 08:27:54,369 INFO [train.py:715] (0/8) Epoch 12, batch 2650, loss[loss=0.1444, simple_loss=0.2114, pruned_loss=0.03873, over 4777.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2114, pruned_loss=0.03223, over 972516.65 frames.], batch size: 17, lr: 1.84e-04 2022-05-07 08:28:32,741 INFO [train.py:715] (0/8) Epoch 12, batch 2700, loss[loss=0.1482, simple_loss=0.2186, pruned_loss=0.03888, over 4795.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03198, over 972454.44 frames.], batch size: 12, lr: 1.84e-04 2022-05-07 08:29:11,533 INFO [train.py:715] (0/8) Epoch 12, batch 2750, loss[loss=0.1202, simple_loss=0.1885, pruned_loss=0.02593, over 4973.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03156, over 973171.53 frames.], batch size: 14, lr: 1.84e-04 2022-05-07 08:29:50,410 INFO [train.py:715] (0/8) Epoch 12, batch 2800, loss[loss=0.1484, simple_loss=0.2234, pruned_loss=0.03668, over 4978.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2109, pruned_loss=0.03211, over 973968.75 frames.], batch size: 28, lr: 1.84e-04 2022-05-07 08:30:28,420 INFO [train.py:715] (0/8) Epoch 12, batch 2850, loss[loss=0.1107, simple_loss=0.1905, pruned_loss=0.01546, over 4809.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2113, pruned_loss=0.03247, over 973828.33 frames.], batch size: 12, lr: 1.84e-04 2022-05-07 08:31:07,088 INFO [train.py:715] (0/8) Epoch 12, batch 2900, loss[loss=0.1288, simple_loss=0.2043, pruned_loss=0.0266, over 4867.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2115, pruned_loss=0.03298, over 973539.52 frames.], batch size: 30, lr: 1.84e-04 2022-05-07 08:31:45,561 INFO [train.py:715] (0/8) Epoch 12, batch 2950, loss[loss=0.1253, simple_loss=0.1972, pruned_loss=0.02667, over 4863.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2112, pruned_loss=0.03291, over 973927.31 frames.], batch size: 20, lr: 1.84e-04 2022-05-07 08:32:24,274 INFO [train.py:715] (0/8) Epoch 12, batch 3000, loss[loss=0.1269, simple_loss=0.1832, pruned_loss=0.03534, over 4836.00 frames.], tot_loss[loss=0.138, simple_loss=0.2104, pruned_loss=0.03279, over 973516.07 frames.], batch size: 12, lr: 1.84e-04 2022-05-07 08:32:24,275 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 08:32:33,757 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1056, simple_loss=0.1896, pruned_loss=0.01082, over 914524.00 frames. 2022-05-07 08:33:11,807 INFO [train.py:715] (0/8) Epoch 12, batch 3050, loss[loss=0.1966, simple_loss=0.2715, pruned_loss=0.06085, over 4788.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2102, pruned_loss=0.03239, over 973419.53 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:33:49,490 INFO [train.py:715] (0/8) Epoch 12, batch 3100, loss[loss=0.1388, simple_loss=0.2161, pruned_loss=0.03073, over 4836.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2106, pruned_loss=0.03262, over 973501.88 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:34:27,406 INFO [train.py:715] (0/8) Epoch 12, batch 3150, loss[loss=0.154, simple_loss=0.2418, pruned_loss=0.03315, over 4774.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2114, pruned_loss=0.03294, over 972842.27 frames.], batch size: 14, lr: 1.84e-04 2022-05-07 08:35:05,542 INFO [train.py:715] (0/8) Epoch 12, batch 3200, loss[loss=0.2001, simple_loss=0.267, pruned_loss=0.06664, over 4980.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2123, pruned_loss=0.03304, over 971537.20 frames.], batch size: 39, lr: 1.84e-04 2022-05-07 08:35:43,247 INFO [train.py:715] (0/8) Epoch 12, batch 3250, loss[loss=0.117, simple_loss=0.1903, pruned_loss=0.02188, over 4921.00 frames.], tot_loss[loss=0.138, simple_loss=0.2114, pruned_loss=0.03235, over 971985.71 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:36:21,482 INFO [train.py:715] (0/8) Epoch 12, batch 3300, loss[loss=0.1389, simple_loss=0.2073, pruned_loss=0.03521, over 4948.00 frames.], tot_loss[loss=0.138, simple_loss=0.2112, pruned_loss=0.03237, over 972688.22 frames.], batch size: 35, lr: 1.84e-04 2022-05-07 08:36:59,234 INFO [train.py:715] (0/8) Epoch 12, batch 3350, loss[loss=0.1284, simple_loss=0.2046, pruned_loss=0.02612, over 4920.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2103, pruned_loss=0.03174, over 972712.54 frames.], batch size: 29, lr: 1.84e-04 2022-05-07 08:37:37,374 INFO [train.py:715] (0/8) Epoch 12, batch 3400, loss[loss=0.1451, simple_loss=0.2187, pruned_loss=0.03578, over 4859.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2106, pruned_loss=0.03167, over 972304.48 frames.], batch size: 38, lr: 1.84e-04 2022-05-07 08:38:14,959 INFO [train.py:715] (0/8) Epoch 12, batch 3450, loss[loss=0.1291, simple_loss=0.2146, pruned_loss=0.02181, over 4931.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03203, over 971762.72 frames.], batch size: 23, lr: 1.84e-04 2022-05-07 08:38:52,881 INFO [train.py:715] (0/8) Epoch 12, batch 3500, loss[loss=0.1564, simple_loss=0.2232, pruned_loss=0.04478, over 4829.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2106, pruned_loss=0.03221, over 971736.80 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:39:31,083 INFO [train.py:715] (0/8) Epoch 12, batch 3550, loss[loss=0.1166, simple_loss=0.1929, pruned_loss=0.02018, over 4838.00 frames.], tot_loss[loss=0.1384, simple_loss=0.2113, pruned_loss=0.0327, over 971669.53 frames.], batch size: 30, lr: 1.84e-04 2022-05-07 08:40:08,792 INFO [train.py:715] (0/8) Epoch 12, batch 3600, loss[loss=0.1382, simple_loss=0.2133, pruned_loss=0.03149, over 4957.00 frames.], tot_loss[loss=0.1391, simple_loss=0.212, pruned_loss=0.0331, over 971661.31 frames.], batch size: 39, lr: 1.84e-04 2022-05-07 08:40:46,531 INFO [train.py:715] (0/8) Epoch 12, batch 3650, loss[loss=0.1425, simple_loss=0.2193, pruned_loss=0.03286, over 4803.00 frames.], tot_loss[loss=0.1389, simple_loss=0.2121, pruned_loss=0.03288, over 971429.24 frames.], batch size: 21, lr: 1.84e-04 2022-05-07 08:41:24,466 INFO [train.py:715] (0/8) Epoch 12, batch 3700, loss[loss=0.1288, simple_loss=0.2104, pruned_loss=0.02361, over 4750.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2124, pruned_loss=0.03297, over 971863.53 frames.], batch size: 19, lr: 1.84e-04 2022-05-07 08:42:02,372 INFO [train.py:715] (0/8) Epoch 12, batch 3750, loss[loss=0.1617, simple_loss=0.2259, pruned_loss=0.04874, over 4964.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.03313, over 972102.67 frames.], batch size: 14, lr: 1.84e-04 2022-05-07 08:42:40,467 INFO [train.py:715] (0/8) Epoch 12, batch 3800, loss[loss=0.131, simple_loss=0.2055, pruned_loss=0.02829, over 4984.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2126, pruned_loss=0.03301, over 972532.04 frames.], batch size: 25, lr: 1.84e-04 2022-05-07 08:43:18,086 INFO [train.py:715] (0/8) Epoch 12, batch 3850, loss[loss=0.1257, simple_loss=0.1992, pruned_loss=0.02608, over 4744.00 frames.], tot_loss[loss=0.1402, simple_loss=0.2136, pruned_loss=0.03342, over 972646.12 frames.], batch size: 16, lr: 1.84e-04 2022-05-07 08:43:55,562 INFO [train.py:715] (0/8) Epoch 12, batch 3900, loss[loss=0.1609, simple_loss=0.2274, pruned_loss=0.04716, over 4922.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2132, pruned_loss=0.03298, over 973061.62 frames.], batch size: 39, lr: 1.84e-04 2022-05-07 08:44:33,438 INFO [train.py:715] (0/8) Epoch 12, batch 3950, loss[loss=0.1187, simple_loss=0.1849, pruned_loss=0.02625, over 4963.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2124, pruned_loss=0.03249, over 972781.70 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:45:11,210 INFO [train.py:715] (0/8) Epoch 12, batch 4000, loss[loss=0.1667, simple_loss=0.2499, pruned_loss=0.0417, over 4897.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2127, pruned_loss=0.03294, over 972986.57 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:45:49,151 INFO [train.py:715] (0/8) Epoch 12, batch 4050, loss[loss=0.1514, simple_loss=0.2244, pruned_loss=0.03913, over 4791.00 frames.], tot_loss[loss=0.1392, simple_loss=0.2125, pruned_loss=0.03298, over 972537.11 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:46:27,043 INFO [train.py:715] (0/8) Epoch 12, batch 4100, loss[loss=0.1425, simple_loss=0.2203, pruned_loss=0.03235, over 4877.00 frames.], tot_loss[loss=0.1398, simple_loss=0.213, pruned_loss=0.03332, over 972170.68 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:47:05,068 INFO [train.py:715] (0/8) Epoch 12, batch 4150, loss[loss=0.1382, simple_loss=0.2089, pruned_loss=0.03374, over 4976.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03308, over 972605.38 frames.], batch size: 31, lr: 1.84e-04 2022-05-07 08:47:43,030 INFO [train.py:715] (0/8) Epoch 12, batch 4200, loss[loss=0.1126, simple_loss=0.1853, pruned_loss=0.01992, over 4841.00 frames.], tot_loss[loss=0.1394, simple_loss=0.2126, pruned_loss=0.03311, over 972602.54 frames.], batch size: 32, lr: 1.84e-04 2022-05-07 08:48:20,656 INFO [train.py:715] (0/8) Epoch 12, batch 4250, loss[loss=0.1551, simple_loss=0.2195, pruned_loss=0.04537, over 4778.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2127, pruned_loss=0.0333, over 972718.39 frames.], batch size: 17, lr: 1.84e-04 2022-05-07 08:48:58,344 INFO [train.py:715] (0/8) Epoch 12, batch 4300, loss[loss=0.1621, simple_loss=0.238, pruned_loss=0.04307, over 4960.00 frames.], tot_loss[loss=0.1391, simple_loss=0.212, pruned_loss=0.03307, over 972211.43 frames.], batch size: 39, lr: 1.84e-04 2022-05-07 08:49:37,513 INFO [train.py:715] (0/8) Epoch 12, batch 4350, loss[loss=0.1386, simple_loss=0.215, pruned_loss=0.03103, over 4865.00 frames.], tot_loss[loss=0.1404, simple_loss=0.2127, pruned_loss=0.03404, over 972667.78 frames.], batch size: 30, lr: 1.84e-04 2022-05-07 08:50:16,262 INFO [train.py:715] (0/8) Epoch 12, batch 4400, loss[loss=0.1792, simple_loss=0.2367, pruned_loss=0.06085, over 4825.00 frames.], tot_loss[loss=0.1396, simple_loss=0.212, pruned_loss=0.0336, over 972004.65 frames.], batch size: 15, lr: 1.84e-04 2022-05-07 08:50:54,763 INFO [train.py:715] (0/8) Epoch 12, batch 4450, loss[loss=0.129, simple_loss=0.1992, pruned_loss=0.02937, over 4758.00 frames.], tot_loss[loss=0.1386, simple_loss=0.2112, pruned_loss=0.03298, over 971569.53 frames.], batch size: 19, lr: 1.84e-04 2022-05-07 08:51:33,200 INFO [train.py:715] (0/8) Epoch 12, batch 4500, loss[loss=0.152, simple_loss=0.2126, pruned_loss=0.04576, over 4843.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2113, pruned_loss=0.03265, over 972471.71 frames.], batch size: 32, lr: 1.84e-04 2022-05-07 08:52:12,278 INFO [train.py:715] (0/8) Epoch 12, batch 4550, loss[loss=0.1033, simple_loss=0.1832, pruned_loss=0.01171, over 4901.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.03233, over 971764.79 frames.], batch size: 22, lr: 1.84e-04 2022-05-07 08:52:50,488 INFO [train.py:715] (0/8) Epoch 12, batch 4600, loss[loss=0.1473, simple_loss=0.2144, pruned_loss=0.04007, over 4860.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.03216, over 971716.14 frames.], batch size: 32, lr: 1.84e-04 2022-05-07 08:53:29,037 INFO [train.py:715] (0/8) Epoch 12, batch 4650, loss[loss=0.177, simple_loss=0.2481, pruned_loss=0.05291, over 4904.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.0328, over 971861.67 frames.], batch size: 19, lr: 1.84e-04 2022-05-07 08:54:07,726 INFO [train.py:715] (0/8) Epoch 12, batch 4700, loss[loss=0.1445, simple_loss=0.2143, pruned_loss=0.03738, over 4948.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.03201, over 972841.48 frames.], batch size: 35, lr: 1.84e-04 2022-05-07 08:54:46,296 INFO [train.py:715] (0/8) Epoch 12, batch 4750, loss[loss=0.1155, simple_loss=0.1895, pruned_loss=0.02072, over 4794.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03192, over 972611.51 frames.], batch size: 21, lr: 1.84e-04 2022-05-07 08:55:24,997 INFO [train.py:715] (0/8) Epoch 12, batch 4800, loss[loss=0.1329, simple_loss=0.2034, pruned_loss=0.03118, over 4920.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03218, over 972916.40 frames.], batch size: 18, lr: 1.84e-04 2022-05-07 08:56:03,560 INFO [train.py:715] (0/8) Epoch 12, batch 4850, loss[loss=0.1251, simple_loss=0.1962, pruned_loss=0.02697, over 4825.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2097, pruned_loss=0.03174, over 972315.18 frames.], batch size: 26, lr: 1.84e-04 2022-05-07 08:56:42,591 INFO [train.py:715] (0/8) Epoch 12, batch 4900, loss[loss=0.1462, simple_loss=0.2133, pruned_loss=0.0396, over 4835.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2102, pruned_loss=0.03216, over 971394.91 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 08:57:20,601 INFO [train.py:715] (0/8) Epoch 12, batch 4950, loss[loss=0.1352, simple_loss=0.2246, pruned_loss=0.02288, over 4777.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2102, pruned_loss=0.03235, over 971643.82 frames.], batch size: 17, lr: 1.83e-04 2022-05-07 08:57:58,205 INFO [train.py:715] (0/8) Epoch 12, batch 5000, loss[loss=0.1295, simple_loss=0.2063, pruned_loss=0.0263, over 4930.00 frames.], tot_loss[loss=0.1373, simple_loss=0.21, pruned_loss=0.03235, over 971650.31 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 08:58:36,390 INFO [train.py:715] (0/8) Epoch 12, batch 5050, loss[loss=0.1474, simple_loss=0.2155, pruned_loss=0.03966, over 4685.00 frames.], tot_loss[loss=0.1388, simple_loss=0.211, pruned_loss=0.03328, over 972215.41 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 08:59:13,982 INFO [train.py:715] (0/8) Epoch 12, batch 5100, loss[loss=0.1195, simple_loss=0.1931, pruned_loss=0.02295, over 4852.00 frames.], tot_loss[loss=0.138, simple_loss=0.2106, pruned_loss=0.03266, over 971891.45 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 08:59:52,109 INFO [train.py:715] (0/8) Epoch 12, batch 5150, loss[loss=0.1353, simple_loss=0.2045, pruned_loss=0.03299, over 4974.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.03251, over 972474.53 frames.], batch size: 35, lr: 1.83e-04 2022-05-07 09:00:30,012 INFO [train.py:715] (0/8) Epoch 12, batch 5200, loss[loss=0.1556, simple_loss=0.2302, pruned_loss=0.04053, over 4941.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.0329, over 972283.38 frames.], batch size: 39, lr: 1.83e-04 2022-05-07 09:01:08,123 INFO [train.py:715] (0/8) Epoch 12, batch 5250, loss[loss=0.1299, simple_loss=0.2072, pruned_loss=0.02628, over 4935.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2117, pruned_loss=0.03269, over 972330.48 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:01:45,990 INFO [train.py:715] (0/8) Epoch 12, batch 5300, loss[loss=0.1544, simple_loss=0.2262, pruned_loss=0.04132, over 4850.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.033, over 972263.88 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:02:24,109 INFO [train.py:715] (0/8) Epoch 12, batch 5350, loss[loss=0.1343, simple_loss=0.2132, pruned_loss=0.02769, over 4917.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2111, pruned_loss=0.03298, over 971941.40 frames.], batch size: 19, lr: 1.83e-04 2022-05-07 09:03:02,668 INFO [train.py:715] (0/8) Epoch 12, batch 5400, loss[loss=0.129, simple_loss=0.2037, pruned_loss=0.02717, over 4875.00 frames.], tot_loss[loss=0.1382, simple_loss=0.211, pruned_loss=0.03271, over 972001.52 frames.], batch size: 32, lr: 1.83e-04 2022-05-07 09:03:40,512 INFO [train.py:715] (0/8) Epoch 12, batch 5450, loss[loss=0.1181, simple_loss=0.1936, pruned_loss=0.02125, over 4842.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2107, pruned_loss=0.0325, over 972882.93 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:04:18,717 INFO [train.py:715] (0/8) Epoch 12, batch 5500, loss[loss=0.1572, simple_loss=0.2241, pruned_loss=0.04517, over 4911.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2112, pruned_loss=0.03291, over 972817.67 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:04:56,505 INFO [train.py:715] (0/8) Epoch 12, batch 5550, loss[loss=0.1326, simple_loss=0.2082, pruned_loss=0.02854, over 4796.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2108, pruned_loss=0.03251, over 972517.64 frames.], batch size: 17, lr: 1.83e-04 2022-05-07 09:05:35,152 INFO [train.py:715] (0/8) Epoch 12, batch 5600, loss[loss=0.1399, simple_loss=0.2167, pruned_loss=0.03149, over 4821.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2097, pruned_loss=0.03193, over 971011.10 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:06:12,946 INFO [train.py:715] (0/8) Epoch 12, batch 5650, loss[loss=0.1533, simple_loss=0.2357, pruned_loss=0.03548, over 4759.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2099, pruned_loss=0.03192, over 971628.89 frames.], batch size: 19, lr: 1.83e-04 2022-05-07 09:06:50,898 INFO [train.py:715] (0/8) Epoch 12, batch 5700, loss[loss=0.1495, simple_loss=0.2141, pruned_loss=0.0424, over 4810.00 frames.], tot_loss[loss=0.137, simple_loss=0.2101, pruned_loss=0.03191, over 972247.91 frames.], batch size: 14, lr: 1.83e-04 2022-05-07 09:07:29,805 INFO [train.py:715] (0/8) Epoch 12, batch 5750, loss[loss=0.1496, simple_loss=0.2216, pruned_loss=0.03881, over 4796.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2096, pruned_loss=0.03198, over 972173.16 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:08:07,979 INFO [train.py:715] (0/8) Epoch 12, batch 5800, loss[loss=0.1427, simple_loss=0.22, pruned_loss=0.03277, over 4889.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2091, pruned_loss=0.03161, over 972323.18 frames.], batch size: 22, lr: 1.83e-04 2022-05-07 09:08:46,178 INFO [train.py:715] (0/8) Epoch 12, batch 5850, loss[loss=0.1271, simple_loss=0.2007, pruned_loss=0.02675, over 4984.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03137, over 972275.20 frames.], batch size: 28, lr: 1.83e-04 2022-05-07 09:09:24,394 INFO [train.py:715] (0/8) Epoch 12, batch 5900, loss[loss=0.1603, simple_loss=0.2309, pruned_loss=0.04483, over 4933.00 frames.], tot_loss[loss=0.136, simple_loss=0.2092, pruned_loss=0.03138, over 972664.61 frames.], batch size: 39, lr: 1.83e-04 2022-05-07 09:10:02,492 INFO [train.py:715] (0/8) Epoch 12, batch 5950, loss[loss=0.1337, simple_loss=0.1996, pruned_loss=0.03386, over 4828.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2091, pruned_loss=0.03115, over 971993.63 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:10:40,372 INFO [train.py:715] (0/8) Epoch 12, batch 6000, loss[loss=0.1235, simple_loss=0.1919, pruned_loss=0.02758, over 4861.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03133, over 971658.90 frames.], batch size: 16, lr: 1.83e-04 2022-05-07 09:10:40,373 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 09:10:49,853 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1057, simple_loss=0.1897, pruned_loss=0.01086, over 914524.00 frames. 2022-05-07 09:11:28,464 INFO [train.py:715] (0/8) Epoch 12, batch 6050, loss[loss=0.1643, simple_loss=0.2352, pruned_loss=0.04669, over 4778.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03141, over 971822.15 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:12:07,169 INFO [train.py:715] (0/8) Epoch 12, batch 6100, loss[loss=0.1067, simple_loss=0.1877, pruned_loss=0.01288, over 4819.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.0314, over 971966.38 frames.], batch size: 27, lr: 1.83e-04 2022-05-07 09:12:46,246 INFO [train.py:715] (0/8) Epoch 12, batch 6150, loss[loss=0.1577, simple_loss=0.2297, pruned_loss=0.04282, over 4914.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2104, pruned_loss=0.03151, over 971421.42 frames.], batch size: 29, lr: 1.83e-04 2022-05-07 09:13:24,042 INFO [train.py:715] (0/8) Epoch 12, batch 6200, loss[loss=0.138, simple_loss=0.2096, pruned_loss=0.03322, over 4969.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2113, pruned_loss=0.03204, over 971764.20 frames.], batch size: 35, lr: 1.83e-04 2022-05-07 09:14:02,107 INFO [train.py:715] (0/8) Epoch 12, batch 6250, loss[loss=0.1393, simple_loss=0.2105, pruned_loss=0.03409, over 4932.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03181, over 971956.50 frames.], batch size: 23, lr: 1.83e-04 2022-05-07 09:14:30,400 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-424000.pt 2022-05-07 09:14:42,629 INFO [train.py:715] (0/8) Epoch 12, batch 6300, loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03057, over 4785.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03182, over 971088.31 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:15:20,409 INFO [train.py:715] (0/8) Epoch 12, batch 6350, loss[loss=0.1431, simple_loss=0.2198, pruned_loss=0.0332, over 4820.00 frames.], tot_loss[loss=0.137, simple_loss=0.2098, pruned_loss=0.03214, over 971395.27 frames.], batch size: 26, lr: 1.83e-04 2022-05-07 09:15:58,257 INFO [train.py:715] (0/8) Epoch 12, batch 6400, loss[loss=0.1517, simple_loss=0.2252, pruned_loss=0.03907, over 4972.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2094, pruned_loss=0.03176, over 972426.33 frames.], batch size: 24, lr: 1.83e-04 2022-05-07 09:16:36,185 INFO [train.py:715] (0/8) Epoch 12, batch 6450, loss[loss=0.1406, simple_loss=0.2065, pruned_loss=0.03735, over 4917.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2096, pruned_loss=0.03183, over 972545.09 frames.], batch size: 17, lr: 1.83e-04 2022-05-07 09:17:14,179 INFO [train.py:715] (0/8) Epoch 12, batch 6500, loss[loss=0.1366, simple_loss=0.2051, pruned_loss=0.03405, over 4819.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03169, over 973041.57 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:17:51,823 INFO [train.py:715] (0/8) Epoch 12, batch 6550, loss[loss=0.1523, simple_loss=0.2226, pruned_loss=0.04097, over 4850.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2101, pruned_loss=0.03214, over 973177.79 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:18:29,928 INFO [train.py:715] (0/8) Epoch 12, batch 6600, loss[loss=0.1507, simple_loss=0.2203, pruned_loss=0.04061, over 4754.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.03188, over 973347.94 frames.], batch size: 19, lr: 1.83e-04 2022-05-07 09:19:08,076 INFO [train.py:715] (0/8) Epoch 12, batch 6650, loss[loss=0.1445, simple_loss=0.2191, pruned_loss=0.03493, over 4919.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2106, pruned_loss=0.03188, over 973135.68 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:19:46,569 INFO [train.py:715] (0/8) Epoch 12, batch 6700, loss[loss=0.1454, simple_loss=0.216, pruned_loss=0.03744, over 4929.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.03148, over 973069.65 frames.], batch size: 39, lr: 1.83e-04 2022-05-07 09:20:24,043 INFO [train.py:715] (0/8) Epoch 12, batch 6750, loss[loss=0.1326, simple_loss=0.1983, pruned_loss=0.0334, over 4863.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03124, over 972421.23 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:21:02,172 INFO [train.py:715] (0/8) Epoch 12, batch 6800, loss[loss=0.1343, simple_loss=0.2163, pruned_loss=0.02613, over 4976.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03144, over 973236.82 frames.], batch size: 25, lr: 1.83e-04 2022-05-07 09:21:40,235 INFO [train.py:715] (0/8) Epoch 12, batch 6850, loss[loss=0.1362, simple_loss=0.2092, pruned_loss=0.03161, over 4991.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03156, over 973677.21 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:22:18,032 INFO [train.py:715] (0/8) Epoch 12, batch 6900, loss[loss=0.1199, simple_loss=0.2038, pruned_loss=0.01798, over 4922.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.03186, over 973667.01 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:22:56,141 INFO [train.py:715] (0/8) Epoch 12, batch 6950, loss[loss=0.1426, simple_loss=0.2198, pruned_loss=0.03271, over 4936.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.03154, over 973486.92 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:23:34,135 INFO [train.py:715] (0/8) Epoch 12, batch 7000, loss[loss=0.1457, simple_loss=0.2091, pruned_loss=0.04116, over 4809.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2105, pruned_loss=0.03154, over 973011.40 frames.], batch size: 12, lr: 1.83e-04 2022-05-07 09:24:12,554 INFO [train.py:715] (0/8) Epoch 12, batch 7050, loss[loss=0.1493, simple_loss=0.2148, pruned_loss=0.04193, over 4980.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03157, over 972902.64 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:24:50,034 INFO [train.py:715] (0/8) Epoch 12, batch 7100, loss[loss=0.1234, simple_loss=0.1921, pruned_loss=0.02738, over 4799.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03156, over 972850.26 frames.], batch size: 24, lr: 1.83e-04 2022-05-07 09:25:28,615 INFO [train.py:715] (0/8) Epoch 12, batch 7150, loss[loss=0.135, simple_loss=0.2168, pruned_loss=0.02657, over 4932.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03091, over 972959.21 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:26:06,439 INFO [train.py:715] (0/8) Epoch 12, batch 7200, loss[loss=0.1321, simple_loss=0.2159, pruned_loss=0.02414, over 4760.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2096, pruned_loss=0.03141, over 971681.19 frames.], batch size: 17, lr: 1.83e-04 2022-05-07 09:26:44,289 INFO [train.py:715] (0/8) Epoch 12, batch 7250, loss[loss=0.1476, simple_loss=0.2214, pruned_loss=0.03688, over 4895.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.0313, over 972979.14 frames.], batch size: 17, lr: 1.83e-04 2022-05-07 09:27:22,550 INFO [train.py:715] (0/8) Epoch 12, batch 7300, loss[loss=0.1358, simple_loss=0.2111, pruned_loss=0.03022, over 4986.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03136, over 972864.47 frames.], batch size: 14, lr: 1.83e-04 2022-05-07 09:28:00,295 INFO [train.py:715] (0/8) Epoch 12, batch 7350, loss[loss=0.1523, simple_loss=0.2324, pruned_loss=0.03609, over 4879.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03103, over 973024.38 frames.], batch size: 16, lr: 1.83e-04 2022-05-07 09:28:38,316 INFO [train.py:715] (0/8) Epoch 12, batch 7400, loss[loss=0.14, simple_loss=0.231, pruned_loss=0.02446, over 4795.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03167, over 973452.93 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:29:16,068 INFO [train.py:715] (0/8) Epoch 12, batch 7450, loss[loss=0.152, simple_loss=0.215, pruned_loss=0.04447, over 4844.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.03202, over 973612.97 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:29:54,160 INFO [train.py:715] (0/8) Epoch 12, batch 7500, loss[loss=0.1207, simple_loss=0.2003, pruned_loss=0.02051, over 4876.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03189, over 972792.15 frames.], batch size: 22, lr: 1.83e-04 2022-05-07 09:30:32,180 INFO [train.py:715] (0/8) Epoch 12, batch 7550, loss[loss=0.125, simple_loss=0.2062, pruned_loss=0.02192, over 4932.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03163, over 972494.87 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:31:10,030 INFO [train.py:715] (0/8) Epoch 12, batch 7600, loss[loss=0.166, simple_loss=0.2399, pruned_loss=0.04607, over 4977.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03161, over 972553.88 frames.], batch size: 14, lr: 1.83e-04 2022-05-07 09:31:48,258 INFO [train.py:715] (0/8) Epoch 12, batch 7650, loss[loss=0.1368, simple_loss=0.2133, pruned_loss=0.0302, over 4804.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2102, pruned_loss=0.03181, over 972227.47 frames.], batch size: 25, lr: 1.83e-04 2022-05-07 09:32:26,436 INFO [train.py:715] (0/8) Epoch 12, batch 7700, loss[loss=0.1239, simple_loss=0.1992, pruned_loss=0.02428, over 4799.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2107, pruned_loss=0.03201, over 971466.59 frames.], batch size: 12, lr: 1.83e-04 2022-05-07 09:33:04,634 INFO [train.py:715] (0/8) Epoch 12, batch 7750, loss[loss=0.1292, simple_loss=0.1993, pruned_loss=0.02952, over 4774.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2112, pruned_loss=0.03169, over 971311.83 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:33:42,402 INFO [train.py:715] (0/8) Epoch 12, batch 7800, loss[loss=0.1313, simple_loss=0.2037, pruned_loss=0.02952, over 4831.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2113, pruned_loss=0.03186, over 971694.01 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:34:20,593 INFO [train.py:715] (0/8) Epoch 12, batch 7850, loss[loss=0.1296, simple_loss=0.1985, pruned_loss=0.03034, over 4816.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2118, pruned_loss=0.03243, over 971180.43 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:34:58,400 INFO [train.py:715] (0/8) Epoch 12, batch 7900, loss[loss=0.124, simple_loss=0.1986, pruned_loss=0.0247, over 4971.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2121, pruned_loss=0.03265, over 972093.64 frames.], batch size: 25, lr: 1.83e-04 2022-05-07 09:35:36,646 INFO [train.py:715] (0/8) Epoch 12, batch 7950, loss[loss=0.1665, simple_loss=0.2388, pruned_loss=0.04706, over 4973.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2116, pruned_loss=0.03241, over 971607.62 frames.], batch size: 24, lr: 1.83e-04 2022-05-07 09:36:14,619 INFO [train.py:715] (0/8) Epoch 12, batch 8000, loss[loss=0.1428, simple_loss=0.2084, pruned_loss=0.03863, over 4659.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2116, pruned_loss=0.03231, over 971315.08 frames.], batch size: 13, lr: 1.83e-04 2022-05-07 09:36:53,079 INFO [train.py:715] (0/8) Epoch 12, batch 8050, loss[loss=0.129, simple_loss=0.2089, pruned_loss=0.02457, over 4829.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2129, pruned_loss=0.03268, over 971874.48 frames.], batch size: 26, lr: 1.83e-04 2022-05-07 09:37:31,430 INFO [train.py:715] (0/8) Epoch 12, batch 8100, loss[loss=0.1387, simple_loss=0.2013, pruned_loss=0.03804, over 4816.00 frames.], tot_loss[loss=0.1393, simple_loss=0.2129, pruned_loss=0.03286, over 970623.21 frames.], batch size: 13, lr: 1.83e-04 2022-05-07 09:38:09,019 INFO [train.py:715] (0/8) Epoch 12, batch 8150, loss[loss=0.1169, simple_loss=0.1864, pruned_loss=0.02365, over 4979.00 frames.], tot_loss[loss=0.1396, simple_loss=0.2133, pruned_loss=0.03296, over 971169.21 frames.], batch size: 25, lr: 1.83e-04 2022-05-07 09:38:47,281 INFO [train.py:715] (0/8) Epoch 12, batch 8200, loss[loss=0.1173, simple_loss=0.1886, pruned_loss=0.02296, over 4821.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2125, pruned_loss=0.03283, over 971157.15 frames.], batch size: 15, lr: 1.83e-04 2022-05-07 09:39:25,277 INFO [train.py:715] (0/8) Epoch 12, batch 8250, loss[loss=0.1525, simple_loss=0.2225, pruned_loss=0.0412, over 4836.00 frames.], tot_loss[loss=0.139, simple_loss=0.2128, pruned_loss=0.03259, over 971444.33 frames.], batch size: 30, lr: 1.83e-04 2022-05-07 09:40:03,011 INFO [train.py:715] (0/8) Epoch 12, batch 8300, loss[loss=0.1389, simple_loss=0.2072, pruned_loss=0.03532, over 4821.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2121, pruned_loss=0.03226, over 972589.21 frames.], batch size: 27, lr: 1.83e-04 2022-05-07 09:40:41,108 INFO [train.py:715] (0/8) Epoch 12, batch 8350, loss[loss=0.1351, simple_loss=0.2083, pruned_loss=0.03092, over 4973.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.03198, over 972756.69 frames.], batch size: 24, lr: 1.83e-04 2022-05-07 09:41:19,296 INFO [train.py:715] (0/8) Epoch 12, batch 8400, loss[loss=0.1258, simple_loss=0.1922, pruned_loss=0.02974, over 4820.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2114, pruned_loss=0.03202, over 973188.79 frames.], batch size: 13, lr: 1.83e-04 2022-05-07 09:41:57,369 INFO [train.py:715] (0/8) Epoch 12, batch 8450, loss[loss=0.1493, simple_loss=0.2227, pruned_loss=0.03795, over 4861.00 frames.], tot_loss[loss=0.1376, simple_loss=0.211, pruned_loss=0.03215, over 973517.89 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:42:34,899 INFO [train.py:715] (0/8) Epoch 12, batch 8500, loss[loss=0.1266, simple_loss=0.201, pruned_loss=0.0261, over 4963.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03202, over 974010.38 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:43:13,185 INFO [train.py:715] (0/8) Epoch 12, batch 8550, loss[loss=0.1481, simple_loss=0.2307, pruned_loss=0.03281, over 4804.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2116, pruned_loss=0.03229, over 974388.31 frames.], batch size: 26, lr: 1.83e-04 2022-05-07 09:43:51,184 INFO [train.py:715] (0/8) Epoch 12, batch 8600, loss[loss=0.1283, simple_loss=0.1979, pruned_loss=0.0294, over 4795.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2116, pruned_loss=0.03232, over 973134.58 frames.], batch size: 12, lr: 1.83e-04 2022-05-07 09:44:28,877 INFO [train.py:715] (0/8) Epoch 12, batch 8650, loss[loss=0.141, simple_loss=0.2137, pruned_loss=0.03411, over 4917.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2118, pruned_loss=0.03278, over 973034.00 frames.], batch size: 29, lr: 1.83e-04 2022-05-07 09:45:07,101 INFO [train.py:715] (0/8) Epoch 12, batch 8700, loss[loss=0.1408, simple_loss=0.2155, pruned_loss=0.03308, over 4856.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2114, pruned_loss=0.03266, over 972755.63 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:45:45,268 INFO [train.py:715] (0/8) Epoch 12, batch 8750, loss[loss=0.1381, simple_loss=0.2135, pruned_loss=0.03133, over 4841.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2114, pruned_loss=0.0328, over 973177.60 frames.], batch size: 13, lr: 1.83e-04 2022-05-07 09:46:23,699 INFO [train.py:715] (0/8) Epoch 12, batch 8800, loss[loss=0.1076, simple_loss=0.183, pruned_loss=0.01617, over 4803.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2116, pruned_loss=0.03299, over 972782.84 frames.], batch size: 12, lr: 1.83e-04 2022-05-07 09:47:01,612 INFO [train.py:715] (0/8) Epoch 12, batch 8850, loss[loss=0.134, simple_loss=0.2041, pruned_loss=0.03189, over 4942.00 frames.], tot_loss[loss=0.1391, simple_loss=0.2122, pruned_loss=0.03298, over 972877.99 frames.], batch size: 29, lr: 1.83e-04 2022-05-07 09:47:40,603 INFO [train.py:715] (0/8) Epoch 12, batch 8900, loss[loss=0.1223, simple_loss=0.1988, pruned_loss=0.02291, over 4793.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2111, pruned_loss=0.03259, over 972843.90 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:48:20,141 INFO [train.py:715] (0/8) Epoch 12, batch 8950, loss[loss=0.1154, simple_loss=0.1919, pruned_loss=0.01948, over 4936.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03204, over 973081.77 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:48:58,102 INFO [train.py:715] (0/8) Epoch 12, batch 9000, loss[loss=0.1253, simple_loss=0.203, pruned_loss=0.02374, over 4811.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03212, over 973032.12 frames.], batch size: 27, lr: 1.83e-04 2022-05-07 09:48:58,103 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 09:49:07,571 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1057, simple_loss=0.1898, pruned_loss=0.01084, over 914524.00 frames. 2022-05-07 09:49:45,342 INFO [train.py:715] (0/8) Epoch 12, batch 9050, loss[loss=0.1232, simple_loss=0.1928, pruned_loss=0.02681, over 4900.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2097, pruned_loss=0.03195, over 972442.53 frames.], batch size: 19, lr: 1.83e-04 2022-05-07 09:50:23,562 INFO [train.py:715] (0/8) Epoch 12, batch 9100, loss[loss=0.1255, simple_loss=0.2037, pruned_loss=0.0237, over 4770.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2102, pruned_loss=0.03232, over 972699.86 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:51:01,821 INFO [train.py:715] (0/8) Epoch 12, batch 9150, loss[loss=0.1269, simple_loss=0.2067, pruned_loss=0.0235, over 4816.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2102, pruned_loss=0.03221, over 971991.80 frames.], batch size: 21, lr: 1.83e-04 2022-05-07 09:51:39,539 INFO [train.py:715] (0/8) Epoch 12, batch 9200, loss[loss=0.1168, simple_loss=0.1943, pruned_loss=0.01968, over 4745.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2094, pruned_loss=0.03201, over 970981.56 frames.], batch size: 16, lr: 1.83e-04 2022-05-07 09:52:17,392 INFO [train.py:715] (0/8) Epoch 12, batch 9250, loss[loss=0.1078, simple_loss=0.1735, pruned_loss=0.021, over 4796.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2092, pruned_loss=0.03187, over 970671.99 frames.], batch size: 12, lr: 1.83e-04 2022-05-07 09:52:55,470 INFO [train.py:715] (0/8) Epoch 12, batch 9300, loss[loss=0.1169, simple_loss=0.1894, pruned_loss=0.02224, over 4820.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2094, pruned_loss=0.0322, over 971494.01 frames.], batch size: 25, lr: 1.83e-04 2022-05-07 09:53:33,063 INFO [train.py:715] (0/8) Epoch 12, batch 9350, loss[loss=0.1366, simple_loss=0.2074, pruned_loss=0.03296, over 4963.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2096, pruned_loss=0.03186, over 972304.27 frames.], batch size: 14, lr: 1.83e-04 2022-05-07 09:54:10,840 INFO [train.py:715] (0/8) Epoch 12, batch 9400, loss[loss=0.1287, simple_loss=0.2045, pruned_loss=0.02649, over 4914.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2096, pruned_loss=0.03204, over 971252.69 frames.], batch size: 18, lr: 1.83e-04 2022-05-07 09:54:48,553 INFO [train.py:715] (0/8) Epoch 12, batch 9450, loss[loss=0.1368, simple_loss=0.2036, pruned_loss=0.03502, over 4834.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2103, pruned_loss=0.03201, over 971545.49 frames.], batch size: 32, lr: 1.83e-04 2022-05-07 09:55:26,593 INFO [train.py:715] (0/8) Epoch 12, batch 9500, loss[loss=0.1154, simple_loss=0.185, pruned_loss=0.02291, over 4856.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03202, over 971148.77 frames.], batch size: 20, lr: 1.83e-04 2022-05-07 09:56:04,146 INFO [train.py:715] (0/8) Epoch 12, batch 9550, loss[loss=0.1273, simple_loss=0.2129, pruned_loss=0.02089, over 4863.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2103, pruned_loss=0.03201, over 971176.68 frames.], batch size: 16, lr: 1.82e-04 2022-05-07 09:56:41,638 INFO [train.py:715] (0/8) Epoch 12, batch 9600, loss[loss=0.1523, simple_loss=0.2251, pruned_loss=0.03977, over 4906.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2101, pruned_loss=0.03232, over 971772.39 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 09:57:19,883 INFO [train.py:715] (0/8) Epoch 12, batch 9650, loss[loss=0.1763, simple_loss=0.2361, pruned_loss=0.05823, over 4923.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2105, pruned_loss=0.03254, over 972113.39 frames.], batch size: 39, lr: 1.82e-04 2022-05-07 09:57:57,754 INFO [train.py:715] (0/8) Epoch 12, batch 9700, loss[loss=0.1412, simple_loss=0.212, pruned_loss=0.03518, over 4751.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2101, pruned_loss=0.03212, over 972119.12 frames.], batch size: 16, lr: 1.82e-04 2022-05-07 09:58:35,533 INFO [train.py:715] (0/8) Epoch 12, batch 9750, loss[loss=0.17, simple_loss=0.2402, pruned_loss=0.0499, over 4865.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.03237, over 971917.92 frames.], batch size: 30, lr: 1.82e-04 2022-05-07 09:59:13,481 INFO [train.py:715] (0/8) Epoch 12, batch 9800, loss[loss=0.1295, simple_loss=0.2064, pruned_loss=0.02628, over 4916.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2107, pruned_loss=0.03224, over 972368.24 frames.], batch size: 18, lr: 1.82e-04 2022-05-07 09:59:51,998 INFO [train.py:715] (0/8) Epoch 12, batch 9850, loss[loss=0.1809, simple_loss=0.2722, pruned_loss=0.04483, over 4887.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2106, pruned_loss=0.03239, over 972575.50 frames.], batch size: 22, lr: 1.82e-04 2022-05-07 10:00:29,631 INFO [train.py:715] (0/8) Epoch 12, batch 9900, loss[loss=0.1218, simple_loss=0.2047, pruned_loss=0.01938, over 4824.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2101, pruned_loss=0.03216, over 972964.12 frames.], batch size: 27, lr: 1.82e-04 2022-05-07 10:01:07,861 INFO [train.py:715] (0/8) Epoch 12, batch 9950, loss[loss=0.1381, simple_loss=0.2183, pruned_loss=0.02893, over 4846.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2104, pruned_loss=0.03205, over 972673.76 frames.], batch size: 20, lr: 1.82e-04 2022-05-07 10:01:46,616 INFO [train.py:715] (0/8) Epoch 12, batch 10000, loss[loss=0.1547, simple_loss=0.2256, pruned_loss=0.0419, over 4988.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03215, over 972691.61 frames.], batch size: 24, lr: 1.82e-04 2022-05-07 10:02:25,148 INFO [train.py:715] (0/8) Epoch 12, batch 10050, loss[loss=0.131, simple_loss=0.2055, pruned_loss=0.02819, over 4785.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2099, pruned_loss=0.03215, over 972503.71 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 10:03:03,486 INFO [train.py:715] (0/8) Epoch 12, batch 10100, loss[loss=0.1162, simple_loss=0.184, pruned_loss=0.0242, over 4707.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2102, pruned_loss=0.03217, over 972216.95 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:03:41,894 INFO [train.py:715] (0/8) Epoch 12, batch 10150, loss[loss=0.1314, simple_loss=0.2081, pruned_loss=0.02729, over 4804.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2102, pruned_loss=0.0321, over 972347.34 frames.], batch size: 25, lr: 1.82e-04 2022-05-07 10:04:20,548 INFO [train.py:715] (0/8) Epoch 12, batch 10200, loss[loss=0.1234, simple_loss=0.2008, pruned_loss=0.02305, over 4977.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2108, pruned_loss=0.03275, over 972385.19 frames.], batch size: 28, lr: 1.82e-04 2022-05-07 10:04:57,861 INFO [train.py:715] (0/8) Epoch 12, batch 10250, loss[loss=0.1526, simple_loss=0.2283, pruned_loss=0.03848, over 4751.00 frames.], tot_loss[loss=0.1388, simple_loss=0.2117, pruned_loss=0.033, over 972764.16 frames.], batch size: 19, lr: 1.82e-04 2022-05-07 10:05:36,035 INFO [train.py:715] (0/8) Epoch 12, batch 10300, loss[loss=0.1395, simple_loss=0.2089, pruned_loss=0.03504, over 4784.00 frames.], tot_loss[loss=0.1387, simple_loss=0.2114, pruned_loss=0.03295, over 972898.83 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:06:14,193 INFO [train.py:715] (0/8) Epoch 12, batch 10350, loss[loss=0.1459, simple_loss=0.2189, pruned_loss=0.0364, over 4980.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2112, pruned_loss=0.03227, over 972992.62 frames.], batch size: 25, lr: 1.82e-04 2022-05-07 10:06:52,238 INFO [train.py:715] (0/8) Epoch 12, batch 10400, loss[loss=0.1421, simple_loss=0.2084, pruned_loss=0.03789, over 4832.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2113, pruned_loss=0.03261, over 972386.17 frames.], batch size: 26, lr: 1.82e-04 2022-05-07 10:07:29,795 INFO [train.py:715] (0/8) Epoch 12, batch 10450, loss[loss=0.1456, simple_loss=0.206, pruned_loss=0.04257, over 4863.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2113, pruned_loss=0.03255, over 972230.54 frames.], batch size: 30, lr: 1.82e-04 2022-05-07 10:08:07,724 INFO [train.py:715] (0/8) Epoch 12, batch 10500, loss[loss=0.125, simple_loss=0.2029, pruned_loss=0.02353, over 4766.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.0321, over 971927.83 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:08:46,132 INFO [train.py:715] (0/8) Epoch 12, batch 10550, loss[loss=0.132, simple_loss=0.202, pruned_loss=0.03102, over 4955.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2101, pruned_loss=0.03198, over 971303.74 frames.], batch size: 29, lr: 1.82e-04 2022-05-07 10:09:23,507 INFO [train.py:715] (0/8) Epoch 12, batch 10600, loss[loss=0.1455, simple_loss=0.2245, pruned_loss=0.03325, over 4770.00 frames.], tot_loss[loss=0.1369, simple_loss=0.21, pruned_loss=0.03193, over 971029.32 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 10:10:01,495 INFO [train.py:715] (0/8) Epoch 12, batch 10650, loss[loss=0.1192, simple_loss=0.1815, pruned_loss=0.02852, over 4797.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2097, pruned_loss=0.03173, over 970660.44 frames.], batch size: 13, lr: 1.82e-04 2022-05-07 10:10:39,352 INFO [train.py:715] (0/8) Epoch 12, batch 10700, loss[loss=0.1261, simple_loss=0.1962, pruned_loss=0.02795, over 4770.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03163, over 970857.04 frames.], batch size: 18, lr: 1.82e-04 2022-05-07 10:11:16,856 INFO [train.py:715] (0/8) Epoch 12, batch 10750, loss[loss=0.1687, simple_loss=0.2385, pruned_loss=0.04948, over 4876.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03173, over 971264.73 frames.], batch size: 16, lr: 1.82e-04 2022-05-07 10:11:54,741 INFO [train.py:715] (0/8) Epoch 12, batch 10800, loss[loss=0.1202, simple_loss=0.1967, pruned_loss=0.02183, over 4793.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03133, over 971182.27 frames.], batch size: 12, lr: 1.82e-04 2022-05-07 10:12:32,731 INFO [train.py:715] (0/8) Epoch 12, batch 10850, loss[loss=0.1455, simple_loss=0.2099, pruned_loss=0.04054, over 4965.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2102, pruned_loss=0.0318, over 971275.31 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:13:11,524 INFO [train.py:715] (0/8) Epoch 12, batch 10900, loss[loss=0.1776, simple_loss=0.2574, pruned_loss=0.04891, over 4979.00 frames.], tot_loss[loss=0.138, simple_loss=0.2118, pruned_loss=0.03206, over 970887.51 frames.], batch size: 35, lr: 1.82e-04 2022-05-07 10:13:48,731 INFO [train.py:715] (0/8) Epoch 12, batch 10950, loss[loss=0.1499, simple_loss=0.2038, pruned_loss=0.04801, over 4757.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2117, pruned_loss=0.03207, over 971918.86 frames.], batch size: 19, lr: 1.82e-04 2022-05-07 10:14:26,873 INFO [train.py:715] (0/8) Epoch 12, batch 11000, loss[loss=0.1366, simple_loss=0.2089, pruned_loss=0.03218, over 4979.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2107, pruned_loss=0.03186, over 972308.90 frames.], batch size: 31, lr: 1.82e-04 2022-05-07 10:15:05,143 INFO [train.py:715] (0/8) Epoch 12, batch 11050, loss[loss=0.1204, simple_loss=0.2, pruned_loss=0.02039, over 4776.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03189, over 972644.06 frames.], batch size: 12, lr: 1.82e-04 2022-05-07 10:15:42,769 INFO [train.py:715] (0/8) Epoch 12, batch 11100, loss[loss=0.1085, simple_loss=0.1809, pruned_loss=0.01807, over 4920.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03156, over 972206.86 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 10:16:21,270 INFO [train.py:715] (0/8) Epoch 12, batch 11150, loss[loss=0.165, simple_loss=0.2281, pruned_loss=0.05099, over 4850.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03179, over 972785.87 frames.], batch size: 34, lr: 1.82e-04 2022-05-07 10:16:58,887 INFO [train.py:715] (0/8) Epoch 12, batch 11200, loss[loss=0.1479, simple_loss=0.2106, pruned_loss=0.04266, over 4977.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2109, pruned_loss=0.03171, over 972980.36 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:17:36,985 INFO [train.py:715] (0/8) Epoch 12, batch 11250, loss[loss=0.1488, simple_loss=0.2267, pruned_loss=0.03542, over 4874.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03186, over 973043.20 frames.], batch size: 16, lr: 1.82e-04 2022-05-07 10:18:14,706 INFO [train.py:715] (0/8) Epoch 12, batch 11300, loss[loss=0.1297, simple_loss=0.2103, pruned_loss=0.02459, over 4863.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2112, pruned_loss=0.03191, over 971829.21 frames.], batch size: 20, lr: 1.82e-04 2022-05-07 10:18:51,978 INFO [train.py:715] (0/8) Epoch 12, batch 11350, loss[loss=0.1391, simple_loss=0.2149, pruned_loss=0.03167, over 4686.00 frames.], tot_loss[loss=0.137, simple_loss=0.2108, pruned_loss=0.03164, over 971866.23 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:19:30,198 INFO [train.py:715] (0/8) Epoch 12, batch 11400, loss[loss=0.149, simple_loss=0.2091, pruned_loss=0.04448, over 4795.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2107, pruned_loss=0.03128, over 972069.54 frames.], batch size: 13, lr: 1.82e-04 2022-05-07 10:20:07,739 INFO [train.py:715] (0/8) Epoch 12, batch 11450, loss[loss=0.1645, simple_loss=0.2279, pruned_loss=0.05055, over 4843.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2108, pruned_loss=0.03187, over 971831.76 frames.], batch size: 32, lr: 1.82e-04 2022-05-07 10:20:45,254 INFO [train.py:715] (0/8) Epoch 12, batch 11500, loss[loss=0.1139, simple_loss=0.1778, pruned_loss=0.02496, over 4855.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03171, over 972200.08 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:21:23,001 INFO [train.py:715] (0/8) Epoch 12, batch 11550, loss[loss=0.1412, simple_loss=0.2192, pruned_loss=0.03157, over 4751.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03141, over 972581.31 frames.], batch size: 19, lr: 1.82e-04 2022-05-07 10:22:01,391 INFO [train.py:715] (0/8) Epoch 12, batch 11600, loss[loss=0.122, simple_loss=0.1954, pruned_loss=0.02425, over 4936.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03185, over 972893.05 frames.], batch size: 21, lr: 1.82e-04 2022-05-07 10:22:38,878 INFO [train.py:715] (0/8) Epoch 12, batch 11650, loss[loss=0.1499, simple_loss=0.2218, pruned_loss=0.03898, over 4688.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03188, over 971400.96 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:23:16,089 INFO [train.py:715] (0/8) Epoch 12, batch 11700, loss[loss=0.1528, simple_loss=0.2266, pruned_loss=0.0395, over 4971.00 frames.], tot_loss[loss=0.1369, simple_loss=0.21, pruned_loss=0.03196, over 971112.97 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:23:53,747 INFO [train.py:715] (0/8) Epoch 12, batch 11750, loss[loss=0.1253, simple_loss=0.2008, pruned_loss=0.02492, over 4821.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2096, pruned_loss=0.03187, over 971543.33 frames.], batch size: 27, lr: 1.82e-04 2022-05-07 10:24:31,080 INFO [train.py:715] (0/8) Epoch 12, batch 11800, loss[loss=0.1263, simple_loss=0.2109, pruned_loss=0.02091, over 4957.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2094, pruned_loss=0.03157, over 971959.95 frames.], batch size: 21, lr: 1.82e-04 2022-05-07 10:25:08,776 INFO [train.py:715] (0/8) Epoch 12, batch 11850, loss[loss=0.1463, simple_loss=0.2083, pruned_loss=0.04213, over 4847.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2095, pruned_loss=0.03188, over 972453.77 frames.], batch size: 30, lr: 1.82e-04 2022-05-07 10:25:46,623 INFO [train.py:715] (0/8) Epoch 12, batch 11900, loss[loss=0.1336, simple_loss=0.2137, pruned_loss=0.02675, over 4877.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2098, pruned_loss=0.03164, over 972237.88 frames.], batch size: 22, lr: 1.82e-04 2022-05-07 10:26:24,512 INFO [train.py:715] (0/8) Epoch 12, batch 11950, loss[loss=0.1609, simple_loss=0.2191, pruned_loss=0.0513, over 4781.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03197, over 972806.04 frames.], batch size: 18, lr: 1.82e-04 2022-05-07 10:27:01,974 INFO [train.py:715] (0/8) Epoch 12, batch 12000, loss[loss=0.1065, simple_loss=0.1716, pruned_loss=0.02073, over 4845.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03219, over 972913.08 frames.], batch size: 12, lr: 1.82e-04 2022-05-07 10:27:01,976 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 10:27:11,324 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1058, simple_loss=0.1897, pruned_loss=0.01095, over 914524.00 frames. 2022-05-07 10:27:50,012 INFO [train.py:715] (0/8) Epoch 12, batch 12050, loss[loss=0.1273, simple_loss=0.2001, pruned_loss=0.02727, over 4966.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.0319, over 973363.82 frames.], batch size: 25, lr: 1.82e-04 2022-05-07 10:28:29,089 INFO [train.py:715] (0/8) Epoch 12, batch 12100, loss[loss=0.1338, simple_loss=0.1989, pruned_loss=0.03432, over 4865.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2113, pruned_loss=0.03191, over 973076.80 frames.], batch size: 16, lr: 1.82e-04 2022-05-07 10:29:08,844 INFO [train.py:715] (0/8) Epoch 12, batch 12150, loss[loss=0.1675, simple_loss=0.2534, pruned_loss=0.04084, over 4781.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2118, pruned_loss=0.03218, over 972279.21 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:29:47,128 INFO [train.py:715] (0/8) Epoch 12, batch 12200, loss[loss=0.1518, simple_loss=0.2327, pruned_loss=0.03541, over 4898.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.03186, over 970901.04 frames.], batch size: 19, lr: 1.82e-04 2022-05-07 10:30:25,382 INFO [train.py:715] (0/8) Epoch 12, batch 12250, loss[loss=0.1498, simple_loss=0.2185, pruned_loss=0.04053, over 4894.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2113, pruned_loss=0.03217, over 972509.00 frames.], batch size: 18, lr: 1.82e-04 2022-05-07 10:31:04,236 INFO [train.py:715] (0/8) Epoch 12, batch 12300, loss[loss=0.1522, simple_loss=0.2215, pruned_loss=0.04141, over 4769.00 frames.], tot_loss[loss=0.1385, simple_loss=0.212, pruned_loss=0.03252, over 971661.53 frames.], batch size: 18, lr: 1.82e-04 2022-05-07 10:31:42,813 INFO [train.py:715] (0/8) Epoch 12, batch 12350, loss[loss=0.1224, simple_loss=0.1965, pruned_loss=0.02419, over 4793.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2118, pruned_loss=0.03229, over 972039.99 frames.], batch size: 24, lr: 1.82e-04 2022-05-07 10:32:20,260 INFO [train.py:715] (0/8) Epoch 12, batch 12400, loss[loss=0.1281, simple_loss=0.2077, pruned_loss=0.02423, over 4979.00 frames.], tot_loss[loss=0.137, simple_loss=0.211, pruned_loss=0.03154, over 972926.11 frames.], batch size: 24, lr: 1.82e-04 2022-05-07 10:32:57,983 INFO [train.py:715] (0/8) Epoch 12, batch 12450, loss[loss=0.1413, simple_loss=0.2172, pruned_loss=0.03273, over 4859.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2113, pruned_loss=0.0315, over 972912.55 frames.], batch size: 20, lr: 1.82e-04 2022-05-07 10:33:36,205 INFO [train.py:715] (0/8) Epoch 12, batch 12500, loss[loss=0.1157, simple_loss=0.191, pruned_loss=0.02018, over 4814.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.03123, over 973001.11 frames.], batch size: 27, lr: 1.82e-04 2022-05-07 10:34:13,315 INFO [train.py:715] (0/8) Epoch 12, batch 12550, loss[loss=0.1314, simple_loss=0.2078, pruned_loss=0.02747, over 4882.00 frames.], tot_loss[loss=0.1359, simple_loss=0.21, pruned_loss=0.03094, over 972765.18 frames.], batch size: 22, lr: 1.82e-04 2022-05-07 10:34:51,152 INFO [train.py:715] (0/8) Epoch 12, batch 12600, loss[loss=0.1615, simple_loss=0.2336, pruned_loss=0.04475, over 4983.00 frames.], tot_loss[loss=0.1372, simple_loss=0.211, pruned_loss=0.03173, over 973233.99 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:35:28,919 INFO [train.py:715] (0/8) Epoch 12, batch 12650, loss[loss=0.1105, simple_loss=0.1852, pruned_loss=0.01785, over 4788.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03175, over 972752.41 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:36:06,671 INFO [train.py:715] (0/8) Epoch 12, batch 12700, loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03218, over 4983.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2115, pruned_loss=0.03192, over 973095.95 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:36:44,124 INFO [train.py:715] (0/8) Epoch 12, batch 12750, loss[loss=0.1684, simple_loss=0.2265, pruned_loss=0.05519, over 4832.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2113, pruned_loss=0.03193, over 972877.79 frames.], batch size: 30, lr: 1.82e-04 2022-05-07 10:37:22,152 INFO [train.py:715] (0/8) Epoch 12, batch 12800, loss[loss=0.1687, simple_loss=0.2325, pruned_loss=0.05248, over 4966.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03254, over 973508.04 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:38:00,586 INFO [train.py:715] (0/8) Epoch 12, batch 12850, loss[loss=0.1305, simple_loss=0.2036, pruned_loss=0.02867, over 4783.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2109, pruned_loss=0.03191, over 973299.08 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 10:38:37,908 INFO [train.py:715] (0/8) Epoch 12, batch 12900, loss[loss=0.1564, simple_loss=0.2267, pruned_loss=0.04309, over 4945.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03177, over 973088.46 frames.], batch size: 21, lr: 1.82e-04 2022-05-07 10:39:15,001 INFO [train.py:715] (0/8) Epoch 12, batch 12950, loss[loss=0.117, simple_loss=0.1867, pruned_loss=0.02363, over 4808.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03203, over 973437.83 frames.], batch size: 21, lr: 1.82e-04 2022-05-07 10:39:52,995 INFO [train.py:715] (0/8) Epoch 12, batch 13000, loss[loss=0.1251, simple_loss=0.1966, pruned_loss=0.02678, over 4766.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03199, over 972906.31 frames.], batch size: 19, lr: 1.82e-04 2022-05-07 10:40:30,777 INFO [train.py:715] (0/8) Epoch 12, batch 13050, loss[loss=0.1205, simple_loss=0.197, pruned_loss=0.022, over 4982.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03166, over 972265.47 frames.], batch size: 24, lr: 1.82e-04 2022-05-07 10:41:08,529 INFO [train.py:715] (0/8) Epoch 12, batch 13100, loss[loss=0.1453, simple_loss=0.2268, pruned_loss=0.03189, over 4810.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2104, pruned_loss=0.03236, over 972706.01 frames.], batch size: 25, lr: 1.82e-04 2022-05-07 10:41:46,119 INFO [train.py:715] (0/8) Epoch 12, batch 13150, loss[loss=0.1441, simple_loss=0.2148, pruned_loss=0.03675, over 4873.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2096, pruned_loss=0.03194, over 971809.52 frames.], batch size: 20, lr: 1.82e-04 2022-05-07 10:42:23,786 INFO [train.py:715] (0/8) Epoch 12, batch 13200, loss[loss=0.1198, simple_loss=0.1979, pruned_loss=0.02088, over 4937.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2094, pruned_loss=0.03212, over 972548.37 frames.], batch size: 23, lr: 1.82e-04 2022-05-07 10:43:01,011 INFO [train.py:715] (0/8) Epoch 12, batch 13250, loss[loss=0.1266, simple_loss=0.2011, pruned_loss=0.026, over 4977.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2096, pruned_loss=0.03176, over 973051.97 frames.], batch size: 29, lr: 1.82e-04 2022-05-07 10:43:38,184 INFO [train.py:715] (0/8) Epoch 12, batch 13300, loss[loss=0.1233, simple_loss=0.2006, pruned_loss=0.02299, over 4830.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2097, pruned_loss=0.03166, over 972905.18 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:44:16,074 INFO [train.py:715] (0/8) Epoch 12, batch 13350, loss[loss=0.132, simple_loss=0.2026, pruned_loss=0.03071, over 4859.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03192, over 973183.99 frames.], batch size: 20, lr: 1.82e-04 2022-05-07 10:44:54,312 INFO [train.py:715] (0/8) Epoch 12, batch 13400, loss[loss=0.1067, simple_loss=0.1778, pruned_loss=0.01779, over 4763.00 frames.], tot_loss[loss=0.1368, simple_loss=0.21, pruned_loss=0.03183, over 973321.99 frames.], batch size: 18, lr: 1.82e-04 2022-05-07 10:45:31,690 INFO [train.py:715] (0/8) Epoch 12, batch 13450, loss[loss=0.1516, simple_loss=0.2249, pruned_loss=0.03912, over 4967.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2099, pruned_loss=0.03188, over 973169.18 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:46:09,027 INFO [train.py:715] (0/8) Epoch 12, batch 13500, loss[loss=0.1535, simple_loss=0.2225, pruned_loss=0.04228, over 4912.00 frames.], tot_loss[loss=0.137, simple_loss=0.2099, pruned_loss=0.03208, over 973369.30 frames.], batch size: 39, lr: 1.82e-04 2022-05-07 10:46:47,472 INFO [train.py:715] (0/8) Epoch 12, batch 13550, loss[loss=0.1394, simple_loss=0.2192, pruned_loss=0.02982, over 4884.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03196, over 972812.69 frames.], batch size: 16, lr: 1.82e-04 2022-05-07 10:47:24,684 INFO [train.py:715] (0/8) Epoch 12, batch 13600, loss[loss=0.1331, simple_loss=0.2083, pruned_loss=0.02895, over 4971.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2104, pruned_loss=0.03152, over 973044.85 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:48:02,571 INFO [train.py:715] (0/8) Epoch 12, batch 13650, loss[loss=0.1488, simple_loss=0.2239, pruned_loss=0.03689, over 4909.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03174, over 972875.93 frames.], batch size: 19, lr: 1.82e-04 2022-05-07 10:48:40,707 INFO [train.py:715] (0/8) Epoch 12, batch 13700, loss[loss=0.1633, simple_loss=0.2191, pruned_loss=0.05372, over 4946.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03155, over 972721.42 frames.], batch size: 35, lr: 1.82e-04 2022-05-07 10:49:18,439 INFO [train.py:715] (0/8) Epoch 12, batch 13750, loss[loss=0.1436, simple_loss=0.2214, pruned_loss=0.03296, over 4897.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03123, over 972225.81 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 10:49:56,506 INFO [train.py:715] (0/8) Epoch 12, batch 13800, loss[loss=0.171, simple_loss=0.2408, pruned_loss=0.05061, over 4816.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03161, over 971428.43 frames.], batch size: 25, lr: 1.82e-04 2022-05-07 10:50:34,446 INFO [train.py:715] (0/8) Epoch 12, batch 13850, loss[loss=0.1536, simple_loss=0.2312, pruned_loss=0.03804, over 4828.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.03204, over 970784.71 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:51:12,968 INFO [train.py:715] (0/8) Epoch 12, batch 13900, loss[loss=0.1113, simple_loss=0.1848, pruned_loss=0.01889, over 4777.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.03177, over 970838.61 frames.], batch size: 17, lr: 1.82e-04 2022-05-07 10:51:50,183 INFO [train.py:715] (0/8) Epoch 12, batch 13950, loss[loss=0.1361, simple_loss=0.2034, pruned_loss=0.03445, over 4819.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03162, over 970906.00 frames.], batch size: 14, lr: 1.82e-04 2022-05-07 10:52:28,375 INFO [train.py:715] (0/8) Epoch 12, batch 14000, loss[loss=0.1297, simple_loss=0.2006, pruned_loss=0.02944, over 4979.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03131, over 971701.85 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:53:06,886 INFO [train.py:715] (0/8) Epoch 12, batch 14050, loss[loss=0.1096, simple_loss=0.179, pruned_loss=0.02008, over 4792.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03123, over 972777.35 frames.], batch size: 24, lr: 1.82e-04 2022-05-07 10:53:44,257 INFO [train.py:715] (0/8) Epoch 12, batch 14100, loss[loss=0.1289, simple_loss=0.2068, pruned_loss=0.02551, over 4934.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03097, over 971903.53 frames.], batch size: 29, lr: 1.82e-04 2022-05-07 10:54:21,692 INFO [train.py:715] (0/8) Epoch 12, batch 14150, loss[loss=0.1434, simple_loss=0.2112, pruned_loss=0.03778, over 4694.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03094, over 972182.06 frames.], batch size: 15, lr: 1.82e-04 2022-05-07 10:55:00,100 INFO [train.py:715] (0/8) Epoch 12, batch 14200, loss[loss=0.1379, simple_loss=0.2002, pruned_loss=0.03783, over 4855.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.03094, over 971523.67 frames.], batch size: 34, lr: 1.82e-04 2022-05-07 10:55:38,423 INFO [train.py:715] (0/8) Epoch 12, batch 14250, loss[loss=0.1355, simple_loss=0.2115, pruned_loss=0.02982, over 4972.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03156, over 970992.21 frames.], batch size: 24, lr: 1.81e-04 2022-05-07 10:56:05,475 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-432000.pt 2022-05-07 10:56:18,081 INFO [train.py:715] (0/8) Epoch 12, batch 14300, loss[loss=0.1247, simple_loss=0.2027, pruned_loss=0.02333, over 4808.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03164, over 970939.07 frames.], batch size: 13, lr: 1.81e-04 2022-05-07 10:56:56,579 INFO [train.py:715] (0/8) Epoch 12, batch 14350, loss[loss=0.107, simple_loss=0.186, pruned_loss=0.01401, over 4823.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2104, pruned_loss=0.03134, over 971641.47 frames.], batch size: 27, lr: 1.81e-04 2022-05-07 10:57:35,964 INFO [train.py:715] (0/8) Epoch 12, batch 14400, loss[loss=0.1237, simple_loss=0.2035, pruned_loss=0.0219, over 4815.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2109, pruned_loss=0.03166, over 971502.78 frames.], batch size: 12, lr: 1.81e-04 2022-05-07 10:58:14,109 INFO [train.py:715] (0/8) Epoch 12, batch 14450, loss[loss=0.1362, simple_loss=0.2044, pruned_loss=0.03394, over 4776.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03183, over 970937.62 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 10:58:53,033 INFO [train.py:715] (0/8) Epoch 12, batch 14500, loss[loss=0.111, simple_loss=0.1819, pruned_loss=0.02008, over 4822.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.03212, over 970418.59 frames.], batch size: 26, lr: 1.81e-04 2022-05-07 10:59:32,142 INFO [train.py:715] (0/8) Epoch 12, batch 14550, loss[loss=0.1531, simple_loss=0.2145, pruned_loss=0.04582, over 4865.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03155, over 970044.76 frames.], batch size: 30, lr: 1.81e-04 2022-05-07 11:00:11,026 INFO [train.py:715] (0/8) Epoch 12, batch 14600, loss[loss=0.1243, simple_loss=0.1968, pruned_loss=0.02594, over 4768.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03146, over 970677.65 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:00:49,642 INFO [train.py:715] (0/8) Epoch 12, batch 14650, loss[loss=0.1289, simple_loss=0.2133, pruned_loss=0.0222, over 4892.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.03151, over 971526.87 frames.], batch size: 22, lr: 1.81e-04 2022-05-07 11:01:27,543 INFO [train.py:715] (0/8) Epoch 12, batch 14700, loss[loss=0.1195, simple_loss=0.1979, pruned_loss=0.02051, over 4970.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2107, pruned_loss=0.03153, over 972581.96 frames.], batch size: 24, lr: 1.81e-04 2022-05-07 11:02:06,064 INFO [train.py:715] (0/8) Epoch 12, batch 14750, loss[loss=0.1132, simple_loss=0.1819, pruned_loss=0.02229, over 4787.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.0317, over 972035.93 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:02:43,579 INFO [train.py:715] (0/8) Epoch 12, batch 14800, loss[loss=0.1519, simple_loss=0.2184, pruned_loss=0.04272, over 4928.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03172, over 972717.06 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:03:21,324 INFO [train.py:715] (0/8) Epoch 12, batch 14850, loss[loss=0.1297, simple_loss=0.2112, pruned_loss=0.02406, over 4879.00 frames.], tot_loss[loss=0.1375, simple_loss=0.211, pruned_loss=0.03196, over 972548.18 frames.], batch size: 22, lr: 1.81e-04 2022-05-07 11:03:59,684 INFO [train.py:715] (0/8) Epoch 12, batch 14900, loss[loss=0.1326, simple_loss=0.2041, pruned_loss=0.03053, over 4782.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.03199, over 971937.58 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:04:38,244 INFO [train.py:715] (0/8) Epoch 12, batch 14950, loss[loss=0.1485, simple_loss=0.2126, pruned_loss=0.04218, over 4869.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03192, over 972200.55 frames.], batch size: 32, lr: 1.81e-04 2022-05-07 11:05:15,438 INFO [train.py:715] (0/8) Epoch 12, batch 15000, loss[loss=0.1578, simple_loss=0.2378, pruned_loss=0.03891, over 4961.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03198, over 972488.91 frames.], batch size: 39, lr: 1.81e-04 2022-05-07 11:05:15,439 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 11:05:25,070 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1057, simple_loss=0.1897, pruned_loss=0.01083, over 914524.00 frames. 2022-05-07 11:06:02,914 INFO [train.py:715] (0/8) Epoch 12, batch 15050, loss[loss=0.1287, simple_loss=0.2082, pruned_loss=0.02455, over 4963.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03192, over 972237.66 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:06:41,206 INFO [train.py:715] (0/8) Epoch 12, batch 15100, loss[loss=0.13, simple_loss=0.2033, pruned_loss=0.02829, over 4895.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.03183, over 972747.14 frames.], batch size: 29, lr: 1.81e-04 2022-05-07 11:07:20,386 INFO [train.py:715] (0/8) Epoch 12, batch 15150, loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03134, over 4742.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03168, over 973107.73 frames.], batch size: 16, lr: 1.81e-04 2022-05-07 11:07:58,859 INFO [train.py:715] (0/8) Epoch 12, batch 15200, loss[loss=0.1171, simple_loss=0.1891, pruned_loss=0.02254, over 4786.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03215, over 972528.96 frames.], batch size: 21, lr: 1.81e-04 2022-05-07 11:08:37,644 INFO [train.py:715] (0/8) Epoch 12, batch 15250, loss[loss=0.155, simple_loss=0.238, pruned_loss=0.03596, over 4875.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2107, pruned_loss=0.03235, over 972680.08 frames.], batch size: 30, lr: 1.81e-04 2022-05-07 11:09:16,363 INFO [train.py:715] (0/8) Epoch 12, batch 15300, loss[loss=0.1361, simple_loss=0.2167, pruned_loss=0.02772, over 4799.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.032, over 972939.41 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:09:54,565 INFO [train.py:715] (0/8) Epoch 12, batch 15350, loss[loss=0.1172, simple_loss=0.1905, pruned_loss=0.02201, over 4780.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2103, pruned_loss=0.03205, over 972672.41 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:10:31,949 INFO [train.py:715] (0/8) Epoch 12, batch 15400, loss[loss=0.1341, simple_loss=0.2091, pruned_loss=0.02952, over 4917.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2107, pruned_loss=0.03213, over 972871.48 frames.], batch size: 29, lr: 1.81e-04 2022-05-07 11:11:09,685 INFO [train.py:715] (0/8) Epoch 12, batch 15450, loss[loss=0.119, simple_loss=0.1935, pruned_loss=0.02224, over 4940.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2099, pruned_loss=0.03224, over 971457.78 frames.], batch size: 21, lr: 1.81e-04 2022-05-07 11:11:48,435 INFO [train.py:715] (0/8) Epoch 12, batch 15500, loss[loss=0.1344, simple_loss=0.2121, pruned_loss=0.0284, over 4781.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2093, pruned_loss=0.03212, over 971233.21 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:12:26,561 INFO [train.py:715] (0/8) Epoch 12, batch 15550, loss[loss=0.1312, simple_loss=0.2011, pruned_loss=0.03064, over 4648.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2093, pruned_loss=0.03202, over 971602.84 frames.], batch size: 13, lr: 1.81e-04 2022-05-07 11:13:04,456 INFO [train.py:715] (0/8) Epoch 12, batch 15600, loss[loss=0.1411, simple_loss=0.2164, pruned_loss=0.03288, over 4933.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2095, pruned_loss=0.03179, over 972199.60 frames.], batch size: 39, lr: 1.81e-04 2022-05-07 11:13:42,235 INFO [train.py:715] (0/8) Epoch 12, batch 15650, loss[loss=0.1281, simple_loss=0.206, pruned_loss=0.02517, over 4807.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03199, over 972458.07 frames.], batch size: 25, lr: 1.81e-04 2022-05-07 11:14:20,671 INFO [train.py:715] (0/8) Epoch 12, batch 15700, loss[loss=0.1183, simple_loss=0.1864, pruned_loss=0.02507, over 4863.00 frames.], tot_loss[loss=0.137, simple_loss=0.2101, pruned_loss=0.03191, over 972780.58 frames.], batch size: 20, lr: 1.81e-04 2022-05-07 11:14:58,367 INFO [train.py:715] (0/8) Epoch 12, batch 15750, loss[loss=0.136, simple_loss=0.2143, pruned_loss=0.02889, over 4788.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.03187, over 972355.66 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:15:36,100 INFO [train.py:715] (0/8) Epoch 12, batch 15800, loss[loss=0.1323, simple_loss=0.2039, pruned_loss=0.03032, over 4832.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2096, pruned_loss=0.03207, over 972192.38 frames.], batch size: 13, lr: 1.81e-04 2022-05-07 11:16:14,193 INFO [train.py:715] (0/8) Epoch 12, batch 15850, loss[loss=0.1213, simple_loss=0.1904, pruned_loss=0.02607, over 4886.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2098, pruned_loss=0.03225, over 971399.08 frames.], batch size: 16, lr: 1.81e-04 2022-05-07 11:16:51,693 INFO [train.py:715] (0/8) Epoch 12, batch 15900, loss[loss=0.1497, simple_loss=0.2166, pruned_loss=0.04145, over 4850.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2096, pruned_loss=0.03186, over 971539.29 frames.], batch size: 30, lr: 1.81e-04 2022-05-07 11:17:29,505 INFO [train.py:715] (0/8) Epoch 12, batch 15950, loss[loss=0.1398, simple_loss=0.2051, pruned_loss=0.03718, over 4822.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2092, pruned_loss=0.03173, over 971802.89 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:18:07,566 INFO [train.py:715] (0/8) Epoch 12, batch 16000, loss[loss=0.1505, simple_loss=0.2223, pruned_loss=0.0393, over 4865.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2086, pruned_loss=0.03127, over 972287.61 frames.], batch size: 32, lr: 1.81e-04 2022-05-07 11:18:47,324 INFO [train.py:715] (0/8) Epoch 12, batch 16050, loss[loss=0.1284, simple_loss=0.1969, pruned_loss=0.02993, over 4790.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2089, pruned_loss=0.03116, over 971464.66 frames.], batch size: 17, lr: 1.81e-04 2022-05-07 11:19:25,274 INFO [train.py:715] (0/8) Epoch 12, batch 16100, loss[loss=0.1122, simple_loss=0.1831, pruned_loss=0.02061, over 4837.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2091, pruned_loss=0.03132, over 971525.86 frames.], batch size: 30, lr: 1.81e-04 2022-05-07 11:20:04,185 INFO [train.py:715] (0/8) Epoch 12, batch 16150, loss[loss=0.1619, simple_loss=0.2299, pruned_loss=0.04694, over 4893.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03132, over 971367.40 frames.], batch size: 39, lr: 1.81e-04 2022-05-07 11:20:43,066 INFO [train.py:715] (0/8) Epoch 12, batch 16200, loss[loss=0.1267, simple_loss=0.2077, pruned_loss=0.02281, over 4987.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.0318, over 971534.98 frames.], batch size: 25, lr: 1.81e-04 2022-05-07 11:21:21,839 INFO [train.py:715] (0/8) Epoch 12, batch 16250, loss[loss=0.1307, simple_loss=0.2042, pruned_loss=0.02865, over 4835.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.03177, over 972183.56 frames.], batch size: 30, lr: 1.81e-04 2022-05-07 11:21:59,693 INFO [train.py:715] (0/8) Epoch 12, batch 16300, loss[loss=0.1304, simple_loss=0.197, pruned_loss=0.0319, over 4869.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2103, pruned_loss=0.03177, over 971631.15 frames.], batch size: 20, lr: 1.81e-04 2022-05-07 11:22:37,474 INFO [train.py:715] (0/8) Epoch 12, batch 16350, loss[loss=0.1319, simple_loss=0.2032, pruned_loss=0.03031, over 4948.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03131, over 971475.42 frames.], batch size: 21, lr: 1.81e-04 2022-05-07 11:23:16,249 INFO [train.py:715] (0/8) Epoch 12, batch 16400, loss[loss=0.151, simple_loss=0.2232, pruned_loss=0.03946, over 4847.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2104, pruned_loss=0.03188, over 972409.40 frames.], batch size: 32, lr: 1.81e-04 2022-05-07 11:23:54,208 INFO [train.py:715] (0/8) Epoch 12, batch 16450, loss[loss=0.1377, simple_loss=0.1986, pruned_loss=0.03845, over 4743.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.03242, over 972689.43 frames.], batch size: 16, lr: 1.81e-04 2022-05-07 11:24:33,094 INFO [train.py:715] (0/8) Epoch 12, batch 16500, loss[loss=0.1439, simple_loss=0.2106, pruned_loss=0.03861, over 4827.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.0321, over 972455.54 frames.], batch size: 13, lr: 1.81e-04 2022-05-07 11:25:12,167 INFO [train.py:715] (0/8) Epoch 12, batch 16550, loss[loss=0.1083, simple_loss=0.1921, pruned_loss=0.01223, over 4958.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03196, over 973098.96 frames.], batch size: 24, lr: 1.81e-04 2022-05-07 11:25:51,303 INFO [train.py:715] (0/8) Epoch 12, batch 16600, loss[loss=0.1347, simple_loss=0.2185, pruned_loss=0.02547, over 4926.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03193, over 972992.65 frames.], batch size: 23, lr: 1.81e-04 2022-05-07 11:26:29,842 INFO [train.py:715] (0/8) Epoch 12, batch 16650, loss[loss=0.1432, simple_loss=0.2179, pruned_loss=0.03428, over 4818.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03164, over 972259.19 frames.], batch size: 25, lr: 1.81e-04 2022-05-07 11:27:08,906 INFO [train.py:715] (0/8) Epoch 12, batch 16700, loss[loss=0.1656, simple_loss=0.2425, pruned_loss=0.0443, over 4863.00 frames.], tot_loss[loss=0.1368, simple_loss=0.21, pruned_loss=0.03173, over 971897.52 frames.], batch size: 20, lr: 1.81e-04 2022-05-07 11:27:48,107 INFO [train.py:715] (0/8) Epoch 12, batch 16750, loss[loss=0.1363, simple_loss=0.2164, pruned_loss=0.02807, over 4939.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03218, over 972490.06 frames.], batch size: 21, lr: 1.81e-04 2022-05-07 11:28:26,496 INFO [train.py:715] (0/8) Epoch 12, batch 16800, loss[loss=0.1304, simple_loss=0.198, pruned_loss=0.03138, over 4683.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2106, pruned_loss=0.03237, over 972126.11 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:29:05,270 INFO [train.py:715] (0/8) Epoch 12, batch 16850, loss[loss=0.1393, simple_loss=0.2202, pruned_loss=0.02921, over 4931.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2103, pruned_loss=0.03253, over 972193.86 frames.], batch size: 23, lr: 1.81e-04 2022-05-07 11:29:44,421 INFO [train.py:715] (0/8) Epoch 12, batch 16900, loss[loss=0.1406, simple_loss=0.2248, pruned_loss=0.02816, over 4792.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03224, over 971794.91 frames.], batch size: 17, lr: 1.81e-04 2022-05-07 11:30:24,177 INFO [train.py:715] (0/8) Epoch 12, batch 16950, loss[loss=0.1336, simple_loss=0.2073, pruned_loss=0.02993, over 4986.00 frames.], tot_loss[loss=0.1378, simple_loss=0.211, pruned_loss=0.03234, over 973105.02 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:31:02,690 INFO [train.py:715] (0/8) Epoch 12, batch 17000, loss[loss=0.1124, simple_loss=0.1758, pruned_loss=0.02452, over 4969.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2103, pruned_loss=0.03224, over 972345.21 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:31:40,876 INFO [train.py:715] (0/8) Epoch 12, batch 17050, loss[loss=0.1379, simple_loss=0.2147, pruned_loss=0.03059, over 4940.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2105, pruned_loss=0.03247, over 971968.98 frames.], batch size: 35, lr: 1.81e-04 2022-05-07 11:32:19,760 INFO [train.py:715] (0/8) Epoch 12, batch 17100, loss[loss=0.1221, simple_loss=0.1935, pruned_loss=0.02535, over 4809.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03212, over 971456.82 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:32:58,564 INFO [train.py:715] (0/8) Epoch 12, batch 17150, loss[loss=0.139, simple_loss=0.2164, pruned_loss=0.03078, over 4974.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03243, over 972115.42 frames.], batch size: 24, lr: 1.81e-04 2022-05-07 11:33:37,592 INFO [train.py:715] (0/8) Epoch 12, batch 17200, loss[loss=0.1412, simple_loss=0.2064, pruned_loss=0.03804, over 4816.00 frames.], tot_loss[loss=0.138, simple_loss=0.2108, pruned_loss=0.03265, over 971597.58 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:34:16,024 INFO [train.py:715] (0/8) Epoch 12, batch 17250, loss[loss=0.1438, simple_loss=0.2101, pruned_loss=0.03873, over 4867.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2106, pruned_loss=0.03255, over 971661.86 frames.], batch size: 39, lr: 1.81e-04 2022-05-07 11:34:54,490 INFO [train.py:715] (0/8) Epoch 12, batch 17300, loss[loss=0.1525, simple_loss=0.2304, pruned_loss=0.03729, over 4908.00 frames.], tot_loss[loss=0.1382, simple_loss=0.211, pruned_loss=0.03267, over 971483.98 frames.], batch size: 19, lr: 1.81e-04 2022-05-07 11:35:32,123 INFO [train.py:715] (0/8) Epoch 12, batch 17350, loss[loss=0.1197, simple_loss=0.188, pruned_loss=0.02566, over 4971.00 frames.], tot_loss[loss=0.1385, simple_loss=0.2111, pruned_loss=0.03297, over 972245.36 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:36:10,075 INFO [train.py:715] (0/8) Epoch 12, batch 17400, loss[loss=0.1251, simple_loss=0.192, pruned_loss=0.02912, over 4862.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2105, pruned_loss=0.03231, over 973037.92 frames.], batch size: 32, lr: 1.81e-04 2022-05-07 11:36:47,842 INFO [train.py:715] (0/8) Epoch 12, batch 17450, loss[loss=0.1297, simple_loss=0.2025, pruned_loss=0.02843, over 4934.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2102, pruned_loss=0.03206, over 972374.40 frames.], batch size: 23, lr: 1.81e-04 2022-05-07 11:37:26,177 INFO [train.py:715] (0/8) Epoch 12, batch 17500, loss[loss=0.153, simple_loss=0.214, pruned_loss=0.04602, over 4980.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03182, over 971937.38 frames.], batch size: 39, lr: 1.81e-04 2022-05-07 11:38:04,039 INFO [train.py:715] (0/8) Epoch 12, batch 17550, loss[loss=0.1132, simple_loss=0.1925, pruned_loss=0.0169, over 4986.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03121, over 971899.95 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:38:42,238 INFO [train.py:715] (0/8) Epoch 12, batch 17600, loss[loss=0.1398, simple_loss=0.2101, pruned_loss=0.03476, over 4819.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03124, over 971398.40 frames.], batch size: 25, lr: 1.81e-04 2022-05-07 11:39:19,882 INFO [train.py:715] (0/8) Epoch 12, batch 17650, loss[loss=0.1373, simple_loss=0.2166, pruned_loss=0.02901, over 4964.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03113, over 971624.42 frames.], batch size: 40, lr: 1.81e-04 2022-05-07 11:39:57,986 INFO [train.py:715] (0/8) Epoch 12, batch 17700, loss[loss=0.1356, simple_loss=0.2053, pruned_loss=0.03295, over 4924.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03129, over 971198.56 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:40:36,845 INFO [train.py:715] (0/8) Epoch 12, batch 17750, loss[loss=0.1359, simple_loss=0.212, pruned_loss=0.02991, over 4971.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03151, over 971543.41 frames.], batch size: 35, lr: 1.81e-04 2022-05-07 11:41:15,686 INFO [train.py:715] (0/8) Epoch 12, batch 17800, loss[loss=0.1325, simple_loss=0.1991, pruned_loss=0.03299, over 4915.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03158, over 972031.90 frames.], batch size: 18, lr: 1.81e-04 2022-05-07 11:41:54,186 INFO [train.py:715] (0/8) Epoch 12, batch 17850, loss[loss=0.1335, simple_loss=0.2019, pruned_loss=0.03262, over 4779.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03139, over 971730.56 frames.], batch size: 14, lr: 1.81e-04 2022-05-07 11:42:32,956 INFO [train.py:715] (0/8) Epoch 12, batch 17900, loss[loss=0.1366, simple_loss=0.2162, pruned_loss=0.02852, over 4919.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03156, over 972573.87 frames.], batch size: 17, lr: 1.81e-04 2022-05-07 11:43:10,434 INFO [train.py:715] (0/8) Epoch 12, batch 17950, loss[loss=0.1423, simple_loss=0.2125, pruned_loss=0.0361, over 4980.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03152, over 972808.57 frames.], batch size: 24, lr: 1.81e-04 2022-05-07 11:43:48,625 INFO [train.py:715] (0/8) Epoch 12, batch 18000, loss[loss=0.1547, simple_loss=0.2313, pruned_loss=0.03904, over 4906.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2096, pruned_loss=0.03157, over 972981.36 frames.], batch size: 17, lr: 1.81e-04 2022-05-07 11:43:48,627 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 11:43:58,182 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.106, simple_loss=0.19, pruned_loss=0.011, over 914524.00 frames. 2022-05-07 11:44:36,606 INFO [train.py:715] (0/8) Epoch 12, batch 18050, loss[loss=0.1674, simple_loss=0.2407, pruned_loss=0.04707, over 4827.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2107, pruned_loss=0.03218, over 972138.49 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:45:14,474 INFO [train.py:715] (0/8) Epoch 12, batch 18100, loss[loss=0.145, simple_loss=0.2152, pruned_loss=0.03737, over 4840.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2112, pruned_loss=0.03271, over 972990.10 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:45:52,623 INFO [train.py:715] (0/8) Epoch 12, batch 18150, loss[loss=0.1217, simple_loss=0.1982, pruned_loss=0.02259, over 4755.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03228, over 972761.12 frames.], batch size: 16, lr: 1.81e-04 2022-05-07 11:46:30,448 INFO [train.py:715] (0/8) Epoch 12, batch 18200, loss[loss=0.1407, simple_loss=0.2174, pruned_loss=0.032, over 4750.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03221, over 972686.39 frames.], batch size: 19, lr: 1.81e-04 2022-05-07 11:47:08,248 INFO [train.py:715] (0/8) Epoch 12, batch 18250, loss[loss=0.1393, simple_loss=0.2067, pruned_loss=0.03595, over 4941.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03232, over 972099.06 frames.], batch size: 39, lr: 1.81e-04 2022-05-07 11:47:46,405 INFO [train.py:715] (0/8) Epoch 12, batch 18300, loss[loss=0.1189, simple_loss=0.2027, pruned_loss=0.01753, over 4899.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03217, over 972301.70 frames.], batch size: 29, lr: 1.81e-04 2022-05-07 11:48:24,290 INFO [train.py:715] (0/8) Epoch 12, batch 18350, loss[loss=0.1492, simple_loss=0.224, pruned_loss=0.03716, over 4984.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03172, over 971932.16 frames.], batch size: 35, lr: 1.81e-04 2022-05-07 11:49:02,247 INFO [train.py:715] (0/8) Epoch 12, batch 18400, loss[loss=0.1216, simple_loss=0.1913, pruned_loss=0.02591, over 4962.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2101, pruned_loss=0.03135, over 971894.82 frames.], batch size: 21, lr: 1.81e-04 2022-05-07 11:49:39,748 INFO [train.py:715] (0/8) Epoch 12, batch 18450, loss[loss=0.1077, simple_loss=0.1865, pruned_loss=0.01442, over 4930.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2108, pruned_loss=0.03173, over 971727.37 frames.], batch size: 29, lr: 1.81e-04 2022-05-07 11:50:17,834 INFO [train.py:715] (0/8) Epoch 12, batch 18500, loss[loss=0.136, simple_loss=0.2074, pruned_loss=0.03227, over 4952.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2112, pruned_loss=0.03209, over 971855.59 frames.], batch size: 35, lr: 1.81e-04 2022-05-07 11:50:55,663 INFO [train.py:715] (0/8) Epoch 12, batch 18550, loss[loss=0.1325, simple_loss=0.2118, pruned_loss=0.02659, over 4733.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03204, over 971339.63 frames.], batch size: 16, lr: 1.81e-04 2022-05-07 11:51:33,499 INFO [train.py:715] (0/8) Epoch 12, batch 18600, loss[loss=0.1685, simple_loss=0.2254, pruned_loss=0.05574, over 4889.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03206, over 971870.87 frames.], batch size: 16, lr: 1.81e-04 2022-05-07 11:52:11,110 INFO [train.py:715] (0/8) Epoch 12, batch 18650, loss[loss=0.1193, simple_loss=0.1952, pruned_loss=0.02167, over 4827.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.0318, over 971744.37 frames.], batch size: 15, lr: 1.81e-04 2022-05-07 11:52:48,670 INFO [train.py:715] (0/8) Epoch 12, batch 18700, loss[loss=0.1507, simple_loss=0.2257, pruned_loss=0.03781, over 4991.00 frames.], tot_loss[loss=0.137, simple_loss=0.2099, pruned_loss=0.03201, over 971652.39 frames.], batch size: 20, lr: 1.81e-04 2022-05-07 11:53:26,070 INFO [train.py:715] (0/8) Epoch 12, batch 18750, loss[loss=0.1313, simple_loss=0.2167, pruned_loss=0.02297, over 4741.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03218, over 972333.39 frames.], batch size: 19, lr: 1.81e-04 2022-05-07 11:54:04,010 INFO [train.py:715] (0/8) Epoch 12, batch 18800, loss[loss=0.1558, simple_loss=0.2266, pruned_loss=0.04248, over 4960.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2111, pruned_loss=0.03204, over 972549.51 frames.], batch size: 35, lr: 1.81e-04 2022-05-07 11:54:41,887 INFO [train.py:715] (0/8) Epoch 12, batch 18850, loss[loss=0.1443, simple_loss=0.2055, pruned_loss=0.04158, over 4761.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.032, over 972539.17 frames.], batch size: 19, lr: 1.81e-04 2022-05-07 11:55:19,704 INFO [train.py:715] (0/8) Epoch 12, batch 18900, loss[loss=0.1585, simple_loss=0.2527, pruned_loss=0.03212, over 4809.00 frames.], tot_loss[loss=0.138, simple_loss=0.2112, pruned_loss=0.03238, over 971856.45 frames.], batch size: 25, lr: 1.81e-04 2022-05-07 11:55:57,995 INFO [train.py:715] (0/8) Epoch 12, batch 18950, loss[loss=0.1246, simple_loss=0.2075, pruned_loss=0.02081, over 4959.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03211, over 972383.05 frames.], batch size: 21, lr: 1.81e-04 2022-05-07 11:56:35,805 INFO [train.py:715] (0/8) Epoch 12, batch 19000, loss[loss=0.1205, simple_loss=0.1937, pruned_loss=0.02371, over 4973.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03198, over 972302.99 frames.], batch size: 25, lr: 1.81e-04 2022-05-07 11:57:13,286 INFO [train.py:715] (0/8) Epoch 12, batch 19050, loss[loss=0.1511, simple_loss=0.2167, pruned_loss=0.04275, over 4860.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2106, pruned_loss=0.0319, over 973188.48 frames.], batch size: 32, lr: 1.80e-04 2022-05-07 11:57:50,567 INFO [train.py:715] (0/8) Epoch 12, batch 19100, loss[loss=0.1412, simple_loss=0.2026, pruned_loss=0.03988, over 4805.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.03182, over 972989.46 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 11:58:28,556 INFO [train.py:715] (0/8) Epoch 12, batch 19150, loss[loss=0.1291, simple_loss=0.1998, pruned_loss=0.02915, over 4867.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03194, over 972359.66 frames.], batch size: 39, lr: 1.80e-04 2022-05-07 11:59:07,190 INFO [train.py:715] (0/8) Epoch 12, batch 19200, loss[loss=0.1247, simple_loss=0.2039, pruned_loss=0.02273, over 4808.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2103, pruned_loss=0.03163, over 972731.15 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 11:59:45,238 INFO [train.py:715] (0/8) Epoch 12, batch 19250, loss[loss=0.1228, simple_loss=0.1944, pruned_loss=0.02563, over 4839.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03177, over 973634.20 frames.], batch size: 30, lr: 1.80e-04 2022-05-07 12:00:23,718 INFO [train.py:715] (0/8) Epoch 12, batch 19300, loss[loss=0.1246, simple_loss=0.1956, pruned_loss=0.02678, over 4759.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.03188, over 973002.15 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:01:01,908 INFO [train.py:715] (0/8) Epoch 12, batch 19350, loss[loss=0.132, simple_loss=0.203, pruned_loss=0.03044, over 4865.00 frames.], tot_loss[loss=0.138, simple_loss=0.2113, pruned_loss=0.03239, over 972524.58 frames.], batch size: 22, lr: 1.80e-04 2022-05-07 12:01:39,910 INFO [train.py:715] (0/8) Epoch 12, batch 19400, loss[loss=0.1205, simple_loss=0.1959, pruned_loss=0.02252, over 4789.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.03201, over 972021.39 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:02:17,948 INFO [train.py:715] (0/8) Epoch 12, batch 19450, loss[loss=0.1267, simple_loss=0.1935, pruned_loss=0.02995, over 4974.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2098, pruned_loss=0.0316, over 971789.44 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:02:56,768 INFO [train.py:715] (0/8) Epoch 12, batch 19500, loss[loss=0.129, simple_loss=0.199, pruned_loss=0.02952, over 4836.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2104, pruned_loss=0.03185, over 972261.68 frames.], batch size: 20, lr: 1.80e-04 2022-05-07 12:03:35,593 INFO [train.py:715] (0/8) Epoch 12, batch 19550, loss[loss=0.1438, simple_loss=0.2181, pruned_loss=0.03478, over 4799.00 frames.], tot_loss[loss=0.1375, simple_loss=0.211, pruned_loss=0.03197, over 972854.11 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:04:14,315 INFO [train.py:715] (0/8) Epoch 12, batch 19600, loss[loss=0.1694, simple_loss=0.2438, pruned_loss=0.04751, over 4925.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03201, over 972400.67 frames.], batch size: 29, lr: 1.80e-04 2022-05-07 12:04:53,460 INFO [train.py:715] (0/8) Epoch 12, batch 19650, loss[loss=0.1199, simple_loss=0.1934, pruned_loss=0.0232, over 4916.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.03213, over 972786.06 frames.], batch size: 23, lr: 1.80e-04 2022-05-07 12:05:32,627 INFO [train.py:715] (0/8) Epoch 12, batch 19700, loss[loss=0.1418, simple_loss=0.2105, pruned_loss=0.0365, over 4936.00 frames.], tot_loss[loss=0.1382, simple_loss=0.2115, pruned_loss=0.03244, over 972231.45 frames.], batch size: 23, lr: 1.80e-04 2022-05-07 12:06:12,004 INFO [train.py:715] (0/8) Epoch 12, batch 19750, loss[loss=0.1338, simple_loss=0.2038, pruned_loss=0.03189, over 4828.00 frames.], tot_loss[loss=0.1395, simple_loss=0.2127, pruned_loss=0.0331, over 971769.21 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:06:52,638 INFO [train.py:715] (0/8) Epoch 12, batch 19800, loss[loss=0.1457, simple_loss=0.2379, pruned_loss=0.02673, over 4780.00 frames.], tot_loss[loss=0.1387, simple_loss=0.212, pruned_loss=0.03276, over 971491.03 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:07:33,083 INFO [train.py:715] (0/8) Epoch 12, batch 19850, loss[loss=0.1185, simple_loss=0.1994, pruned_loss=0.01883, over 4754.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2108, pruned_loss=0.03185, over 970591.16 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:08:14,251 INFO [train.py:715] (0/8) Epoch 12, batch 19900, loss[loss=0.135, simple_loss=0.2165, pruned_loss=0.02669, over 4898.00 frames.], tot_loss[loss=0.138, simple_loss=0.2116, pruned_loss=0.0322, over 970773.38 frames.], batch size: 22, lr: 1.80e-04 2022-05-07 12:08:54,591 INFO [train.py:715] (0/8) Epoch 12, batch 19950, loss[loss=0.124, simple_loss=0.2005, pruned_loss=0.02371, over 4985.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2109, pruned_loss=0.03186, over 970363.37 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:09:35,203 INFO [train.py:715] (0/8) Epoch 12, batch 20000, loss[loss=0.1275, simple_loss=0.2034, pruned_loss=0.02579, over 4796.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03166, over 970975.70 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:10:15,439 INFO [train.py:715] (0/8) Epoch 12, batch 20050, loss[loss=0.1455, simple_loss=0.2268, pruned_loss=0.03215, over 4964.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03163, over 971192.53 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:10:55,687 INFO [train.py:715] (0/8) Epoch 12, batch 20100, loss[loss=0.1616, simple_loss=0.2298, pruned_loss=0.04671, over 4978.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03126, over 971226.64 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:11:35,651 INFO [train.py:715] (0/8) Epoch 12, batch 20150, loss[loss=0.1509, simple_loss=0.2117, pruned_loss=0.04501, over 4856.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2093, pruned_loss=0.03123, over 971315.31 frames.], batch size: 12, lr: 1.80e-04 2022-05-07 12:12:16,042 INFO [train.py:715] (0/8) Epoch 12, batch 20200, loss[loss=0.1682, simple_loss=0.2326, pruned_loss=0.05195, over 4923.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03131, over 971600.95 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:12:56,153 INFO [train.py:715] (0/8) Epoch 12, batch 20250, loss[loss=0.1181, simple_loss=0.1941, pruned_loss=0.02101, over 4763.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03126, over 971491.53 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:13:36,207 INFO [train.py:715] (0/8) Epoch 12, batch 20300, loss[loss=0.1182, simple_loss=0.2041, pruned_loss=0.01618, over 4986.00 frames.], tot_loss[loss=0.138, simple_loss=0.2117, pruned_loss=0.03216, over 971852.34 frames.], batch size: 20, lr: 1.80e-04 2022-05-07 12:14:16,798 INFO [train.py:715] (0/8) Epoch 12, batch 20350, loss[loss=0.1385, simple_loss=0.2147, pruned_loss=0.03118, over 4800.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03162, over 970875.78 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:14:56,465 INFO [train.py:715] (0/8) Epoch 12, batch 20400, loss[loss=0.1227, simple_loss=0.2057, pruned_loss=0.01988, over 4794.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03172, over 972014.03 frames.], batch size: 17, lr: 1.80e-04 2022-05-07 12:15:36,227 INFO [train.py:715] (0/8) Epoch 12, batch 20450, loss[loss=0.132, simple_loss=0.2089, pruned_loss=0.0275, over 4929.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03155, over 971822.67 frames.], batch size: 23, lr: 1.80e-04 2022-05-07 12:16:15,874 INFO [train.py:715] (0/8) Epoch 12, batch 20500, loss[loss=0.1422, simple_loss=0.2189, pruned_loss=0.03279, over 4741.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03141, over 971350.08 frames.], batch size: 16, lr: 1.80e-04 2022-05-07 12:16:56,320 INFO [train.py:715] (0/8) Epoch 12, batch 20550, loss[loss=0.1209, simple_loss=0.1967, pruned_loss=0.02257, over 4808.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2106, pruned_loss=0.03192, over 971234.91 frames.], batch size: 26, lr: 1.80e-04 2022-05-07 12:17:36,236 INFO [train.py:715] (0/8) Epoch 12, batch 20600, loss[loss=0.1333, simple_loss=0.2024, pruned_loss=0.0321, over 4777.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.0318, over 970587.63 frames.], batch size: 17, lr: 1.80e-04 2022-05-07 12:18:15,186 INFO [train.py:715] (0/8) Epoch 12, batch 20650, loss[loss=0.1329, simple_loss=0.2028, pruned_loss=0.03144, over 4757.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03195, over 971588.40 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:18:54,294 INFO [train.py:715] (0/8) Epoch 12, batch 20700, loss[loss=0.1529, simple_loss=0.2251, pruned_loss=0.04034, over 4744.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2113, pruned_loss=0.03181, over 971672.39 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:19:32,270 INFO [train.py:715] (0/8) Epoch 12, batch 20750, loss[loss=0.1222, simple_loss=0.2004, pruned_loss=0.02206, over 4767.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2109, pruned_loss=0.03158, over 971907.23 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:20:10,574 INFO [train.py:715] (0/8) Epoch 12, batch 20800, loss[loss=0.165, simple_loss=0.2259, pruned_loss=0.05208, over 4792.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.03221, over 971712.73 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:20:48,324 INFO [train.py:715] (0/8) Epoch 12, batch 20850, loss[loss=0.1244, simple_loss=0.1955, pruned_loss=0.02663, over 4744.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2106, pruned_loss=0.03222, over 971338.80 frames.], batch size: 12, lr: 1.80e-04 2022-05-07 12:21:26,467 INFO [train.py:715] (0/8) Epoch 12, batch 20900, loss[loss=0.1152, simple_loss=0.1938, pruned_loss=0.01833, over 4910.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03196, over 971257.69 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:22:04,009 INFO [train.py:715] (0/8) Epoch 12, batch 20950, loss[loss=0.1408, simple_loss=0.2143, pruned_loss=0.03363, over 4776.00 frames.], tot_loss[loss=0.1379, simple_loss=0.211, pruned_loss=0.03246, over 970779.93 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:22:41,372 INFO [train.py:715] (0/8) Epoch 12, batch 21000, loss[loss=0.1307, simple_loss=0.213, pruned_loss=0.02416, over 4892.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03214, over 971517.27 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:22:41,373 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 12:22:50,900 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1056, simple_loss=0.1896, pruned_loss=0.01081, over 914524.00 frames. 2022-05-07 12:23:28,719 INFO [train.py:715] (0/8) Epoch 12, batch 21050, loss[loss=0.1299, simple_loss=0.2128, pruned_loss=0.02354, over 4929.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03201, over 971633.77 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:24:06,824 INFO [train.py:715] (0/8) Epoch 12, batch 21100, loss[loss=0.161, simple_loss=0.2308, pruned_loss=0.04563, over 4763.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2112, pruned_loss=0.03225, over 972212.76 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:24:44,625 INFO [train.py:715] (0/8) Epoch 12, batch 21150, loss[loss=0.1358, simple_loss=0.2095, pruned_loss=0.03108, over 4779.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.0316, over 971779.57 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:25:22,417 INFO [train.py:715] (0/8) Epoch 12, batch 21200, loss[loss=0.1387, simple_loss=0.2049, pruned_loss=0.03619, over 4968.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2106, pruned_loss=0.03156, over 972094.66 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:26:00,690 INFO [train.py:715] (0/8) Epoch 12, batch 21250, loss[loss=0.1476, simple_loss=0.2169, pruned_loss=0.03915, over 4933.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2105, pruned_loss=0.03152, over 971639.95 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:26:39,513 INFO [train.py:715] (0/8) Epoch 12, batch 21300, loss[loss=0.1292, simple_loss=0.1996, pruned_loss=0.0294, over 4845.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03216, over 971395.00 frames.], batch size: 20, lr: 1.80e-04 2022-05-07 12:27:17,262 INFO [train.py:715] (0/8) Epoch 12, batch 21350, loss[loss=0.1412, simple_loss=0.2211, pruned_loss=0.03064, over 4804.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2116, pruned_loss=0.03233, over 971251.37 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:27:56,355 INFO [train.py:715] (0/8) Epoch 12, batch 21400, loss[loss=0.1252, simple_loss=0.1992, pruned_loss=0.0256, over 4766.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2112, pruned_loss=0.03223, over 971640.45 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:28:35,909 INFO [train.py:715] (0/8) Epoch 12, batch 21450, loss[loss=0.1461, simple_loss=0.2122, pruned_loss=0.04003, over 4939.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2107, pruned_loss=0.03232, over 972720.89 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:29:14,507 INFO [train.py:715] (0/8) Epoch 12, batch 21500, loss[loss=0.1084, simple_loss=0.1825, pruned_loss=0.01716, over 4774.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03217, over 972843.58 frames.], batch size: 12, lr: 1.80e-04 2022-05-07 12:29:53,096 INFO [train.py:715] (0/8) Epoch 12, batch 21550, loss[loss=0.1491, simple_loss=0.2372, pruned_loss=0.03055, over 4789.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2104, pruned_loss=0.03225, over 972798.21 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:30:31,271 INFO [train.py:715] (0/8) Epoch 12, batch 21600, loss[loss=0.138, simple_loss=0.2103, pruned_loss=0.0329, over 4837.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.03196, over 972006.51 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:31:09,731 INFO [train.py:715] (0/8) Epoch 12, batch 21650, loss[loss=0.1198, simple_loss=0.1879, pruned_loss=0.02584, over 4881.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2101, pruned_loss=0.0322, over 972365.05 frames.], batch size: 16, lr: 1.80e-04 2022-05-07 12:31:46,932 INFO [train.py:715] (0/8) Epoch 12, batch 21700, loss[loss=0.1471, simple_loss=0.2266, pruned_loss=0.03382, over 4828.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03193, over 971937.63 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:32:25,493 INFO [train.py:715] (0/8) Epoch 12, batch 21750, loss[loss=0.113, simple_loss=0.1755, pruned_loss=0.02524, over 4986.00 frames.], tot_loss[loss=0.1373, simple_loss=0.211, pruned_loss=0.03184, over 972376.60 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:33:04,216 INFO [train.py:715] (0/8) Epoch 12, batch 21800, loss[loss=0.134, simple_loss=0.2136, pruned_loss=0.02717, over 4974.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03182, over 971826.10 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:33:42,108 INFO [train.py:715] (0/8) Epoch 12, batch 21850, loss[loss=0.106, simple_loss=0.1773, pruned_loss=0.01741, over 4853.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03165, over 972057.42 frames.], batch size: 32, lr: 1.80e-04 2022-05-07 12:34:19,724 INFO [train.py:715] (0/8) Epoch 12, batch 21900, loss[loss=0.1392, simple_loss=0.2174, pruned_loss=0.03049, over 4860.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2107, pruned_loss=0.03215, over 972765.40 frames.], batch size: 32, lr: 1.80e-04 2022-05-07 12:34:58,472 INFO [train.py:715] (0/8) Epoch 12, batch 21950, loss[loss=0.1157, simple_loss=0.1924, pruned_loss=0.01953, over 4845.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03111, over 972987.96 frames.], batch size: 30, lr: 1.80e-04 2022-05-07 12:35:37,474 INFO [train.py:715] (0/8) Epoch 12, batch 22000, loss[loss=0.1404, simple_loss=0.2082, pruned_loss=0.03632, over 4916.00 frames.], tot_loss[loss=0.136, simple_loss=0.2096, pruned_loss=0.03121, over 972170.05 frames.], batch size: 17, lr: 1.80e-04 2022-05-07 12:36:15,708 INFO [train.py:715] (0/8) Epoch 12, batch 22050, loss[loss=0.1327, simple_loss=0.2082, pruned_loss=0.02858, over 4988.00 frames.], tot_loss[loss=0.1353, simple_loss=0.209, pruned_loss=0.03078, over 972913.03 frames.], batch size: 28, lr: 1.80e-04 2022-05-07 12:36:54,703 INFO [train.py:715] (0/8) Epoch 12, batch 22100, loss[loss=0.1566, simple_loss=0.2214, pruned_loss=0.04585, over 4959.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2087, pruned_loss=0.03081, over 972098.58 frames.], batch size: 39, lr: 1.80e-04 2022-05-07 12:37:33,656 INFO [train.py:715] (0/8) Epoch 12, batch 22150, loss[loss=0.101, simple_loss=0.1743, pruned_loss=0.01383, over 4828.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2086, pruned_loss=0.03075, over 972212.56 frames.], batch size: 13, lr: 1.80e-04 2022-05-07 12:38:11,931 INFO [train.py:715] (0/8) Epoch 12, batch 22200, loss[loss=0.1369, simple_loss=0.2199, pruned_loss=0.02692, over 4889.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03096, over 972573.44 frames.], batch size: 19, lr: 1.80e-04 2022-05-07 12:38:49,697 INFO [train.py:715] (0/8) Epoch 12, batch 22250, loss[loss=0.1255, simple_loss=0.2019, pruned_loss=0.02457, over 4956.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03148, over 973391.14 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:39:18,277 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-440000.pt 2022-05-07 12:39:30,398 INFO [train.py:715] (0/8) Epoch 12, batch 22300, loss[loss=0.1702, simple_loss=0.2289, pruned_loss=0.05577, over 4979.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2106, pruned_loss=0.03159, over 973661.65 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:40:08,649 INFO [train.py:715] (0/8) Epoch 12, batch 22350, loss[loss=0.1004, simple_loss=0.1733, pruned_loss=0.0137, over 4834.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03126, over 973677.54 frames.], batch size: 26, lr: 1.80e-04 2022-05-07 12:40:46,756 INFO [train.py:715] (0/8) Epoch 12, batch 22400, loss[loss=0.1265, simple_loss=0.2018, pruned_loss=0.02561, over 4694.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03123, over 972798.68 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:41:25,340 INFO [train.py:715] (0/8) Epoch 12, batch 22450, loss[loss=0.1384, simple_loss=0.2165, pruned_loss=0.03011, over 4898.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2105, pruned_loss=0.03146, over 973255.46 frames.], batch size: 17, lr: 1.80e-04 2022-05-07 12:42:03,781 INFO [train.py:715] (0/8) Epoch 12, batch 22500, loss[loss=0.1246, simple_loss=0.1984, pruned_loss=0.02538, over 4845.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.03194, over 973029.95 frames.], batch size: 30, lr: 1.80e-04 2022-05-07 12:42:42,484 INFO [train.py:715] (0/8) Epoch 12, batch 22550, loss[loss=0.1356, simple_loss=0.2118, pruned_loss=0.02964, over 4828.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03192, over 972802.00 frames.], batch size: 27, lr: 1.80e-04 2022-05-07 12:43:20,632 INFO [train.py:715] (0/8) Epoch 12, batch 22600, loss[loss=0.1408, simple_loss=0.2128, pruned_loss=0.03437, over 4953.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03134, over 972427.58 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:43:58,687 INFO [train.py:715] (0/8) Epoch 12, batch 22650, loss[loss=0.1177, simple_loss=0.1918, pruned_loss=0.02185, over 4929.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.03105, over 972260.84 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:44:36,591 INFO [train.py:715] (0/8) Epoch 12, batch 22700, loss[loss=0.1682, simple_loss=0.2373, pruned_loss=0.04957, over 4876.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.0316, over 971890.06 frames.], batch size: 22, lr: 1.80e-04 2022-05-07 12:45:14,813 INFO [train.py:715] (0/8) Epoch 12, batch 22750, loss[loss=0.1554, simple_loss=0.2241, pruned_loss=0.04336, over 4776.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03151, over 973107.73 frames.], batch size: 17, lr: 1.80e-04 2022-05-07 12:45:53,316 INFO [train.py:715] (0/8) Epoch 12, batch 22800, loss[loss=0.1212, simple_loss=0.2007, pruned_loss=0.02084, over 4941.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03166, over 973005.87 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:46:32,310 INFO [train.py:715] (0/8) Epoch 12, batch 22850, loss[loss=0.1674, simple_loss=0.2436, pruned_loss=0.04562, over 4856.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03195, over 972194.65 frames.], batch size: 30, lr: 1.80e-04 2022-05-07 12:47:10,522 INFO [train.py:715] (0/8) Epoch 12, batch 22900, loss[loss=0.1519, simple_loss=0.218, pruned_loss=0.04294, over 4878.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03157, over 972291.24 frames.], batch size: 16, lr: 1.80e-04 2022-05-07 12:47:48,401 INFO [train.py:715] (0/8) Epoch 12, batch 22950, loss[loss=0.1591, simple_loss=0.2317, pruned_loss=0.0433, over 4907.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2093, pruned_loss=0.03143, over 972421.95 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:48:26,678 INFO [train.py:715] (0/8) Epoch 12, batch 23000, loss[loss=0.1184, simple_loss=0.1873, pruned_loss=0.02473, over 4815.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2083, pruned_loss=0.03094, over 973207.20 frames.], batch size: 27, lr: 1.80e-04 2022-05-07 12:49:04,948 INFO [train.py:715] (0/8) Epoch 12, batch 23050, loss[loss=0.1543, simple_loss=0.2297, pruned_loss=0.03949, over 4926.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2083, pruned_loss=0.03079, over 973311.94 frames.], batch size: 39, lr: 1.80e-04 2022-05-07 12:49:43,056 INFO [train.py:715] (0/8) Epoch 12, batch 23100, loss[loss=0.1266, simple_loss=0.1988, pruned_loss=0.02718, over 4826.00 frames.], tot_loss[loss=0.135, simple_loss=0.2083, pruned_loss=0.03084, over 972877.20 frames.], batch size: 26, lr: 1.80e-04 2022-05-07 12:50:21,951 INFO [train.py:715] (0/8) Epoch 12, batch 23150, loss[loss=0.1418, simple_loss=0.2154, pruned_loss=0.03411, over 4920.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2081, pruned_loss=0.03064, over 972596.52 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:51:01,021 INFO [train.py:715] (0/8) Epoch 12, batch 23200, loss[loss=0.1334, simple_loss=0.2064, pruned_loss=0.03014, over 4786.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2082, pruned_loss=0.03069, over 972572.04 frames.], batch size: 14, lr: 1.80e-04 2022-05-07 12:51:39,414 INFO [train.py:715] (0/8) Epoch 12, batch 23250, loss[loss=0.1293, simple_loss=0.2059, pruned_loss=0.02632, over 4970.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03075, over 972846.70 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:52:17,107 INFO [train.py:715] (0/8) Epoch 12, batch 23300, loss[loss=0.1172, simple_loss=0.1987, pruned_loss=0.01783, over 4804.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03053, over 972674.75 frames.], batch size: 21, lr: 1.80e-04 2022-05-07 12:52:55,807 INFO [train.py:715] (0/8) Epoch 12, batch 23350, loss[loss=0.135, simple_loss=0.2128, pruned_loss=0.02858, over 4845.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03094, over 972596.27 frames.], batch size: 34, lr: 1.80e-04 2022-05-07 12:53:33,835 INFO [train.py:715] (0/8) Epoch 12, batch 23400, loss[loss=0.1337, simple_loss=0.211, pruned_loss=0.02819, over 4854.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.03155, over 973315.72 frames.], batch size: 30, lr: 1.80e-04 2022-05-07 12:54:11,384 INFO [train.py:715] (0/8) Epoch 12, batch 23450, loss[loss=0.1373, simple_loss=0.2008, pruned_loss=0.03687, over 4855.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03162, over 972908.80 frames.], batch size: 20, lr: 1.80e-04 2022-05-07 12:54:49,569 INFO [train.py:715] (0/8) Epoch 12, batch 23500, loss[loss=0.1705, simple_loss=0.2451, pruned_loss=0.04797, over 4928.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2111, pruned_loss=0.03181, over 973059.20 frames.], batch size: 29, lr: 1.80e-04 2022-05-07 12:55:28,409 INFO [train.py:715] (0/8) Epoch 12, batch 23550, loss[loss=0.1358, simple_loss=0.2069, pruned_loss=0.03239, over 4871.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2112, pruned_loss=0.03194, over 973143.47 frames.], batch size: 22, lr: 1.80e-04 2022-05-07 12:56:07,097 INFO [train.py:715] (0/8) Epoch 12, batch 23600, loss[loss=0.1356, simple_loss=0.2048, pruned_loss=0.0332, over 4798.00 frames.], tot_loss[loss=0.138, simple_loss=0.2117, pruned_loss=0.03219, over 972391.51 frames.], batch size: 24, lr: 1.80e-04 2022-05-07 12:56:45,796 INFO [train.py:715] (0/8) Epoch 12, batch 23650, loss[loss=0.1386, simple_loss=0.2164, pruned_loss=0.03041, over 4836.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2111, pruned_loss=0.03199, over 971427.98 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:57:24,206 INFO [train.py:715] (0/8) Epoch 12, batch 23700, loss[loss=0.133, simple_loss=0.2088, pruned_loss=0.02858, over 4893.00 frames.], tot_loss[loss=0.1375, simple_loss=0.211, pruned_loss=0.03197, over 971404.43 frames.], batch size: 16, lr: 1.80e-04 2022-05-07 12:58:02,486 INFO [train.py:715] (0/8) Epoch 12, batch 23750, loss[loss=0.1149, simple_loss=0.1917, pruned_loss=0.01909, over 4981.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03165, over 972033.84 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:58:41,201 INFO [train.py:715] (0/8) Epoch 12, batch 23800, loss[loss=0.1782, simple_loss=0.2518, pruned_loss=0.05224, over 4831.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2097, pruned_loss=0.03163, over 971570.87 frames.], batch size: 15, lr: 1.80e-04 2022-05-07 12:59:20,119 INFO [train.py:715] (0/8) Epoch 12, batch 23850, loss[loss=0.1242, simple_loss=0.1921, pruned_loss=0.0281, over 4759.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2097, pruned_loss=0.03184, over 972299.49 frames.], batch size: 18, lr: 1.80e-04 2022-05-07 12:59:59,696 INFO [train.py:715] (0/8) Epoch 12, batch 23900, loss[loss=0.1434, simple_loss=0.2186, pruned_loss=0.03412, over 4934.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03193, over 972705.85 frames.], batch size: 23, lr: 1.80e-04 2022-05-07 13:00:39,412 INFO [train.py:715] (0/8) Epoch 12, batch 23950, loss[loss=0.1531, simple_loss=0.2318, pruned_loss=0.03713, over 4881.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2112, pruned_loss=0.03247, over 973158.46 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:01:18,251 INFO [train.py:715] (0/8) Epoch 12, batch 24000, loss[loss=0.1116, simple_loss=0.1911, pruned_loss=0.01606, over 4796.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03166, over 972388.68 frames.], batch size: 21, lr: 1.79e-04 2022-05-07 13:01:18,252 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 13:01:27,803 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1054, simple_loss=0.1895, pruned_loss=0.01071, over 914524.00 frames. 2022-05-07 13:02:06,829 INFO [train.py:715] (0/8) Epoch 12, batch 24050, loss[loss=0.1698, simple_loss=0.2286, pruned_loss=0.05553, over 4994.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2094, pruned_loss=0.03163, over 972769.84 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:02:47,350 INFO [train.py:715] (0/8) Epoch 12, batch 24100, loss[loss=0.1241, simple_loss=0.1947, pruned_loss=0.02668, over 4895.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2097, pruned_loss=0.03174, over 972683.75 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:03:27,826 INFO [train.py:715] (0/8) Epoch 12, batch 24150, loss[loss=0.1258, simple_loss=0.209, pruned_loss=0.02125, over 4886.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2085, pruned_loss=0.03108, over 972727.46 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:04:07,864 INFO [train.py:715] (0/8) Epoch 12, batch 24200, loss[loss=0.1049, simple_loss=0.1793, pruned_loss=0.01525, over 4858.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2087, pruned_loss=0.03131, over 972941.96 frames.], batch size: 20, lr: 1.79e-04 2022-05-07 13:04:47,987 INFO [train.py:715] (0/8) Epoch 12, batch 24250, loss[loss=0.1323, simple_loss=0.1982, pruned_loss=0.03322, over 4783.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2089, pruned_loss=0.03145, over 972521.00 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:05:28,030 INFO [train.py:715] (0/8) Epoch 12, batch 24300, loss[loss=0.1512, simple_loss=0.2258, pruned_loss=0.03835, over 4880.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2094, pruned_loss=0.03165, over 972088.81 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:06:07,756 INFO [train.py:715] (0/8) Epoch 12, batch 24350, loss[loss=0.1199, simple_loss=0.1921, pruned_loss=0.02383, over 4749.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2091, pruned_loss=0.03158, over 972522.22 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:06:47,580 INFO [train.py:715] (0/8) Epoch 12, batch 24400, loss[loss=0.1169, simple_loss=0.1899, pruned_loss=0.0219, over 4969.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2092, pruned_loss=0.03181, over 971938.84 frames.], batch size: 24, lr: 1.79e-04 2022-05-07 13:07:27,539 INFO [train.py:715] (0/8) Epoch 12, batch 24450, loss[loss=0.1518, simple_loss=0.2214, pruned_loss=0.04111, over 4916.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2093, pruned_loss=0.03206, over 972056.96 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:08:07,310 INFO [train.py:715] (0/8) Epoch 12, batch 24500, loss[loss=0.1367, simple_loss=0.2081, pruned_loss=0.03262, over 4986.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2089, pruned_loss=0.03192, over 972638.22 frames.], batch size: 31, lr: 1.79e-04 2022-05-07 13:08:46,551 INFO [train.py:715] (0/8) Epoch 12, batch 24550, loss[loss=0.1364, simple_loss=0.2184, pruned_loss=0.02719, over 4944.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2091, pruned_loss=0.03193, over 972028.35 frames.], batch size: 39, lr: 1.79e-04 2022-05-07 13:09:26,205 INFO [train.py:715] (0/8) Epoch 12, batch 24600, loss[loss=0.1485, simple_loss=0.22, pruned_loss=0.03848, over 4977.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2096, pruned_loss=0.03206, over 972519.15 frames.], batch size: 26, lr: 1.79e-04 2022-05-07 13:10:05,926 INFO [train.py:715] (0/8) Epoch 12, batch 24650, loss[loss=0.1714, simple_loss=0.2283, pruned_loss=0.05727, over 4972.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2095, pruned_loss=0.03199, over 972324.86 frames.], batch size: 35, lr: 1.79e-04 2022-05-07 13:10:45,627 INFO [train.py:715] (0/8) Epoch 12, batch 24700, loss[loss=0.1316, simple_loss=0.2067, pruned_loss=0.02827, over 4835.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2105, pruned_loss=0.03241, over 972627.79 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:11:24,793 INFO [train.py:715] (0/8) Epoch 12, batch 24750, loss[loss=0.1232, simple_loss=0.1949, pruned_loss=0.02571, over 4834.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2114, pruned_loss=0.03256, over 972458.13 frames.], batch size: 13, lr: 1.79e-04 2022-05-07 13:12:04,997 INFO [train.py:715] (0/8) Epoch 12, batch 24800, loss[loss=0.1654, simple_loss=0.2524, pruned_loss=0.03925, over 4900.00 frames.], tot_loss[loss=0.138, simple_loss=0.2115, pruned_loss=0.03229, over 972792.40 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:12:44,862 INFO [train.py:715] (0/8) Epoch 12, batch 24850, loss[loss=0.1345, simple_loss=0.2015, pruned_loss=0.0337, over 4813.00 frames.], tot_loss[loss=0.1375, simple_loss=0.211, pruned_loss=0.032, over 973226.01 frames.], batch size: 13, lr: 1.79e-04 2022-05-07 13:13:24,106 INFO [train.py:715] (0/8) Epoch 12, batch 24900, loss[loss=0.1026, simple_loss=0.1708, pruned_loss=0.01725, over 4787.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2115, pruned_loss=0.03206, over 973367.63 frames.], batch size: 12, lr: 1.79e-04 2022-05-07 13:14:03,439 INFO [train.py:715] (0/8) Epoch 12, batch 24950, loss[loss=0.133, simple_loss=0.2166, pruned_loss=0.02469, over 4897.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03166, over 973604.55 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:14:42,355 INFO [train.py:715] (0/8) Epoch 12, batch 25000, loss[loss=0.1436, simple_loss=0.2063, pruned_loss=0.04044, over 4780.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03168, over 973436.79 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:15:20,409 INFO [train.py:715] (0/8) Epoch 12, batch 25050, loss[loss=0.1613, simple_loss=0.2245, pruned_loss=0.04905, over 4878.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03164, over 973240.16 frames.], batch size: 16, lr: 1.79e-04 2022-05-07 13:15:58,459 INFO [train.py:715] (0/8) Epoch 12, batch 25100, loss[loss=0.1155, simple_loss=0.1865, pruned_loss=0.02227, over 4867.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2109, pruned_loss=0.03182, over 973428.40 frames.], batch size: 16, lr: 1.79e-04 2022-05-07 13:16:36,851 INFO [train.py:715] (0/8) Epoch 12, batch 25150, loss[loss=0.1708, simple_loss=0.238, pruned_loss=0.05179, over 4792.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2108, pruned_loss=0.03134, over 973595.53 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:17:15,103 INFO [train.py:715] (0/8) Epoch 12, batch 25200, loss[loss=0.1122, simple_loss=0.185, pruned_loss=0.01965, over 4759.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.03129, over 973206.09 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:17:52,722 INFO [train.py:715] (0/8) Epoch 12, batch 25250, loss[loss=0.1448, simple_loss=0.2245, pruned_loss=0.03249, over 4780.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.03121, over 973209.36 frames.], batch size: 17, lr: 1.79e-04 2022-05-07 13:18:30,733 INFO [train.py:715] (0/8) Epoch 12, batch 25300, loss[loss=0.1196, simple_loss=0.1884, pruned_loss=0.02542, over 4823.00 frames.], tot_loss[loss=0.1368, simple_loss=0.211, pruned_loss=0.03127, over 972391.70 frames.], batch size: 26, lr: 1.79e-04 2022-05-07 13:19:08,815 INFO [train.py:715] (0/8) Epoch 12, batch 25350, loss[loss=0.1383, simple_loss=0.2087, pruned_loss=0.03396, over 4793.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2116, pruned_loss=0.03154, over 972576.67 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:19:47,788 INFO [train.py:715] (0/8) Epoch 12, batch 25400, loss[loss=0.1547, simple_loss=0.2252, pruned_loss=0.04207, over 4869.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2111, pruned_loss=0.03178, over 972454.79 frames.], batch size: 38, lr: 1.79e-04 2022-05-07 13:20:26,679 INFO [train.py:715] (0/8) Epoch 12, batch 25450, loss[loss=0.1924, simple_loss=0.2585, pruned_loss=0.06319, over 4853.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2107, pruned_loss=0.03218, over 973150.90 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:21:06,563 INFO [train.py:715] (0/8) Epoch 12, batch 25500, loss[loss=0.1259, simple_loss=0.1999, pruned_loss=0.02594, over 4769.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2108, pruned_loss=0.03222, over 972479.88 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:21:45,629 INFO [train.py:715] (0/8) Epoch 12, batch 25550, loss[loss=0.1308, simple_loss=0.2063, pruned_loss=0.02767, over 4813.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.03178, over 972631.42 frames.], batch size: 27, lr: 1.79e-04 2022-05-07 13:22:23,733 INFO [train.py:715] (0/8) Epoch 12, batch 25600, loss[loss=0.1212, simple_loss=0.203, pruned_loss=0.01976, over 4814.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03161, over 972151.50 frames.], batch size: 21, lr: 1.79e-04 2022-05-07 13:23:02,061 INFO [train.py:715] (0/8) Epoch 12, batch 25650, loss[loss=0.1401, simple_loss=0.2094, pruned_loss=0.03542, over 4826.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03089, over 972136.99 frames.], batch size: 25, lr: 1.79e-04 2022-05-07 13:23:40,754 INFO [train.py:715] (0/8) Epoch 12, batch 25700, loss[loss=0.1205, simple_loss=0.1941, pruned_loss=0.02347, over 4982.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.0313, over 971156.68 frames.], batch size: 28, lr: 1.79e-04 2022-05-07 13:24:19,539 INFO [train.py:715] (0/8) Epoch 12, batch 25750, loss[loss=0.1444, simple_loss=0.2185, pruned_loss=0.03513, over 4875.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03104, over 970929.88 frames.], batch size: 16, lr: 1.79e-04 2022-05-07 13:24:58,007 INFO [train.py:715] (0/8) Epoch 12, batch 25800, loss[loss=0.1324, simple_loss=0.1994, pruned_loss=0.03269, over 4884.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.0318, over 971633.81 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:25:36,919 INFO [train.py:715] (0/8) Epoch 12, batch 25850, loss[loss=0.1973, simple_loss=0.249, pruned_loss=0.07279, over 4817.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2102, pruned_loss=0.03181, over 971425.96 frames.], batch size: 25, lr: 1.79e-04 2022-05-07 13:26:15,478 INFO [train.py:715] (0/8) Epoch 12, batch 25900, loss[loss=0.129, simple_loss=0.2071, pruned_loss=0.02548, over 4864.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03151, over 969954.18 frames.], batch size: 39, lr: 1.79e-04 2022-05-07 13:26:53,769 INFO [train.py:715] (0/8) Epoch 12, batch 25950, loss[loss=0.156, simple_loss=0.2198, pruned_loss=0.04615, over 4907.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2094, pruned_loss=0.03146, over 970342.33 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:27:31,252 INFO [train.py:715] (0/8) Epoch 12, batch 26000, loss[loss=0.1143, simple_loss=0.197, pruned_loss=0.01579, over 4838.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03123, over 971906.20 frames.], batch size: 25, lr: 1.79e-04 2022-05-07 13:28:09,530 INFO [train.py:715] (0/8) Epoch 12, batch 26050, loss[loss=0.1387, simple_loss=0.2176, pruned_loss=0.02986, over 4878.00 frames.], tot_loss[loss=0.136, simple_loss=0.2095, pruned_loss=0.03123, over 971256.76 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:28:48,384 INFO [train.py:715] (0/8) Epoch 12, batch 26100, loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03099, over 4798.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03131, over 971691.91 frames.], batch size: 25, lr: 1.79e-04 2022-05-07 13:29:27,199 INFO [train.py:715] (0/8) Epoch 12, batch 26150, loss[loss=0.1631, simple_loss=0.2495, pruned_loss=0.03828, over 4780.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03119, over 972324.67 frames.], batch size: 17, lr: 1.79e-04 2022-05-07 13:30:06,151 INFO [train.py:715] (0/8) Epoch 12, batch 26200, loss[loss=0.1507, simple_loss=0.2291, pruned_loss=0.03613, over 4817.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03149, over 972444.37 frames.], batch size: 27, lr: 1.79e-04 2022-05-07 13:30:44,505 INFO [train.py:715] (0/8) Epoch 12, batch 26250, loss[loss=0.1514, simple_loss=0.2273, pruned_loss=0.03774, over 4782.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03173, over 972870.27 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:31:23,010 INFO [train.py:715] (0/8) Epoch 12, batch 26300, loss[loss=0.1253, simple_loss=0.1941, pruned_loss=0.02825, over 4829.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2098, pruned_loss=0.0319, over 973242.16 frames.], batch size: 13, lr: 1.79e-04 2022-05-07 13:32:02,151 INFO [train.py:715] (0/8) Epoch 12, batch 26350, loss[loss=0.1333, simple_loss=0.2056, pruned_loss=0.03051, over 4982.00 frames.], tot_loss[loss=0.1363, simple_loss=0.209, pruned_loss=0.03177, over 972902.22 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:32:40,218 INFO [train.py:715] (0/8) Epoch 12, batch 26400, loss[loss=0.1636, simple_loss=0.2417, pruned_loss=0.04274, over 4840.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2092, pruned_loss=0.03184, over 971905.79 frames.], batch size: 20, lr: 1.79e-04 2022-05-07 13:33:18,354 INFO [train.py:715] (0/8) Epoch 12, batch 26450, loss[loss=0.173, simple_loss=0.2484, pruned_loss=0.04886, over 4852.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2097, pruned_loss=0.03166, over 971156.47 frames.], batch size: 13, lr: 1.79e-04 2022-05-07 13:33:56,289 INFO [train.py:715] (0/8) Epoch 12, batch 26500, loss[loss=0.1282, simple_loss=0.2024, pruned_loss=0.02703, over 4929.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03163, over 971549.08 frames.], batch size: 23, lr: 1.79e-04 2022-05-07 13:34:34,590 INFO [train.py:715] (0/8) Epoch 12, batch 26550, loss[loss=0.1252, simple_loss=0.1998, pruned_loss=0.02532, over 4936.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03164, over 971702.95 frames.], batch size: 29, lr: 1.79e-04 2022-05-07 13:35:12,932 INFO [train.py:715] (0/8) Epoch 12, batch 26600, loss[loss=0.1381, simple_loss=0.2179, pruned_loss=0.02914, over 4824.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2102, pruned_loss=0.0321, over 971816.07 frames.], batch size: 26, lr: 1.79e-04 2022-05-07 13:35:51,453 INFO [train.py:715] (0/8) Epoch 12, batch 26650, loss[loss=0.1453, simple_loss=0.2174, pruned_loss=0.03657, over 4979.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2099, pruned_loss=0.03197, over 972339.15 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:36:30,039 INFO [train.py:715] (0/8) Epoch 12, batch 26700, loss[loss=0.1578, simple_loss=0.2381, pruned_loss=0.03878, over 4694.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03203, over 972552.17 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:37:08,434 INFO [train.py:715] (0/8) Epoch 12, batch 26750, loss[loss=0.1222, simple_loss=0.1963, pruned_loss=0.02403, over 4765.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2092, pruned_loss=0.03155, over 972873.75 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:37:47,905 INFO [train.py:715] (0/8) Epoch 12, batch 26800, loss[loss=0.1146, simple_loss=0.19, pruned_loss=0.0196, over 4911.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2095, pruned_loss=0.03155, over 972866.47 frames.], batch size: 17, lr: 1.79e-04 2022-05-07 13:38:27,721 INFO [train.py:715] (0/8) Epoch 12, batch 26850, loss[loss=0.1117, simple_loss=0.187, pruned_loss=0.01816, over 4918.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2097, pruned_loss=0.03171, over 972695.65 frames.], batch size: 29, lr: 1.79e-04 2022-05-07 13:39:07,105 INFO [train.py:715] (0/8) Epoch 12, batch 26900, loss[loss=0.1385, simple_loss=0.2144, pruned_loss=0.03131, over 4972.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2092, pruned_loss=0.03152, over 972940.91 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:39:45,932 INFO [train.py:715] (0/8) Epoch 12, batch 26950, loss[loss=0.1251, simple_loss=0.2003, pruned_loss=0.0249, over 4771.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.03206, over 972265.26 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:40:25,464 INFO [train.py:715] (0/8) Epoch 12, batch 27000, loss[loss=0.118, simple_loss=0.192, pruned_loss=0.02203, over 4768.00 frames.], tot_loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03258, over 973012.35 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:40:25,466 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 13:40:37,912 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1054, simple_loss=0.1894, pruned_loss=0.01072, over 914524.00 frames. 2022-05-07 13:41:17,231 INFO [train.py:715] (0/8) Epoch 12, batch 27050, loss[loss=0.1411, simple_loss=0.2146, pruned_loss=0.0338, over 4947.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03193, over 973471.63 frames.], batch size: 39, lr: 1.79e-04 2022-05-07 13:41:55,460 INFO [train.py:715] (0/8) Epoch 12, batch 27100, loss[loss=0.1367, simple_loss=0.216, pruned_loss=0.02872, over 4917.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03185, over 973148.91 frames.], batch size: 39, lr: 1.79e-04 2022-05-07 13:42:33,818 INFO [train.py:715] (0/8) Epoch 12, batch 27150, loss[loss=0.1114, simple_loss=0.1861, pruned_loss=0.01835, over 4886.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2106, pruned_loss=0.0321, over 973789.41 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:43:12,673 INFO [train.py:715] (0/8) Epoch 12, batch 27200, loss[loss=0.1595, simple_loss=0.2177, pruned_loss=0.0506, over 4823.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.03188, over 974503.29 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:43:50,973 INFO [train.py:715] (0/8) Epoch 12, batch 27250, loss[loss=0.1366, simple_loss=0.203, pruned_loss=0.03509, over 4799.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03215, over 974361.27 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:44:29,598 INFO [train.py:715] (0/8) Epoch 12, batch 27300, loss[loss=0.1392, simple_loss=0.2126, pruned_loss=0.03294, over 4946.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2111, pruned_loss=0.0323, over 973933.89 frames.], batch size: 21, lr: 1.79e-04 2022-05-07 13:45:08,182 INFO [train.py:715] (0/8) Epoch 12, batch 27350, loss[loss=0.1178, simple_loss=0.1957, pruned_loss=0.01996, over 4920.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2112, pruned_loss=0.032, over 974138.88 frames.], batch size: 29, lr: 1.79e-04 2022-05-07 13:45:47,169 INFO [train.py:715] (0/8) Epoch 12, batch 27400, loss[loss=0.1406, simple_loss=0.2139, pruned_loss=0.0336, over 4790.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03168, over 973848.19 frames.], batch size: 17, lr: 1.79e-04 2022-05-07 13:46:25,845 INFO [train.py:715] (0/8) Epoch 12, batch 27450, loss[loss=0.1335, simple_loss=0.2052, pruned_loss=0.03087, over 4869.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2097, pruned_loss=0.03146, over 974081.06 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:47:04,342 INFO [train.py:715] (0/8) Epoch 12, batch 27500, loss[loss=0.1104, simple_loss=0.1848, pruned_loss=0.01798, over 4910.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2102, pruned_loss=0.03184, over 973598.79 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 13:47:43,115 INFO [train.py:715] (0/8) Epoch 12, batch 27550, loss[loss=0.1332, simple_loss=0.1967, pruned_loss=0.03484, over 4861.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2097, pruned_loss=0.03161, over 973464.11 frames.], batch size: 13, lr: 1.79e-04 2022-05-07 13:48:21,797 INFO [train.py:715] (0/8) Epoch 12, batch 27600, loss[loss=0.1513, simple_loss=0.2256, pruned_loss=0.03847, over 4770.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03182, over 972360.01 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 13:49:00,948 INFO [train.py:715] (0/8) Epoch 12, batch 27650, loss[loss=0.1515, simple_loss=0.2277, pruned_loss=0.03758, over 4881.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2111, pruned_loss=0.03236, over 972437.57 frames.], batch size: 22, lr: 1.79e-04 2022-05-07 13:49:39,557 INFO [train.py:715] (0/8) Epoch 12, batch 27700, loss[loss=0.1512, simple_loss=0.2216, pruned_loss=0.04045, over 4979.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2109, pruned_loss=0.0323, over 973089.45 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:50:18,421 INFO [train.py:715] (0/8) Epoch 12, batch 27750, loss[loss=0.1371, simple_loss=0.2169, pruned_loss=0.02861, over 4769.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.0321, over 973303.00 frames.], batch size: 17, lr: 1.79e-04 2022-05-07 13:50:56,326 INFO [train.py:715] (0/8) Epoch 12, batch 27800, loss[loss=0.1525, simple_loss=0.2271, pruned_loss=0.03892, over 4969.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2101, pruned_loss=0.03189, over 973037.78 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:51:34,001 INFO [train.py:715] (0/8) Epoch 12, batch 27850, loss[loss=0.1164, simple_loss=0.19, pruned_loss=0.02141, over 4839.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03169, over 972448.15 frames.], batch size: 32, lr: 1.79e-04 2022-05-07 13:52:12,418 INFO [train.py:715] (0/8) Epoch 12, batch 27900, loss[loss=0.1352, simple_loss=0.2054, pruned_loss=0.03254, over 4897.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2106, pruned_loss=0.0321, over 973497.18 frames.], batch size: 19, lr: 1.79e-04 2022-05-07 13:52:50,383 INFO [train.py:715] (0/8) Epoch 12, batch 27950, loss[loss=0.147, simple_loss=0.2121, pruned_loss=0.04099, over 4903.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03209, over 973037.94 frames.], batch size: 17, lr: 1.79e-04 2022-05-07 13:53:28,675 INFO [train.py:715] (0/8) Epoch 12, batch 28000, loss[loss=0.1431, simple_loss=0.2257, pruned_loss=0.03027, over 4862.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03185, over 972332.09 frames.], batch size: 20, lr: 1.79e-04 2022-05-07 13:54:06,329 INFO [train.py:715] (0/8) Epoch 12, batch 28050, loss[loss=0.1393, simple_loss=0.2267, pruned_loss=0.02596, over 4765.00 frames.], tot_loss[loss=0.138, simple_loss=0.2112, pruned_loss=0.03238, over 971409.09 frames.], batch size: 16, lr: 1.79e-04 2022-05-07 13:54:44,514 INFO [train.py:715] (0/8) Epoch 12, batch 28100, loss[loss=0.1323, simple_loss=0.2055, pruned_loss=0.02953, over 4964.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.03248, over 972562.94 frames.], batch size: 24, lr: 1.79e-04 2022-05-07 13:55:22,255 INFO [train.py:715] (0/8) Epoch 12, batch 28150, loss[loss=0.1461, simple_loss=0.229, pruned_loss=0.03156, over 4951.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2112, pruned_loss=0.03221, over 971830.64 frames.], batch size: 24, lr: 1.79e-04 2022-05-07 13:56:00,657 INFO [train.py:715] (0/8) Epoch 12, batch 28200, loss[loss=0.143, simple_loss=0.2169, pruned_loss=0.03456, over 4990.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2112, pruned_loss=0.03214, over 973541.32 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 13:56:39,076 INFO [train.py:715] (0/8) Epoch 12, batch 28250, loss[loss=0.1366, simple_loss=0.206, pruned_loss=0.03366, over 4953.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.03204, over 973068.46 frames.], batch size: 35, lr: 1.79e-04 2022-05-07 13:57:17,041 INFO [train.py:715] (0/8) Epoch 12, batch 28300, loss[loss=0.1263, simple_loss=0.2023, pruned_loss=0.02512, over 4979.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2112, pruned_loss=0.03209, over 972296.63 frames.], batch size: 40, lr: 1.79e-04 2022-05-07 13:57:55,846 INFO [train.py:715] (0/8) Epoch 12, batch 28350, loss[loss=0.1397, simple_loss=0.2167, pruned_loss=0.03129, over 4825.00 frames.], tot_loss[loss=0.138, simple_loss=0.211, pruned_loss=0.0325, over 972044.48 frames.], batch size: 26, lr: 1.79e-04 2022-05-07 13:58:33,885 INFO [train.py:715] (0/8) Epoch 12, batch 28400, loss[loss=0.1269, simple_loss=0.2038, pruned_loss=0.02504, over 4828.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2115, pruned_loss=0.03237, over 972407.36 frames.], batch size: 26, lr: 1.79e-04 2022-05-07 13:59:12,070 INFO [train.py:715] (0/8) Epoch 12, batch 28450, loss[loss=0.1238, simple_loss=0.1962, pruned_loss=0.0257, over 4985.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.0319, over 972057.19 frames.], batch size: 33, lr: 1.79e-04 2022-05-07 13:59:49,930 INFO [train.py:715] (0/8) Epoch 12, batch 28500, loss[loss=0.1065, simple_loss=0.1804, pruned_loss=0.01632, over 4976.00 frames.], tot_loss[loss=0.1372, simple_loss=0.211, pruned_loss=0.03167, over 973173.48 frames.], batch size: 14, lr: 1.79e-04 2022-05-07 14:00:27,898 INFO [train.py:715] (0/8) Epoch 12, batch 28550, loss[loss=0.1476, simple_loss=0.2297, pruned_loss=0.03271, over 4795.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2105, pruned_loss=0.03147, over 972642.00 frames.], batch size: 21, lr: 1.79e-04 2022-05-07 14:01:06,316 INFO [train.py:715] (0/8) Epoch 12, batch 28600, loss[loss=0.1147, simple_loss=0.1991, pruned_loss=0.01517, over 4778.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.03128, over 972983.55 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 14:01:44,214 INFO [train.py:715] (0/8) Epoch 12, batch 28650, loss[loss=0.1545, simple_loss=0.2202, pruned_loss=0.04444, over 4781.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03159, over 973388.90 frames.], batch size: 23, lr: 1.79e-04 2022-05-07 14:02:23,312 INFO [train.py:715] (0/8) Epoch 12, batch 28700, loss[loss=0.1263, simple_loss=0.2019, pruned_loss=0.02532, over 4838.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2108, pruned_loss=0.03171, over 972642.09 frames.], batch size: 15, lr: 1.79e-04 2022-05-07 14:03:01,816 INFO [train.py:715] (0/8) Epoch 12, batch 28750, loss[loss=0.1232, simple_loss=0.2036, pruned_loss=0.02143, over 4795.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03127, over 972960.83 frames.], batch size: 24, lr: 1.79e-04 2022-05-07 14:03:40,783 INFO [train.py:715] (0/8) Epoch 12, batch 28800, loss[loss=0.137, simple_loss=0.2201, pruned_loss=0.02699, over 4780.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03139, over 972911.79 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 14:04:18,673 INFO [train.py:715] (0/8) Epoch 12, batch 28850, loss[loss=0.1282, simple_loss=0.2102, pruned_loss=0.02311, over 4918.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.0312, over 972895.86 frames.], batch size: 18, lr: 1.79e-04 2022-05-07 14:04:57,029 INFO [train.py:715] (0/8) Epoch 12, batch 28900, loss[loss=0.1394, simple_loss=0.2054, pruned_loss=0.03671, over 4906.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.03113, over 972644.86 frames.], batch size: 23, lr: 1.78e-04 2022-05-07 14:05:35,788 INFO [train.py:715] (0/8) Epoch 12, batch 28950, loss[loss=0.1234, simple_loss=0.1969, pruned_loss=0.025, over 4774.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.03139, over 973178.36 frames.], batch size: 17, lr: 1.78e-04 2022-05-07 14:06:14,141 INFO [train.py:715] (0/8) Epoch 12, batch 29000, loss[loss=0.1384, simple_loss=0.2166, pruned_loss=0.03012, over 4888.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.03125, over 972548.21 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:06:53,413 INFO [train.py:715] (0/8) Epoch 12, batch 29050, loss[loss=0.156, simple_loss=0.236, pruned_loss=0.03799, over 4808.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2105, pruned_loss=0.03154, over 972422.88 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:07:31,880 INFO [train.py:715] (0/8) Epoch 12, batch 29100, loss[loss=0.1116, simple_loss=0.1936, pruned_loss=0.0148, over 4643.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.0312, over 971811.40 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 14:08:10,550 INFO [train.py:715] (0/8) Epoch 12, batch 29150, loss[loss=0.1433, simple_loss=0.2115, pruned_loss=0.03757, over 4779.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03114, over 971825.50 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 14:08:48,979 INFO [train.py:715] (0/8) Epoch 12, batch 29200, loss[loss=0.1065, simple_loss=0.1788, pruned_loss=0.01704, over 4762.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03145, over 971637.77 frames.], batch size: 16, lr: 1.78e-04 2022-05-07 14:09:27,673 INFO [train.py:715] (0/8) Epoch 12, batch 29250, loss[loss=0.1464, simple_loss=0.2263, pruned_loss=0.03326, over 4713.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03154, over 971379.64 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:10:05,807 INFO [train.py:715] (0/8) Epoch 12, batch 29300, loss[loss=0.1269, simple_loss=0.1888, pruned_loss=0.03245, over 4773.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03172, over 971149.13 frames.], batch size: 12, lr: 1.78e-04 2022-05-07 14:10:43,231 INFO [train.py:715] (0/8) Epoch 12, batch 29350, loss[loss=0.1298, simple_loss=0.1991, pruned_loss=0.03031, over 4874.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03228, over 971540.17 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:11:22,337 INFO [train.py:715] (0/8) Epoch 12, batch 29400, loss[loss=0.1046, simple_loss=0.1726, pruned_loss=0.01825, over 4820.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.03182, over 971706.39 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 14:12:00,592 INFO [train.py:715] (0/8) Epoch 12, batch 29450, loss[loss=0.1068, simple_loss=0.1874, pruned_loss=0.01314, over 4883.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.032, over 972288.01 frames.], batch size: 16, lr: 1.78e-04 2022-05-07 14:12:38,751 INFO [train.py:715] (0/8) Epoch 12, batch 29500, loss[loss=0.1395, simple_loss=0.2162, pruned_loss=0.03138, over 4951.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2096, pruned_loss=0.03175, over 971153.04 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:13:16,877 INFO [train.py:715] (0/8) Epoch 12, batch 29550, loss[loss=0.1085, simple_loss=0.1839, pruned_loss=0.01658, over 4815.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03216, over 971685.33 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:13:55,805 INFO [train.py:715] (0/8) Epoch 12, batch 29600, loss[loss=0.1157, simple_loss=0.1816, pruned_loss=0.0249, over 4842.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2091, pruned_loss=0.03136, over 971198.50 frames.], batch size: 26, lr: 1.78e-04 2022-05-07 14:14:34,030 INFO [train.py:715] (0/8) Epoch 12, batch 29650, loss[loss=0.1284, simple_loss=0.1988, pruned_loss=0.02899, over 4811.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2086, pruned_loss=0.03118, over 971103.87 frames.], batch size: 25, lr: 1.78e-04 2022-05-07 14:15:11,740 INFO [train.py:715] (0/8) Epoch 12, batch 29700, loss[loss=0.1415, simple_loss=0.2159, pruned_loss=0.03359, over 4941.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2095, pruned_loss=0.03177, over 971461.52 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:15:51,270 INFO [train.py:715] (0/8) Epoch 12, batch 29750, loss[loss=0.1195, simple_loss=0.1947, pruned_loss=0.0221, over 4880.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2102, pruned_loss=0.03203, over 971255.72 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:16:30,390 INFO [train.py:715] (0/8) Epoch 12, batch 29800, loss[loss=0.1064, simple_loss=0.1843, pruned_loss=0.01424, over 4824.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03162, over 971484.02 frames.], batch size: 25, lr: 1.78e-04 2022-05-07 14:17:09,202 INFO [train.py:715] (0/8) Epoch 12, batch 29850, loss[loss=0.1609, simple_loss=0.2413, pruned_loss=0.04023, over 4824.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03162, over 971790.03 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 14:17:47,532 INFO [train.py:715] (0/8) Epoch 12, batch 29900, loss[loss=0.1257, simple_loss=0.2081, pruned_loss=0.02164, over 4938.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.03188, over 972647.73 frames.], batch size: 29, lr: 1.78e-04 2022-05-07 14:18:26,383 INFO [train.py:715] (0/8) Epoch 12, batch 29950, loss[loss=0.1234, simple_loss=0.2092, pruned_loss=0.01883, over 4833.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2113, pruned_loss=0.03212, over 972925.02 frames.], batch size: 25, lr: 1.78e-04 2022-05-07 14:19:04,505 INFO [train.py:715] (0/8) Epoch 12, batch 30000, loss[loss=0.161, simple_loss=0.2361, pruned_loss=0.04292, over 4856.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2105, pruned_loss=0.03135, over 972983.63 frames.], batch size: 32, lr: 1.78e-04 2022-05-07 14:19:04,506 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 14:19:14,014 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1054, simple_loss=0.1894, pruned_loss=0.01072, over 914524.00 frames. 2022-05-07 14:19:52,924 INFO [train.py:715] (0/8) Epoch 12, batch 30050, loss[loss=0.1363, simple_loss=0.2228, pruned_loss=0.02494, over 4835.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03099, over 972267.07 frames.], batch size: 26, lr: 1.78e-04 2022-05-07 14:20:31,328 INFO [train.py:715] (0/8) Epoch 12, batch 30100, loss[loss=0.1416, simple_loss=0.2087, pruned_loss=0.03726, over 4988.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03124, over 971896.31 frames.], batch size: 14, lr: 1.78e-04 2022-05-07 14:21:10,494 INFO [train.py:715] (0/8) Epoch 12, batch 30150, loss[loss=0.1511, simple_loss=0.232, pruned_loss=0.03511, over 4953.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2103, pruned_loss=0.03121, over 971916.26 frames.], batch size: 39, lr: 1.78e-04 2022-05-07 14:21:48,957 INFO [train.py:715] (0/8) Epoch 12, batch 30200, loss[loss=0.1267, simple_loss=0.2034, pruned_loss=0.02502, over 4806.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2103, pruned_loss=0.03095, over 972112.58 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:22:28,437 INFO [train.py:715] (0/8) Epoch 12, batch 30250, loss[loss=0.1308, simple_loss=0.2038, pruned_loss=0.02887, over 4962.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2106, pruned_loss=0.03095, over 971858.74 frames.], batch size: 24, lr: 1.78e-04 2022-05-07 14:22:55,797 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-448000.pt 2022-05-07 14:23:07,600 INFO [train.py:715] (0/8) Epoch 12, batch 30300, loss[loss=0.1359, simple_loss=0.2031, pruned_loss=0.03436, over 4942.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2108, pruned_loss=0.0315, over 971808.07 frames.], batch size: 23, lr: 1.78e-04 2022-05-07 14:23:45,570 INFO [train.py:715] (0/8) Epoch 12, batch 30350, loss[loss=0.1426, simple_loss=0.2126, pruned_loss=0.03628, over 4851.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.03138, over 971301.62 frames.], batch size: 30, lr: 1.78e-04 2022-05-07 14:24:23,562 INFO [train.py:715] (0/8) Epoch 12, batch 30400, loss[loss=0.1577, simple_loss=0.2287, pruned_loss=0.04337, over 4854.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2096, pruned_loss=0.0316, over 971300.24 frames.], batch size: 30, lr: 1.78e-04 2022-05-07 14:25:01,294 INFO [train.py:715] (0/8) Epoch 12, batch 30450, loss[loss=0.139, simple_loss=0.2104, pruned_loss=0.03379, over 4821.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03138, over 970981.50 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:25:39,282 INFO [train.py:715] (0/8) Epoch 12, batch 30500, loss[loss=0.1144, simple_loss=0.1982, pruned_loss=0.01529, over 4780.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03153, over 970692.82 frames.], batch size: 17, lr: 1.78e-04 2022-05-07 14:26:17,287 INFO [train.py:715] (0/8) Epoch 12, batch 30550, loss[loss=0.1342, simple_loss=0.206, pruned_loss=0.03119, over 4913.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2104, pruned_loss=0.03127, over 971770.53 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 14:26:55,235 INFO [train.py:715] (0/8) Epoch 12, batch 30600, loss[loss=0.1403, simple_loss=0.2181, pruned_loss=0.03129, over 4984.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03136, over 971863.54 frames.], batch size: 28, lr: 1.78e-04 2022-05-07 14:27:32,187 INFO [train.py:715] (0/8) Epoch 12, batch 30650, loss[loss=0.1356, simple_loss=0.2163, pruned_loss=0.02743, over 4853.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03153, over 971641.51 frames.], batch size: 30, lr: 1.78e-04 2022-05-07 14:28:10,731 INFO [train.py:715] (0/8) Epoch 12, batch 30700, loss[loss=0.1346, simple_loss=0.1961, pruned_loss=0.03656, over 4783.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03175, over 971351.22 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 14:28:48,652 INFO [train.py:715] (0/8) Epoch 12, batch 30750, loss[loss=0.1246, simple_loss=0.1951, pruned_loss=0.02711, over 4883.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03154, over 971464.80 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:29:27,165 INFO [train.py:715] (0/8) Epoch 12, batch 30800, loss[loss=0.1425, simple_loss=0.2105, pruned_loss=0.03721, over 4898.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2097, pruned_loss=0.03141, over 971538.00 frames.], batch size: 39, lr: 1.78e-04 2022-05-07 14:30:05,814 INFO [train.py:715] (0/8) Epoch 12, batch 30850, loss[loss=0.1448, simple_loss=0.2127, pruned_loss=0.03848, over 4849.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2098, pruned_loss=0.03156, over 970859.25 frames.], batch size: 20, lr: 1.78e-04 2022-05-07 14:30:45,019 INFO [train.py:715] (0/8) Epoch 12, batch 30900, loss[loss=0.1476, simple_loss=0.2222, pruned_loss=0.03651, over 4893.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2103, pruned_loss=0.03145, over 971342.50 frames.], batch size: 19, lr: 1.78e-04 2022-05-07 14:31:23,356 INFO [train.py:715] (0/8) Epoch 12, batch 30950, loss[loss=0.1324, simple_loss=0.2152, pruned_loss=0.02481, over 4859.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2111, pruned_loss=0.03191, over 971713.73 frames.], batch size: 20, lr: 1.78e-04 2022-05-07 14:32:02,063 INFO [train.py:715] (0/8) Epoch 12, batch 31000, loss[loss=0.1307, simple_loss=0.2151, pruned_loss=0.02319, over 4859.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2113, pruned_loss=0.0317, over 972934.82 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:32:41,211 INFO [train.py:715] (0/8) Epoch 12, batch 31050, loss[loss=0.1086, simple_loss=0.1777, pruned_loss=0.01977, over 4780.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2108, pruned_loss=0.03135, over 972301.79 frames.], batch size: 12, lr: 1.78e-04 2022-05-07 14:33:19,689 INFO [train.py:715] (0/8) Epoch 12, batch 31100, loss[loss=0.1531, simple_loss=0.2243, pruned_loss=0.04098, over 4823.00 frames.], tot_loss[loss=0.137, simple_loss=0.211, pruned_loss=0.03146, over 973056.75 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:33:57,504 INFO [train.py:715] (0/8) Epoch 12, batch 31150, loss[loss=0.1333, simple_loss=0.2055, pruned_loss=0.03059, over 4844.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2103, pruned_loss=0.03108, over 972831.63 frames.], batch size: 30, lr: 1.78e-04 2022-05-07 14:34:36,501 INFO [train.py:715] (0/8) Epoch 12, batch 31200, loss[loss=0.1418, simple_loss=0.2227, pruned_loss=0.03045, over 4964.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2105, pruned_loss=0.03141, over 973877.20 frames.], batch size: 39, lr: 1.78e-04 2022-05-07 14:35:15,343 INFO [train.py:715] (0/8) Epoch 12, batch 31250, loss[loss=0.1208, simple_loss=0.1982, pruned_loss=0.02168, over 4791.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03154, over 974534.58 frames.], batch size: 14, lr: 1.78e-04 2022-05-07 14:35:54,046 INFO [train.py:715] (0/8) Epoch 12, batch 31300, loss[loss=0.1294, simple_loss=0.2082, pruned_loss=0.0253, over 4762.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03164, over 973697.08 frames.], batch size: 17, lr: 1.78e-04 2022-05-07 14:36:32,567 INFO [train.py:715] (0/8) Epoch 12, batch 31350, loss[loss=0.1624, simple_loss=0.2312, pruned_loss=0.04682, over 4990.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2106, pruned_loss=0.03196, over 972943.89 frames.], batch size: 25, lr: 1.78e-04 2022-05-07 14:37:11,732 INFO [train.py:715] (0/8) Epoch 12, batch 31400, loss[loss=0.1403, simple_loss=0.2033, pruned_loss=0.03861, over 4825.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.03181, over 973161.50 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 14:37:50,136 INFO [train.py:715] (0/8) Epoch 12, batch 31450, loss[loss=0.1279, simple_loss=0.2022, pruned_loss=0.02682, over 4745.00 frames.], tot_loss[loss=0.1367, simple_loss=0.21, pruned_loss=0.03168, over 972554.96 frames.], batch size: 16, lr: 1.78e-04 2022-05-07 14:38:28,379 INFO [train.py:715] (0/8) Epoch 12, batch 31500, loss[loss=0.1043, simple_loss=0.1883, pruned_loss=0.0101, over 4924.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03167, over 972561.04 frames.], batch size: 23, lr: 1.78e-04 2022-05-07 14:39:06,655 INFO [train.py:715] (0/8) Epoch 12, batch 31550, loss[loss=0.1185, simple_loss=0.1803, pruned_loss=0.02835, over 4991.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2109, pruned_loss=0.03217, over 972795.84 frames.], batch size: 14, lr: 1.78e-04 2022-05-07 14:39:45,219 INFO [train.py:715] (0/8) Epoch 12, batch 31600, loss[loss=0.1274, simple_loss=0.1928, pruned_loss=0.03107, over 4831.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03197, over 972572.04 frames.], batch size: 12, lr: 1.78e-04 2022-05-07 14:40:22,888 INFO [train.py:715] (0/8) Epoch 12, batch 31650, loss[loss=0.1231, simple_loss=0.197, pruned_loss=0.02458, over 4774.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03209, over 972409.69 frames.], batch size: 14, lr: 1.78e-04 2022-05-07 14:41:00,516 INFO [train.py:715] (0/8) Epoch 12, batch 31700, loss[loss=0.1193, simple_loss=0.1959, pruned_loss=0.02135, over 4833.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.03182, over 972268.47 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:41:38,631 INFO [train.py:715] (0/8) Epoch 12, batch 31750, loss[loss=0.1164, simple_loss=0.1964, pruned_loss=0.01821, over 4883.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03191, over 972071.30 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:42:16,744 INFO [train.py:715] (0/8) Epoch 12, batch 31800, loss[loss=0.1342, simple_loss=0.2139, pruned_loss=0.02726, over 4892.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03143, over 973507.81 frames.], batch size: 22, lr: 1.78e-04 2022-05-07 14:42:54,697 INFO [train.py:715] (0/8) Epoch 12, batch 31850, loss[loss=0.1555, simple_loss=0.2279, pruned_loss=0.04161, over 4946.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03139, over 973107.94 frames.], batch size: 39, lr: 1.78e-04 2022-05-07 14:43:32,403 INFO [train.py:715] (0/8) Epoch 12, batch 31900, loss[loss=0.1412, simple_loss=0.2134, pruned_loss=0.03453, over 4899.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2107, pruned_loss=0.03145, over 973711.84 frames.], batch size: 39, lr: 1.78e-04 2022-05-07 14:44:10,696 INFO [train.py:715] (0/8) Epoch 12, batch 31950, loss[loss=0.1228, simple_loss=0.1982, pruned_loss=0.02372, over 4800.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2101, pruned_loss=0.03114, over 973690.06 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:44:48,295 INFO [train.py:715] (0/8) Epoch 12, batch 32000, loss[loss=0.1315, simple_loss=0.2087, pruned_loss=0.02712, over 4806.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03077, over 972440.55 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:45:26,159 INFO [train.py:715] (0/8) Epoch 12, batch 32050, loss[loss=0.1642, simple_loss=0.2342, pruned_loss=0.04715, over 4693.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2103, pruned_loss=0.03118, over 972864.79 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:46:04,018 INFO [train.py:715] (0/8) Epoch 12, batch 32100, loss[loss=0.1339, simple_loss=0.2132, pruned_loss=0.02733, over 4691.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03102, over 971288.92 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:46:42,426 INFO [train.py:715] (0/8) Epoch 12, batch 32150, loss[loss=0.1177, simple_loss=0.1978, pruned_loss=0.01876, over 4904.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2107, pruned_loss=0.03154, over 971335.10 frames.], batch size: 17, lr: 1.78e-04 2022-05-07 14:47:20,021 INFO [train.py:715] (0/8) Epoch 12, batch 32200, loss[loss=0.1398, simple_loss=0.2149, pruned_loss=0.03231, over 4929.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03151, over 971491.61 frames.], batch size: 23, lr: 1.78e-04 2022-05-07 14:47:58,172 INFO [train.py:715] (0/8) Epoch 12, batch 32250, loss[loss=0.1258, simple_loss=0.2034, pruned_loss=0.02412, over 4900.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03139, over 972258.96 frames.], batch size: 19, lr: 1.78e-04 2022-05-07 14:48:36,822 INFO [train.py:715] (0/8) Epoch 12, batch 32300, loss[loss=0.1117, simple_loss=0.1826, pruned_loss=0.02037, over 4637.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03163, over 972521.81 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 14:49:14,381 INFO [train.py:715] (0/8) Epoch 12, batch 32350, loss[loss=0.1186, simple_loss=0.1963, pruned_loss=0.02048, over 4965.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03162, over 972265.19 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:49:52,724 INFO [train.py:715] (0/8) Epoch 12, batch 32400, loss[loss=0.1002, simple_loss=0.1743, pruned_loss=0.01299, over 4917.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2111, pruned_loss=0.03204, over 971605.71 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 14:50:30,824 INFO [train.py:715] (0/8) Epoch 12, batch 32450, loss[loss=0.144, simple_loss=0.2297, pruned_loss=0.02911, over 4877.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2117, pruned_loss=0.03208, over 972314.95 frames.], batch size: 16, lr: 1.78e-04 2022-05-07 14:51:09,331 INFO [train.py:715] (0/8) Epoch 12, batch 32500, loss[loss=0.1345, simple_loss=0.1988, pruned_loss=0.03511, over 4831.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2114, pruned_loss=0.03209, over 971732.45 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:51:46,827 INFO [train.py:715] (0/8) Epoch 12, batch 32550, loss[loss=0.1422, simple_loss=0.2181, pruned_loss=0.03316, over 4807.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2114, pruned_loss=0.03202, over 971668.45 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:52:25,067 INFO [train.py:715] (0/8) Epoch 12, batch 32600, loss[loss=0.1671, simple_loss=0.2321, pruned_loss=0.05102, over 4757.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03199, over 969983.50 frames.], batch size: 16, lr: 1.78e-04 2022-05-07 14:53:03,205 INFO [train.py:715] (0/8) Epoch 12, batch 32650, loss[loss=0.1175, simple_loss=0.2016, pruned_loss=0.01674, over 4920.00 frames.], tot_loss[loss=0.138, simple_loss=0.2113, pruned_loss=0.03241, over 970325.66 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 14:53:40,735 INFO [train.py:715] (0/8) Epoch 12, batch 32700, loss[loss=0.1487, simple_loss=0.2226, pruned_loss=0.03741, over 4984.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2101, pruned_loss=0.03213, over 970444.47 frames.], batch size: 28, lr: 1.78e-04 2022-05-07 14:54:18,459 INFO [train.py:715] (0/8) Epoch 12, batch 32750, loss[loss=0.1355, simple_loss=0.2048, pruned_loss=0.03308, over 4749.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2099, pruned_loss=0.03213, over 970568.23 frames.], batch size: 16, lr: 1.78e-04 2022-05-07 14:54:56,864 INFO [train.py:715] (0/8) Epoch 12, batch 32800, loss[loss=0.1416, simple_loss=0.2133, pruned_loss=0.03492, over 4762.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2099, pruned_loss=0.0322, over 971305.46 frames.], batch size: 19, lr: 1.78e-04 2022-05-07 14:55:35,232 INFO [train.py:715] (0/8) Epoch 12, batch 32850, loss[loss=0.1037, simple_loss=0.1804, pruned_loss=0.01346, over 4909.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2091, pruned_loss=0.03169, over 971287.49 frames.], batch size: 17, lr: 1.78e-04 2022-05-07 14:56:12,921 INFO [train.py:715] (0/8) Epoch 12, batch 32900, loss[loss=0.1238, simple_loss=0.2069, pruned_loss=0.0203, over 4808.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2091, pruned_loss=0.03131, over 972081.61 frames.], batch size: 24, lr: 1.78e-04 2022-05-07 14:56:51,020 INFO [train.py:715] (0/8) Epoch 12, batch 32950, loss[loss=0.1164, simple_loss=0.1947, pruned_loss=0.01906, over 4903.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2089, pruned_loss=0.03139, over 971328.11 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 14:57:29,165 INFO [train.py:715] (0/8) Epoch 12, batch 33000, loss[loss=0.1398, simple_loss=0.2141, pruned_loss=0.03274, over 4819.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2089, pruned_loss=0.03118, over 971870.03 frames.], batch size: 26, lr: 1.78e-04 2022-05-07 14:57:29,166 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 14:57:38,689 INFO [train.py:742] (0/8) Epoch 12, validation: loss=0.1057, simple_loss=0.1896, pruned_loss=0.01085, over 914524.00 frames. 2022-05-07 14:58:18,187 INFO [train.py:715] (0/8) Epoch 12, batch 33050, loss[loss=0.1216, simple_loss=0.1971, pruned_loss=0.02305, over 4689.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.031, over 972303.21 frames.], batch size: 15, lr: 1.78e-04 2022-05-07 14:58:56,557 INFO [train.py:715] (0/8) Epoch 12, batch 33100, loss[loss=0.1276, simple_loss=0.2099, pruned_loss=0.02268, over 4811.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03114, over 972916.27 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 14:59:34,841 INFO [train.py:715] (0/8) Epoch 12, batch 33150, loss[loss=0.1338, simple_loss=0.2073, pruned_loss=0.03015, over 4906.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03105, over 972491.66 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 15:00:12,865 INFO [train.py:715] (0/8) Epoch 12, batch 33200, loss[loss=0.129, simple_loss=0.2003, pruned_loss=0.02883, over 4986.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03058, over 973097.28 frames.], batch size: 33, lr: 1.78e-04 2022-05-07 15:00:51,448 INFO [train.py:715] (0/8) Epoch 12, batch 33250, loss[loss=0.09711, simple_loss=0.1707, pruned_loss=0.01177, over 4826.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03108, over 973024.16 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 15:01:29,589 INFO [train.py:715] (0/8) Epoch 12, batch 33300, loss[loss=0.1636, simple_loss=0.2353, pruned_loss=0.04595, over 4986.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.03127, over 973077.76 frames.], batch size: 25, lr: 1.78e-04 2022-05-07 15:02:07,719 INFO [train.py:715] (0/8) Epoch 12, batch 33350, loss[loss=0.1171, simple_loss=0.1911, pruned_loss=0.02152, over 4951.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03122, over 973047.86 frames.], batch size: 24, lr: 1.78e-04 2022-05-07 15:02:46,381 INFO [train.py:715] (0/8) Epoch 12, batch 33400, loss[loss=0.1417, simple_loss=0.2001, pruned_loss=0.04163, over 4765.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03085, over 971782.00 frames.], batch size: 14, lr: 1.78e-04 2022-05-07 15:03:25,035 INFO [train.py:715] (0/8) Epoch 12, batch 33450, loss[loss=0.1173, simple_loss=0.1935, pruned_loss=0.02058, over 4934.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03075, over 971920.75 frames.], batch size: 18, lr: 1.78e-04 2022-05-07 15:04:03,391 INFO [train.py:715] (0/8) Epoch 12, batch 33500, loss[loss=0.118, simple_loss=0.1956, pruned_loss=0.02023, over 4938.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03129, over 972492.86 frames.], batch size: 29, lr: 1.78e-04 2022-05-07 15:04:42,491 INFO [train.py:715] (0/8) Epoch 12, batch 33550, loss[loss=0.154, simple_loss=0.2318, pruned_loss=0.03806, over 4945.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03071, over 971847.91 frames.], batch size: 21, lr: 1.78e-04 2022-05-07 15:05:21,126 INFO [train.py:715] (0/8) Epoch 12, batch 33600, loss[loss=0.1328, simple_loss=0.1979, pruned_loss=0.0338, over 4822.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2094, pruned_loss=0.03072, over 971825.49 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 15:05:59,981 INFO [train.py:715] (0/8) Epoch 12, batch 33650, loss[loss=0.1235, simple_loss=0.2055, pruned_loss=0.02076, over 4941.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03105, over 971945.42 frames.], batch size: 29, lr: 1.78e-04 2022-05-07 15:06:38,060 INFO [train.py:715] (0/8) Epoch 12, batch 33700, loss[loss=0.1034, simple_loss=0.1783, pruned_loss=0.01422, over 4905.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03155, over 971251.31 frames.], batch size: 17, lr: 1.78e-04 2022-05-07 15:07:16,842 INFO [train.py:715] (0/8) Epoch 12, batch 33750, loss[loss=0.1181, simple_loss=0.1927, pruned_loss=0.02178, over 4824.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.0313, over 972145.56 frames.], batch size: 26, lr: 1.78e-04 2022-05-07 15:07:55,110 INFO [train.py:715] (0/8) Epoch 12, batch 33800, loss[loss=0.124, simple_loss=0.2013, pruned_loss=0.02339, over 4805.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03158, over 972119.28 frames.], batch size: 13, lr: 1.78e-04 2022-05-07 15:08:32,474 INFO [train.py:715] (0/8) Epoch 12, batch 33850, loss[loss=0.1541, simple_loss=0.2322, pruned_loss=0.03798, over 4815.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2107, pruned_loss=0.03188, over 971827.78 frames.], batch size: 26, lr: 1.78e-04 2022-05-07 15:09:10,654 INFO [train.py:715] (0/8) Epoch 12, batch 33900, loss[loss=0.1158, simple_loss=0.1907, pruned_loss=0.02047, over 4797.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03127, over 972119.63 frames.], batch size: 24, lr: 1.78e-04 2022-05-07 15:09:47,907 INFO [train.py:715] (0/8) Epoch 12, batch 33950, loss[loss=0.1403, simple_loss=0.217, pruned_loss=0.03176, over 4957.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2105, pruned_loss=0.03139, over 972634.00 frames.], batch size: 21, lr: 1.77e-04 2022-05-07 15:10:26,068 INFO [train.py:715] (0/8) Epoch 12, batch 34000, loss[loss=0.1288, simple_loss=0.2093, pruned_loss=0.02411, over 4968.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2109, pruned_loss=0.03162, over 973103.42 frames.], batch size: 28, lr: 1.77e-04 2022-05-07 15:11:03,701 INFO [train.py:715] (0/8) Epoch 12, batch 34050, loss[loss=0.1202, simple_loss=0.1999, pruned_loss=0.02022, over 4972.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03111, over 972049.71 frames.], batch size: 15, lr: 1.77e-04 2022-05-07 15:11:41,635 INFO [train.py:715] (0/8) Epoch 12, batch 34100, loss[loss=0.1537, simple_loss=0.2371, pruned_loss=0.03513, over 4916.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2096, pruned_loss=0.03062, over 972107.05 frames.], batch size: 18, lr: 1.77e-04 2022-05-07 15:12:19,668 INFO [train.py:715] (0/8) Epoch 12, batch 34150, loss[loss=0.1149, simple_loss=0.1863, pruned_loss=0.02174, over 4929.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03086, over 972194.69 frames.], batch size: 18, lr: 1.77e-04 2022-05-07 15:12:57,177 INFO [train.py:715] (0/8) Epoch 12, batch 34200, loss[loss=0.1306, simple_loss=0.2147, pruned_loss=0.02326, over 4867.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2109, pruned_loss=0.03095, over 972834.24 frames.], batch size: 34, lr: 1.77e-04 2022-05-07 15:13:35,446 INFO [train.py:715] (0/8) Epoch 12, batch 34250, loss[loss=0.1228, simple_loss=0.1928, pruned_loss=0.02643, over 4790.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.03131, over 972439.55 frames.], batch size: 18, lr: 1.77e-04 2022-05-07 15:14:12,815 INFO [train.py:715] (0/8) Epoch 12, batch 34300, loss[loss=0.1621, simple_loss=0.2312, pruned_loss=0.04649, over 4781.00 frames.], tot_loss[loss=0.1369, simple_loss=0.211, pruned_loss=0.03143, over 972446.30 frames.], batch size: 18, lr: 1.77e-04 2022-05-07 15:14:51,103 INFO [train.py:715] (0/8) Epoch 12, batch 34350, loss[loss=0.1151, simple_loss=0.187, pruned_loss=0.02157, over 4975.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2101, pruned_loss=0.03086, over 972207.35 frames.], batch size: 14, lr: 1.77e-04 2022-05-07 15:15:28,880 INFO [train.py:715] (0/8) Epoch 12, batch 34400, loss[loss=0.1703, simple_loss=0.2329, pruned_loss=0.05392, over 4969.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03099, over 971997.68 frames.], batch size: 35, lr: 1.77e-04 2022-05-07 15:16:07,248 INFO [train.py:715] (0/8) Epoch 12, batch 34450, loss[loss=0.1284, simple_loss=0.1988, pruned_loss=0.02898, over 4649.00 frames.], tot_loss[loss=0.136, simple_loss=0.2099, pruned_loss=0.03104, over 971945.85 frames.], batch size: 13, lr: 1.77e-04 2022-05-07 15:16:45,347 INFO [train.py:715] (0/8) Epoch 12, batch 34500, loss[loss=0.1807, simple_loss=0.2479, pruned_loss=0.05681, over 4803.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2115, pruned_loss=0.03192, over 972134.84 frames.], batch size: 21, lr: 1.77e-04 2022-05-07 15:17:23,593 INFO [train.py:715] (0/8) Epoch 12, batch 34550, loss[loss=0.1716, simple_loss=0.2432, pruned_loss=0.04996, over 4856.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2119, pruned_loss=0.03218, over 971747.67 frames.], batch size: 20, lr: 1.77e-04 2022-05-07 15:18:02,259 INFO [train.py:715] (0/8) Epoch 12, batch 34600, loss[loss=0.1148, simple_loss=0.1876, pruned_loss=0.02102, over 4798.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2112, pruned_loss=0.03181, over 971867.00 frames.], batch size: 21, lr: 1.77e-04 2022-05-07 15:18:41,655 INFO [train.py:715] (0/8) Epoch 12, batch 34650, loss[loss=0.1585, simple_loss=0.2425, pruned_loss=0.03721, over 4811.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2117, pruned_loss=0.03227, over 972026.46 frames.], batch size: 25, lr: 1.77e-04 2022-05-07 15:19:21,039 INFO [train.py:715] (0/8) Epoch 12, batch 34700, loss[loss=0.1394, simple_loss=0.214, pruned_loss=0.03245, over 4867.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2114, pruned_loss=0.03224, over 972468.00 frames.], batch size: 38, lr: 1.77e-04 2022-05-07 15:19:58,679 INFO [train.py:715] (0/8) Epoch 12, batch 34750, loss[loss=0.118, simple_loss=0.1934, pruned_loss=0.02127, over 4807.00 frames.], tot_loss[loss=0.1385, simple_loss=0.212, pruned_loss=0.03245, over 972598.88 frames.], batch size: 26, lr: 1.77e-04 2022-05-07 15:20:34,678 INFO [train.py:715] (0/8) Epoch 12, batch 34800, loss[loss=0.1243, simple_loss=0.2045, pruned_loss=0.02202, over 4758.00 frames.], tot_loss[loss=0.139, simple_loss=0.2123, pruned_loss=0.03285, over 971767.69 frames.], batch size: 14, lr: 1.77e-04 2022-05-07 15:20:44,105 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-12.pt 2022-05-07 15:21:23,125 INFO [train.py:715] (0/8) Epoch 13, batch 0, loss[loss=0.1099, simple_loss=0.1879, pruned_loss=0.01596, over 4971.00 frames.], tot_loss[loss=0.1099, simple_loss=0.1879, pruned_loss=0.01596, over 4971.00 frames.], batch size: 25, lr: 1.71e-04 2022-05-07 15:22:01,151 INFO [train.py:715] (0/8) Epoch 13, batch 50, loss[loss=0.1404, simple_loss=0.2158, pruned_loss=0.03255, over 4871.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2083, pruned_loss=0.0307, over 219787.19 frames.], batch size: 30, lr: 1.71e-04 2022-05-07 15:22:39,462 INFO [train.py:715] (0/8) Epoch 13, batch 100, loss[loss=0.1133, simple_loss=0.1791, pruned_loss=0.02381, over 4654.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03043, over 387099.54 frames.], batch size: 13, lr: 1.71e-04 2022-05-07 15:23:17,854 INFO [train.py:715] (0/8) Epoch 13, batch 150, loss[loss=0.1246, simple_loss=0.1979, pruned_loss=0.02565, over 4732.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2077, pruned_loss=0.03003, over 516431.81 frames.], batch size: 12, lr: 1.71e-04 2022-05-07 15:23:57,323 INFO [train.py:715] (0/8) Epoch 13, batch 200, loss[loss=0.1488, simple_loss=0.2169, pruned_loss=0.04036, over 4963.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.0303, over 617811.27 frames.], batch size: 35, lr: 1.71e-04 2022-05-07 15:24:35,733 INFO [train.py:715] (0/8) Epoch 13, batch 250, loss[loss=0.1396, simple_loss=0.2267, pruned_loss=0.02628, over 4812.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.03011, over 696700.01 frames.], batch size: 25, lr: 1.71e-04 2022-05-07 15:25:15,228 INFO [train.py:715] (0/8) Epoch 13, batch 300, loss[loss=0.1487, simple_loss=0.2264, pruned_loss=0.0355, over 4883.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03069, over 757712.49 frames.], batch size: 20, lr: 1.71e-04 2022-05-07 15:25:53,990 INFO [train.py:715] (0/8) Epoch 13, batch 350, loss[loss=0.1422, simple_loss=0.2053, pruned_loss=0.03955, over 4774.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03008, over 805434.47 frames.], batch size: 14, lr: 1.71e-04 2022-05-07 15:26:33,530 INFO [train.py:715] (0/8) Epoch 13, batch 400, loss[loss=0.1812, simple_loss=0.2425, pruned_loss=0.05998, over 4773.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2079, pruned_loss=0.03056, over 842014.53 frames.], batch size: 18, lr: 1.71e-04 2022-05-07 15:27:13,021 INFO [train.py:715] (0/8) Epoch 13, batch 450, loss[loss=0.165, simple_loss=0.233, pruned_loss=0.04852, over 4966.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03082, over 871015.25 frames.], batch size: 35, lr: 1.71e-04 2022-05-07 15:27:53,172 INFO [train.py:715] (0/8) Epoch 13, batch 500, loss[loss=0.1298, simple_loss=0.2032, pruned_loss=0.02823, over 4895.00 frames.], tot_loss[loss=0.1346, simple_loss=0.208, pruned_loss=0.03056, over 891902.69 frames.], batch size: 22, lr: 1.71e-04 2022-05-07 15:28:33,642 INFO [train.py:715] (0/8) Epoch 13, batch 550, loss[loss=0.1276, simple_loss=0.206, pruned_loss=0.02459, over 4908.00 frames.], tot_loss[loss=0.1358, simple_loss=0.209, pruned_loss=0.03128, over 908832.15 frames.], batch size: 39, lr: 1.71e-04 2022-05-07 15:29:12,909 INFO [train.py:715] (0/8) Epoch 13, batch 600, loss[loss=0.1753, simple_loss=0.2458, pruned_loss=0.05235, over 4754.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2091, pruned_loss=0.03119, over 922811.62 frames.], batch size: 19, lr: 1.71e-04 2022-05-07 15:29:53,383 INFO [train.py:715] (0/8) Epoch 13, batch 650, loss[loss=0.162, simple_loss=0.2327, pruned_loss=0.04563, over 4953.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03104, over 934093.67 frames.], batch size: 21, lr: 1.71e-04 2022-05-07 15:30:33,364 INFO [train.py:715] (0/8) Epoch 13, batch 700, loss[loss=0.1317, simple_loss=0.1909, pruned_loss=0.03628, over 4841.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2089, pruned_loss=0.03109, over 942206.18 frames.], batch size: 13, lr: 1.71e-04 2022-05-07 15:31:13,977 INFO [train.py:715] (0/8) Epoch 13, batch 750, loss[loss=0.1106, simple_loss=0.1781, pruned_loss=0.02159, over 4954.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03154, over 948367.49 frames.], batch size: 21, lr: 1.71e-04 2022-05-07 15:31:53,292 INFO [train.py:715] (0/8) Epoch 13, batch 800, loss[loss=0.117, simple_loss=0.1904, pruned_loss=0.02175, over 4983.00 frames.], tot_loss[loss=0.136, simple_loss=0.2092, pruned_loss=0.03146, over 954284.27 frames.], batch size: 35, lr: 1.71e-04 2022-05-07 15:32:32,561 INFO [train.py:715] (0/8) Epoch 13, batch 850, loss[loss=0.1009, simple_loss=0.1719, pruned_loss=0.01498, over 4788.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2095, pruned_loss=0.03161, over 957651.71 frames.], batch size: 12, lr: 1.71e-04 2022-05-07 15:33:12,796 INFO [train.py:715] (0/8) Epoch 13, batch 900, loss[loss=0.1674, simple_loss=0.2465, pruned_loss=0.04419, over 4978.00 frames.], tot_loss[loss=0.1367, simple_loss=0.21, pruned_loss=0.03166, over 960993.63 frames.], batch size: 15, lr: 1.71e-04 2022-05-07 15:33:52,195 INFO [train.py:715] (0/8) Epoch 13, batch 950, loss[loss=0.135, simple_loss=0.2146, pruned_loss=0.02768, over 4965.00 frames.], tot_loss[loss=0.136, simple_loss=0.2096, pruned_loss=0.03119, over 964389.79 frames.], batch size: 24, lr: 1.71e-04 2022-05-07 15:34:32,777 INFO [train.py:715] (0/8) Epoch 13, batch 1000, loss[loss=0.1436, simple_loss=0.2226, pruned_loss=0.03228, over 4794.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2098, pruned_loss=0.03158, over 965963.69 frames.], batch size: 24, lr: 1.71e-04 2022-05-07 15:35:12,235 INFO [train.py:715] (0/8) Epoch 13, batch 1050, loss[loss=0.1621, simple_loss=0.2418, pruned_loss=0.04118, over 4855.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03176, over 967135.06 frames.], batch size: 32, lr: 1.71e-04 2022-05-07 15:35:52,552 INFO [train.py:715] (0/8) Epoch 13, batch 1100, loss[loss=0.1441, simple_loss=0.2184, pruned_loss=0.0349, over 4908.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03148, over 968377.59 frames.], batch size: 17, lr: 1.71e-04 2022-05-07 15:36:32,011 INFO [train.py:715] (0/8) Epoch 13, batch 1150, loss[loss=0.1482, simple_loss=0.2332, pruned_loss=0.03162, over 4939.00 frames.], tot_loss[loss=0.137, simple_loss=0.2108, pruned_loss=0.03163, over 968717.23 frames.], batch size: 29, lr: 1.71e-04 2022-05-07 15:37:11,814 INFO [train.py:715] (0/8) Epoch 13, batch 1200, loss[loss=0.1315, simple_loss=0.188, pruned_loss=0.03753, over 4895.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03165, over 968962.86 frames.], batch size: 17, lr: 1.71e-04 2022-05-07 15:37:52,135 INFO [train.py:715] (0/8) Epoch 13, batch 1250, loss[loss=0.1471, simple_loss=0.2153, pruned_loss=0.03943, over 4944.00 frames.], tot_loss[loss=0.1371, simple_loss=0.21, pruned_loss=0.03204, over 971020.93 frames.], batch size: 29, lr: 1.71e-04 2022-05-07 15:38:31,094 INFO [train.py:715] (0/8) Epoch 13, batch 1300, loss[loss=0.12, simple_loss=0.1996, pruned_loss=0.02023, over 4945.00 frames.], tot_loss[loss=0.137, simple_loss=0.21, pruned_loss=0.03199, over 971270.63 frames.], batch size: 21, lr: 1.71e-04 2022-05-07 15:39:11,006 INFO [train.py:715] (0/8) Epoch 13, batch 1350, loss[loss=0.1498, simple_loss=0.2161, pruned_loss=0.04175, over 4938.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2107, pruned_loss=0.03244, over 971420.79 frames.], batch size: 21, lr: 1.71e-04 2022-05-07 15:39:49,770 INFO [train.py:715] (0/8) Epoch 13, batch 1400, loss[loss=0.1307, simple_loss=0.2041, pruned_loss=0.02868, over 4735.00 frames.], tot_loss[loss=0.1371, simple_loss=0.21, pruned_loss=0.03209, over 971865.94 frames.], batch size: 16, lr: 1.71e-04 2022-05-07 15:40:28,857 INFO [train.py:715] (0/8) Epoch 13, batch 1450, loss[loss=0.1479, simple_loss=0.2167, pruned_loss=0.03952, over 4899.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03166, over 972538.61 frames.], batch size: 19, lr: 1.71e-04 2022-05-07 15:41:06,532 INFO [train.py:715] (0/8) Epoch 13, batch 1500, loss[loss=0.1431, simple_loss=0.2231, pruned_loss=0.03149, over 4898.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.0317, over 972397.77 frames.], batch size: 19, lr: 1.71e-04 2022-05-07 15:41:44,151 INFO [train.py:715] (0/8) Epoch 13, batch 1550, loss[loss=0.1298, simple_loss=0.2003, pruned_loss=0.02967, over 4882.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.0317, over 973501.05 frames.], batch size: 32, lr: 1.71e-04 2022-05-07 15:42:22,720 INFO [train.py:715] (0/8) Epoch 13, batch 1600, loss[loss=0.1371, simple_loss=0.213, pruned_loss=0.03059, over 4921.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2094, pruned_loss=0.03171, over 973629.31 frames.], batch size: 21, lr: 1.71e-04 2022-05-07 15:43:00,637 INFO [train.py:715] (0/8) Epoch 13, batch 1650, loss[loss=0.1347, simple_loss=0.2107, pruned_loss=0.02935, over 4809.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2087, pruned_loss=0.03142, over 972698.82 frames.], batch size: 27, lr: 1.71e-04 2022-05-07 15:43:39,375 INFO [train.py:715] (0/8) Epoch 13, batch 1700, loss[loss=0.1299, simple_loss=0.2085, pruned_loss=0.02562, over 4960.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2093, pruned_loss=0.03151, over 973031.24 frames.], batch size: 15, lr: 1.71e-04 2022-05-07 15:44:17,659 INFO [train.py:715] (0/8) Epoch 13, batch 1750, loss[loss=0.1216, simple_loss=0.1927, pruned_loss=0.02523, over 4918.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2083, pruned_loss=0.03117, over 972318.82 frames.], batch size: 23, lr: 1.71e-04 2022-05-07 15:44:57,087 INFO [train.py:715] (0/8) Epoch 13, batch 1800, loss[loss=0.1283, simple_loss=0.203, pruned_loss=0.02678, over 4944.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2082, pruned_loss=0.03115, over 971460.33 frames.], batch size: 21, lr: 1.71e-04 2022-05-07 15:45:35,167 INFO [train.py:715] (0/8) Epoch 13, batch 1850, loss[loss=0.1556, simple_loss=0.2294, pruned_loss=0.04087, over 4882.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2083, pruned_loss=0.03121, over 971766.94 frames.], batch size: 39, lr: 1.71e-04 2022-05-07 15:46:13,427 INFO [train.py:715] (0/8) Epoch 13, batch 1900, loss[loss=0.143, simple_loss=0.223, pruned_loss=0.03151, over 4813.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2093, pruned_loss=0.03184, over 973622.84 frames.], batch size: 25, lr: 1.71e-04 2022-05-07 15:46:52,086 INFO [train.py:715] (0/8) Epoch 13, batch 1950, loss[loss=0.1677, simple_loss=0.2298, pruned_loss=0.05275, over 4872.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2099, pruned_loss=0.03178, over 973747.67 frames.], batch size: 16, lr: 1.71e-04 2022-05-07 15:47:30,462 INFO [train.py:715] (0/8) Epoch 13, batch 2000, loss[loss=0.1358, simple_loss=0.2114, pruned_loss=0.03013, over 4836.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03171, over 973673.76 frames.], batch size: 26, lr: 1.71e-04 2022-05-07 15:48:09,015 INFO [train.py:715] (0/8) Epoch 13, batch 2050, loss[loss=0.1734, simple_loss=0.2401, pruned_loss=0.05335, over 4843.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03193, over 973334.65 frames.], batch size: 13, lr: 1.71e-04 2022-05-07 15:48:47,028 INFO [train.py:715] (0/8) Epoch 13, batch 2100, loss[loss=0.1365, simple_loss=0.2111, pruned_loss=0.03092, over 4865.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03167, over 973128.25 frames.], batch size: 16, lr: 1.71e-04 2022-05-07 15:49:26,185 INFO [train.py:715] (0/8) Epoch 13, batch 2150, loss[loss=0.1156, simple_loss=0.1844, pruned_loss=0.02339, over 4855.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.0314, over 972759.15 frames.], batch size: 12, lr: 1.71e-04 2022-05-07 15:50:04,053 INFO [train.py:715] (0/8) Epoch 13, batch 2200, loss[loss=0.1599, simple_loss=0.2288, pruned_loss=0.04552, over 4747.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2097, pruned_loss=0.03156, over 973061.99 frames.], batch size: 19, lr: 1.71e-04 2022-05-07 15:50:42,239 INFO [train.py:715] (0/8) Epoch 13, batch 2250, loss[loss=0.1537, simple_loss=0.2273, pruned_loss=0.04003, over 4725.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2097, pruned_loss=0.03189, over 972724.32 frames.], batch size: 15, lr: 1.71e-04 2022-05-07 15:51:20,489 INFO [train.py:715] (0/8) Epoch 13, batch 2300, loss[loss=0.1684, simple_loss=0.2398, pruned_loss=0.04853, over 4867.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2102, pruned_loss=0.03183, over 973142.62 frames.], batch size: 16, lr: 1.71e-04 2022-05-07 15:51:59,641 INFO [train.py:715] (0/8) Epoch 13, batch 2350, loss[loss=0.1382, simple_loss=0.2184, pruned_loss=0.02897, over 4818.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2093, pruned_loss=0.03124, over 973340.15 frames.], batch size: 25, lr: 1.71e-04 2022-05-07 15:52:38,006 INFO [train.py:715] (0/8) Epoch 13, batch 2400, loss[loss=0.1142, simple_loss=0.1933, pruned_loss=0.01749, over 4937.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2089, pruned_loss=0.03113, over 973002.03 frames.], batch size: 18, lr: 1.71e-04 2022-05-07 15:53:16,743 INFO [train.py:715] (0/8) Epoch 13, batch 2450, loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02934, over 4915.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03176, over 973917.69 frames.], batch size: 39, lr: 1.71e-04 2022-05-07 15:53:55,649 INFO [train.py:715] (0/8) Epoch 13, batch 2500, loss[loss=0.1392, simple_loss=0.2025, pruned_loss=0.03798, over 4855.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03164, over 973394.04 frames.], batch size: 13, lr: 1.71e-04 2022-05-07 15:54:34,059 INFO [train.py:715] (0/8) Epoch 13, batch 2550, loss[loss=0.1427, simple_loss=0.2004, pruned_loss=0.04247, over 4926.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2112, pruned_loss=0.03214, over 972682.50 frames.], batch size: 29, lr: 1.71e-04 2022-05-07 15:55:12,158 INFO [train.py:715] (0/8) Epoch 13, batch 2600, loss[loss=0.1679, simple_loss=0.2297, pruned_loss=0.05306, over 4976.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2113, pruned_loss=0.03239, over 972188.86 frames.], batch size: 24, lr: 1.71e-04 2022-05-07 15:55:50,580 INFO [train.py:715] (0/8) Epoch 13, batch 2650, loss[loss=0.1388, simple_loss=0.2176, pruned_loss=0.02999, over 4697.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03195, over 972156.30 frames.], batch size: 15, lr: 1.71e-04 2022-05-07 15:56:28,662 INFO [train.py:715] (0/8) Epoch 13, batch 2700, loss[loss=0.1518, simple_loss=0.2089, pruned_loss=0.04739, over 4973.00 frames.], tot_loss[loss=0.138, simple_loss=0.2116, pruned_loss=0.03227, over 972265.68 frames.], batch size: 14, lr: 1.70e-04 2022-05-07 15:57:06,437 INFO [train.py:715] (0/8) Epoch 13, batch 2750, loss[loss=0.1291, simple_loss=0.2084, pruned_loss=0.02492, over 4856.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2118, pruned_loss=0.03195, over 971811.89 frames.], batch size: 20, lr: 1.70e-04 2022-05-07 15:57:43,966 INFO [train.py:715] (0/8) Epoch 13, batch 2800, loss[loss=0.1408, simple_loss=0.2264, pruned_loss=0.0276, over 4835.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2115, pruned_loss=0.03164, over 971855.25 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 15:58:22,561 INFO [train.py:715] (0/8) Epoch 13, batch 2850, loss[loss=0.1513, simple_loss=0.2303, pruned_loss=0.03613, over 4854.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2114, pruned_loss=0.03158, over 971905.88 frames.], batch size: 20, lr: 1.70e-04 2022-05-07 15:59:00,067 INFO [train.py:715] (0/8) Epoch 13, batch 2900, loss[loss=0.1922, simple_loss=0.2626, pruned_loss=0.06092, over 4846.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2111, pruned_loss=0.03176, over 972274.96 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 15:59:37,964 INFO [train.py:715] (0/8) Epoch 13, batch 2950, loss[loss=0.1345, simple_loss=0.2158, pruned_loss=0.02658, over 4871.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2108, pruned_loss=0.03144, over 972551.86 frames.], batch size: 22, lr: 1.70e-04 2022-05-07 16:00:15,985 INFO [train.py:715] (0/8) Epoch 13, batch 3000, loss[loss=0.1642, simple_loss=0.2392, pruned_loss=0.04458, over 4878.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2111, pruned_loss=0.03173, over 972773.94 frames.], batch size: 16, lr: 1.70e-04 2022-05-07 16:00:15,987 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 16:00:25,446 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1052, simple_loss=0.1893, pruned_loss=0.01058, over 914524.00 frames. 2022-05-07 16:01:03,669 INFO [train.py:715] (0/8) Epoch 13, batch 3050, loss[loss=0.1104, simple_loss=0.1896, pruned_loss=0.01559, over 4832.00 frames.], tot_loss[loss=0.136, simple_loss=0.2101, pruned_loss=0.03094, over 972832.89 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 16:01:42,200 INFO [train.py:715] (0/8) Epoch 13, batch 3100, loss[loss=0.1413, simple_loss=0.2168, pruned_loss=0.03288, over 4948.00 frames.], tot_loss[loss=0.137, simple_loss=0.2108, pruned_loss=0.03158, over 972013.89 frames.], batch size: 23, lr: 1.70e-04 2022-05-07 16:02:19,744 INFO [train.py:715] (0/8) Epoch 13, batch 3150, loss[loss=0.1453, simple_loss=0.2224, pruned_loss=0.03405, over 4970.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2114, pruned_loss=0.03182, over 971755.17 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:02:57,073 INFO [train.py:715] (0/8) Epoch 13, batch 3200, loss[loss=0.1516, simple_loss=0.2369, pruned_loss=0.03319, over 4916.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2114, pruned_loss=0.03181, over 971817.66 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:03:35,541 INFO [train.py:715] (0/8) Epoch 13, batch 3250, loss[loss=0.139, simple_loss=0.2214, pruned_loss=0.02835, over 4805.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2117, pruned_loss=0.03168, over 972322.02 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:04:13,561 INFO [train.py:715] (0/8) Epoch 13, batch 3300, loss[loss=0.1359, simple_loss=0.2228, pruned_loss=0.02447, over 4976.00 frames.], tot_loss[loss=0.1366, simple_loss=0.211, pruned_loss=0.03115, over 972431.74 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:04:51,385 INFO [train.py:715] (0/8) Epoch 13, batch 3350, loss[loss=0.1353, simple_loss=0.2174, pruned_loss=0.02656, over 4924.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2104, pruned_loss=0.03121, over 972218.94 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:05:29,074 INFO [train.py:715] (0/8) Epoch 13, batch 3400, loss[loss=0.1551, simple_loss=0.2185, pruned_loss=0.04579, over 4974.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03146, over 971896.07 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:06:07,368 INFO [train.py:715] (0/8) Epoch 13, batch 3450, loss[loss=0.1392, simple_loss=0.2044, pruned_loss=0.03704, over 4845.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03161, over 971833.25 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:06:27,354 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-456000.pt 2022-05-07 16:06:47,669 INFO [train.py:715] (0/8) Epoch 13, batch 3500, loss[loss=0.1299, simple_loss=0.1927, pruned_loss=0.03357, over 4985.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03203, over 971345.70 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:07:25,029 INFO [train.py:715] (0/8) Epoch 13, batch 3550, loss[loss=0.1481, simple_loss=0.2277, pruned_loss=0.0342, over 4858.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2091, pruned_loss=0.03129, over 971861.47 frames.], batch size: 20, lr: 1.70e-04 2022-05-07 16:08:03,488 INFO [train.py:715] (0/8) Epoch 13, batch 3600, loss[loss=0.1554, simple_loss=0.2342, pruned_loss=0.03826, over 4776.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2088, pruned_loss=0.03106, over 972180.58 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:08:41,275 INFO [train.py:715] (0/8) Epoch 13, batch 3650, loss[loss=0.1476, simple_loss=0.2285, pruned_loss=0.0333, over 4806.00 frames.], tot_loss[loss=0.1357, simple_loss=0.209, pruned_loss=0.03121, over 972040.44 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:09:18,849 INFO [train.py:715] (0/8) Epoch 13, batch 3700, loss[loss=0.1113, simple_loss=0.1803, pruned_loss=0.0212, over 4850.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2086, pruned_loss=0.03097, over 972342.48 frames.], batch size: 13, lr: 1.70e-04 2022-05-07 16:09:56,566 INFO [train.py:715] (0/8) Epoch 13, batch 3750, loss[loss=0.1056, simple_loss=0.1808, pruned_loss=0.01525, over 4837.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2081, pruned_loss=0.0306, over 971986.80 frames.], batch size: 13, lr: 1.70e-04 2022-05-07 16:10:34,800 INFO [train.py:715] (0/8) Epoch 13, batch 3800, loss[loss=0.1096, simple_loss=0.1758, pruned_loss=0.02167, over 4932.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2089, pruned_loss=0.03133, over 972876.48 frames.], batch size: 29, lr: 1.70e-04 2022-05-07 16:11:11,944 INFO [train.py:715] (0/8) Epoch 13, batch 3850, loss[loss=0.1331, simple_loss=0.2115, pruned_loss=0.0274, over 4886.00 frames.], tot_loss[loss=0.135, simple_loss=0.2082, pruned_loss=0.03091, over 972195.13 frames.], batch size: 22, lr: 1.70e-04 2022-05-07 16:11:49,241 INFO [train.py:715] (0/8) Epoch 13, batch 3900, loss[loss=0.154, simple_loss=0.2323, pruned_loss=0.03781, over 4841.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2086, pruned_loss=0.0311, over 972194.73 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:12:27,128 INFO [train.py:715] (0/8) Epoch 13, batch 3950, loss[loss=0.1345, simple_loss=0.212, pruned_loss=0.02846, over 4740.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.0308, over 971859.96 frames.], batch size: 16, lr: 1.70e-04 2022-05-07 16:13:05,297 INFO [train.py:715] (0/8) Epoch 13, batch 4000, loss[loss=0.1722, simple_loss=0.2483, pruned_loss=0.04807, over 4947.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03126, over 972234.34 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:13:42,995 INFO [train.py:715] (0/8) Epoch 13, batch 4050, loss[loss=0.148, simple_loss=0.2163, pruned_loss=0.0399, over 4950.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03043, over 972634.56 frames.], batch size: 29, lr: 1.70e-04 2022-05-07 16:14:20,642 INFO [train.py:715] (0/8) Epoch 13, batch 4100, loss[loss=0.1194, simple_loss=0.2016, pruned_loss=0.01856, over 4825.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03045, over 972801.03 frames.], batch size: 26, lr: 1.70e-04 2022-05-07 16:14:59,182 INFO [train.py:715] (0/8) Epoch 13, batch 4150, loss[loss=0.1345, simple_loss=0.2024, pruned_loss=0.0333, over 4787.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2079, pruned_loss=0.03048, over 971621.02 frames.], batch size: 17, lr: 1.70e-04 2022-05-07 16:15:36,525 INFO [train.py:715] (0/8) Epoch 13, batch 4200, loss[loss=0.1516, simple_loss=0.2328, pruned_loss=0.03518, over 4820.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03066, over 971523.93 frames.], batch size: 27, lr: 1.70e-04 2022-05-07 16:16:14,499 INFO [train.py:715] (0/8) Epoch 13, batch 4250, loss[loss=0.1412, simple_loss=0.2204, pruned_loss=0.03097, over 4809.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03087, over 971653.43 frames.], batch size: 26, lr: 1.70e-04 2022-05-07 16:16:52,596 INFO [train.py:715] (0/8) Epoch 13, batch 4300, loss[loss=0.1316, simple_loss=0.2092, pruned_loss=0.02697, over 4899.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03114, over 970964.75 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 16:17:30,602 INFO [train.py:715] (0/8) Epoch 13, batch 4350, loss[loss=0.123, simple_loss=0.1919, pruned_loss=0.02707, over 4841.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2092, pruned_loss=0.0312, over 971239.39 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:18:08,275 INFO [train.py:715] (0/8) Epoch 13, batch 4400, loss[loss=0.1305, simple_loss=0.2034, pruned_loss=0.02883, over 4953.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03131, over 970930.66 frames.], batch size: 39, lr: 1.70e-04 2022-05-07 16:18:46,443 INFO [train.py:715] (0/8) Epoch 13, batch 4450, loss[loss=0.1028, simple_loss=0.1742, pruned_loss=0.01567, over 4857.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2096, pruned_loss=0.0313, over 971008.37 frames.], batch size: 13, lr: 1.70e-04 2022-05-07 16:19:25,667 INFO [train.py:715] (0/8) Epoch 13, batch 4500, loss[loss=0.1237, simple_loss=0.1948, pruned_loss=0.02631, over 4767.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.0315, over 971220.32 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 16:20:03,828 INFO [train.py:715] (0/8) Epoch 13, batch 4550, loss[loss=0.1349, simple_loss=0.2001, pruned_loss=0.03485, over 4789.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03131, over 971993.56 frames.], batch size: 17, lr: 1.70e-04 2022-05-07 16:20:40,812 INFO [train.py:715] (0/8) Epoch 13, batch 4600, loss[loss=0.1127, simple_loss=0.1839, pruned_loss=0.02074, over 4773.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2105, pruned_loss=0.03148, over 971392.09 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:21:19,530 INFO [train.py:715] (0/8) Epoch 13, batch 4650, loss[loss=0.1426, simple_loss=0.2105, pruned_loss=0.03741, over 4934.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03158, over 972385.77 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:21:57,437 INFO [train.py:715] (0/8) Epoch 13, batch 4700, loss[loss=0.1384, simple_loss=0.2052, pruned_loss=0.0358, over 4813.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03191, over 973118.29 frames.], batch size: 13, lr: 1.70e-04 2022-05-07 16:22:35,609 INFO [train.py:715] (0/8) Epoch 13, batch 4750, loss[loss=0.1436, simple_loss=0.224, pruned_loss=0.03156, over 4776.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2109, pruned_loss=0.03184, over 972981.86 frames.], batch size: 14, lr: 1.70e-04 2022-05-07 16:23:13,888 INFO [train.py:715] (0/8) Epoch 13, batch 4800, loss[loss=0.1897, simple_loss=0.2648, pruned_loss=0.05727, over 4971.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2108, pruned_loss=0.03188, over 972546.98 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:23:53,164 INFO [train.py:715] (0/8) Epoch 13, batch 4850, loss[loss=0.1873, simple_loss=0.2525, pruned_loss=0.061, over 4910.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03215, over 973163.05 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:24:31,287 INFO [train.py:715] (0/8) Epoch 13, batch 4900, loss[loss=0.1349, simple_loss=0.2105, pruned_loss=0.02963, over 4981.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.03185, over 973498.15 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:25:10,146 INFO [train.py:715] (0/8) Epoch 13, batch 4950, loss[loss=0.1624, simple_loss=0.2362, pruned_loss=0.04436, over 4878.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03194, over 972829.71 frames.], batch size: 39, lr: 1.70e-04 2022-05-07 16:25:49,559 INFO [train.py:715] (0/8) Epoch 13, batch 5000, loss[loss=0.1564, simple_loss=0.2197, pruned_loss=0.04658, over 4742.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03158, over 972493.75 frames.], batch size: 16, lr: 1.70e-04 2022-05-07 16:26:28,895 INFO [train.py:715] (0/8) Epoch 13, batch 5050, loss[loss=0.1293, simple_loss=0.2119, pruned_loss=0.02335, over 4887.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2103, pruned_loss=0.0315, over 972909.67 frames.], batch size: 22, lr: 1.70e-04 2022-05-07 16:27:07,529 INFO [train.py:715] (0/8) Epoch 13, batch 5100, loss[loss=0.1044, simple_loss=0.1755, pruned_loss=0.01665, over 4882.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2099, pruned_loss=0.03179, over 972851.00 frames.], batch size: 16, lr: 1.70e-04 2022-05-07 16:27:46,961 INFO [train.py:715] (0/8) Epoch 13, batch 5150, loss[loss=0.1254, simple_loss=0.1985, pruned_loss=0.02614, over 4816.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2096, pruned_loss=0.03141, over 972775.85 frames.], batch size: 26, lr: 1.70e-04 2022-05-07 16:28:26,705 INFO [train.py:715] (0/8) Epoch 13, batch 5200, loss[loss=0.1273, simple_loss=0.2101, pruned_loss=0.02225, over 4897.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.031, over 971854.48 frames.], batch size: 22, lr: 1.70e-04 2022-05-07 16:29:06,547 INFO [train.py:715] (0/8) Epoch 13, batch 5250, loss[loss=0.1274, simple_loss=0.2044, pruned_loss=0.02524, over 4820.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03094, over 972786.25 frames.], batch size: 27, lr: 1.70e-04 2022-05-07 16:29:45,228 INFO [train.py:715] (0/8) Epoch 13, batch 5300, loss[loss=0.1425, simple_loss=0.2126, pruned_loss=0.03622, over 4988.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03091, over 972203.05 frames.], batch size: 28, lr: 1.70e-04 2022-05-07 16:30:25,369 INFO [train.py:715] (0/8) Epoch 13, batch 5350, loss[loss=0.1509, simple_loss=0.2299, pruned_loss=0.03596, over 4750.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2085, pruned_loss=0.03067, over 972740.67 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 16:31:05,470 INFO [train.py:715] (0/8) Epoch 13, batch 5400, loss[loss=0.1169, simple_loss=0.1868, pruned_loss=0.02346, over 4978.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03053, over 973432.22 frames.], batch size: 35, lr: 1.70e-04 2022-05-07 16:31:45,403 INFO [train.py:715] (0/8) Epoch 13, batch 5450, loss[loss=0.1376, simple_loss=0.2134, pruned_loss=0.03097, over 4810.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.0307, over 973340.28 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:32:24,987 INFO [train.py:715] (0/8) Epoch 13, batch 5500, loss[loss=0.1465, simple_loss=0.2193, pruned_loss=0.03687, over 4782.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03096, over 972542.67 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:33:04,811 INFO [train.py:715] (0/8) Epoch 13, batch 5550, loss[loss=0.1452, simple_loss=0.2287, pruned_loss=0.03088, over 4947.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.0313, over 972798.40 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:33:44,063 INFO [train.py:715] (0/8) Epoch 13, batch 5600, loss[loss=0.1125, simple_loss=0.1753, pruned_loss=0.02489, over 4765.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2104, pruned_loss=0.03152, over 972571.37 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 16:34:23,510 INFO [train.py:715] (0/8) Epoch 13, batch 5650, loss[loss=0.1197, simple_loss=0.1872, pruned_loss=0.02606, over 4795.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03155, over 972490.21 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 16:35:03,782 INFO [train.py:715] (0/8) Epoch 13, batch 5700, loss[loss=0.1483, simple_loss=0.236, pruned_loss=0.03025, over 4771.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03132, over 972438.81 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:35:43,895 INFO [train.py:715] (0/8) Epoch 13, batch 5750, loss[loss=0.1597, simple_loss=0.2362, pruned_loss=0.04162, over 4763.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.0313, over 972753.96 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 16:36:22,748 INFO [train.py:715] (0/8) Epoch 13, batch 5800, loss[loss=0.1405, simple_loss=0.2136, pruned_loss=0.03369, over 4853.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03111, over 972648.04 frames.], batch size: 32, lr: 1.70e-04 2022-05-07 16:37:02,232 INFO [train.py:715] (0/8) Epoch 13, batch 5850, loss[loss=0.1304, simple_loss=0.2098, pruned_loss=0.02546, over 4801.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2095, pruned_loss=0.03092, over 972338.00 frames.], batch size: 24, lr: 1.70e-04 2022-05-07 16:37:42,373 INFO [train.py:715] (0/8) Epoch 13, batch 5900, loss[loss=0.1505, simple_loss=0.2177, pruned_loss=0.04166, over 4780.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03107, over 972780.25 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 16:38:21,731 INFO [train.py:715] (0/8) Epoch 13, batch 5950, loss[loss=0.1196, simple_loss=0.191, pruned_loss=0.02411, over 4815.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03084, over 972809.61 frames.], batch size: 13, lr: 1.70e-04 2022-05-07 16:39:01,220 INFO [train.py:715] (0/8) Epoch 13, batch 6000, loss[loss=0.1322, simple_loss=0.2062, pruned_loss=0.02912, over 4908.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2095, pruned_loss=0.03086, over 973843.00 frames.], batch size: 17, lr: 1.70e-04 2022-05-07 16:39:01,222 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 16:39:10,779 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1054, simple_loss=0.1893, pruned_loss=0.01078, over 914524.00 frames. 2022-05-07 16:39:50,258 INFO [train.py:715] (0/8) Epoch 13, batch 6050, loss[loss=0.1415, simple_loss=0.2097, pruned_loss=0.03668, over 4801.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.0312, over 973061.76 frames.], batch size: 17, lr: 1.70e-04 2022-05-07 16:40:29,775 INFO [train.py:715] (0/8) Epoch 13, batch 6100, loss[loss=0.1509, simple_loss=0.2195, pruned_loss=0.04113, over 4989.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03132, over 972050.34 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:41:09,339 INFO [train.py:715] (0/8) Epoch 13, batch 6150, loss[loss=0.1426, simple_loss=0.221, pruned_loss=0.03205, over 4691.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2109, pruned_loss=0.03152, over 972302.56 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:41:47,236 INFO [train.py:715] (0/8) Epoch 13, batch 6200, loss[loss=0.1124, simple_loss=0.1774, pruned_loss=0.0237, over 4837.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2103, pruned_loss=0.03144, over 972064.61 frames.], batch size: 13, lr: 1.70e-04 2022-05-07 16:42:26,288 INFO [train.py:715] (0/8) Epoch 13, batch 6250, loss[loss=0.1239, simple_loss=0.2022, pruned_loss=0.02279, over 4707.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2101, pruned_loss=0.03142, over 971946.17 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:43:05,821 INFO [train.py:715] (0/8) Epoch 13, batch 6300, loss[loss=0.1534, simple_loss=0.2279, pruned_loss=0.03949, over 4895.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03143, over 972656.85 frames.], batch size: 22, lr: 1.70e-04 2022-05-07 16:43:44,409 INFO [train.py:715] (0/8) Epoch 13, batch 6350, loss[loss=0.1205, simple_loss=0.196, pruned_loss=0.02246, over 4770.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03123, over 972313.90 frames.], batch size: 17, lr: 1.70e-04 2022-05-07 16:44:24,224 INFO [train.py:715] (0/8) Epoch 13, batch 6400, loss[loss=0.1468, simple_loss=0.2262, pruned_loss=0.0337, over 4889.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03123, over 973239.27 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 16:45:04,045 INFO [train.py:715] (0/8) Epoch 13, batch 6450, loss[loss=0.147, simple_loss=0.2147, pruned_loss=0.03965, over 4952.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03135, over 973590.04 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:45:44,138 INFO [train.py:715] (0/8) Epoch 13, batch 6500, loss[loss=0.1443, simple_loss=0.2273, pruned_loss=0.03064, over 4965.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03073, over 973652.51 frames.], batch size: 39, lr: 1.70e-04 2022-05-07 16:46:23,310 INFO [train.py:715] (0/8) Epoch 13, batch 6550, loss[loss=0.1246, simple_loss=0.1983, pruned_loss=0.02538, over 4937.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03049, over 973013.76 frames.], batch size: 29, lr: 1.70e-04 2022-05-07 16:47:02,646 INFO [train.py:715] (0/8) Epoch 13, batch 6600, loss[loss=0.1305, simple_loss=0.2051, pruned_loss=0.0279, over 4843.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03064, over 973317.94 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:47:42,040 INFO [train.py:715] (0/8) Epoch 13, batch 6650, loss[loss=0.1292, simple_loss=0.207, pruned_loss=0.02568, over 4889.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03071, over 973669.99 frames.], batch size: 22, lr: 1.70e-04 2022-05-07 16:48:20,273 INFO [train.py:715] (0/8) Epoch 13, batch 6700, loss[loss=0.1334, simple_loss=0.2018, pruned_loss=0.0325, over 4769.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03095, over 973605.92 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:48:58,713 INFO [train.py:715] (0/8) Epoch 13, batch 6750, loss[loss=0.1304, simple_loss=0.2084, pruned_loss=0.02626, over 4872.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03119, over 973597.15 frames.], batch size: 16, lr: 1.70e-04 2022-05-07 16:49:37,979 INFO [train.py:715] (0/8) Epoch 13, batch 6800, loss[loss=0.1258, simple_loss=0.2084, pruned_loss=0.02155, over 4945.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03104, over 973647.04 frames.], batch size: 39, lr: 1.70e-04 2022-05-07 16:50:17,428 INFO [train.py:715] (0/8) Epoch 13, batch 6850, loss[loss=0.1434, simple_loss=0.2252, pruned_loss=0.03083, over 4803.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03145, over 973976.44 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:50:55,360 INFO [train.py:715] (0/8) Epoch 13, batch 6900, loss[loss=0.1211, simple_loss=0.19, pruned_loss=0.02607, over 4987.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03119, over 973464.25 frames.], batch size: 14, lr: 1.70e-04 2022-05-07 16:51:33,400 INFO [train.py:715] (0/8) Epoch 13, batch 6950, loss[loss=0.1518, simple_loss=0.2285, pruned_loss=0.03755, over 4893.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.0311, over 973442.57 frames.], batch size: 17, lr: 1.70e-04 2022-05-07 16:52:12,637 INFO [train.py:715] (0/8) Epoch 13, batch 7000, loss[loss=0.1359, simple_loss=0.2028, pruned_loss=0.03454, over 4809.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2096, pruned_loss=0.03071, over 972952.96 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 16:52:51,279 INFO [train.py:715] (0/8) Epoch 13, batch 7050, loss[loss=0.1507, simple_loss=0.2059, pruned_loss=0.04773, over 4966.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2088, pruned_loss=0.03079, over 972232.51 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 16:53:30,243 INFO [train.py:715] (0/8) Epoch 13, batch 7100, loss[loss=0.09874, simple_loss=0.1639, pruned_loss=0.0168, over 4848.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2082, pruned_loss=0.03052, over 971669.29 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 16:54:09,702 INFO [train.py:715] (0/8) Epoch 13, batch 7150, loss[loss=0.1221, simple_loss=0.2035, pruned_loss=0.02033, over 4749.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03041, over 970740.44 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 16:54:49,403 INFO [train.py:715] (0/8) Epoch 13, batch 7200, loss[loss=0.1363, simple_loss=0.2071, pruned_loss=0.03281, over 4931.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03048, over 970387.73 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:55:27,552 INFO [train.py:715] (0/8) Epoch 13, batch 7250, loss[loss=0.1375, simple_loss=0.2092, pruned_loss=0.03287, over 4848.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03079, over 970106.20 frames.], batch size: 20, lr: 1.70e-04 2022-05-07 16:56:05,822 INFO [train.py:715] (0/8) Epoch 13, batch 7300, loss[loss=0.134, simple_loss=0.2105, pruned_loss=0.02881, over 4891.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03089, over 971444.58 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:56:45,074 INFO [train.py:715] (0/8) Epoch 13, batch 7350, loss[loss=0.1717, simple_loss=0.2343, pruned_loss=0.0545, over 4985.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2101, pruned_loss=0.03147, over 971789.89 frames.], batch size: 35, lr: 1.70e-04 2022-05-07 16:57:23,714 INFO [train.py:715] (0/8) Epoch 13, batch 7400, loss[loss=0.1592, simple_loss=0.2288, pruned_loss=0.04478, over 4963.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03194, over 972336.24 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:58:01,538 INFO [train.py:715] (0/8) Epoch 13, batch 7450, loss[loss=0.1542, simple_loss=0.2353, pruned_loss=0.03662, over 4942.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03181, over 972743.77 frames.], batch size: 21, lr: 1.70e-04 2022-05-07 16:58:40,991 INFO [train.py:715] (0/8) Epoch 13, batch 7500, loss[loss=0.1456, simple_loss=0.222, pruned_loss=0.03461, over 4911.00 frames.], tot_loss[loss=0.1382, simple_loss=0.212, pruned_loss=0.03219, over 972584.60 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:59:20,233 INFO [train.py:715] (0/8) Epoch 13, batch 7550, loss[loss=0.1574, simple_loss=0.2357, pruned_loss=0.03952, over 4923.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2117, pruned_loss=0.03199, over 972268.07 frames.], batch size: 18, lr: 1.70e-04 2022-05-07 16:59:57,834 INFO [train.py:715] (0/8) Epoch 13, batch 7600, loss[loss=0.1501, simple_loss=0.2237, pruned_loss=0.03824, over 4832.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2112, pruned_loss=0.03159, over 972143.77 frames.], batch size: 30, lr: 1.70e-04 2022-05-07 17:00:36,712 INFO [train.py:715] (0/8) Epoch 13, batch 7650, loss[loss=0.1459, simple_loss=0.2161, pruned_loss=0.03787, over 4821.00 frames.], tot_loss[loss=0.1374, simple_loss=0.211, pruned_loss=0.03185, over 971886.26 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 17:01:15,682 INFO [train.py:715] (0/8) Epoch 13, batch 7700, loss[loss=0.1668, simple_loss=0.2294, pruned_loss=0.05203, over 4828.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.03149, over 972786.17 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 17:01:54,647 INFO [train.py:715] (0/8) Epoch 13, batch 7750, loss[loss=0.1459, simple_loss=0.2208, pruned_loss=0.03548, over 4855.00 frames.], tot_loss[loss=0.1373, simple_loss=0.211, pruned_loss=0.03182, over 972830.31 frames.], batch size: 30, lr: 1.70e-04 2022-05-07 17:02:32,574 INFO [train.py:715] (0/8) Epoch 13, batch 7800, loss[loss=0.1281, simple_loss=0.205, pruned_loss=0.02555, over 4923.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.03125, over 972188.05 frames.], batch size: 23, lr: 1.70e-04 2022-05-07 17:03:11,058 INFO [train.py:715] (0/8) Epoch 13, batch 7850, loss[loss=0.1249, simple_loss=0.1905, pruned_loss=0.02961, over 4789.00 frames.], tot_loss[loss=0.136, simple_loss=0.2101, pruned_loss=0.03088, over 971576.27 frames.], batch size: 12, lr: 1.70e-04 2022-05-07 17:03:50,701 INFO [train.py:715] (0/8) Epoch 13, batch 7900, loss[loss=0.1199, simple_loss=0.2075, pruned_loss=0.01617, over 4889.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2108, pruned_loss=0.03129, over 972024.72 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 17:04:28,750 INFO [train.py:715] (0/8) Epoch 13, batch 7950, loss[loss=0.1468, simple_loss=0.2235, pruned_loss=0.03501, over 4878.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2106, pruned_loss=0.03136, over 972993.91 frames.], batch size: 16, lr: 1.70e-04 2022-05-07 17:05:07,212 INFO [train.py:715] (0/8) Epoch 13, batch 8000, loss[loss=0.1927, simple_loss=0.2634, pruned_loss=0.06097, over 4908.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03181, over 973214.95 frames.], batch size: 19, lr: 1.70e-04 2022-05-07 17:05:45,979 INFO [train.py:715] (0/8) Epoch 13, batch 8050, loss[loss=0.1555, simple_loss=0.2313, pruned_loss=0.03985, over 4960.00 frames.], tot_loss[loss=0.138, simple_loss=0.2113, pruned_loss=0.03241, over 972694.61 frames.], batch size: 15, lr: 1.70e-04 2022-05-07 17:06:24,530 INFO [train.py:715] (0/8) Epoch 13, batch 8100, loss[loss=0.1367, simple_loss=0.2139, pruned_loss=0.02974, over 4906.00 frames.], tot_loss[loss=0.138, simple_loss=0.2117, pruned_loss=0.0322, over 972569.78 frames.], batch size: 29, lr: 1.69e-04 2022-05-07 17:07:02,514 INFO [train.py:715] (0/8) Epoch 13, batch 8150, loss[loss=0.1408, simple_loss=0.2164, pruned_loss=0.03258, over 4890.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2115, pruned_loss=0.03219, over 972277.21 frames.], batch size: 22, lr: 1.69e-04 2022-05-07 17:07:40,991 INFO [train.py:715] (0/8) Epoch 13, batch 8200, loss[loss=0.1206, simple_loss=0.1894, pruned_loss=0.02591, over 4975.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03179, over 973092.65 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:08:20,200 INFO [train.py:715] (0/8) Epoch 13, batch 8250, loss[loss=0.1143, simple_loss=0.2015, pruned_loss=0.01356, over 4891.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2105, pruned_loss=0.03198, over 972732.55 frames.], batch size: 22, lr: 1.69e-04 2022-05-07 17:08:58,108 INFO [train.py:715] (0/8) Epoch 13, batch 8300, loss[loss=0.1749, simple_loss=0.2568, pruned_loss=0.04649, over 4789.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03194, over 973237.73 frames.], batch size: 18, lr: 1.69e-04 2022-05-07 17:09:36,538 INFO [train.py:715] (0/8) Epoch 13, batch 8350, loss[loss=0.1357, simple_loss=0.2123, pruned_loss=0.02961, over 4939.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2105, pruned_loss=0.03218, over 973190.78 frames.], batch size: 39, lr: 1.69e-04 2022-05-07 17:10:15,699 INFO [train.py:715] (0/8) Epoch 13, batch 8400, loss[loss=0.1495, simple_loss=0.2155, pruned_loss=0.04172, over 4800.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2109, pruned_loss=0.03206, over 972835.59 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 17:10:54,561 INFO [train.py:715] (0/8) Epoch 13, batch 8450, loss[loss=0.1379, simple_loss=0.2134, pruned_loss=0.03121, over 4935.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.0319, over 973029.85 frames.], batch size: 29, lr: 1.69e-04 2022-05-07 17:11:32,552 INFO [train.py:715] (0/8) Epoch 13, batch 8500, loss[loss=0.127, simple_loss=0.2092, pruned_loss=0.02234, over 4979.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2104, pruned_loss=0.03187, over 973011.44 frames.], batch size: 24, lr: 1.69e-04 2022-05-07 17:12:11,692 INFO [train.py:715] (0/8) Epoch 13, batch 8550, loss[loss=0.1446, simple_loss=0.216, pruned_loss=0.03658, over 4867.00 frames.], tot_loss[loss=0.1367, simple_loss=0.21, pruned_loss=0.03164, over 973074.24 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:12:50,643 INFO [train.py:715] (0/8) Epoch 13, batch 8600, loss[loss=0.1415, simple_loss=0.2239, pruned_loss=0.02949, over 4988.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03166, over 973437.56 frames.], batch size: 28, lr: 1.69e-04 2022-05-07 17:13:28,879 INFO [train.py:715] (0/8) Epoch 13, batch 8650, loss[loss=0.1509, simple_loss=0.2119, pruned_loss=0.04493, over 4910.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2101, pruned_loss=0.03132, over 973521.05 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:14:07,223 INFO [train.py:715] (0/8) Epoch 13, batch 8700, loss[loss=0.1215, simple_loss=0.2018, pruned_loss=0.02061, over 4953.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2103, pruned_loss=0.03163, over 973462.81 frames.], batch size: 21, lr: 1.69e-04 2022-05-07 17:14:45,864 INFO [train.py:715] (0/8) Epoch 13, batch 8750, loss[loss=0.1642, simple_loss=0.2357, pruned_loss=0.04637, over 4915.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.03184, over 973796.61 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:15:24,583 INFO [train.py:715] (0/8) Epoch 13, batch 8800, loss[loss=0.1468, simple_loss=0.2188, pruned_loss=0.03739, over 4874.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03173, over 971878.29 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:16:02,883 INFO [train.py:715] (0/8) Epoch 13, batch 8850, loss[loss=0.1227, simple_loss=0.1915, pruned_loss=0.02696, over 4739.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03184, over 971467.02 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:16:40,971 INFO [train.py:715] (0/8) Epoch 13, batch 8900, loss[loss=0.1538, simple_loss=0.2343, pruned_loss=0.0367, over 4942.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03177, over 971622.28 frames.], batch size: 23, lr: 1.69e-04 2022-05-07 17:17:19,696 INFO [train.py:715] (0/8) Epoch 13, batch 8950, loss[loss=0.1398, simple_loss=0.2057, pruned_loss=0.037, over 4730.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03199, over 971078.35 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:17:57,823 INFO [train.py:715] (0/8) Epoch 13, batch 9000, loss[loss=0.1597, simple_loss=0.241, pruned_loss=0.03915, over 4823.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.0318, over 972411.15 frames.], batch size: 25, lr: 1.69e-04 2022-05-07 17:17:57,824 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 17:18:07,453 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1055, simple_loss=0.1893, pruned_loss=0.01084, over 914524.00 frames. 2022-05-07 17:18:45,499 INFO [train.py:715] (0/8) Epoch 13, batch 9050, loss[loss=0.1592, simple_loss=0.2301, pruned_loss=0.04413, over 4895.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03169, over 972441.31 frames.], batch size: 38, lr: 1.69e-04 2022-05-07 17:19:23,907 INFO [train.py:715] (0/8) Epoch 13, batch 9100, loss[loss=0.162, simple_loss=0.2354, pruned_loss=0.04434, over 4980.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2105, pruned_loss=0.03182, over 972596.48 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:20:03,096 INFO [train.py:715] (0/8) Epoch 13, batch 9150, loss[loss=0.1644, simple_loss=0.2378, pruned_loss=0.04554, over 4857.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.0315, over 973134.45 frames.], batch size: 20, lr: 1.69e-04 2022-05-07 17:20:42,093 INFO [train.py:715] (0/8) Epoch 13, batch 9200, loss[loss=0.1268, simple_loss=0.1974, pruned_loss=0.02813, over 4774.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2107, pruned_loss=0.03201, over 973149.23 frames.], batch size: 14, lr: 1.69e-04 2022-05-07 17:21:20,008 INFO [train.py:715] (0/8) Epoch 13, batch 9250, loss[loss=0.1486, simple_loss=0.2138, pruned_loss=0.04174, over 4900.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03178, over 973396.23 frames.], batch size: 39, lr: 1.69e-04 2022-05-07 17:21:58,912 INFO [train.py:715] (0/8) Epoch 13, batch 9300, loss[loss=0.1576, simple_loss=0.2221, pruned_loss=0.0466, over 4765.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03175, over 972900.31 frames.], batch size: 14, lr: 1.69e-04 2022-05-07 17:22:37,764 INFO [train.py:715] (0/8) Epoch 13, batch 9350, loss[loss=0.1465, simple_loss=0.226, pruned_loss=0.03352, over 4766.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03091, over 972441.71 frames.], batch size: 18, lr: 1.69e-04 2022-05-07 17:23:15,579 INFO [train.py:715] (0/8) Epoch 13, batch 9400, loss[loss=0.1156, simple_loss=0.18, pruned_loss=0.02559, over 4800.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2099, pruned_loss=0.03169, over 972086.30 frames.], batch size: 12, lr: 1.69e-04 2022-05-07 17:23:54,030 INFO [train.py:715] (0/8) Epoch 13, batch 9450, loss[loss=0.1277, simple_loss=0.2, pruned_loss=0.02765, over 4828.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2094, pruned_loss=0.03159, over 971505.14 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:24:32,851 INFO [train.py:715] (0/8) Epoch 13, batch 9500, loss[loss=0.1255, simple_loss=0.1894, pruned_loss=0.03084, over 4845.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2094, pruned_loss=0.03156, over 970536.30 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:25:11,106 INFO [train.py:715] (0/8) Epoch 13, batch 9550, loss[loss=0.1294, simple_loss=0.1984, pruned_loss=0.03024, over 4897.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2094, pruned_loss=0.03149, over 971560.41 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:25:49,077 INFO [train.py:715] (0/8) Epoch 13, batch 9600, loss[loss=0.1468, simple_loss=0.2105, pruned_loss=0.04157, over 4924.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2086, pruned_loss=0.03111, over 971693.59 frames.], batch size: 23, lr: 1.69e-04 2022-05-07 17:26:28,019 INFO [train.py:715] (0/8) Epoch 13, batch 9650, loss[loss=0.1428, simple_loss=0.2174, pruned_loss=0.03409, over 4920.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2083, pruned_loss=0.03114, over 970783.08 frames.], batch size: 23, lr: 1.69e-04 2022-05-07 17:27:06,434 INFO [train.py:715] (0/8) Epoch 13, batch 9700, loss[loss=0.1537, simple_loss=0.2376, pruned_loss=0.03489, over 4979.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03133, over 970785.87 frames.], batch size: 28, lr: 1.69e-04 2022-05-07 17:27:44,976 INFO [train.py:715] (0/8) Epoch 13, batch 9750, loss[loss=0.1242, simple_loss=0.2045, pruned_loss=0.02191, over 4939.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03126, over 971669.00 frames.], batch size: 24, lr: 1.69e-04 2022-05-07 17:28:23,840 INFO [train.py:715] (0/8) Epoch 13, batch 9800, loss[loss=0.1544, simple_loss=0.2318, pruned_loss=0.03849, over 4822.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03134, over 971859.49 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:29:03,033 INFO [train.py:715] (0/8) Epoch 13, batch 9850, loss[loss=0.132, simple_loss=0.2053, pruned_loss=0.02929, over 4780.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2107, pruned_loss=0.03147, over 971926.99 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 17:29:41,579 INFO [train.py:715] (0/8) Epoch 13, batch 9900, loss[loss=0.1526, simple_loss=0.2163, pruned_loss=0.04448, over 4985.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2098, pruned_loss=0.03157, over 971398.40 frames.], batch size: 35, lr: 1.69e-04 2022-05-07 17:30:19,819 INFO [train.py:715] (0/8) Epoch 13, batch 9950, loss[loss=0.1151, simple_loss=0.1863, pruned_loss=0.02199, over 4786.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03124, over 971495.65 frames.], batch size: 21, lr: 1.69e-04 2022-05-07 17:30:58,613 INFO [train.py:715] (0/8) Epoch 13, batch 10000, loss[loss=0.1385, simple_loss=0.2187, pruned_loss=0.02911, over 4834.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03103, over 971685.28 frames.], batch size: 26, lr: 1.69e-04 2022-05-07 17:31:37,783 INFO [train.py:715] (0/8) Epoch 13, batch 10050, loss[loss=0.1425, simple_loss=0.2137, pruned_loss=0.03567, over 4984.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03153, over 972538.76 frames.], batch size: 28, lr: 1.69e-04 2022-05-07 17:32:16,718 INFO [train.py:715] (0/8) Epoch 13, batch 10100, loss[loss=0.1416, simple_loss=0.2043, pruned_loss=0.03947, over 4762.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2084, pruned_loss=0.03088, over 972997.25 frames.], batch size: 18, lr: 1.69e-04 2022-05-07 17:32:54,963 INFO [train.py:715] (0/8) Epoch 13, batch 10150, loss[loss=0.1411, simple_loss=0.207, pruned_loss=0.03763, over 4833.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2091, pruned_loss=0.03111, over 972188.96 frames.], batch size: 12, lr: 1.69e-04 2022-05-07 17:33:33,995 INFO [train.py:715] (0/8) Epoch 13, batch 10200, loss[loss=0.1162, simple_loss=0.1893, pruned_loss=0.02149, over 4780.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.03115, over 972615.95 frames.], batch size: 12, lr: 1.69e-04 2022-05-07 17:34:13,390 INFO [train.py:715] (0/8) Epoch 13, batch 10250, loss[loss=0.1683, simple_loss=0.2395, pruned_loss=0.04861, over 4985.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03162, over 973060.92 frames.], batch size: 40, lr: 1.69e-04 2022-05-07 17:34:52,080 INFO [train.py:715] (0/8) Epoch 13, batch 10300, loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02929, over 4974.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03209, over 972880.38 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:35:31,124 INFO [train.py:715] (0/8) Epoch 13, batch 10350, loss[loss=0.1307, simple_loss=0.199, pruned_loss=0.03124, over 4765.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03217, over 971960.57 frames.], batch size: 18, lr: 1.69e-04 2022-05-07 17:36:10,302 INFO [train.py:715] (0/8) Epoch 13, batch 10400, loss[loss=0.1482, simple_loss=0.2135, pruned_loss=0.04146, over 4745.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2106, pruned_loss=0.03227, over 972168.67 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:36:49,248 INFO [train.py:715] (0/8) Epoch 13, batch 10450, loss[loss=0.1282, simple_loss=0.2004, pruned_loss=0.02797, over 4940.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2104, pruned_loss=0.03195, over 971869.86 frames.], batch size: 35, lr: 1.69e-04 2022-05-07 17:37:26,675 INFO [train.py:715] (0/8) Epoch 13, batch 10500, loss[loss=0.1515, simple_loss=0.229, pruned_loss=0.03703, over 4936.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03221, over 972213.79 frames.], batch size: 29, lr: 1.69e-04 2022-05-07 17:38:05,564 INFO [train.py:715] (0/8) Epoch 13, batch 10550, loss[loss=0.1218, simple_loss=0.2035, pruned_loss=0.02007, over 4754.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2107, pruned_loss=0.03225, over 972208.71 frames.], batch size: 14, lr: 1.69e-04 2022-05-07 17:38:44,485 INFO [train.py:715] (0/8) Epoch 13, batch 10600, loss[loss=0.1559, simple_loss=0.231, pruned_loss=0.04036, over 4943.00 frames.], tot_loss[loss=0.137, simple_loss=0.2103, pruned_loss=0.03183, over 971794.26 frames.], batch size: 23, lr: 1.69e-04 2022-05-07 17:39:22,590 INFO [train.py:715] (0/8) Epoch 13, batch 10650, loss[loss=0.1402, simple_loss=0.2059, pruned_loss=0.03726, over 4861.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03174, over 971535.33 frames.], batch size: 32, lr: 1.69e-04 2022-05-07 17:40:01,762 INFO [train.py:715] (0/8) Epoch 13, batch 10700, loss[loss=0.1306, simple_loss=0.2079, pruned_loss=0.02663, over 4904.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03168, over 971758.01 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 17:40:41,060 INFO [train.py:715] (0/8) Epoch 13, batch 10750, loss[loss=0.127, simple_loss=0.1935, pruned_loss=0.0303, over 4966.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03172, over 970868.12 frames.], batch size: 24, lr: 1.69e-04 2022-05-07 17:41:19,857 INFO [train.py:715] (0/8) Epoch 13, batch 10800, loss[loss=0.1586, simple_loss=0.2209, pruned_loss=0.04816, over 4835.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2097, pruned_loss=0.03144, over 971865.11 frames.], batch size: 30, lr: 1.69e-04 2022-05-07 17:41:57,879 INFO [train.py:715] (0/8) Epoch 13, batch 10850, loss[loss=0.1325, simple_loss=0.201, pruned_loss=0.03203, over 4980.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.0314, over 972174.97 frames.], batch size: 24, lr: 1.69e-04 2022-05-07 17:42:37,029 INFO [train.py:715] (0/8) Epoch 13, batch 10900, loss[loss=0.1135, simple_loss=0.1858, pruned_loss=0.02061, over 4760.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2095, pruned_loss=0.03156, over 972549.77 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:43:16,889 INFO [train.py:715] (0/8) Epoch 13, batch 10950, loss[loss=0.112, simple_loss=0.1919, pruned_loss=0.01609, over 4908.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03135, over 973180.23 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:43:56,318 INFO [train.py:715] (0/8) Epoch 13, batch 11000, loss[loss=0.1059, simple_loss=0.186, pruned_loss=0.01293, over 4941.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03111, over 973350.61 frames.], batch size: 23, lr: 1.69e-04 2022-05-07 17:44:34,951 INFO [train.py:715] (0/8) Epoch 13, batch 11050, loss[loss=0.1192, simple_loss=0.1961, pruned_loss=0.0212, over 4965.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03121, over 973926.97 frames.], batch size: 14, lr: 1.69e-04 2022-05-07 17:45:14,248 INFO [train.py:715] (0/8) Epoch 13, batch 11100, loss[loss=0.1395, simple_loss=0.2173, pruned_loss=0.03078, over 4898.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03134, over 973145.32 frames.], batch size: 22, lr: 1.69e-04 2022-05-07 17:45:53,232 INFO [train.py:715] (0/8) Epoch 13, batch 11150, loss[loss=0.1463, simple_loss=0.2085, pruned_loss=0.04205, over 4850.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03117, over 973500.49 frames.], batch size: 30, lr: 1.69e-04 2022-05-07 17:46:30,987 INFO [train.py:715] (0/8) Epoch 13, batch 11200, loss[loss=0.1304, simple_loss=0.1985, pruned_loss=0.03119, over 4873.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2095, pruned_loss=0.03144, over 973385.77 frames.], batch size: 20, lr: 1.69e-04 2022-05-07 17:47:09,190 INFO [train.py:715] (0/8) Epoch 13, batch 11250, loss[loss=0.1335, simple_loss=0.2097, pruned_loss=0.02868, over 4690.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2094, pruned_loss=0.03102, over 973117.73 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:47:48,135 INFO [train.py:715] (0/8) Epoch 13, batch 11300, loss[loss=0.1456, simple_loss=0.2181, pruned_loss=0.03655, over 4745.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03104, over 972627.26 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:48:27,085 INFO [train.py:715] (0/8) Epoch 13, batch 11350, loss[loss=0.1452, simple_loss=0.2332, pruned_loss=0.02858, over 4952.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03134, over 972848.81 frames.], batch size: 21, lr: 1.69e-04 2022-05-07 17:49:05,289 INFO [train.py:715] (0/8) Epoch 13, batch 11400, loss[loss=0.124, simple_loss=0.1887, pruned_loss=0.02968, over 4822.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03123, over 973206.13 frames.], batch size: 13, lr: 1.69e-04 2022-05-07 17:49:44,154 INFO [train.py:715] (0/8) Epoch 13, batch 11450, loss[loss=0.1211, simple_loss=0.1996, pruned_loss=0.02135, over 4972.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03129, over 972347.68 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:50:04,587 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-464000.pt 2022-05-07 17:50:25,715 INFO [train.py:715] (0/8) Epoch 13, batch 11500, loss[loss=0.1528, simple_loss=0.2193, pruned_loss=0.04316, over 4975.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03125, over 971606.70 frames.], batch size: 39, lr: 1.69e-04 2022-05-07 17:51:03,649 INFO [train.py:715] (0/8) Epoch 13, batch 11550, loss[loss=0.1072, simple_loss=0.1823, pruned_loss=0.01602, over 4783.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03052, over 972122.93 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:51:42,313 INFO [train.py:715] (0/8) Epoch 13, batch 11600, loss[loss=0.1409, simple_loss=0.2125, pruned_loss=0.03461, over 4870.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03049, over 972251.24 frames.], batch size: 32, lr: 1.69e-04 2022-05-07 17:52:21,596 INFO [train.py:715] (0/8) Epoch 13, batch 11650, loss[loss=0.1589, simple_loss=0.2244, pruned_loss=0.04667, over 4984.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2089, pruned_loss=0.03015, over 972744.07 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 17:53:00,306 INFO [train.py:715] (0/8) Epoch 13, batch 11700, loss[loss=0.1636, simple_loss=0.2461, pruned_loss=0.04053, over 4904.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.02992, over 972801.68 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 17:53:38,274 INFO [train.py:715] (0/8) Epoch 13, batch 11750, loss[loss=0.1246, simple_loss=0.1991, pruned_loss=0.02504, over 4935.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03032, over 971439.58 frames.], batch size: 35, lr: 1.69e-04 2022-05-07 17:54:16,752 INFO [train.py:715] (0/8) Epoch 13, batch 11800, loss[loss=0.1393, simple_loss=0.2159, pruned_loss=0.03131, over 4809.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2089, pruned_loss=0.03065, over 971256.70 frames.], batch size: 25, lr: 1.69e-04 2022-05-07 17:54:55,476 INFO [train.py:715] (0/8) Epoch 13, batch 11850, loss[loss=0.1331, simple_loss=0.2024, pruned_loss=0.03195, over 4905.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03054, over 972269.75 frames.], batch size: 18, lr: 1.69e-04 2022-05-07 17:55:32,885 INFO [train.py:715] (0/8) Epoch 13, batch 11900, loss[loss=0.1388, simple_loss=0.2189, pruned_loss=0.02934, over 4895.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03026, over 971510.11 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:56:11,541 INFO [train.py:715] (0/8) Epoch 13, batch 11950, loss[loss=0.1414, simple_loss=0.2204, pruned_loss=0.03124, over 4883.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.0303, over 971412.90 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:56:50,607 INFO [train.py:715] (0/8) Epoch 13, batch 12000, loss[loss=0.131, simple_loss=0.2048, pruned_loss=0.02857, over 4989.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2085, pruned_loss=0.03083, over 972148.02 frames.], batch size: 25, lr: 1.69e-04 2022-05-07 17:56:50,609 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 17:57:00,358 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1055, simple_loss=0.1893, pruned_loss=0.01081, over 914524.00 frames. 2022-05-07 17:57:40,022 INFO [train.py:715] (0/8) Epoch 13, batch 12050, loss[loss=0.1533, simple_loss=0.2324, pruned_loss=0.03705, over 4890.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2085, pruned_loss=0.03083, over 971701.10 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 17:58:18,318 INFO [train.py:715] (0/8) Epoch 13, batch 12100, loss[loss=0.1382, simple_loss=0.2104, pruned_loss=0.03302, over 4813.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03046, over 971791.62 frames.], batch size: 26, lr: 1.69e-04 2022-05-07 17:58:56,068 INFO [train.py:715] (0/8) Epoch 13, batch 12150, loss[loss=0.1449, simple_loss=0.22, pruned_loss=0.03491, over 4927.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03141, over 972802.09 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 17:59:34,969 INFO [train.py:715] (0/8) Epoch 13, batch 12200, loss[loss=0.1461, simple_loss=0.2217, pruned_loss=0.03532, over 4955.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03174, over 972488.95 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 18:00:13,886 INFO [train.py:715] (0/8) Epoch 13, batch 12250, loss[loss=0.1267, simple_loss=0.1972, pruned_loss=0.02811, over 4746.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2109, pruned_loss=0.03168, over 972435.10 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 18:00:52,455 INFO [train.py:715] (0/8) Epoch 13, batch 12300, loss[loss=0.1219, simple_loss=0.1917, pruned_loss=0.026, over 4894.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.0318, over 971799.61 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 18:01:30,133 INFO [train.py:715] (0/8) Epoch 13, batch 12350, loss[loss=0.136, simple_loss=0.2083, pruned_loss=0.03184, over 4816.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03165, over 971630.79 frames.], batch size: 13, lr: 1.69e-04 2022-05-07 18:02:09,065 INFO [train.py:715] (0/8) Epoch 13, batch 12400, loss[loss=0.1896, simple_loss=0.272, pruned_loss=0.05362, over 4903.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03122, over 972407.80 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 18:02:47,455 INFO [train.py:715] (0/8) Epoch 13, batch 12450, loss[loss=0.1398, simple_loss=0.2265, pruned_loss=0.02661, over 4888.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03133, over 972093.52 frames.], batch size: 22, lr: 1.69e-04 2022-05-07 18:03:24,469 INFO [train.py:715] (0/8) Epoch 13, batch 12500, loss[loss=0.1422, simple_loss=0.2171, pruned_loss=0.03361, over 4869.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.0313, over 972355.48 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 18:04:03,257 INFO [train.py:715] (0/8) Epoch 13, batch 12550, loss[loss=0.1454, simple_loss=0.2152, pruned_loss=0.03775, over 4763.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2113, pruned_loss=0.03167, over 973116.43 frames.], batch size: 14, lr: 1.69e-04 2022-05-07 18:04:41,879 INFO [train.py:715] (0/8) Epoch 13, batch 12600, loss[loss=0.1457, simple_loss=0.2138, pruned_loss=0.03883, over 4833.00 frames.], tot_loss[loss=0.137, simple_loss=0.2111, pruned_loss=0.03143, over 973371.03 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 18:05:20,410 INFO [train.py:715] (0/8) Epoch 13, batch 12650, loss[loss=0.1431, simple_loss=0.2192, pruned_loss=0.03354, over 4973.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2114, pruned_loss=0.03192, over 972517.89 frames.], batch size: 21, lr: 1.69e-04 2022-05-07 18:05:58,203 INFO [train.py:715] (0/8) Epoch 13, batch 12700, loss[loss=0.1346, simple_loss=0.2037, pruned_loss=0.03269, over 4982.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03129, over 972047.16 frames.], batch size: 31, lr: 1.69e-04 2022-05-07 18:06:37,490 INFO [train.py:715] (0/8) Epoch 13, batch 12750, loss[loss=0.1523, simple_loss=0.2207, pruned_loss=0.04201, over 4766.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.0316, over 972086.77 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 18:07:16,113 INFO [train.py:715] (0/8) Epoch 13, batch 12800, loss[loss=0.1184, simple_loss=0.1976, pruned_loss=0.01966, over 4873.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.0315, over 972395.89 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 18:07:53,797 INFO [train.py:715] (0/8) Epoch 13, batch 12850, loss[loss=0.1289, simple_loss=0.1995, pruned_loss=0.02912, over 4836.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2094, pruned_loss=0.03146, over 972209.22 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 18:08:32,299 INFO [train.py:715] (0/8) Epoch 13, batch 12900, loss[loss=0.1325, simple_loss=0.1951, pruned_loss=0.03492, over 4798.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03124, over 970963.00 frames.], batch size: 12, lr: 1.69e-04 2022-05-07 18:09:10,900 INFO [train.py:715] (0/8) Epoch 13, batch 12950, loss[loss=0.1272, simple_loss=0.2104, pruned_loss=0.02202, over 4793.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2095, pruned_loss=0.03139, over 970929.27 frames.], batch size: 21, lr: 1.69e-04 2022-05-07 18:09:48,854 INFO [train.py:715] (0/8) Epoch 13, batch 13000, loss[loss=0.1131, simple_loss=0.1793, pruned_loss=0.02346, over 4884.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2096, pruned_loss=0.03135, over 970989.38 frames.], batch size: 32, lr: 1.69e-04 2022-05-07 18:10:26,254 INFO [train.py:715] (0/8) Epoch 13, batch 13050, loss[loss=0.1058, simple_loss=0.1861, pruned_loss=0.01278, over 4971.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03178, over 972551.87 frames.], batch size: 28, lr: 1.69e-04 2022-05-07 18:11:05,298 INFO [train.py:715] (0/8) Epoch 13, batch 13100, loss[loss=0.1206, simple_loss=0.1932, pruned_loss=0.02398, over 4730.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2096, pruned_loss=0.03141, over 972554.36 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 18:11:43,993 INFO [train.py:715] (0/8) Epoch 13, batch 13150, loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02825, over 4751.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03161, over 972447.62 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 18:12:21,743 INFO [train.py:715] (0/8) Epoch 13, batch 13200, loss[loss=0.1386, simple_loss=0.2056, pruned_loss=0.03583, over 4890.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2096, pruned_loss=0.03141, over 973211.35 frames.], batch size: 16, lr: 1.69e-04 2022-05-07 18:13:00,173 INFO [train.py:715] (0/8) Epoch 13, batch 13250, loss[loss=0.1524, simple_loss=0.2327, pruned_loss=0.03605, over 4827.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.03117, over 972839.24 frames.], batch size: 27, lr: 1.69e-04 2022-05-07 18:13:38,866 INFO [train.py:715] (0/8) Epoch 13, batch 13300, loss[loss=0.1297, simple_loss=0.1994, pruned_loss=0.02996, over 4910.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2092, pruned_loss=0.03114, over 972666.26 frames.], batch size: 17, lr: 1.69e-04 2022-05-07 18:14:17,602 INFO [train.py:715] (0/8) Epoch 13, batch 13350, loss[loss=0.1387, simple_loss=0.2223, pruned_loss=0.02756, over 4795.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.03181, over 973105.72 frames.], batch size: 24, lr: 1.69e-04 2022-05-07 18:14:55,891 INFO [train.py:715] (0/8) Epoch 13, batch 13400, loss[loss=0.1239, simple_loss=0.2023, pruned_loss=0.02278, over 4789.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03215, over 972399.12 frames.], batch size: 18, lr: 1.69e-04 2022-05-07 18:15:35,678 INFO [train.py:715] (0/8) Epoch 13, batch 13450, loss[loss=0.1331, simple_loss=0.1985, pruned_loss=0.03386, over 4842.00 frames.], tot_loss[loss=0.138, simple_loss=0.2109, pruned_loss=0.03251, over 972400.27 frames.], batch size: 15, lr: 1.69e-04 2022-05-07 18:16:14,405 INFO [train.py:715] (0/8) Epoch 13, batch 13500, loss[loss=0.1461, simple_loss=0.2302, pruned_loss=0.03097, over 4897.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2104, pruned_loss=0.0319, over 972259.51 frames.], batch size: 19, lr: 1.69e-04 2022-05-07 18:16:52,057 INFO [train.py:715] (0/8) Epoch 13, batch 13550, loss[loss=0.1199, simple_loss=0.2041, pruned_loss=0.01783, over 4930.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2096, pruned_loss=0.03131, over 972187.57 frames.], batch size: 23, lr: 1.69e-04 2022-05-07 18:17:29,847 INFO [train.py:715] (0/8) Epoch 13, batch 13600, loss[loss=0.1331, simple_loss=0.2151, pruned_loss=0.0256, over 4984.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03164, over 972507.15 frames.], batch size: 39, lr: 1.68e-04 2022-05-07 18:18:08,968 INFO [train.py:715] (0/8) Epoch 13, batch 13650, loss[loss=0.119, simple_loss=0.1938, pruned_loss=0.02208, over 4883.00 frames.], tot_loss[loss=0.1376, simple_loss=0.211, pruned_loss=0.03214, over 971769.92 frames.], batch size: 22, lr: 1.68e-04 2022-05-07 18:18:47,084 INFO [train.py:715] (0/8) Epoch 13, batch 13700, loss[loss=0.1595, simple_loss=0.2422, pruned_loss=0.03846, over 4803.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2111, pruned_loss=0.03223, over 972456.18 frames.], batch size: 17, lr: 1.68e-04 2022-05-07 18:19:24,722 INFO [train.py:715] (0/8) Epoch 13, batch 13750, loss[loss=0.09153, simple_loss=0.1664, pruned_loss=0.008325, over 4814.00 frames.], tot_loss[loss=0.1378, simple_loss=0.211, pruned_loss=0.03232, over 972787.18 frames.], batch size: 27, lr: 1.68e-04 2022-05-07 18:20:03,318 INFO [train.py:715] (0/8) Epoch 13, batch 13800, loss[loss=0.1312, simple_loss=0.2178, pruned_loss=0.0223, over 4818.00 frames.], tot_loss[loss=0.1377, simple_loss=0.211, pruned_loss=0.03225, over 971765.01 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 18:20:41,457 INFO [train.py:715] (0/8) Epoch 13, batch 13850, loss[loss=0.1261, simple_loss=0.2002, pruned_loss=0.02601, over 4978.00 frames.], tot_loss[loss=0.1369, simple_loss=0.21, pruned_loss=0.03188, over 972268.35 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 18:21:19,870 INFO [train.py:715] (0/8) Epoch 13, batch 13900, loss[loss=0.1152, simple_loss=0.1837, pruned_loss=0.02334, over 4796.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2092, pruned_loss=0.03178, over 972007.59 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 18:21:58,632 INFO [train.py:715] (0/8) Epoch 13, batch 13950, loss[loss=0.1334, simple_loss=0.2115, pruned_loss=0.02765, over 4905.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03141, over 972241.44 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:22:37,438 INFO [train.py:715] (0/8) Epoch 13, batch 14000, loss[loss=0.1161, simple_loss=0.1844, pruned_loss=0.02395, over 4938.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2088, pruned_loss=0.03104, over 972996.39 frames.], batch size: 35, lr: 1.68e-04 2022-05-07 18:23:15,658 INFO [train.py:715] (0/8) Epoch 13, batch 14050, loss[loss=0.1086, simple_loss=0.1809, pruned_loss=0.01813, over 4759.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2083, pruned_loss=0.03065, over 972245.47 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:23:53,250 INFO [train.py:715] (0/8) Epoch 13, batch 14100, loss[loss=0.1266, simple_loss=0.195, pruned_loss=0.02914, over 4826.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2086, pruned_loss=0.03091, over 972250.23 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 18:24:32,479 INFO [train.py:715] (0/8) Epoch 13, batch 14150, loss[loss=0.1291, simple_loss=0.2091, pruned_loss=0.02457, over 4893.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03111, over 972662.21 frames.], batch size: 22, lr: 1.68e-04 2022-05-07 18:25:10,627 INFO [train.py:715] (0/8) Epoch 13, batch 14200, loss[loss=0.1179, simple_loss=0.1968, pruned_loss=0.01948, over 4955.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.03113, over 972558.22 frames.], batch size: 24, lr: 1.68e-04 2022-05-07 18:25:48,507 INFO [train.py:715] (0/8) Epoch 13, batch 14250, loss[loss=0.1481, simple_loss=0.2242, pruned_loss=0.036, over 4875.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.03113, over 972425.57 frames.], batch size: 32, lr: 1.68e-04 2022-05-07 18:26:26,771 INFO [train.py:715] (0/8) Epoch 13, batch 14300, loss[loss=0.1491, simple_loss=0.2213, pruned_loss=0.03846, over 4881.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.0307, over 972749.22 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:27:06,168 INFO [train.py:715] (0/8) Epoch 13, batch 14350, loss[loss=0.1287, simple_loss=0.2049, pruned_loss=0.02628, over 4931.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03102, over 973018.95 frames.], batch size: 29, lr: 1.68e-04 2022-05-07 18:27:44,506 INFO [train.py:715] (0/8) Epoch 13, batch 14400, loss[loss=0.1167, simple_loss=0.1869, pruned_loss=0.02321, over 4848.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03066, over 973026.49 frames.], batch size: 30, lr: 1.68e-04 2022-05-07 18:28:22,433 INFO [train.py:715] (0/8) Epoch 13, batch 14450, loss[loss=0.1311, simple_loss=0.2062, pruned_loss=0.02797, over 4979.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03103, over 973054.96 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 18:29:01,541 INFO [train.py:715] (0/8) Epoch 13, batch 14500, loss[loss=0.1461, simple_loss=0.2284, pruned_loss=0.03191, over 4932.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2088, pruned_loss=0.03108, over 972898.48 frames.], batch size: 29, lr: 1.68e-04 2022-05-07 18:29:40,341 INFO [train.py:715] (0/8) Epoch 13, batch 14550, loss[loss=0.1519, simple_loss=0.2174, pruned_loss=0.04319, over 4863.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2094, pruned_loss=0.03162, over 972782.51 frames.], batch size: 20, lr: 1.68e-04 2022-05-07 18:30:18,691 INFO [train.py:715] (0/8) Epoch 13, batch 14600, loss[loss=0.143, simple_loss=0.221, pruned_loss=0.03249, over 4808.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2096, pruned_loss=0.03187, over 973308.21 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 18:30:57,056 INFO [train.py:715] (0/8) Epoch 13, batch 14650, loss[loss=0.1093, simple_loss=0.1815, pruned_loss=0.0185, over 4818.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2099, pruned_loss=0.03185, over 972395.88 frames.], batch size: 13, lr: 1.68e-04 2022-05-07 18:31:35,705 INFO [train.py:715] (0/8) Epoch 13, batch 14700, loss[loss=0.1285, simple_loss=0.1961, pruned_loss=0.03044, over 4858.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03119, over 972254.44 frames.], batch size: 32, lr: 1.68e-04 2022-05-07 18:32:13,644 INFO [train.py:715] (0/8) Epoch 13, batch 14750, loss[loss=0.1526, simple_loss=0.2196, pruned_loss=0.04275, over 4973.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03141, over 972787.20 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 18:32:50,797 INFO [train.py:715] (0/8) Epoch 13, batch 14800, loss[loss=0.1663, simple_loss=0.2234, pruned_loss=0.05463, over 4740.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.03138, over 972926.05 frames.], batch size: 16, lr: 1.68e-04 2022-05-07 18:33:29,887 INFO [train.py:715] (0/8) Epoch 13, batch 14850, loss[loss=0.138, simple_loss=0.2042, pruned_loss=0.03589, over 4972.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03116, over 973518.42 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 18:34:08,567 INFO [train.py:715] (0/8) Epoch 13, batch 14900, loss[loss=0.1339, simple_loss=0.1993, pruned_loss=0.0342, over 4895.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2096, pruned_loss=0.03135, over 972708.89 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:34:46,489 INFO [train.py:715] (0/8) Epoch 13, batch 14950, loss[loss=0.1462, simple_loss=0.2194, pruned_loss=0.03649, over 4965.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2095, pruned_loss=0.03142, over 972333.10 frames.], batch size: 35, lr: 1.68e-04 2022-05-07 18:35:24,992 INFO [train.py:715] (0/8) Epoch 13, batch 15000, loss[loss=0.1208, simple_loss=0.1928, pruned_loss=0.02442, over 4912.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03104, over 972096.18 frames.], batch size: 18, lr: 1.68e-04 2022-05-07 18:35:24,993 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 18:35:34,568 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1052, simple_loss=0.189, pruned_loss=0.01074, over 914524.00 frames. 2022-05-07 18:36:13,158 INFO [train.py:715] (0/8) Epoch 13, batch 15050, loss[loss=0.1433, simple_loss=0.2177, pruned_loss=0.03445, over 4778.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.0309, over 971462.01 frames.], batch size: 18, lr: 1.68e-04 2022-05-07 18:36:52,710 INFO [train.py:715] (0/8) Epoch 13, batch 15100, loss[loss=0.1463, simple_loss=0.2258, pruned_loss=0.03337, over 4930.00 frames.], tot_loss[loss=0.1345, simple_loss=0.208, pruned_loss=0.03047, over 971817.32 frames.], batch size: 39, lr: 1.68e-04 2022-05-07 18:37:31,191 INFO [train.py:715] (0/8) Epoch 13, batch 15150, loss[loss=0.1178, simple_loss=0.1917, pruned_loss=0.02198, over 4802.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03032, over 972456.07 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 18:38:09,443 INFO [train.py:715] (0/8) Epoch 13, batch 15200, loss[loss=0.1489, simple_loss=0.2125, pruned_loss=0.04269, over 4862.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.0303, over 972834.79 frames.], batch size: 30, lr: 1.68e-04 2022-05-07 18:38:49,226 INFO [train.py:715] (0/8) Epoch 13, batch 15250, loss[loss=0.1003, simple_loss=0.1627, pruned_loss=0.01891, over 4811.00 frames.], tot_loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.03027, over 972899.43 frames.], batch size: 12, lr: 1.68e-04 2022-05-07 18:39:27,972 INFO [train.py:715] (0/8) Epoch 13, batch 15300, loss[loss=0.1398, simple_loss=0.209, pruned_loss=0.03536, over 4865.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03105, over 972351.91 frames.], batch size: 22, lr: 1.68e-04 2022-05-07 18:40:06,012 INFO [train.py:715] (0/8) Epoch 13, batch 15350, loss[loss=0.1376, simple_loss=0.2146, pruned_loss=0.03027, over 4947.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03115, over 972679.47 frames.], batch size: 23, lr: 1.68e-04 2022-05-07 18:40:45,011 INFO [train.py:715] (0/8) Epoch 13, batch 15400, loss[loss=0.1585, simple_loss=0.2285, pruned_loss=0.04429, over 4986.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03157, over 971949.30 frames.], batch size: 31, lr: 1.68e-04 2022-05-07 18:41:23,900 INFO [train.py:715] (0/8) Epoch 13, batch 15450, loss[loss=0.1232, simple_loss=0.1997, pruned_loss=0.02333, over 4838.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.0312, over 972228.04 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 18:42:03,710 INFO [train.py:715] (0/8) Epoch 13, batch 15500, loss[loss=0.1348, simple_loss=0.2077, pruned_loss=0.03094, over 4757.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03121, over 972303.96 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:42:41,959 INFO [train.py:715] (0/8) Epoch 13, batch 15550, loss[loss=0.1334, simple_loss=0.2027, pruned_loss=0.03204, over 4755.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03063, over 972317.90 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:43:21,695 INFO [train.py:715] (0/8) Epoch 13, batch 15600, loss[loss=0.126, simple_loss=0.2071, pruned_loss=0.02243, over 4922.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03081, over 972208.38 frames.], batch size: 29, lr: 1.68e-04 2022-05-07 18:44:01,137 INFO [train.py:715] (0/8) Epoch 13, batch 15650, loss[loss=0.1455, simple_loss=0.216, pruned_loss=0.03746, over 4785.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03111, over 973022.13 frames.], batch size: 17, lr: 1.68e-04 2022-05-07 18:44:39,622 INFO [train.py:715] (0/8) Epoch 13, batch 15700, loss[loss=0.131, simple_loss=0.2078, pruned_loss=0.02709, over 4981.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2094, pruned_loss=0.031, over 973295.28 frames.], batch size: 28, lr: 1.68e-04 2022-05-07 18:45:18,631 INFO [train.py:715] (0/8) Epoch 13, batch 15750, loss[loss=0.1248, simple_loss=0.2084, pruned_loss=0.02057, over 4865.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03172, over 973459.41 frames.], batch size: 20, lr: 1.68e-04 2022-05-07 18:45:57,409 INFO [train.py:715] (0/8) Epoch 13, batch 15800, loss[loss=0.1389, simple_loss=0.2133, pruned_loss=0.03229, over 4919.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2111, pruned_loss=0.03164, over 973041.41 frames.], batch size: 29, lr: 1.68e-04 2022-05-07 18:46:35,692 INFO [train.py:715] (0/8) Epoch 13, batch 15850, loss[loss=0.1107, simple_loss=0.1892, pruned_loss=0.01613, over 4823.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.0312, over 972270.86 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 18:47:13,601 INFO [train.py:715] (0/8) Epoch 13, batch 15900, loss[loss=0.1397, simple_loss=0.2069, pruned_loss=0.03628, over 4853.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03137, over 971950.57 frames.], batch size: 32, lr: 1.68e-04 2022-05-07 18:47:52,834 INFO [train.py:715] (0/8) Epoch 13, batch 15950, loss[loss=0.1292, simple_loss=0.2081, pruned_loss=0.02518, over 4930.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.0313, over 972210.16 frames.], batch size: 23, lr: 1.68e-04 2022-05-07 18:48:31,346 INFO [train.py:715] (0/8) Epoch 13, batch 16000, loss[loss=0.1279, simple_loss=0.2022, pruned_loss=0.02678, over 4946.00 frames.], tot_loss[loss=0.1358, simple_loss=0.209, pruned_loss=0.03125, over 972598.77 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 18:49:09,599 INFO [train.py:715] (0/8) Epoch 13, batch 16050, loss[loss=0.145, simple_loss=0.2282, pruned_loss=0.0309, over 4819.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2096, pruned_loss=0.03158, over 972450.82 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 18:49:48,078 INFO [train.py:715] (0/8) Epoch 13, batch 16100, loss[loss=0.1572, simple_loss=0.2401, pruned_loss=0.03717, over 4945.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03176, over 971771.40 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 18:50:27,333 INFO [train.py:715] (0/8) Epoch 13, batch 16150, loss[loss=0.1455, simple_loss=0.216, pruned_loss=0.03747, over 4956.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03154, over 971433.34 frames.], batch size: 35, lr: 1.68e-04 2022-05-07 18:51:05,992 INFO [train.py:715] (0/8) Epoch 13, batch 16200, loss[loss=0.1353, simple_loss=0.211, pruned_loss=0.02977, over 4809.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2104, pruned_loss=0.03171, over 971556.25 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 18:51:42,921 INFO [train.py:715] (0/8) Epoch 13, batch 16250, loss[loss=0.1237, simple_loss=0.1958, pruned_loss=0.0258, over 4931.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03173, over 972015.45 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 18:52:22,098 INFO [train.py:715] (0/8) Epoch 13, batch 16300, loss[loss=0.1315, simple_loss=0.2086, pruned_loss=0.02725, over 4646.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.031, over 971574.90 frames.], batch size: 13, lr: 1.68e-04 2022-05-07 18:53:00,696 INFO [train.py:715] (0/8) Epoch 13, batch 16350, loss[loss=0.1126, simple_loss=0.1791, pruned_loss=0.02303, over 4655.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03131, over 971719.05 frames.], batch size: 13, lr: 1.68e-04 2022-05-07 18:53:39,042 INFO [train.py:715] (0/8) Epoch 13, batch 16400, loss[loss=0.1312, simple_loss=0.1992, pruned_loss=0.03157, over 4817.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.03144, over 972092.04 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 18:54:18,186 INFO [train.py:715] (0/8) Epoch 13, batch 16450, loss[loss=0.1422, simple_loss=0.2253, pruned_loss=0.02952, over 4896.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.03126, over 972459.02 frames.], batch size: 39, lr: 1.68e-04 2022-05-07 18:54:57,399 INFO [train.py:715] (0/8) Epoch 13, batch 16500, loss[loss=0.1405, simple_loss=0.2265, pruned_loss=0.02725, over 4903.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03088, over 972362.04 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 18:55:36,541 INFO [train.py:715] (0/8) Epoch 13, batch 16550, loss[loss=0.1487, simple_loss=0.222, pruned_loss=0.03766, over 4746.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2102, pruned_loss=0.03108, over 972552.29 frames.], batch size: 16, lr: 1.68e-04 2022-05-07 18:56:13,922 INFO [train.py:715] (0/8) Epoch 13, batch 16600, loss[loss=0.1473, simple_loss=0.2277, pruned_loss=0.03351, over 4788.00 frames.], tot_loss[loss=0.136, simple_loss=0.2101, pruned_loss=0.03094, over 971542.33 frames.], batch size: 18, lr: 1.68e-04 2022-05-07 18:56:53,162 INFO [train.py:715] (0/8) Epoch 13, batch 16650, loss[loss=0.1604, simple_loss=0.2301, pruned_loss=0.04532, over 4832.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2096, pruned_loss=0.03052, over 971697.60 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 18:57:31,697 INFO [train.py:715] (0/8) Epoch 13, batch 16700, loss[loss=0.142, simple_loss=0.2232, pruned_loss=0.03041, over 4801.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2103, pruned_loss=0.03116, over 971028.49 frames.], batch size: 24, lr: 1.68e-04 2022-05-07 18:58:09,689 INFO [train.py:715] (0/8) Epoch 13, batch 16750, loss[loss=0.14, simple_loss=0.206, pruned_loss=0.03703, over 4810.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2107, pruned_loss=0.03142, over 971286.29 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 18:58:48,287 INFO [train.py:715] (0/8) Epoch 13, batch 16800, loss[loss=0.1709, simple_loss=0.2348, pruned_loss=0.05353, over 4829.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2109, pruned_loss=0.03167, over 971458.53 frames.], batch size: 27, lr: 1.68e-04 2022-05-07 18:59:27,920 INFO [train.py:715] (0/8) Epoch 13, batch 16850, loss[loss=0.1308, simple_loss=0.2064, pruned_loss=0.02759, over 4882.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2113, pruned_loss=0.03185, over 971435.53 frames.], batch size: 16, lr: 1.68e-04 2022-05-07 19:00:06,295 INFO [train.py:715] (0/8) Epoch 13, batch 16900, loss[loss=0.1199, simple_loss=0.1919, pruned_loss=0.02397, over 4977.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2111, pruned_loss=0.03137, over 972259.29 frames.], batch size: 28, lr: 1.68e-04 2022-05-07 19:00:44,801 INFO [train.py:715] (0/8) Epoch 13, batch 16950, loss[loss=0.1292, simple_loss=0.2054, pruned_loss=0.02648, over 4962.00 frames.], tot_loss[loss=0.137, simple_loss=0.2112, pruned_loss=0.03143, over 972821.92 frames.], batch size: 29, lr: 1.68e-04 2022-05-07 19:01:23,720 INFO [train.py:715] (0/8) Epoch 13, batch 17000, loss[loss=0.1692, simple_loss=0.2555, pruned_loss=0.0415, over 4940.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2105, pruned_loss=0.03159, over 973234.45 frames.], batch size: 23, lr: 1.68e-04 2022-05-07 19:02:02,411 INFO [train.py:715] (0/8) Epoch 13, batch 17050, loss[loss=0.1845, simple_loss=0.2607, pruned_loss=0.05413, over 4912.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03161, over 972979.59 frames.], batch size: 39, lr: 1.68e-04 2022-05-07 19:02:40,527 INFO [train.py:715] (0/8) Epoch 13, batch 17100, loss[loss=0.1323, simple_loss=0.2136, pruned_loss=0.0255, over 4863.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03177, over 973088.48 frames.], batch size: 20, lr: 1.68e-04 2022-05-07 19:03:19,258 INFO [train.py:715] (0/8) Epoch 13, batch 17150, loss[loss=0.1287, simple_loss=0.201, pruned_loss=0.02814, over 4705.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2115, pruned_loss=0.03177, over 973125.79 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 19:03:58,097 INFO [train.py:715] (0/8) Epoch 13, batch 17200, loss[loss=0.1314, simple_loss=0.2094, pruned_loss=0.02675, over 4805.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2111, pruned_loss=0.03167, over 973546.66 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 19:04:36,804 INFO [train.py:715] (0/8) Epoch 13, batch 17250, loss[loss=0.1162, simple_loss=0.1856, pruned_loss=0.02344, over 4817.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.0315, over 973089.88 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 19:05:14,798 INFO [train.py:715] (0/8) Epoch 13, batch 17300, loss[loss=0.1156, simple_loss=0.1886, pruned_loss=0.02125, over 4939.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.03183, over 972693.57 frames.], batch size: 23, lr: 1.68e-04 2022-05-07 19:05:53,535 INFO [train.py:715] (0/8) Epoch 13, batch 17350, loss[loss=0.1378, simple_loss=0.2181, pruned_loss=0.02872, over 4942.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2113, pruned_loss=0.0318, over 973024.76 frames.], batch size: 23, lr: 1.68e-04 2022-05-07 19:06:32,449 INFO [train.py:715] (0/8) Epoch 13, batch 17400, loss[loss=0.1141, simple_loss=0.1923, pruned_loss=0.01795, over 4809.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2116, pruned_loss=0.03206, over 971209.18 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 19:07:10,067 INFO [train.py:715] (0/8) Epoch 13, batch 17450, loss[loss=0.1259, simple_loss=0.2035, pruned_loss=0.02416, over 4858.00 frames.], tot_loss[loss=0.138, simple_loss=0.2118, pruned_loss=0.03212, over 971023.18 frames.], batch size: 20, lr: 1.68e-04 2022-05-07 19:07:48,566 INFO [train.py:715] (0/8) Epoch 13, batch 17500, loss[loss=0.1519, simple_loss=0.2202, pruned_loss=0.04179, over 4814.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2112, pruned_loss=0.03193, over 971457.94 frames.], batch size: 27, lr: 1.68e-04 2022-05-07 19:08:27,653 INFO [train.py:715] (0/8) Epoch 13, batch 17550, loss[loss=0.1488, simple_loss=0.2231, pruned_loss=0.03722, over 4892.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.0317, over 971110.10 frames.], batch size: 19, lr: 1.68e-04 2022-05-07 19:09:06,323 INFO [train.py:715] (0/8) Epoch 13, batch 17600, loss[loss=0.1182, simple_loss=0.1914, pruned_loss=0.02249, over 4776.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2112, pruned_loss=0.03186, over 972137.58 frames.], batch size: 18, lr: 1.68e-04 2022-05-07 19:09:43,966 INFO [train.py:715] (0/8) Epoch 13, batch 17650, loss[loss=0.1393, simple_loss=0.2188, pruned_loss=0.02987, over 4882.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.03147, over 972731.67 frames.], batch size: 38, lr: 1.68e-04 2022-05-07 19:10:23,205 INFO [train.py:715] (0/8) Epoch 13, batch 17700, loss[loss=0.1277, simple_loss=0.2022, pruned_loss=0.02657, over 4781.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03199, over 971709.09 frames.], batch size: 14, lr: 1.68e-04 2022-05-07 19:11:02,059 INFO [train.py:715] (0/8) Epoch 13, batch 17750, loss[loss=0.118, simple_loss=0.2, pruned_loss=0.01805, over 4830.00 frames.], tot_loss[loss=0.137, simple_loss=0.2104, pruned_loss=0.03185, over 972233.96 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 19:11:39,677 INFO [train.py:715] (0/8) Epoch 13, batch 17800, loss[loss=0.1302, simple_loss=0.199, pruned_loss=0.03065, over 4877.00 frames.], tot_loss[loss=0.136, simple_loss=0.2091, pruned_loss=0.0314, over 972167.66 frames.], batch size: 22, lr: 1.68e-04 2022-05-07 19:12:18,450 INFO [train.py:715] (0/8) Epoch 13, batch 17850, loss[loss=0.128, simple_loss=0.1993, pruned_loss=0.02835, over 4936.00 frames.], tot_loss[loss=0.136, simple_loss=0.2092, pruned_loss=0.0314, over 972301.72 frames.], batch size: 18, lr: 1.68e-04 2022-05-07 19:12:57,282 INFO [train.py:715] (0/8) Epoch 13, batch 17900, loss[loss=0.1297, simple_loss=0.2118, pruned_loss=0.02377, over 4798.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2093, pruned_loss=0.03126, over 972122.58 frames.], batch size: 21, lr: 1.68e-04 2022-05-07 19:13:35,474 INFO [train.py:715] (0/8) Epoch 13, batch 17950, loss[loss=0.1194, simple_loss=0.1938, pruned_loss=0.02252, over 4826.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.0312, over 972994.77 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 19:14:13,536 INFO [train.py:715] (0/8) Epoch 13, batch 18000, loss[loss=0.1152, simple_loss=0.1852, pruned_loss=0.02264, over 4786.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.03163, over 972489.54 frames.], batch size: 17, lr: 1.68e-04 2022-05-07 19:14:13,537 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 19:14:23,029 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1055, simple_loss=0.1892, pruned_loss=0.01083, over 914524.00 frames. 2022-05-07 19:15:00,693 INFO [train.py:715] (0/8) Epoch 13, batch 18050, loss[loss=0.1254, simple_loss=0.2034, pruned_loss=0.02372, over 4982.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03131, over 972491.48 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 19:15:39,770 INFO [train.py:715] (0/8) Epoch 13, batch 18100, loss[loss=0.1395, simple_loss=0.2176, pruned_loss=0.03063, over 4880.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03133, over 972085.17 frames.], batch size: 16, lr: 1.68e-04 2022-05-07 19:16:18,120 INFO [train.py:715] (0/8) Epoch 13, batch 18150, loss[loss=0.1463, simple_loss=0.2353, pruned_loss=0.02861, over 4767.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2104, pruned_loss=0.03145, over 971863.01 frames.], batch size: 18, lr: 1.68e-04 2022-05-07 19:16:55,370 INFO [train.py:715] (0/8) Epoch 13, batch 18200, loss[loss=0.126, simple_loss=0.2152, pruned_loss=0.01835, over 4813.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2104, pruned_loss=0.03125, over 972073.10 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 19:17:33,688 INFO [train.py:715] (0/8) Epoch 13, batch 18250, loss[loss=0.1183, simple_loss=0.1948, pruned_loss=0.02084, over 4778.00 frames.], tot_loss[loss=0.137, simple_loss=0.2107, pruned_loss=0.0317, over 972599.37 frames.], batch size: 17, lr: 1.68e-04 2022-05-07 19:18:12,480 INFO [train.py:715] (0/8) Epoch 13, batch 18300, loss[loss=0.1346, simple_loss=0.2125, pruned_loss=0.02832, over 4889.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.0315, over 973267.91 frames.], batch size: 22, lr: 1.68e-04 2022-05-07 19:18:51,119 INFO [train.py:715] (0/8) Epoch 13, batch 18350, loss[loss=0.1377, simple_loss=0.2144, pruned_loss=0.0305, over 4985.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03106, over 973211.92 frames.], batch size: 25, lr: 1.68e-04 2022-05-07 19:19:29,019 INFO [train.py:715] (0/8) Epoch 13, batch 18400, loss[loss=0.1325, simple_loss=0.1997, pruned_loss=0.03265, over 4866.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2103, pruned_loss=0.03173, over 972653.37 frames.], batch size: 32, lr: 1.68e-04 2022-05-07 19:20:07,823 INFO [train.py:715] (0/8) Epoch 13, batch 18450, loss[loss=0.1558, simple_loss=0.2241, pruned_loss=0.04371, over 4846.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03131, over 971918.95 frames.], batch size: 34, lr: 1.68e-04 2022-05-07 19:20:46,494 INFO [train.py:715] (0/8) Epoch 13, batch 18500, loss[loss=0.1504, simple_loss=0.2248, pruned_loss=0.03798, over 4985.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.0314, over 971224.29 frames.], batch size: 20, lr: 1.68e-04 2022-05-07 19:21:23,935 INFO [train.py:715] (0/8) Epoch 13, batch 18550, loss[loss=0.1483, simple_loss=0.2185, pruned_loss=0.03906, over 4696.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.0317, over 970929.43 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 19:22:01,956 INFO [train.py:715] (0/8) Epoch 13, batch 18600, loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02941, over 4802.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03135, over 969930.36 frames.], batch size: 14, lr: 1.68e-04 2022-05-07 19:22:40,552 INFO [train.py:715] (0/8) Epoch 13, batch 18650, loss[loss=0.1405, simple_loss=0.2076, pruned_loss=0.03665, over 4863.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2096, pruned_loss=0.03154, over 970129.64 frames.], batch size: 32, lr: 1.68e-04 2022-05-07 19:23:18,501 INFO [train.py:715] (0/8) Epoch 13, batch 18700, loss[loss=0.1298, simple_loss=0.2075, pruned_loss=0.02611, over 4845.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2108, pruned_loss=0.03206, over 970227.71 frames.], batch size: 12, lr: 1.68e-04 2022-05-07 19:23:56,287 INFO [train.py:715] (0/8) Epoch 13, batch 18750, loss[loss=0.119, simple_loss=0.1917, pruned_loss=0.0231, over 4940.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2103, pruned_loss=0.03212, over 970442.64 frames.], batch size: 23, lr: 1.68e-04 2022-05-07 19:24:35,594 INFO [train.py:715] (0/8) Epoch 13, batch 18800, loss[loss=0.1122, simple_loss=0.184, pruned_loss=0.02019, over 4817.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03167, over 971475.68 frames.], batch size: 13, lr: 1.68e-04 2022-05-07 19:25:14,013 INFO [train.py:715] (0/8) Epoch 13, batch 18850, loss[loss=0.1203, simple_loss=0.1987, pruned_loss=0.02095, over 4828.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2101, pruned_loss=0.03177, over 971755.01 frames.], batch size: 26, lr: 1.68e-04 2022-05-07 19:25:52,017 INFO [train.py:715] (0/8) Epoch 13, batch 18900, loss[loss=0.1289, simple_loss=0.2071, pruned_loss=0.02535, over 4983.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2096, pruned_loss=0.03146, over 971641.06 frames.], batch size: 24, lr: 1.68e-04 2022-05-07 19:26:30,880 INFO [train.py:715] (0/8) Epoch 13, batch 18950, loss[loss=0.1402, simple_loss=0.2041, pruned_loss=0.03821, over 4857.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2103, pruned_loss=0.03164, over 970718.36 frames.], batch size: 30, lr: 1.68e-04 2022-05-07 19:27:09,765 INFO [train.py:715] (0/8) Epoch 13, batch 19000, loss[loss=0.1231, simple_loss=0.1952, pruned_loss=0.02547, over 4839.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03135, over 971761.61 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 19:27:48,112 INFO [train.py:715] (0/8) Epoch 13, batch 19050, loss[loss=0.181, simple_loss=0.2548, pruned_loss=0.05358, over 4701.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03101, over 971132.52 frames.], batch size: 15, lr: 1.68e-04 2022-05-07 19:28:26,434 INFO [train.py:715] (0/8) Epoch 13, batch 19100, loss[loss=0.1157, simple_loss=0.185, pruned_loss=0.02322, over 4886.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03091, over 970906.25 frames.], batch size: 16, lr: 1.68e-04 2022-05-07 19:29:05,439 INFO [train.py:715] (0/8) Epoch 13, batch 19150, loss[loss=0.1827, simple_loss=0.246, pruned_loss=0.05971, over 4750.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03143, over 970515.31 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:29:44,100 INFO [train.py:715] (0/8) Epoch 13, batch 19200, loss[loss=0.1256, simple_loss=0.2047, pruned_loss=0.02321, over 4768.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03139, over 970773.33 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:30:21,499 INFO [train.py:715] (0/8) Epoch 13, batch 19250, loss[loss=0.1138, simple_loss=0.1868, pruned_loss=0.02042, over 4798.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2088, pruned_loss=0.03105, over 971294.46 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 19:31:00,076 INFO [train.py:715] (0/8) Epoch 13, batch 19300, loss[loss=0.1869, simple_loss=0.2494, pruned_loss=0.06226, over 4977.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2088, pruned_loss=0.0312, over 970720.60 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 19:31:39,540 INFO [train.py:715] (0/8) Epoch 13, batch 19350, loss[loss=0.1287, simple_loss=0.2027, pruned_loss=0.02737, over 4689.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2091, pruned_loss=0.03134, over 970213.83 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 19:32:18,079 INFO [train.py:715] (0/8) Epoch 13, batch 19400, loss[loss=0.189, simple_loss=0.2555, pruned_loss=0.06128, over 4910.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03072, over 970338.60 frames.], batch size: 18, lr: 1.67e-04 2022-05-07 19:32:56,514 INFO [train.py:715] (0/8) Epoch 13, batch 19450, loss[loss=0.1175, simple_loss=0.1935, pruned_loss=0.02072, over 4824.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2095, pruned_loss=0.03108, over 970915.42 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 19:33:17,809 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-472000.pt 2022-05-07 19:33:37,806 INFO [train.py:715] (0/8) Epoch 13, batch 19500, loss[loss=0.1369, simple_loss=0.2037, pruned_loss=0.03507, over 4766.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2094, pruned_loss=0.03077, over 971088.74 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 19:34:16,748 INFO [train.py:715] (0/8) Epoch 13, batch 19550, loss[loss=0.1497, simple_loss=0.2192, pruned_loss=0.04009, over 4764.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03084, over 971318.19 frames.], batch size: 19, lr: 1.67e-04 2022-05-07 19:34:54,316 INFO [train.py:715] (0/8) Epoch 13, batch 19600, loss[loss=0.1528, simple_loss=0.2216, pruned_loss=0.04196, over 4886.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2094, pruned_loss=0.03085, over 970955.32 frames.], batch size: 32, lr: 1.67e-04 2022-05-07 19:35:32,446 INFO [train.py:715] (0/8) Epoch 13, batch 19650, loss[loss=0.1587, simple_loss=0.2209, pruned_loss=0.0483, over 4968.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03108, over 971731.84 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 19:36:11,253 INFO [train.py:715] (0/8) Epoch 13, batch 19700, loss[loss=0.1304, simple_loss=0.2089, pruned_loss=0.02597, over 4827.00 frames.], tot_loss[loss=0.136, simple_loss=0.2099, pruned_loss=0.03103, over 972430.28 frames.], batch size: 26, lr: 1.67e-04 2022-05-07 19:36:49,082 INFO [train.py:715] (0/8) Epoch 13, batch 19750, loss[loss=0.1402, simple_loss=0.2116, pruned_loss=0.03441, over 4969.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2104, pruned_loss=0.03144, over 971709.55 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 19:37:26,938 INFO [train.py:715] (0/8) Epoch 13, batch 19800, loss[loss=0.1034, simple_loss=0.1665, pruned_loss=0.02012, over 4987.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03131, over 971792.95 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 19:38:05,609 INFO [train.py:715] (0/8) Epoch 13, batch 19850, loss[loss=0.1336, simple_loss=0.2026, pruned_loss=0.03228, over 4748.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03101, over 971881.55 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:38:44,221 INFO [train.py:715] (0/8) Epoch 13, batch 19900, loss[loss=0.1356, simple_loss=0.215, pruned_loss=0.02804, over 4811.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03123, over 971805.66 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 19:39:22,423 INFO [train.py:715] (0/8) Epoch 13, batch 19950, loss[loss=0.1134, simple_loss=0.1916, pruned_loss=0.01763, over 4969.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2104, pruned_loss=0.03123, over 972299.87 frames.], batch size: 24, lr: 1.67e-04 2022-05-07 19:40:01,311 INFO [train.py:715] (0/8) Epoch 13, batch 20000, loss[loss=0.1204, simple_loss=0.201, pruned_loss=0.01995, over 4920.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03137, over 972418.46 frames.], batch size: 23, lr: 1.67e-04 2022-05-07 19:40:39,754 INFO [train.py:715] (0/8) Epoch 13, batch 20050, loss[loss=0.1384, simple_loss=0.2201, pruned_loss=0.02831, over 4921.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03197, over 972275.27 frames.], batch size: 29, lr: 1.67e-04 2022-05-07 19:41:16,928 INFO [train.py:715] (0/8) Epoch 13, batch 20100, loss[loss=0.1313, simple_loss=0.2046, pruned_loss=0.02893, over 4931.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.03127, over 972129.91 frames.], batch size: 23, lr: 1.67e-04 2022-05-07 19:41:54,378 INFO [train.py:715] (0/8) Epoch 13, batch 20150, loss[loss=0.1361, simple_loss=0.2076, pruned_loss=0.03229, over 4928.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.03117, over 971884.64 frames.], batch size: 17, lr: 1.67e-04 2022-05-07 19:42:33,105 INFO [train.py:715] (0/8) Epoch 13, batch 20200, loss[loss=0.1194, simple_loss=0.203, pruned_loss=0.01791, over 4812.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.03118, over 972443.82 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 19:43:11,185 INFO [train.py:715] (0/8) Epoch 13, batch 20250, loss[loss=0.1157, simple_loss=0.1925, pruned_loss=0.01946, over 4764.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2093, pruned_loss=0.03122, over 972009.54 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:43:48,889 INFO [train.py:715] (0/8) Epoch 13, batch 20300, loss[loss=0.1299, simple_loss=0.2013, pruned_loss=0.02929, over 4965.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2083, pruned_loss=0.0307, over 970829.40 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 19:44:26,991 INFO [train.py:715] (0/8) Epoch 13, batch 20350, loss[loss=0.1166, simple_loss=0.1919, pruned_loss=0.02059, over 4853.00 frames.], tot_loss[loss=0.135, simple_loss=0.2082, pruned_loss=0.03087, over 971195.46 frames.], batch size: 20, lr: 1.67e-04 2022-05-07 19:45:05,759 INFO [train.py:715] (0/8) Epoch 13, batch 20400, loss[loss=0.1427, simple_loss=0.218, pruned_loss=0.03373, over 4918.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03109, over 972174.16 frames.], batch size: 18, lr: 1.67e-04 2022-05-07 19:45:43,490 INFO [train.py:715] (0/8) Epoch 13, batch 20450, loss[loss=0.151, simple_loss=0.2237, pruned_loss=0.03914, over 4897.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03104, over 972853.97 frames.], batch size: 39, lr: 1.67e-04 2022-05-07 19:46:21,263 INFO [train.py:715] (0/8) Epoch 13, batch 20500, loss[loss=0.1312, simple_loss=0.2068, pruned_loss=0.0278, over 4878.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03095, over 972419.99 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:46:59,822 INFO [train.py:715] (0/8) Epoch 13, batch 20550, loss[loss=0.1174, simple_loss=0.1957, pruned_loss=0.01954, over 4821.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03092, over 971626.87 frames.], batch size: 26, lr: 1.67e-04 2022-05-07 19:47:37,478 INFO [train.py:715] (0/8) Epoch 13, batch 20600, loss[loss=0.1512, simple_loss=0.2172, pruned_loss=0.04259, over 4857.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.0315, over 972039.38 frames.], batch size: 32, lr: 1.67e-04 2022-05-07 19:48:15,105 INFO [train.py:715] (0/8) Epoch 13, batch 20650, loss[loss=0.159, simple_loss=0.237, pruned_loss=0.04054, over 4881.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03133, over 972346.80 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:48:52,911 INFO [train.py:715] (0/8) Epoch 13, batch 20700, loss[loss=0.1174, simple_loss=0.1978, pruned_loss=0.01852, over 4803.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03119, over 971925.22 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 19:49:31,348 INFO [train.py:715] (0/8) Epoch 13, batch 20750, loss[loss=0.1651, simple_loss=0.2421, pruned_loss=0.04406, over 4935.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03075, over 972006.44 frames.], batch size: 18, lr: 1.67e-04 2022-05-07 19:50:08,693 INFO [train.py:715] (0/8) Epoch 13, batch 20800, loss[loss=0.1288, simple_loss=0.2101, pruned_loss=0.02372, over 4913.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03084, over 973045.13 frames.], batch size: 17, lr: 1.67e-04 2022-05-07 19:50:46,279 INFO [train.py:715] (0/8) Epoch 13, batch 20850, loss[loss=0.1366, simple_loss=0.2265, pruned_loss=0.02338, over 4947.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03092, over 971759.86 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 19:51:24,966 INFO [train.py:715] (0/8) Epoch 13, batch 20900, loss[loss=0.1578, simple_loss=0.2321, pruned_loss=0.04175, over 4746.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2089, pruned_loss=0.03105, over 972709.79 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:52:03,237 INFO [train.py:715] (0/8) Epoch 13, batch 20950, loss[loss=0.1523, simple_loss=0.2269, pruned_loss=0.03882, over 4868.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.03101, over 971985.84 frames.], batch size: 20, lr: 1.67e-04 2022-05-07 19:52:40,744 INFO [train.py:715] (0/8) Epoch 13, batch 21000, loss[loss=0.1392, simple_loss=0.219, pruned_loss=0.02973, over 4810.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03106, over 971765.79 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 19:52:40,745 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 19:52:50,264 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1054, simple_loss=0.1891, pruned_loss=0.01084, over 914524.00 frames. 2022-05-07 19:53:28,431 INFO [train.py:715] (0/8) Epoch 13, batch 21050, loss[loss=0.1313, simple_loss=0.1955, pruned_loss=0.03355, over 4874.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2088, pruned_loss=0.03119, over 971912.97 frames.], batch size: 34, lr: 1.67e-04 2022-05-07 19:54:06,965 INFO [train.py:715] (0/8) Epoch 13, batch 21100, loss[loss=0.1124, simple_loss=0.1907, pruned_loss=0.01712, over 4938.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03111, over 972965.72 frames.], batch size: 29, lr: 1.67e-04 2022-05-07 19:54:46,054 INFO [train.py:715] (0/8) Epoch 13, batch 21150, loss[loss=0.172, simple_loss=0.2359, pruned_loss=0.0541, over 4974.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2092, pruned_loss=0.03131, over 972750.03 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 19:55:23,876 INFO [train.py:715] (0/8) Epoch 13, batch 21200, loss[loss=0.1277, simple_loss=0.202, pruned_loss=0.0267, over 4839.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03163, over 972552.57 frames.], batch size: 30, lr: 1.67e-04 2022-05-07 19:56:02,463 INFO [train.py:715] (0/8) Epoch 13, batch 21250, loss[loss=0.1424, simple_loss=0.2144, pruned_loss=0.03516, over 4985.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03174, over 972636.20 frames.], batch size: 25, lr: 1.67e-04 2022-05-07 19:56:41,274 INFO [train.py:715] (0/8) Epoch 13, batch 21300, loss[loss=0.1701, simple_loss=0.2207, pruned_loss=0.05975, over 4961.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.0315, over 972196.93 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 19:57:19,152 INFO [train.py:715] (0/8) Epoch 13, batch 21350, loss[loss=0.1213, simple_loss=0.2059, pruned_loss=0.01834, over 4845.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2106, pruned_loss=0.03154, over 972300.27 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 19:57:57,080 INFO [train.py:715] (0/8) Epoch 13, batch 21400, loss[loss=0.1333, simple_loss=0.2158, pruned_loss=0.02539, over 4936.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03122, over 971623.46 frames.], batch size: 29, lr: 1.67e-04 2022-05-07 19:58:35,342 INFO [train.py:715] (0/8) Epoch 13, batch 21450, loss[loss=0.1134, simple_loss=0.1931, pruned_loss=0.01688, over 4890.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2101, pruned_loss=0.03145, over 971993.53 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 19:59:14,497 INFO [train.py:715] (0/8) Epoch 13, batch 21500, loss[loss=0.1359, simple_loss=0.203, pruned_loss=0.03442, over 4826.00 frames.], tot_loss[loss=0.1367, simple_loss=0.21, pruned_loss=0.0317, over 972932.18 frames.], batch size: 26, lr: 1.67e-04 2022-05-07 19:59:52,260 INFO [train.py:715] (0/8) Epoch 13, batch 21550, loss[loss=0.1317, simple_loss=0.2031, pruned_loss=0.03019, over 4985.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03137, over 972695.87 frames.], batch size: 28, lr: 1.67e-04 2022-05-07 20:00:30,895 INFO [train.py:715] (0/8) Epoch 13, batch 21600, loss[loss=0.1264, simple_loss=0.1971, pruned_loss=0.02784, over 4988.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2085, pruned_loss=0.03088, over 971980.42 frames.], batch size: 28, lr: 1.67e-04 2022-05-07 20:01:09,860 INFO [train.py:715] (0/8) Epoch 13, batch 21650, loss[loss=0.1432, simple_loss=0.2166, pruned_loss=0.03486, over 4851.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2079, pruned_loss=0.03059, over 972044.01 frames.], batch size: 20, lr: 1.67e-04 2022-05-07 20:01:48,620 INFO [train.py:715] (0/8) Epoch 13, batch 21700, loss[loss=0.1305, simple_loss=0.2045, pruned_loss=0.02821, over 4697.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03069, over 972710.48 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 20:02:27,464 INFO [train.py:715] (0/8) Epoch 13, batch 21750, loss[loss=0.1413, simple_loss=0.2124, pruned_loss=0.03513, over 4972.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03031, over 972667.01 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 20:03:06,111 INFO [train.py:715] (0/8) Epoch 13, batch 21800, loss[loss=0.1228, simple_loss=0.1866, pruned_loss=0.02944, over 4807.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.0307, over 973219.95 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 20:03:45,410 INFO [train.py:715] (0/8) Epoch 13, batch 21850, loss[loss=0.1295, simple_loss=0.2107, pruned_loss=0.02414, over 4816.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03098, over 972961.47 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 20:04:23,519 INFO [train.py:715] (0/8) Epoch 13, batch 21900, loss[loss=0.1777, simple_loss=0.2376, pruned_loss=0.05888, over 4989.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03134, over 973193.69 frames.], batch size: 25, lr: 1.67e-04 2022-05-07 20:05:01,700 INFO [train.py:715] (0/8) Epoch 13, batch 21950, loss[loss=0.109, simple_loss=0.1747, pruned_loss=0.02168, over 4849.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03113, over 973484.07 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 20:05:40,173 INFO [train.py:715] (0/8) Epoch 13, batch 22000, loss[loss=0.1406, simple_loss=0.2119, pruned_loss=0.03465, over 4982.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03031, over 973319.41 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 20:06:17,896 INFO [train.py:715] (0/8) Epoch 13, batch 22050, loss[loss=0.138, simple_loss=0.2161, pruned_loss=0.02989, over 4963.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03028, over 972649.61 frames.], batch size: 24, lr: 1.67e-04 2022-05-07 20:06:55,940 INFO [train.py:715] (0/8) Epoch 13, batch 22100, loss[loss=0.1324, simple_loss=0.203, pruned_loss=0.03092, over 4828.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03079, over 973096.44 frames.], batch size: 12, lr: 1.67e-04 2022-05-07 20:07:33,690 INFO [train.py:715] (0/8) Epoch 13, batch 22150, loss[loss=0.1406, simple_loss=0.2094, pruned_loss=0.03587, over 4859.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2096, pruned_loss=0.03055, over 973188.07 frames.], batch size: 20, lr: 1.67e-04 2022-05-07 20:08:12,645 INFO [train.py:715] (0/8) Epoch 13, batch 22200, loss[loss=0.1226, simple_loss=0.1901, pruned_loss=0.02756, over 4642.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.03042, over 972284.52 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 20:08:50,191 INFO [train.py:715] (0/8) Epoch 13, batch 22250, loss[loss=0.1544, simple_loss=0.2297, pruned_loss=0.03956, over 4782.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2094, pruned_loss=0.03062, over 970537.71 frames.], batch size: 17, lr: 1.67e-04 2022-05-07 20:09:28,952 INFO [train.py:715] (0/8) Epoch 13, batch 22300, loss[loss=0.1758, simple_loss=0.2553, pruned_loss=0.04814, over 4980.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2107, pruned_loss=0.03104, over 970379.42 frames.], batch size: 39, lr: 1.67e-04 2022-05-07 20:10:07,700 INFO [train.py:715] (0/8) Epoch 13, batch 22350, loss[loss=0.1471, simple_loss=0.2155, pruned_loss=0.03931, over 4942.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2102, pruned_loss=0.03075, over 970129.26 frames.], batch size: 24, lr: 1.67e-04 2022-05-07 20:10:45,728 INFO [train.py:715] (0/8) Epoch 13, batch 22400, loss[loss=0.1399, simple_loss=0.2128, pruned_loss=0.03352, over 4774.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2106, pruned_loss=0.03108, over 970576.42 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:11:23,409 INFO [train.py:715] (0/8) Epoch 13, batch 22450, loss[loss=0.1199, simple_loss=0.1851, pruned_loss=0.02734, over 4819.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2108, pruned_loss=0.0317, over 970726.68 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 20:12:01,253 INFO [train.py:715] (0/8) Epoch 13, batch 22500, loss[loss=0.1362, simple_loss=0.2021, pruned_loss=0.03519, over 4797.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2108, pruned_loss=0.03151, over 971012.07 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 20:12:39,609 INFO [train.py:715] (0/8) Epoch 13, batch 22550, loss[loss=0.09844, simple_loss=0.1722, pruned_loss=0.01233, over 4806.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03132, over 971112.94 frames.], batch size: 26, lr: 1.67e-04 2022-05-07 20:13:16,801 INFO [train.py:715] (0/8) Epoch 13, batch 22600, loss[loss=0.1572, simple_loss=0.2225, pruned_loss=0.04591, over 4790.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03092, over 972275.29 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:13:54,708 INFO [train.py:715] (0/8) Epoch 13, batch 22650, loss[loss=0.1587, simple_loss=0.2316, pruned_loss=0.04292, over 4913.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03133, over 972041.50 frames.], batch size: 23, lr: 1.67e-04 2022-05-07 20:14:32,800 INFO [train.py:715] (0/8) Epoch 13, batch 22700, loss[loss=0.147, simple_loss=0.2262, pruned_loss=0.03391, over 4799.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2105, pruned_loss=0.03154, over 972592.60 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 20:15:11,031 INFO [train.py:715] (0/8) Epoch 13, batch 22750, loss[loss=0.1589, simple_loss=0.2299, pruned_loss=0.04399, over 4792.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2107, pruned_loss=0.0314, over 971507.62 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:15:49,011 INFO [train.py:715] (0/8) Epoch 13, batch 22800, loss[loss=0.1247, simple_loss=0.1947, pruned_loss=0.02729, over 4967.00 frames.], tot_loss[loss=0.137, simple_loss=0.2112, pruned_loss=0.03143, over 971995.00 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 20:16:27,578 INFO [train.py:715] (0/8) Epoch 13, batch 22850, loss[loss=0.1663, simple_loss=0.2331, pruned_loss=0.04979, over 4982.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2107, pruned_loss=0.03105, over 972792.26 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 20:17:06,828 INFO [train.py:715] (0/8) Epoch 13, batch 22900, loss[loss=0.1297, simple_loss=0.1985, pruned_loss=0.03048, over 4965.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2111, pruned_loss=0.03171, over 973114.70 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:17:44,511 INFO [train.py:715] (0/8) Epoch 13, batch 22950, loss[loss=0.1377, simple_loss=0.213, pruned_loss=0.03117, over 4966.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2108, pruned_loss=0.03121, over 972756.44 frames.], batch size: 39, lr: 1.67e-04 2022-05-07 20:18:23,094 INFO [train.py:715] (0/8) Epoch 13, batch 23000, loss[loss=0.1475, simple_loss=0.2222, pruned_loss=0.03636, over 4752.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2104, pruned_loss=0.0312, over 972662.86 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 20:19:01,741 INFO [train.py:715] (0/8) Epoch 13, batch 23050, loss[loss=0.1358, simple_loss=0.2127, pruned_loss=0.02948, over 4788.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2116, pruned_loss=0.03162, over 971438.76 frames.], batch size: 17, lr: 1.67e-04 2022-05-07 20:19:40,068 INFO [train.py:715] (0/8) Epoch 13, batch 23100, loss[loss=0.1572, simple_loss=0.2256, pruned_loss=0.04436, over 4775.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2116, pruned_loss=0.03176, over 971679.37 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:20:17,984 INFO [train.py:715] (0/8) Epoch 13, batch 23150, loss[loss=0.1383, simple_loss=0.2147, pruned_loss=0.03094, over 4967.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2113, pruned_loss=0.03169, over 972514.73 frames.], batch size: 35, lr: 1.67e-04 2022-05-07 20:20:56,162 INFO [train.py:715] (0/8) Epoch 13, batch 23200, loss[loss=0.115, simple_loss=0.1884, pruned_loss=0.02079, over 4823.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03134, over 972478.56 frames.], batch size: 12, lr: 1.67e-04 2022-05-07 20:21:34,314 INFO [train.py:715] (0/8) Epoch 13, batch 23250, loss[loss=0.1546, simple_loss=0.2325, pruned_loss=0.03842, over 4974.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.03137, over 971994.33 frames.], batch size: 25, lr: 1.67e-04 2022-05-07 20:22:11,783 INFO [train.py:715] (0/8) Epoch 13, batch 23300, loss[loss=0.12, simple_loss=0.1897, pruned_loss=0.02516, over 4748.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03124, over 972559.48 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 20:22:50,104 INFO [train.py:715] (0/8) Epoch 13, batch 23350, loss[loss=0.1323, simple_loss=0.2135, pruned_loss=0.02554, over 4756.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.03096, over 972106.75 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 20:23:28,674 INFO [train.py:715] (0/8) Epoch 13, batch 23400, loss[loss=0.1287, simple_loss=0.2056, pruned_loss=0.02591, over 4981.00 frames.], tot_loss[loss=0.1353, simple_loss=0.209, pruned_loss=0.0308, over 973303.10 frames.], batch size: 24, lr: 1.67e-04 2022-05-07 20:24:06,991 INFO [train.py:715] (0/8) Epoch 13, batch 23450, loss[loss=0.1514, simple_loss=0.2321, pruned_loss=0.0353, over 4868.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.03111, over 974041.50 frames.], batch size: 32, lr: 1.67e-04 2022-05-07 20:24:45,011 INFO [train.py:715] (0/8) Epoch 13, batch 23500, loss[loss=0.1472, simple_loss=0.2137, pruned_loss=0.04034, over 4750.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03106, over 973909.24 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 20:25:23,766 INFO [train.py:715] (0/8) Epoch 13, batch 23550, loss[loss=0.1201, simple_loss=0.1936, pruned_loss=0.0233, over 4750.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03091, over 973973.27 frames.], batch size: 19, lr: 1.67e-04 2022-05-07 20:26:02,265 INFO [train.py:715] (0/8) Epoch 13, batch 23600, loss[loss=0.1122, simple_loss=0.1827, pruned_loss=0.02085, over 4841.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03118, over 972549.29 frames.], batch size: 12, lr: 1.67e-04 2022-05-07 20:26:39,838 INFO [train.py:715] (0/8) Epoch 13, batch 23650, loss[loss=0.1485, simple_loss=0.2182, pruned_loss=0.03945, over 4789.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03088, over 972129.33 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:27:18,101 INFO [train.py:715] (0/8) Epoch 13, batch 23700, loss[loss=0.1836, simple_loss=0.2614, pruned_loss=0.05294, over 4972.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2099, pruned_loss=0.03081, over 972293.20 frames.], batch size: 39, lr: 1.67e-04 2022-05-07 20:27:56,585 INFO [train.py:715] (0/8) Epoch 13, batch 23750, loss[loss=0.1148, simple_loss=0.1934, pruned_loss=0.01812, over 4929.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2093, pruned_loss=0.03063, over 972267.32 frames.], batch size: 29, lr: 1.67e-04 2022-05-07 20:28:34,757 INFO [train.py:715] (0/8) Epoch 13, batch 23800, loss[loss=0.1244, simple_loss=0.1978, pruned_loss=0.0255, over 4943.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03093, over 972339.23 frames.], batch size: 39, lr: 1.67e-04 2022-05-07 20:29:12,131 INFO [train.py:715] (0/8) Epoch 13, batch 23850, loss[loss=0.1499, simple_loss=0.2306, pruned_loss=0.03462, over 4753.00 frames.], tot_loss[loss=0.1358, simple_loss=0.21, pruned_loss=0.03077, over 971596.79 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 20:29:51,247 INFO [train.py:715] (0/8) Epoch 13, batch 23900, loss[loss=0.1695, simple_loss=0.2603, pruned_loss=0.03939, over 4821.00 frames.], tot_loss[loss=0.1357, simple_loss=0.21, pruned_loss=0.03072, over 970753.75 frames.], batch size: 25, lr: 1.67e-04 2022-05-07 20:30:29,197 INFO [train.py:715] (0/8) Epoch 13, batch 23950, loss[loss=0.1419, simple_loss=0.2086, pruned_loss=0.03761, over 4766.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03073, over 972106.07 frames.], batch size: 16, lr: 1.67e-04 2022-05-07 20:31:06,575 INFO [train.py:715] (0/8) Epoch 13, batch 24000, loss[loss=0.1366, simple_loss=0.2091, pruned_loss=0.03202, over 4941.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03107, over 971753.23 frames.], batch size: 39, lr: 1.67e-04 2022-05-07 20:31:06,576 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 20:31:16,110 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1053, simple_loss=0.1891, pruned_loss=0.01069, over 914524.00 frames. 2022-05-07 20:31:53,720 INFO [train.py:715] (0/8) Epoch 13, batch 24050, loss[loss=0.1245, simple_loss=0.2014, pruned_loss=0.02382, over 4977.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2095, pruned_loss=0.03087, over 972457.44 frames.], batch size: 28, lr: 1.67e-04 2022-05-07 20:32:31,538 INFO [train.py:715] (0/8) Epoch 13, batch 24100, loss[loss=0.1632, simple_loss=0.2513, pruned_loss=0.0376, over 4854.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.03111, over 972760.62 frames.], batch size: 20, lr: 1.67e-04 2022-05-07 20:33:10,915 INFO [train.py:715] (0/8) Epoch 13, batch 24150, loss[loss=0.1509, simple_loss=0.2262, pruned_loss=0.03776, over 4943.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03058, over 973015.92 frames.], batch size: 21, lr: 1.67e-04 2022-05-07 20:33:49,882 INFO [train.py:715] (0/8) Epoch 13, batch 24200, loss[loss=0.1315, simple_loss=0.2132, pruned_loss=0.02487, over 4924.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03053, over 972193.12 frames.], batch size: 23, lr: 1.67e-04 2022-05-07 20:34:28,084 INFO [train.py:715] (0/8) Epoch 13, batch 24250, loss[loss=0.1293, simple_loss=0.198, pruned_loss=0.03034, over 4770.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03032, over 972683.31 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:35:06,948 INFO [train.py:715] (0/8) Epoch 13, batch 24300, loss[loss=0.1181, simple_loss=0.1926, pruned_loss=0.02184, over 4925.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2075, pruned_loss=0.0305, over 973470.69 frames.], batch size: 23, lr: 1.67e-04 2022-05-07 20:35:45,646 INFO [train.py:715] (0/8) Epoch 13, batch 24350, loss[loss=0.1375, simple_loss=0.2102, pruned_loss=0.03238, over 4764.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03062, over 972650.48 frames.], batch size: 18, lr: 1.67e-04 2022-05-07 20:36:23,173 INFO [train.py:715] (0/8) Epoch 13, batch 24400, loss[loss=0.1533, simple_loss=0.2329, pruned_loss=0.03685, over 4770.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03105, over 972942.91 frames.], batch size: 19, lr: 1.67e-04 2022-05-07 20:37:01,580 INFO [train.py:715] (0/8) Epoch 13, batch 24450, loss[loss=0.134, simple_loss=0.2122, pruned_loss=0.02786, over 4826.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.03098, over 972726.95 frames.], batch size: 26, lr: 1.67e-04 2022-05-07 20:37:40,233 INFO [train.py:715] (0/8) Epoch 13, batch 24500, loss[loss=0.1525, simple_loss=0.2192, pruned_loss=0.0429, over 4768.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03114, over 972690.01 frames.], batch size: 18, lr: 1.67e-04 2022-05-07 20:38:18,540 INFO [train.py:715] (0/8) Epoch 13, batch 24550, loss[loss=0.1386, simple_loss=0.2179, pruned_loss=0.02962, over 4955.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03098, over 972957.96 frames.], batch size: 29, lr: 1.67e-04 2022-05-07 20:38:56,895 INFO [train.py:715] (0/8) Epoch 13, batch 24600, loss[loss=0.1288, simple_loss=0.2055, pruned_loss=0.02609, over 4916.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2107, pruned_loss=0.031, over 972794.08 frames.], batch size: 18, lr: 1.67e-04 2022-05-07 20:39:36,085 INFO [train.py:715] (0/8) Epoch 13, batch 24650, loss[loss=0.1191, simple_loss=0.1863, pruned_loss=0.02592, over 4865.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03091, over 971639.15 frames.], batch size: 13, lr: 1.67e-04 2022-05-07 20:40:14,984 INFO [train.py:715] (0/8) Epoch 13, batch 24700, loss[loss=0.1268, simple_loss=0.1978, pruned_loss=0.02785, over 4839.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03077, over 972024.30 frames.], batch size: 15, lr: 1.67e-04 2022-05-07 20:40:52,891 INFO [train.py:715] (0/8) Epoch 13, batch 24750, loss[loss=0.1436, simple_loss=0.2111, pruned_loss=0.03802, over 4767.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03083, over 971703.08 frames.], batch size: 14, lr: 1.67e-04 2022-05-07 20:41:31,281 INFO [train.py:715] (0/8) Epoch 13, batch 24800, loss[loss=0.1449, simple_loss=0.2154, pruned_loss=0.03717, over 4917.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.03124, over 972517.29 frames.], batch size: 19, lr: 1.67e-04 2022-05-07 20:42:10,090 INFO [train.py:715] (0/8) Epoch 13, batch 24850, loss[loss=0.1115, simple_loss=0.1871, pruned_loss=0.01792, over 4877.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2104, pruned_loss=0.0312, over 971993.83 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 20:42:48,214 INFO [train.py:715] (0/8) Epoch 13, batch 24900, loss[loss=0.1511, simple_loss=0.2147, pruned_loss=0.04376, over 4894.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2104, pruned_loss=0.03144, over 972996.58 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 20:43:26,332 INFO [train.py:715] (0/8) Epoch 13, batch 24950, loss[loss=0.1256, simple_loss=0.1965, pruned_loss=0.02738, over 4957.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03159, over 973603.70 frames.], batch size: 24, lr: 1.66e-04 2022-05-07 20:44:04,940 INFO [train.py:715] (0/8) Epoch 13, batch 25000, loss[loss=0.1402, simple_loss=0.216, pruned_loss=0.03225, over 4891.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2103, pruned_loss=0.03162, over 973962.04 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 20:44:43,235 INFO [train.py:715] (0/8) Epoch 13, batch 25050, loss[loss=0.1557, simple_loss=0.216, pruned_loss=0.04774, over 4752.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03115, over 973184.40 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 20:45:20,923 INFO [train.py:715] (0/8) Epoch 13, batch 25100, loss[loss=0.1307, simple_loss=0.2089, pruned_loss=0.0263, over 4879.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03091, over 973714.22 frames.], batch size: 22, lr: 1.66e-04 2022-05-07 20:46:00,037 INFO [train.py:715] (0/8) Epoch 13, batch 25150, loss[loss=0.1174, simple_loss=0.1979, pruned_loss=0.0184, over 4898.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03103, over 973668.24 frames.], batch size: 17, lr: 1.66e-04 2022-05-07 20:46:38,584 INFO [train.py:715] (0/8) Epoch 13, batch 25200, loss[loss=0.1212, simple_loss=0.1958, pruned_loss=0.02326, over 4810.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03128, over 973643.39 frames.], batch size: 26, lr: 1.66e-04 2022-05-07 20:47:17,715 INFO [train.py:715] (0/8) Epoch 13, batch 25250, loss[loss=0.1415, simple_loss=0.2096, pruned_loss=0.03668, over 4869.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03159, over 973630.96 frames.], batch size: 32, lr: 1.66e-04 2022-05-07 20:47:55,926 INFO [train.py:715] (0/8) Epoch 13, batch 25300, loss[loss=0.1402, simple_loss=0.207, pruned_loss=0.03667, over 4975.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03128, over 974287.30 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 20:48:34,489 INFO [train.py:715] (0/8) Epoch 13, batch 25350, loss[loss=0.1559, simple_loss=0.2177, pruned_loss=0.04702, over 4808.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03115, over 973020.15 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 20:49:13,700 INFO [train.py:715] (0/8) Epoch 13, batch 25400, loss[loss=0.1276, simple_loss=0.2043, pruned_loss=0.02543, over 4636.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03125, over 971817.60 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 20:49:51,563 INFO [train.py:715] (0/8) Epoch 13, batch 25450, loss[loss=0.1193, simple_loss=0.1897, pruned_loss=0.0244, over 4745.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03102, over 972083.85 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 20:50:30,628 INFO [train.py:715] (0/8) Epoch 13, batch 25500, loss[loss=0.1412, simple_loss=0.2172, pruned_loss=0.03258, over 4959.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03066, over 971978.50 frames.], batch size: 24, lr: 1.66e-04 2022-05-07 20:51:09,200 INFO [train.py:715] (0/8) Epoch 13, batch 25550, loss[loss=0.1392, simple_loss=0.2122, pruned_loss=0.03309, over 4795.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03035, over 971909.05 frames.], batch size: 17, lr: 1.66e-04 2022-05-07 20:51:47,744 INFO [train.py:715] (0/8) Epoch 13, batch 25600, loss[loss=0.1669, simple_loss=0.2296, pruned_loss=0.05214, over 4820.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03068, over 972373.51 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 20:52:25,798 INFO [train.py:715] (0/8) Epoch 13, batch 25650, loss[loss=0.1438, simple_loss=0.2172, pruned_loss=0.03517, over 4967.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.03085, over 973116.52 frames.], batch size: 35, lr: 1.66e-04 2022-05-07 20:53:05,131 INFO [train.py:715] (0/8) Epoch 13, batch 25700, loss[loss=0.1241, simple_loss=0.2051, pruned_loss=0.02159, over 4776.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03061, over 973109.72 frames.], batch size: 18, lr: 1.66e-04 2022-05-07 20:53:43,484 INFO [train.py:715] (0/8) Epoch 13, batch 25750, loss[loss=0.1228, simple_loss=0.2041, pruned_loss=0.02072, over 4746.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.0311, over 972209.95 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 20:54:21,669 INFO [train.py:715] (0/8) Epoch 13, batch 25800, loss[loss=0.1471, simple_loss=0.2197, pruned_loss=0.03724, over 4964.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03136, over 972964.75 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 20:55:00,561 INFO [train.py:715] (0/8) Epoch 13, batch 25850, loss[loss=0.1122, simple_loss=0.1983, pruned_loss=0.01304, over 4916.00 frames.], tot_loss[loss=0.136, simple_loss=0.2095, pruned_loss=0.03129, over 973041.57 frames.], batch size: 18, lr: 1.66e-04 2022-05-07 20:55:39,353 INFO [train.py:715] (0/8) Epoch 13, batch 25900, loss[loss=0.1435, simple_loss=0.2115, pruned_loss=0.03772, over 4877.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2097, pruned_loss=0.03144, over 974225.41 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 20:56:18,174 INFO [train.py:715] (0/8) Epoch 13, batch 25950, loss[loss=0.1454, simple_loss=0.216, pruned_loss=0.03735, over 4811.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03087, over 974289.66 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 20:56:57,177 INFO [train.py:715] (0/8) Epoch 13, batch 26000, loss[loss=0.1636, simple_loss=0.2382, pruned_loss=0.04448, over 4923.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03072, over 973449.32 frames.], batch size: 39, lr: 1.66e-04 2022-05-07 20:57:36,538 INFO [train.py:715] (0/8) Epoch 13, batch 26050, loss[loss=0.1413, simple_loss=0.2192, pruned_loss=0.03169, over 4863.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03135, over 972520.48 frames.], batch size: 34, lr: 1.66e-04 2022-05-07 20:58:15,735 INFO [train.py:715] (0/8) Epoch 13, batch 26100, loss[loss=0.1271, simple_loss=0.2023, pruned_loss=0.02597, over 4764.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03145, over 971684.05 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 20:58:54,119 INFO [train.py:715] (0/8) Epoch 13, batch 26150, loss[loss=0.1307, simple_loss=0.2025, pruned_loss=0.02949, over 4783.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.03179, over 972001.42 frames.], batch size: 18, lr: 1.66e-04 2022-05-07 20:59:33,344 INFO [train.py:715] (0/8) Epoch 13, batch 26200, loss[loss=0.141, simple_loss=0.2091, pruned_loss=0.03643, over 4813.00 frames.], tot_loss[loss=0.136, simple_loss=0.2091, pruned_loss=0.03149, over 971828.15 frames.], batch size: 26, lr: 1.66e-04 2022-05-07 21:00:12,166 INFO [train.py:715] (0/8) Epoch 13, batch 26250, loss[loss=0.149, simple_loss=0.2188, pruned_loss=0.03964, over 4967.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2087, pruned_loss=0.03121, over 972992.99 frames.], batch size: 24, lr: 1.66e-04 2022-05-07 21:00:50,341 INFO [train.py:715] (0/8) Epoch 13, batch 26300, loss[loss=0.1711, simple_loss=0.2471, pruned_loss=0.04751, over 4879.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03084, over 972800.46 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:01:28,295 INFO [train.py:715] (0/8) Epoch 13, batch 26350, loss[loss=0.1207, simple_loss=0.1994, pruned_loss=0.02104, over 4813.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2089, pruned_loss=0.03081, over 972888.84 frames.], batch size: 25, lr: 1.66e-04 2022-05-07 21:02:07,162 INFO [train.py:715] (0/8) Epoch 13, batch 26400, loss[loss=0.129, simple_loss=0.2049, pruned_loss=0.02661, over 4745.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03101, over 972185.63 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 21:02:46,101 INFO [train.py:715] (0/8) Epoch 13, batch 26450, loss[loss=0.1304, simple_loss=0.2025, pruned_loss=0.02913, over 4642.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03136, over 972110.54 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:03:24,274 INFO [train.py:715] (0/8) Epoch 13, batch 26500, loss[loss=0.1217, simple_loss=0.1953, pruned_loss=0.024, over 4833.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03099, over 971867.07 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:04:03,395 INFO [train.py:715] (0/8) Epoch 13, batch 26550, loss[loss=0.1461, simple_loss=0.2272, pruned_loss=0.03252, over 4796.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2083, pruned_loss=0.03075, over 972273.56 frames.], batch size: 18, lr: 1.66e-04 2022-05-07 21:04:41,836 INFO [train.py:715] (0/8) Epoch 13, batch 26600, loss[loss=0.1744, simple_loss=0.2332, pruned_loss=0.05774, over 4822.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2083, pruned_loss=0.03096, over 972331.65 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:05:20,063 INFO [train.py:715] (0/8) Epoch 13, batch 26650, loss[loss=0.1142, simple_loss=0.1867, pruned_loss=0.02087, over 4837.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2088, pruned_loss=0.03108, over 972241.88 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:05:58,313 INFO [train.py:715] (0/8) Epoch 13, batch 26700, loss[loss=0.1142, simple_loss=0.1791, pruned_loss=0.02466, over 4830.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03104, over 971484.64 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:06:37,479 INFO [train.py:715] (0/8) Epoch 13, batch 26750, loss[loss=0.1296, simple_loss=0.2035, pruned_loss=0.0279, over 4831.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03057, over 971746.06 frames.], batch size: 26, lr: 1.66e-04 2022-05-07 21:07:15,987 INFO [train.py:715] (0/8) Epoch 13, batch 26800, loss[loss=0.1327, simple_loss=0.2157, pruned_loss=0.0249, over 4911.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03042, over 971806.87 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 21:07:54,599 INFO [train.py:715] (0/8) Epoch 13, batch 26850, loss[loss=0.1517, simple_loss=0.2194, pruned_loss=0.04194, over 4740.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03061, over 970947.82 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:08:33,344 INFO [train.py:715] (0/8) Epoch 13, batch 26900, loss[loss=0.1597, simple_loss=0.2304, pruned_loss=0.04453, over 4765.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2082, pruned_loss=0.03064, over 970885.86 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 21:09:11,792 INFO [train.py:715] (0/8) Epoch 13, batch 26950, loss[loss=0.1308, simple_loss=0.1981, pruned_loss=0.03177, over 4765.00 frames.], tot_loss[loss=0.1359, simple_loss=0.209, pruned_loss=0.0314, over 970722.20 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:09:50,375 INFO [train.py:715] (0/8) Epoch 13, batch 27000, loss[loss=0.1502, simple_loss=0.218, pruned_loss=0.04118, over 4808.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03138, over 970966.30 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:09:50,376 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 21:09:59,936 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1053, simple_loss=0.1891, pruned_loss=0.01077, over 914524.00 frames. 2022-05-07 21:10:39,024 INFO [train.py:715] (0/8) Epoch 13, batch 27050, loss[loss=0.1199, simple_loss=0.1907, pruned_loss=0.02457, over 4689.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.0316, over 971864.37 frames.], batch size: 12, lr: 1.66e-04 2022-05-07 21:11:17,914 INFO [train.py:715] (0/8) Epoch 13, batch 27100, loss[loss=0.1204, simple_loss=0.1837, pruned_loss=0.02857, over 4802.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2101, pruned_loss=0.03145, over 971804.92 frames.], batch size: 12, lr: 1.66e-04 2022-05-07 21:11:57,146 INFO [train.py:715] (0/8) Epoch 13, batch 27150, loss[loss=0.1274, simple_loss=0.203, pruned_loss=0.02593, over 4955.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2108, pruned_loss=0.03181, over 971046.18 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:12:36,107 INFO [train.py:715] (0/8) Epoch 13, batch 27200, loss[loss=0.1221, simple_loss=0.1959, pruned_loss=0.02414, over 4699.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2098, pruned_loss=0.03147, over 971835.21 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:13:14,908 INFO [train.py:715] (0/8) Epoch 13, batch 27250, loss[loss=0.1071, simple_loss=0.1799, pruned_loss=0.01715, over 4757.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2107, pruned_loss=0.03137, over 971607.31 frames.], batch size: 12, lr: 1.66e-04 2022-05-07 21:13:54,905 INFO [train.py:715] (0/8) Epoch 13, batch 27300, loss[loss=0.1639, simple_loss=0.2359, pruned_loss=0.04599, over 4875.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2098, pruned_loss=0.03069, over 972032.81 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:14:33,861 INFO [train.py:715] (0/8) Epoch 13, batch 27350, loss[loss=0.1133, simple_loss=0.1844, pruned_loss=0.0211, over 4948.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.03123, over 972168.59 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:15:11,630 INFO [train.py:715] (0/8) Epoch 13, batch 27400, loss[loss=0.1549, simple_loss=0.2247, pruned_loss=0.04248, over 4943.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03129, over 972749.00 frames.], batch size: 39, lr: 1.66e-04 2022-05-07 21:15:49,741 INFO [train.py:715] (0/8) Epoch 13, batch 27450, loss[loss=0.1702, simple_loss=0.2223, pruned_loss=0.05902, over 4895.00 frames.], tot_loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03163, over 972502.97 frames.], batch size: 19, lr: 1.66e-04 2022-05-07 21:16:10,070 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-480000.pt 2022-05-07 21:16:30,586 INFO [train.py:715] (0/8) Epoch 13, batch 27500, loss[loss=0.1313, simple_loss=0.2154, pruned_loss=0.02359, over 4709.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03133, over 972398.20 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:17:08,826 INFO [train.py:715] (0/8) Epoch 13, batch 27550, loss[loss=0.1355, simple_loss=0.2077, pruned_loss=0.03167, over 4970.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2093, pruned_loss=0.03129, over 973225.95 frames.], batch size: 35, lr: 1.66e-04 2022-05-07 21:17:46,774 INFO [train.py:715] (0/8) Epoch 13, batch 27600, loss[loss=0.1588, simple_loss=0.2323, pruned_loss=0.04263, over 4827.00 frames.], tot_loss[loss=0.135, simple_loss=0.2085, pruned_loss=0.03077, over 973219.69 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:18:25,953 INFO [train.py:715] (0/8) Epoch 13, batch 27650, loss[loss=0.1376, simple_loss=0.2146, pruned_loss=0.03029, over 4947.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2089, pruned_loss=0.03104, over 972630.87 frames.], batch size: 39, lr: 1.66e-04 2022-05-07 21:19:03,875 INFO [train.py:715] (0/8) Epoch 13, batch 27700, loss[loss=0.1236, simple_loss=0.2016, pruned_loss=0.02277, over 4787.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2082, pruned_loss=0.03084, over 972255.44 frames.], batch size: 17, lr: 1.66e-04 2022-05-07 21:19:42,878 INFO [train.py:715] (0/8) Epoch 13, batch 27750, loss[loss=0.1623, simple_loss=0.2336, pruned_loss=0.04554, over 4936.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2081, pruned_loss=0.03103, over 972114.37 frames.], batch size: 23, lr: 1.66e-04 2022-05-07 21:20:21,390 INFO [train.py:715] (0/8) Epoch 13, batch 27800, loss[loss=0.1686, simple_loss=0.2407, pruned_loss=0.04829, over 4895.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2094, pruned_loss=0.03142, over 972322.57 frames.], batch size: 22, lr: 1.66e-04 2022-05-07 21:21:00,117 INFO [train.py:715] (0/8) Epoch 13, batch 27850, loss[loss=0.1329, simple_loss=0.2156, pruned_loss=0.02506, over 4927.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2093, pruned_loss=0.03151, over 972785.39 frames.], batch size: 18, lr: 1.66e-04 2022-05-07 21:21:38,312 INFO [train.py:715] (0/8) Epoch 13, batch 27900, loss[loss=0.1174, simple_loss=0.1848, pruned_loss=0.02504, over 4774.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2086, pruned_loss=0.0312, over 972167.20 frames.], batch size: 17, lr: 1.66e-04 2022-05-07 21:22:16,093 INFO [train.py:715] (0/8) Epoch 13, batch 27950, loss[loss=0.1251, simple_loss=0.2022, pruned_loss=0.02405, over 4820.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2095, pruned_loss=0.03159, over 971499.67 frames.], batch size: 26, lr: 1.66e-04 2022-05-07 21:22:55,050 INFO [train.py:715] (0/8) Epoch 13, batch 28000, loss[loss=0.1393, simple_loss=0.2131, pruned_loss=0.03277, over 4861.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.03176, over 972321.82 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:23:33,521 INFO [train.py:715] (0/8) Epoch 13, batch 28050, loss[loss=0.1316, simple_loss=0.1933, pruned_loss=0.03495, over 4789.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.03239, over 972556.28 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:24:11,551 INFO [train.py:715] (0/8) Epoch 13, batch 28100, loss[loss=0.1382, simple_loss=0.2022, pruned_loss=0.03711, over 4689.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2105, pruned_loss=0.03259, over 972652.95 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:24:49,595 INFO [train.py:715] (0/8) Epoch 13, batch 28150, loss[loss=0.1286, simple_loss=0.2046, pruned_loss=0.02628, over 4900.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2103, pruned_loss=0.03194, over 972723.52 frames.], batch size: 22, lr: 1.66e-04 2022-05-07 21:25:28,813 INFO [train.py:715] (0/8) Epoch 13, batch 28200, loss[loss=0.1121, simple_loss=0.1869, pruned_loss=0.01862, over 4769.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2102, pruned_loss=0.03166, over 972400.05 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:26:06,608 INFO [train.py:715] (0/8) Epoch 13, batch 28250, loss[loss=0.1319, simple_loss=0.1939, pruned_loss=0.03497, over 4833.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2105, pruned_loss=0.03202, over 972579.51 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:26:44,750 INFO [train.py:715] (0/8) Epoch 13, batch 28300, loss[loss=0.1415, simple_loss=0.214, pruned_loss=0.03454, over 4904.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2104, pruned_loss=0.03161, over 972413.11 frames.], batch size: 23, lr: 1.66e-04 2022-05-07 21:27:23,458 INFO [train.py:715] (0/8) Epoch 13, batch 28350, loss[loss=0.1262, simple_loss=0.198, pruned_loss=0.02716, over 4948.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2107, pruned_loss=0.03145, over 972674.58 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:28:01,605 INFO [train.py:715] (0/8) Epoch 13, batch 28400, loss[loss=0.154, simple_loss=0.2289, pruned_loss=0.03952, over 4934.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03125, over 972383.40 frames.], batch size: 39, lr: 1.66e-04 2022-05-07 21:28:40,043 INFO [train.py:715] (0/8) Epoch 13, batch 28450, loss[loss=0.1312, simple_loss=0.2137, pruned_loss=0.02431, over 4821.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03055, over 971495.29 frames.], batch size: 25, lr: 1.66e-04 2022-05-07 21:29:18,383 INFO [train.py:715] (0/8) Epoch 13, batch 28500, loss[loss=0.1232, simple_loss=0.2053, pruned_loss=0.02049, over 4921.00 frames.], tot_loss[loss=0.1353, simple_loss=0.209, pruned_loss=0.03082, over 972247.33 frames.], batch size: 29, lr: 1.66e-04 2022-05-07 21:29:57,057 INFO [train.py:715] (0/8) Epoch 13, batch 28550, loss[loss=0.1415, simple_loss=0.2175, pruned_loss=0.03273, over 4693.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2103, pruned_loss=0.03147, over 971574.23 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:30:35,257 INFO [train.py:715] (0/8) Epoch 13, batch 28600, loss[loss=0.1147, simple_loss=0.186, pruned_loss=0.02171, over 4978.00 frames.], tot_loss[loss=0.137, simple_loss=0.2102, pruned_loss=0.0319, over 972049.65 frames.], batch size: 35, lr: 1.66e-04 2022-05-07 21:31:13,614 INFO [train.py:715] (0/8) Epoch 13, batch 28650, loss[loss=0.1282, simple_loss=0.2028, pruned_loss=0.02682, over 4990.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2104, pruned_loss=0.03212, over 972527.17 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:31:52,260 INFO [train.py:715] (0/8) Epoch 13, batch 28700, loss[loss=0.1206, simple_loss=0.1937, pruned_loss=0.02376, over 4966.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03151, over 972343.72 frames.], batch size: 24, lr: 1.66e-04 2022-05-07 21:32:30,331 INFO [train.py:715] (0/8) Epoch 13, batch 28750, loss[loss=0.1162, simple_loss=0.1921, pruned_loss=0.02014, over 4817.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.03146, over 971986.16 frames.], batch size: 26, lr: 1.66e-04 2022-05-07 21:33:08,634 INFO [train.py:715] (0/8) Epoch 13, batch 28800, loss[loss=0.1493, simple_loss=0.219, pruned_loss=0.03978, over 4774.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2096, pruned_loss=0.03141, over 972246.49 frames.], batch size: 17, lr: 1.66e-04 2022-05-07 21:33:47,842 INFO [train.py:715] (0/8) Epoch 13, batch 28850, loss[loss=0.1364, simple_loss=0.2225, pruned_loss=0.02518, over 4796.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2098, pruned_loss=0.03157, over 972008.90 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:34:26,363 INFO [train.py:715] (0/8) Epoch 13, batch 28900, loss[loss=0.1596, simple_loss=0.2269, pruned_loss=0.04618, over 4870.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2101, pruned_loss=0.032, over 972003.58 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:35:04,277 INFO [train.py:715] (0/8) Epoch 13, batch 28950, loss[loss=0.1413, simple_loss=0.2143, pruned_loss=0.03414, over 4689.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2103, pruned_loss=0.03241, over 972186.19 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:35:42,441 INFO [train.py:715] (0/8) Epoch 13, batch 29000, loss[loss=0.1385, simple_loss=0.207, pruned_loss=0.03501, over 4947.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2106, pruned_loss=0.03283, over 971410.93 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:36:21,622 INFO [train.py:715] (0/8) Epoch 13, batch 29050, loss[loss=0.1166, simple_loss=0.1898, pruned_loss=0.02174, over 4785.00 frames.], tot_loss[loss=0.1373, simple_loss=0.21, pruned_loss=0.03233, over 971104.72 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:37:00,156 INFO [train.py:715] (0/8) Epoch 13, batch 29100, loss[loss=0.1716, simple_loss=0.24, pruned_loss=0.0516, over 4943.00 frames.], tot_loss[loss=0.137, simple_loss=0.2099, pruned_loss=0.03203, over 970397.42 frames.], batch size: 39, lr: 1.66e-04 2022-05-07 21:37:38,201 INFO [train.py:715] (0/8) Epoch 13, batch 29150, loss[loss=0.1172, simple_loss=0.1897, pruned_loss=0.02242, over 4781.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2101, pruned_loss=0.03176, over 971360.97 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:38:16,961 INFO [train.py:715] (0/8) Epoch 13, batch 29200, loss[loss=0.1678, simple_loss=0.2367, pruned_loss=0.04944, over 4758.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2103, pruned_loss=0.03175, over 971672.54 frames.], batch size: 12, lr: 1.66e-04 2022-05-07 21:38:55,208 INFO [train.py:715] (0/8) Epoch 13, batch 29250, loss[loss=0.1386, simple_loss=0.2151, pruned_loss=0.03106, over 4867.00 frames.], tot_loss[loss=0.1373, simple_loss=0.2109, pruned_loss=0.03187, over 972378.50 frames.], batch size: 22, lr: 1.66e-04 2022-05-07 21:39:34,052 INFO [train.py:715] (0/8) Epoch 13, batch 29300, loss[loss=0.1393, simple_loss=0.2156, pruned_loss=0.03145, over 4879.00 frames.], tot_loss[loss=0.138, simple_loss=0.2114, pruned_loss=0.03227, over 973064.65 frames.], batch size: 39, lr: 1.66e-04 2022-05-07 21:40:12,798 INFO [train.py:715] (0/8) Epoch 13, batch 29350, loss[loss=0.1322, simple_loss=0.2156, pruned_loss=0.02443, over 4780.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2113, pruned_loss=0.03182, over 972983.96 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:40:51,672 INFO [train.py:715] (0/8) Epoch 13, batch 29400, loss[loss=0.1351, simple_loss=0.2021, pruned_loss=0.03408, over 4888.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2109, pruned_loss=0.03177, over 973087.22 frames.], batch size: 22, lr: 1.66e-04 2022-05-07 21:41:29,695 INFO [train.py:715] (0/8) Epoch 13, batch 29450, loss[loss=0.1249, simple_loss=0.2035, pruned_loss=0.02313, over 4910.00 frames.], tot_loss[loss=0.137, simple_loss=0.2106, pruned_loss=0.03175, over 972464.89 frames.], batch size: 17, lr: 1.66e-04 2022-05-07 21:42:08,735 INFO [train.py:715] (0/8) Epoch 13, batch 29500, loss[loss=0.15, simple_loss=0.2132, pruned_loss=0.04344, over 4648.00 frames.], tot_loss[loss=0.1376, simple_loss=0.2109, pruned_loss=0.03214, over 971445.79 frames.], batch size: 13, lr: 1.66e-04 2022-05-07 21:42:47,369 INFO [train.py:715] (0/8) Epoch 13, batch 29550, loss[loss=0.1197, simple_loss=0.2014, pruned_loss=0.01895, over 4931.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03159, over 971871.45 frames.], batch size: 29, lr: 1.66e-04 2022-05-07 21:43:25,732 INFO [train.py:715] (0/8) Epoch 13, batch 29600, loss[loss=0.1663, simple_loss=0.2486, pruned_loss=0.04206, over 4959.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2104, pruned_loss=0.03204, over 972880.38 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:44:03,480 INFO [train.py:715] (0/8) Epoch 13, batch 29650, loss[loss=0.116, simple_loss=0.1816, pruned_loss=0.02514, over 4841.00 frames.], tot_loss[loss=0.1375, simple_loss=0.2106, pruned_loss=0.03215, over 973038.41 frames.], batch size: 15, lr: 1.66e-04 2022-05-07 21:44:41,765 INFO [train.py:715] (0/8) Epoch 13, batch 29700, loss[loss=0.1386, simple_loss=0.2095, pruned_loss=0.03386, over 4856.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2106, pruned_loss=0.03183, over 972247.33 frames.], batch size: 12, lr: 1.66e-04 2022-05-07 21:45:20,121 INFO [train.py:715] (0/8) Epoch 13, batch 29750, loss[loss=0.1495, simple_loss=0.2282, pruned_loss=0.03536, over 4959.00 frames.], tot_loss[loss=0.1365, simple_loss=0.21, pruned_loss=0.03153, over 972671.65 frames.], batch size: 24, lr: 1.66e-04 2022-05-07 21:45:59,494 INFO [train.py:715] (0/8) Epoch 13, batch 29800, loss[loss=0.12, simple_loss=0.1986, pruned_loss=0.02068, over 4963.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2095, pruned_loss=0.03163, over 972136.60 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:46:38,726 INFO [train.py:715] (0/8) Epoch 13, batch 29850, loss[loss=0.1241, simple_loss=0.2042, pruned_loss=0.02199, over 4811.00 frames.], tot_loss[loss=0.136, simple_loss=0.209, pruned_loss=0.03147, over 972313.28 frames.], batch size: 27, lr: 1.66e-04 2022-05-07 21:47:18,336 INFO [train.py:715] (0/8) Epoch 13, batch 29900, loss[loss=0.1327, simple_loss=0.2045, pruned_loss=0.03044, over 4825.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2091, pruned_loss=0.03127, over 972756.53 frames.], batch size: 25, lr: 1.66e-04 2022-05-07 21:47:57,732 INFO [train.py:715] (0/8) Epoch 13, batch 29950, loss[loss=0.1474, simple_loss=0.2135, pruned_loss=0.04066, over 4733.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.0313, over 972397.85 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:48:36,361 INFO [train.py:715] (0/8) Epoch 13, batch 30000, loss[loss=0.1402, simple_loss=0.2043, pruned_loss=0.03805, over 4971.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.03105, over 972458.78 frames.], batch size: 35, lr: 1.66e-04 2022-05-07 21:48:36,362 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 21:48:45,862 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1054, simple_loss=0.1891, pruned_loss=0.01083, over 914524.00 frames. 2022-05-07 21:49:25,284 INFO [train.py:715] (0/8) Epoch 13, batch 30050, loss[loss=0.1251, simple_loss=0.1968, pruned_loss=0.02675, over 4835.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.0305, over 971626.73 frames.], batch size: 30, lr: 1.66e-04 2022-05-07 21:50:05,093 INFO [train.py:715] (0/8) Epoch 13, batch 30100, loss[loss=0.1129, simple_loss=0.1867, pruned_loss=0.01952, over 4886.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03125, over 971563.51 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:50:44,565 INFO [train.py:715] (0/8) Epoch 13, batch 30150, loss[loss=0.1249, simple_loss=0.2004, pruned_loss=0.02468, over 4942.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03125, over 970920.41 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:51:23,147 INFO [train.py:715] (0/8) Epoch 13, batch 30200, loss[loss=0.1179, simple_loss=0.2002, pruned_loss=0.0178, over 4876.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03111, over 971305.83 frames.], batch size: 22, lr: 1.66e-04 2022-05-07 21:52:02,986 INFO [train.py:715] (0/8) Epoch 13, batch 30250, loss[loss=0.1027, simple_loss=0.1766, pruned_loss=0.0144, over 4965.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2105, pruned_loss=0.03131, over 971376.77 frames.], batch size: 14, lr: 1.66e-04 2022-05-07 21:52:42,786 INFO [train.py:715] (0/8) Epoch 13, batch 30300, loss[loss=0.1356, simple_loss=0.2165, pruned_loss=0.02732, over 4825.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03159, over 972684.30 frames.], batch size: 27, lr: 1.66e-04 2022-05-07 21:53:22,304 INFO [train.py:715] (0/8) Epoch 13, batch 30350, loss[loss=0.1429, simple_loss=0.2152, pruned_loss=0.03535, over 4804.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2101, pruned_loss=0.03172, over 972604.88 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:54:01,875 INFO [train.py:715] (0/8) Epoch 13, batch 30400, loss[loss=0.1348, simple_loss=0.2012, pruned_loss=0.03421, over 4806.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2096, pruned_loss=0.03128, over 971784.48 frames.], batch size: 21, lr: 1.66e-04 2022-05-07 21:54:42,499 INFO [train.py:715] (0/8) Epoch 13, batch 30450, loss[loss=0.1341, simple_loss=0.2145, pruned_loss=0.02687, over 4875.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03133, over 971566.92 frames.], batch size: 16, lr: 1.66e-04 2022-05-07 21:55:22,610 INFO [train.py:715] (0/8) Epoch 13, batch 30500, loss[loss=0.1135, simple_loss=0.1892, pruned_loss=0.0189, over 4946.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03067, over 972522.18 frames.], batch size: 29, lr: 1.66e-04 2022-05-07 21:56:02,394 INFO [train.py:715] (0/8) Epoch 13, batch 30550, loss[loss=0.1308, simple_loss=0.2032, pruned_loss=0.02919, over 4790.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2092, pruned_loss=0.03117, over 972810.24 frames.], batch size: 18, lr: 1.66e-04 2022-05-07 21:56:43,835 INFO [train.py:715] (0/8) Epoch 13, batch 30600, loss[loss=0.1167, simple_loss=0.1935, pruned_loss=0.01991, over 4819.00 frames.], tot_loss[loss=0.1357, simple_loss=0.209, pruned_loss=0.03116, over 972263.47 frames.], batch size: 27, lr: 1.66e-04 2022-05-07 21:57:24,953 INFO [train.py:715] (0/8) Epoch 13, batch 30650, loss[loss=0.1099, simple_loss=0.1869, pruned_loss=0.0165, over 4958.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2082, pruned_loss=0.03077, over 972050.48 frames.], batch size: 24, lr: 1.65e-04 2022-05-07 21:58:05,365 INFO [train.py:715] (0/8) Epoch 13, batch 30700, loss[loss=0.1258, simple_loss=0.2022, pruned_loss=0.02475, over 4927.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2076, pruned_loss=0.03055, over 972060.30 frames.], batch size: 17, lr: 1.65e-04 2022-05-07 21:58:45,824 INFO [train.py:715] (0/8) Epoch 13, batch 30750, loss[loss=0.159, simple_loss=0.2215, pruned_loss=0.04827, over 4873.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2087, pruned_loss=0.03118, over 971971.47 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 21:59:26,816 INFO [train.py:715] (0/8) Epoch 13, batch 30800, loss[loss=0.1433, simple_loss=0.2199, pruned_loss=0.03336, over 4915.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2089, pruned_loss=0.03086, over 972197.16 frames.], batch size: 23, lr: 1.65e-04 2022-05-07 22:00:07,577 INFO [train.py:715] (0/8) Epoch 13, batch 30850, loss[loss=0.1643, simple_loss=0.2298, pruned_loss=0.04936, over 4943.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.03116, over 972729.83 frames.], batch size: 35, lr: 1.65e-04 2022-05-07 22:00:48,212 INFO [train.py:715] (0/8) Epoch 13, batch 30900, loss[loss=0.1378, simple_loss=0.2124, pruned_loss=0.03155, over 4873.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03115, over 973170.35 frames.], batch size: 22, lr: 1.65e-04 2022-05-07 22:01:29,254 INFO [train.py:715] (0/8) Epoch 13, batch 30950, loss[loss=0.1113, simple_loss=0.1898, pruned_loss=0.01636, over 4775.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03132, over 972571.56 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:02:09,963 INFO [train.py:715] (0/8) Epoch 13, batch 31000, loss[loss=0.14, simple_loss=0.2115, pruned_loss=0.03426, over 4885.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.0312, over 972481.44 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:02:50,142 INFO [train.py:715] (0/8) Epoch 13, batch 31050, loss[loss=0.1665, simple_loss=0.2327, pruned_loss=0.0501, over 4859.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03095, over 972962.93 frames.], batch size: 32, lr: 1.65e-04 2022-05-07 22:03:30,744 INFO [train.py:715] (0/8) Epoch 13, batch 31100, loss[loss=0.1591, simple_loss=0.2324, pruned_loss=0.04286, over 4906.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2104, pruned_loss=0.0312, over 972550.29 frames.], batch size: 19, lr: 1.65e-04 2022-05-07 22:04:11,680 INFO [train.py:715] (0/8) Epoch 13, batch 31150, loss[loss=0.1581, simple_loss=0.2362, pruned_loss=0.04001, over 4854.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2101, pruned_loss=0.03114, over 972808.66 frames.], batch size: 32, lr: 1.65e-04 2022-05-07 22:04:52,783 INFO [train.py:715] (0/8) Epoch 13, batch 31200, loss[loss=0.1772, simple_loss=0.2373, pruned_loss=0.05857, over 4855.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03138, over 973041.12 frames.], batch size: 32, lr: 1.65e-04 2022-05-07 22:05:32,912 INFO [train.py:715] (0/8) Epoch 13, batch 31250, loss[loss=0.1338, simple_loss=0.2035, pruned_loss=0.03205, over 4978.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2095, pruned_loss=0.03103, over 973435.55 frames.], batch size: 26, lr: 1.65e-04 2022-05-07 22:06:13,239 INFO [train.py:715] (0/8) Epoch 13, batch 31300, loss[loss=0.1527, simple_loss=0.2276, pruned_loss=0.0389, over 4858.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.0308, over 973126.36 frames.], batch size: 34, lr: 1.65e-04 2022-05-07 22:06:53,512 INFO [train.py:715] (0/8) Epoch 13, batch 31350, loss[loss=0.1065, simple_loss=0.1845, pruned_loss=0.01424, over 4926.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03046, over 972642.37 frames.], batch size: 29, lr: 1.65e-04 2022-05-07 22:07:33,263 INFO [train.py:715] (0/8) Epoch 13, batch 31400, loss[loss=0.1515, simple_loss=0.2257, pruned_loss=0.03863, over 4823.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03068, over 972697.08 frames.], batch size: 26, lr: 1.65e-04 2022-05-07 22:08:13,742 INFO [train.py:715] (0/8) Epoch 13, batch 31450, loss[loss=0.1277, simple_loss=0.1992, pruned_loss=0.02809, over 4864.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03049, over 972633.55 frames.], batch size: 20, lr: 1.65e-04 2022-05-07 22:08:54,097 INFO [train.py:715] (0/8) Epoch 13, batch 31500, loss[loss=0.1365, simple_loss=0.211, pruned_loss=0.03099, over 4923.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03064, over 972898.02 frames.], batch size: 17, lr: 1.65e-04 2022-05-07 22:09:33,922 INFO [train.py:715] (0/8) Epoch 13, batch 31550, loss[loss=0.1444, simple_loss=0.2099, pruned_loss=0.03948, over 4873.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03072, over 973500.63 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:10:14,441 INFO [train.py:715] (0/8) Epoch 13, batch 31600, loss[loss=0.1416, simple_loss=0.2119, pruned_loss=0.03564, over 4928.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03098, over 972839.73 frames.], batch size: 23, lr: 1.65e-04 2022-05-07 22:10:55,014 INFO [train.py:715] (0/8) Epoch 13, batch 31650, loss[loss=0.1607, simple_loss=0.2227, pruned_loss=0.04933, over 4959.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.0315, over 973458.26 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:11:35,401 INFO [train.py:715] (0/8) Epoch 13, batch 31700, loss[loss=0.1819, simple_loss=0.2486, pruned_loss=0.0576, over 4903.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2093, pruned_loss=0.03132, over 973298.14 frames.], batch size: 39, lr: 1.65e-04 2022-05-07 22:12:15,839 INFO [train.py:715] (0/8) Epoch 13, batch 31750, loss[loss=0.1273, simple_loss=0.2103, pruned_loss=0.02211, over 4795.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03112, over 973311.68 frames.], batch size: 24, lr: 1.65e-04 2022-05-07 22:12:56,364 INFO [train.py:715] (0/8) Epoch 13, batch 31800, loss[loss=0.1314, simple_loss=0.2049, pruned_loss=0.02893, over 4849.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03134, over 973419.87 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:13:37,273 INFO [train.py:715] (0/8) Epoch 13, batch 31850, loss[loss=0.1333, simple_loss=0.2195, pruned_loss=0.02354, over 4838.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2107, pruned_loss=0.03161, over 973094.80 frames.], batch size: 26, lr: 1.65e-04 2022-05-07 22:14:18,124 INFO [train.py:715] (0/8) Epoch 13, batch 31900, loss[loss=0.1485, simple_loss=0.2165, pruned_loss=0.04022, over 4986.00 frames.], tot_loss[loss=0.1373, simple_loss=0.211, pruned_loss=0.03185, over 973363.70 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:14:59,145 INFO [train.py:715] (0/8) Epoch 13, batch 31950, loss[loss=0.1347, simple_loss=0.2055, pruned_loss=0.03194, over 4786.00 frames.], tot_loss[loss=0.137, simple_loss=0.2105, pruned_loss=0.03175, over 973338.47 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:15:39,561 INFO [train.py:715] (0/8) Epoch 13, batch 32000, loss[loss=0.1193, simple_loss=0.1865, pruned_loss=0.02607, over 4789.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.03124, over 972900.38 frames.], batch size: 21, lr: 1.65e-04 2022-05-07 22:16:20,155 INFO [train.py:715] (0/8) Epoch 13, batch 32050, loss[loss=0.1152, simple_loss=0.1881, pruned_loss=0.02112, over 4970.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03124, over 973340.62 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:17:00,693 INFO [train.py:715] (0/8) Epoch 13, batch 32100, loss[loss=0.1418, simple_loss=0.2064, pruned_loss=0.03855, over 4969.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03089, over 972669.40 frames.], batch size: 35, lr: 1.65e-04 2022-05-07 22:17:41,700 INFO [train.py:715] (0/8) Epoch 13, batch 32150, loss[loss=0.1192, simple_loss=0.191, pruned_loss=0.02368, over 4759.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.0309, over 972633.61 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:18:22,395 INFO [train.py:715] (0/8) Epoch 13, batch 32200, loss[loss=0.1595, simple_loss=0.2288, pruned_loss=0.04514, over 4875.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.03043, over 972427.09 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:19:03,054 INFO [train.py:715] (0/8) Epoch 13, batch 32250, loss[loss=0.1531, simple_loss=0.2226, pruned_loss=0.04182, over 4871.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.0306, over 972027.87 frames.], batch size: 30, lr: 1.65e-04 2022-05-07 22:19:43,882 INFO [train.py:715] (0/8) Epoch 13, batch 32300, loss[loss=0.1114, simple_loss=0.1847, pruned_loss=0.01906, over 4861.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03132, over 970946.57 frames.], batch size: 13, lr: 1.65e-04 2022-05-07 22:20:24,945 INFO [train.py:715] (0/8) Epoch 13, batch 32350, loss[loss=0.122, simple_loss=0.1881, pruned_loss=0.02795, over 4973.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.03114, over 971802.76 frames.], batch size: 28, lr: 1.65e-04 2022-05-07 22:21:06,368 INFO [train.py:715] (0/8) Epoch 13, batch 32400, loss[loss=0.1516, simple_loss=0.218, pruned_loss=0.04258, over 4986.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03129, over 972090.56 frames.], batch size: 31, lr: 1.65e-04 2022-05-07 22:21:47,427 INFO [train.py:715] (0/8) Epoch 13, batch 32450, loss[loss=0.1255, simple_loss=0.199, pruned_loss=0.02596, over 4971.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2099, pruned_loss=0.03143, over 971617.95 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:22:28,213 INFO [train.py:715] (0/8) Epoch 13, batch 32500, loss[loss=0.1056, simple_loss=0.1714, pruned_loss=0.01991, over 4738.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2099, pruned_loss=0.03175, over 971054.35 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:23:09,252 INFO [train.py:715] (0/8) Epoch 13, batch 32550, loss[loss=0.1216, simple_loss=0.1983, pruned_loss=0.02241, over 4870.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2091, pruned_loss=0.03154, over 971476.12 frames.], batch size: 32, lr: 1.65e-04 2022-05-07 22:23:49,650 INFO [train.py:715] (0/8) Epoch 13, batch 32600, loss[loss=0.149, simple_loss=0.2249, pruned_loss=0.03656, over 4924.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2096, pruned_loss=0.03169, over 971898.12 frames.], batch size: 23, lr: 1.65e-04 2022-05-07 22:24:29,999 INFO [train.py:715] (0/8) Epoch 13, batch 32650, loss[loss=0.1081, simple_loss=0.1824, pruned_loss=0.01691, over 4787.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2087, pruned_loss=0.03097, over 970787.99 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:25:10,590 INFO [train.py:715] (0/8) Epoch 13, batch 32700, loss[loss=0.1346, simple_loss=0.2059, pruned_loss=0.03171, over 4649.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03079, over 971592.82 frames.], batch size: 13, lr: 1.65e-04 2022-05-07 22:25:50,914 INFO [train.py:715] (0/8) Epoch 13, batch 32750, loss[loss=0.1315, simple_loss=0.2034, pruned_loss=0.02978, over 4861.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03034, over 971743.49 frames.], batch size: 32, lr: 1.65e-04 2022-05-07 22:26:31,936 INFO [train.py:715] (0/8) Epoch 13, batch 32800, loss[loss=0.1469, simple_loss=0.2207, pruned_loss=0.03654, over 4865.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03033, over 972664.45 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:27:12,668 INFO [train.py:715] (0/8) Epoch 13, batch 32850, loss[loss=0.1521, simple_loss=0.2243, pruned_loss=0.03996, over 4910.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03008, over 973073.80 frames.], batch size: 19, lr: 1.65e-04 2022-05-07 22:27:53,754 INFO [train.py:715] (0/8) Epoch 13, batch 32900, loss[loss=0.1517, simple_loss=0.2214, pruned_loss=0.04097, over 4696.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03025, over 972474.52 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:28:33,954 INFO [train.py:715] (0/8) Epoch 13, batch 32950, loss[loss=0.1351, simple_loss=0.2127, pruned_loss=0.02881, over 4791.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03028, over 972462.59 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:29:14,627 INFO [train.py:715] (0/8) Epoch 13, batch 33000, loss[loss=0.172, simple_loss=0.2414, pruned_loss=0.05124, over 4771.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2089, pruned_loss=0.03081, over 972082.97 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:29:14,628 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 22:29:24,504 INFO [train.py:742] (0/8) Epoch 13, validation: loss=0.1054, simple_loss=0.1892, pruned_loss=0.01081, over 914524.00 frames. 2022-05-07 22:30:05,559 INFO [train.py:715] (0/8) Epoch 13, batch 33050, loss[loss=0.1602, simple_loss=0.2195, pruned_loss=0.05043, over 4975.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2095, pruned_loss=0.03138, over 973143.56 frames.], batch size: 31, lr: 1.65e-04 2022-05-07 22:30:45,207 INFO [train.py:715] (0/8) Epoch 13, batch 33100, loss[loss=0.1109, simple_loss=0.1762, pruned_loss=0.02284, over 4810.00 frames.], tot_loss[loss=0.136, simple_loss=0.2094, pruned_loss=0.03128, over 972283.40 frames.], batch size: 21, lr: 1.65e-04 2022-05-07 22:31:25,148 INFO [train.py:715] (0/8) Epoch 13, batch 33150, loss[loss=0.1769, simple_loss=0.2405, pruned_loss=0.05667, over 4820.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03141, over 972570.18 frames.], batch size: 13, lr: 1.65e-04 2022-05-07 22:32:05,570 INFO [train.py:715] (0/8) Epoch 13, batch 33200, loss[loss=0.1528, simple_loss=0.2226, pruned_loss=0.04151, over 4745.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03133, over 973010.10 frames.], batch size: 16, lr: 1.65e-04 2022-05-07 22:32:46,035 INFO [train.py:715] (0/8) Epoch 13, batch 33250, loss[loss=0.1219, simple_loss=0.1988, pruned_loss=0.02251, over 4774.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2102, pruned_loss=0.03153, over 972756.90 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:33:26,588 INFO [train.py:715] (0/8) Epoch 13, batch 33300, loss[loss=0.09581, simple_loss=0.172, pruned_loss=0.009818, over 4780.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2103, pruned_loss=0.03164, over 972649.08 frames.], batch size: 12, lr: 1.65e-04 2022-05-07 22:34:07,015 INFO [train.py:715] (0/8) Epoch 13, batch 33350, loss[loss=0.131, simple_loss=0.202, pruned_loss=0.03003, over 4783.00 frames.], tot_loss[loss=0.136, simple_loss=0.2095, pruned_loss=0.03128, over 971891.58 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:34:47,634 INFO [train.py:715] (0/8) Epoch 13, batch 33400, loss[loss=0.1176, simple_loss=0.1942, pruned_loss=0.02044, over 4992.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03059, over 971733.87 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:35:28,235 INFO [train.py:715] (0/8) Epoch 13, batch 33450, loss[loss=0.1108, simple_loss=0.1872, pruned_loss=0.01716, over 4933.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.0304, over 971385.25 frames.], batch size: 29, lr: 1.65e-04 2022-05-07 22:36:08,958 INFO [train.py:715] (0/8) Epoch 13, batch 33500, loss[loss=0.1505, simple_loss=0.227, pruned_loss=0.03696, over 4899.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03046, over 971512.37 frames.], batch size: 17, lr: 1.65e-04 2022-05-07 22:36:49,607 INFO [train.py:715] (0/8) Epoch 13, batch 33550, loss[loss=0.1414, simple_loss=0.2219, pruned_loss=0.03049, over 4919.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.0302, over 971801.75 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:37:30,304 INFO [train.py:715] (0/8) Epoch 13, batch 33600, loss[loss=0.1405, simple_loss=0.2196, pruned_loss=0.03065, over 4762.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.0303, over 971720.36 frames.], batch size: 19, lr: 1.65e-04 2022-05-07 22:38:10,829 INFO [train.py:715] (0/8) Epoch 13, batch 33650, loss[loss=0.1447, simple_loss=0.2156, pruned_loss=0.03684, over 4895.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.0305, over 972106.85 frames.], batch size: 17, lr: 1.65e-04 2022-05-07 22:38:51,062 INFO [train.py:715] (0/8) Epoch 13, batch 33700, loss[loss=0.1108, simple_loss=0.1823, pruned_loss=0.01964, over 4788.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03069, over 971365.07 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:39:32,036 INFO [train.py:715] (0/8) Epoch 13, batch 33750, loss[loss=0.1436, simple_loss=0.2229, pruned_loss=0.03216, over 4958.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03076, over 972412.79 frames.], batch size: 21, lr: 1.65e-04 2022-05-07 22:40:12,827 INFO [train.py:715] (0/8) Epoch 13, batch 33800, loss[loss=0.1322, simple_loss=0.2134, pruned_loss=0.0255, over 4810.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2102, pruned_loss=0.03101, over 972550.13 frames.], batch size: 21, lr: 1.65e-04 2022-05-07 22:40:53,574 INFO [train.py:715] (0/8) Epoch 13, batch 33850, loss[loss=0.1251, simple_loss=0.2006, pruned_loss=0.02476, over 4988.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03125, over 973074.28 frames.], batch size: 25, lr: 1.65e-04 2022-05-07 22:41:34,021 INFO [train.py:715] (0/8) Epoch 13, batch 33900, loss[loss=0.1283, simple_loss=0.2013, pruned_loss=0.02764, over 4874.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03171, over 973566.75 frames.], batch size: 22, lr: 1.65e-04 2022-05-07 22:42:15,283 INFO [train.py:715] (0/8) Epoch 13, batch 33950, loss[loss=0.1176, simple_loss=0.1942, pruned_loss=0.02043, over 4910.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2101, pruned_loss=0.03147, over 972344.50 frames.], batch size: 19, lr: 1.65e-04 2022-05-07 22:42:56,289 INFO [train.py:715] (0/8) Epoch 13, batch 34000, loss[loss=0.1366, simple_loss=0.2164, pruned_loss=0.02844, over 4897.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03161, over 972944.92 frames.], batch size: 22, lr: 1.65e-04 2022-05-07 22:43:36,859 INFO [train.py:715] (0/8) Epoch 13, batch 34050, loss[loss=0.1189, simple_loss=0.2016, pruned_loss=0.01807, over 4933.00 frames.], tot_loss[loss=0.136, simple_loss=0.2095, pruned_loss=0.03129, over 973816.84 frames.], batch size: 23, lr: 1.65e-04 2022-05-07 22:44:17,681 INFO [train.py:715] (0/8) Epoch 13, batch 34100, loss[loss=0.1155, simple_loss=0.1804, pruned_loss=0.02531, over 4686.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03093, over 973379.95 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:44:57,555 INFO [train.py:715] (0/8) Epoch 13, batch 34150, loss[loss=0.1229, simple_loss=0.1944, pruned_loss=0.02567, over 4886.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2086, pruned_loss=0.03077, over 972858.36 frames.], batch size: 39, lr: 1.65e-04 2022-05-07 22:45:38,247 INFO [train.py:715] (0/8) Epoch 13, batch 34200, loss[loss=0.1269, simple_loss=0.1981, pruned_loss=0.02787, over 4933.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.03112, over 973045.97 frames.], batch size: 21, lr: 1.65e-04 2022-05-07 22:46:18,600 INFO [train.py:715] (0/8) Epoch 13, batch 34250, loss[loss=0.1476, simple_loss=0.2136, pruned_loss=0.04086, over 4968.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03094, over 973199.51 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:46:59,526 INFO [train.py:715] (0/8) Epoch 13, batch 34300, loss[loss=0.1165, simple_loss=0.1929, pruned_loss=0.02008, over 4952.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03095, over 973446.43 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:47:39,594 INFO [train.py:715] (0/8) Epoch 13, batch 34350, loss[loss=0.1402, simple_loss=0.2196, pruned_loss=0.03042, over 4946.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03062, over 973358.63 frames.], batch size: 35, lr: 1.65e-04 2022-05-07 22:48:20,188 INFO [train.py:715] (0/8) Epoch 13, batch 34400, loss[loss=0.1265, simple_loss=0.1922, pruned_loss=0.03041, over 4952.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2086, pruned_loss=0.03086, over 972934.32 frames.], batch size: 21, lr: 1.65e-04 2022-05-07 22:49:01,282 INFO [train.py:715] (0/8) Epoch 13, batch 34450, loss[loss=0.1266, simple_loss=0.2004, pruned_loss=0.02638, over 4818.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03092, over 973953.37 frames.], batch size: 15, lr: 1.65e-04 2022-05-07 22:49:41,690 INFO [train.py:715] (0/8) Epoch 13, batch 34500, loss[loss=0.1486, simple_loss=0.2304, pruned_loss=0.03339, over 4934.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.0316, over 973624.93 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:50:21,493 INFO [train.py:715] (0/8) Epoch 13, batch 34550, loss[loss=0.1423, simple_loss=0.2219, pruned_loss=0.03134, over 4811.00 frames.], tot_loss[loss=0.1378, simple_loss=0.2113, pruned_loss=0.03218, over 972788.51 frames.], batch size: 25, lr: 1.65e-04 2022-05-07 22:51:01,498 INFO [train.py:715] (0/8) Epoch 13, batch 34600, loss[loss=0.1358, simple_loss=0.2024, pruned_loss=0.03463, over 4913.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2116, pruned_loss=0.03232, over 972475.95 frames.], batch size: 18, lr: 1.65e-04 2022-05-07 22:51:40,866 INFO [train.py:715] (0/8) Epoch 13, batch 34650, loss[loss=0.129, simple_loss=0.1963, pruned_loss=0.03088, over 4776.00 frames.], tot_loss[loss=0.1381, simple_loss=0.2113, pruned_loss=0.0325, over 972898.01 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:52:20,402 INFO [train.py:715] (0/8) Epoch 13, batch 34700, loss[loss=0.12, simple_loss=0.1989, pruned_loss=0.02062, over 4794.00 frames.], tot_loss[loss=0.1377, simple_loss=0.2108, pruned_loss=0.03232, over 972609.80 frames.], batch size: 14, lr: 1.65e-04 2022-05-07 22:52:59,362 INFO [train.py:715] (0/8) Epoch 13, batch 34750, loss[loss=0.161, simple_loss=0.2397, pruned_loss=0.04114, over 4883.00 frames.], tot_loss[loss=0.1379, simple_loss=0.2114, pruned_loss=0.03222, over 973075.98 frames.], batch size: 22, lr: 1.65e-04 2022-05-07 22:53:36,104 INFO [train.py:715] (0/8) Epoch 13, batch 34800, loss[loss=0.1326, simple_loss=0.1988, pruned_loss=0.03323, over 4943.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2106, pruned_loss=0.03165, over 972679.29 frames.], batch size: 23, lr: 1.65e-04 2022-05-07 22:53:44,981 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-13.pt 2022-05-07 22:54:25,042 INFO [train.py:715] (0/8) Epoch 14, batch 0, loss[loss=0.12, simple_loss=0.1841, pruned_loss=0.02796, over 4839.00 frames.], tot_loss[loss=0.12, simple_loss=0.1841, pruned_loss=0.02796, over 4839.00 frames.], batch size: 30, lr: 1.59e-04 2022-05-07 22:55:04,009 INFO [train.py:715] (0/8) Epoch 14, batch 50, loss[loss=0.1444, simple_loss=0.2105, pruned_loss=0.03917, over 4752.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2047, pruned_loss=0.03113, over 218471.14 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 22:55:42,420 INFO [train.py:715] (0/8) Epoch 14, batch 100, loss[loss=0.1299, simple_loss=0.1984, pruned_loss=0.03072, over 4927.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2077, pruned_loss=0.03023, over 385954.49 frames.], batch size: 23, lr: 1.59e-04 2022-05-07 22:56:21,299 INFO [train.py:715] (0/8) Epoch 14, batch 150, loss[loss=0.1505, simple_loss=0.2225, pruned_loss=0.03928, over 4819.00 frames.], tot_loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.03029, over 516504.71 frames.], batch size: 27, lr: 1.59e-04 2022-05-07 22:56:59,871 INFO [train.py:715] (0/8) Epoch 14, batch 200, loss[loss=0.1199, simple_loss=0.1952, pruned_loss=0.02233, over 4831.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2079, pruned_loss=0.03095, over 617611.85 frames.], batch size: 26, lr: 1.59e-04 2022-05-07 22:57:38,466 INFO [train.py:715] (0/8) Epoch 14, batch 250, loss[loss=0.1529, simple_loss=0.2274, pruned_loss=0.03918, over 4918.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03077, over 695694.70 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 22:58:17,246 INFO [train.py:715] (0/8) Epoch 14, batch 300, loss[loss=0.1648, simple_loss=0.2426, pruned_loss=0.04354, over 4844.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02994, over 757590.81 frames.], batch size: 30, lr: 1.59e-04 2022-05-07 22:58:56,804 INFO [train.py:715] (0/8) Epoch 14, batch 350, loss[loss=0.1393, simple_loss=0.2137, pruned_loss=0.03252, over 4983.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.0295, over 805775.90 frames.], batch size: 25, lr: 1.59e-04 2022-05-07 22:59:35,345 INFO [train.py:715] (0/8) Epoch 14, batch 400, loss[loss=0.118, simple_loss=0.1909, pruned_loss=0.02255, over 4797.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02984, over 843416.33 frames.], batch size: 25, lr: 1.59e-04 2022-05-07 23:00:14,778 INFO [train.py:715] (0/8) Epoch 14, batch 450, loss[loss=0.1265, simple_loss=0.2016, pruned_loss=0.02571, over 4902.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02946, over 872395.43 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:00:54,064 INFO [train.py:715] (0/8) Epoch 14, batch 500, loss[loss=0.1289, simple_loss=0.2102, pruned_loss=0.02374, over 4946.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.0302, over 894782.15 frames.], batch size: 24, lr: 1.59e-04 2022-05-07 23:01:33,688 INFO [train.py:715] (0/8) Epoch 14, batch 550, loss[loss=0.1324, simple_loss=0.203, pruned_loss=0.03091, over 4806.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2088, pruned_loss=0.03079, over 912264.06 frames.], batch size: 21, lr: 1.59e-04 2022-05-07 23:02:12,464 INFO [train.py:715] (0/8) Epoch 14, batch 600, loss[loss=0.1611, simple_loss=0.2295, pruned_loss=0.04636, over 4828.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03065, over 926386.18 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:02:51,131 INFO [train.py:715] (0/8) Epoch 14, batch 650, loss[loss=0.1304, simple_loss=0.2103, pruned_loss=0.0253, over 4917.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2089, pruned_loss=0.03063, over 936840.76 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:03:05,268 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-488000.pt 2022-05-07 23:03:32,631 INFO [train.py:715] (0/8) Epoch 14, batch 700, loss[loss=0.1196, simple_loss=0.1953, pruned_loss=0.02199, over 4938.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03085, over 944675.59 frames.], batch size: 21, lr: 1.59e-04 2022-05-07 23:04:11,050 INFO [train.py:715] (0/8) Epoch 14, batch 750, loss[loss=0.1157, simple_loss=0.191, pruned_loss=0.02025, over 4775.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03024, over 950227.68 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:04:51,154 INFO [train.py:715] (0/8) Epoch 14, batch 800, loss[loss=0.1526, simple_loss=0.2154, pruned_loss=0.04489, over 4841.00 frames.], tot_loss[loss=0.135, simple_loss=0.2093, pruned_loss=0.03038, over 954291.03 frames.], batch size: 30, lr: 1.59e-04 2022-05-07 23:05:30,247 INFO [train.py:715] (0/8) Epoch 14, batch 850, loss[loss=0.1415, simple_loss=0.2274, pruned_loss=0.02781, over 4801.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2094, pruned_loss=0.03076, over 958453.06 frames.], batch size: 24, lr: 1.59e-04 2022-05-07 23:06:09,705 INFO [train.py:715] (0/8) Epoch 14, batch 900, loss[loss=0.1366, simple_loss=0.2055, pruned_loss=0.03381, over 4896.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03055, over 961793.12 frames.], batch size: 22, lr: 1.59e-04 2022-05-07 23:06:48,328 INFO [train.py:715] (0/8) Epoch 14, batch 950, loss[loss=0.1701, simple_loss=0.2397, pruned_loss=0.05026, over 4991.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03138, over 965148.24 frames.], batch size: 16, lr: 1.59e-04 2022-05-07 23:07:27,883 INFO [train.py:715] (0/8) Epoch 14, batch 1000, loss[loss=0.1419, simple_loss=0.2183, pruned_loss=0.03278, over 4909.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03132, over 966304.63 frames.], batch size: 39, lr: 1.59e-04 2022-05-07 23:08:07,945 INFO [train.py:715] (0/8) Epoch 14, batch 1050, loss[loss=0.1827, simple_loss=0.2512, pruned_loss=0.05711, over 4961.00 frames.], tot_loss[loss=0.1363, simple_loss=0.21, pruned_loss=0.03136, over 968025.74 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:08:47,255 INFO [train.py:715] (0/8) Epoch 14, batch 1100, loss[loss=0.1423, simple_loss=0.2136, pruned_loss=0.03555, over 4783.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03132, over 968454.79 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:09:26,966 INFO [train.py:715] (0/8) Epoch 14, batch 1150, loss[loss=0.1553, simple_loss=0.2316, pruned_loss=0.03953, over 4965.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03061, over 969044.79 frames.], batch size: 39, lr: 1.59e-04 2022-05-07 23:10:07,019 INFO [train.py:715] (0/8) Epoch 14, batch 1200, loss[loss=0.1364, simple_loss=0.2039, pruned_loss=0.03443, over 4808.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03094, over 970072.36 frames.], batch size: 25, lr: 1.59e-04 2022-05-07 23:10:47,176 INFO [train.py:715] (0/8) Epoch 14, batch 1250, loss[loss=0.1502, simple_loss=0.2128, pruned_loss=0.04382, over 4798.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03042, over 969910.88 frames.], batch size: 21, lr: 1.59e-04 2022-05-07 23:11:26,193 INFO [train.py:715] (0/8) Epoch 14, batch 1300, loss[loss=0.1255, simple_loss=0.1925, pruned_loss=0.02921, over 4841.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03045, over 971334.67 frames.], batch size: 32, lr: 1.59e-04 2022-05-07 23:12:05,711 INFO [train.py:715] (0/8) Epoch 14, batch 1350, loss[loss=0.1354, simple_loss=0.2105, pruned_loss=0.03012, over 4710.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03065, over 970706.80 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:12:45,082 INFO [train.py:715] (0/8) Epoch 14, batch 1400, loss[loss=0.1219, simple_loss=0.1849, pruned_loss=0.02943, over 4835.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03051, over 971795.45 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:13:24,654 INFO [train.py:715] (0/8) Epoch 14, batch 1450, loss[loss=0.1652, simple_loss=0.2341, pruned_loss=0.04814, over 4848.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03114, over 971601.08 frames.], batch size: 20, lr: 1.59e-04 2022-05-07 23:14:04,622 INFO [train.py:715] (0/8) Epoch 14, batch 1500, loss[loss=0.1237, simple_loss=0.194, pruned_loss=0.02668, over 4901.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03069, over 971295.52 frames.], batch size: 22, lr: 1.59e-04 2022-05-07 23:14:44,302 INFO [train.py:715] (0/8) Epoch 14, batch 1550, loss[loss=0.1367, simple_loss=0.2123, pruned_loss=0.0305, over 4837.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03093, over 972023.60 frames.], batch size: 30, lr: 1.59e-04 2022-05-07 23:15:24,192 INFO [train.py:715] (0/8) Epoch 14, batch 1600, loss[loss=0.1308, simple_loss=0.198, pruned_loss=0.03179, over 4763.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03074, over 971720.55 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:16:03,422 INFO [train.py:715] (0/8) Epoch 14, batch 1650, loss[loss=0.1481, simple_loss=0.2258, pruned_loss=0.03516, over 4863.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.0308, over 972172.57 frames.], batch size: 32, lr: 1.59e-04 2022-05-07 23:16:43,084 INFO [train.py:715] (0/8) Epoch 14, batch 1700, loss[loss=0.1621, simple_loss=0.2322, pruned_loss=0.04603, over 4895.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03135, over 972775.36 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:17:22,556 INFO [train.py:715] (0/8) Epoch 14, batch 1750, loss[loss=0.09908, simple_loss=0.1702, pruned_loss=0.01398, over 4843.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03124, over 973521.86 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:18:02,283 INFO [train.py:715] (0/8) Epoch 14, batch 1800, loss[loss=0.1219, simple_loss=0.1899, pruned_loss=0.02692, over 4817.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.03135, over 972778.24 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:18:40,623 INFO [train.py:715] (0/8) Epoch 14, batch 1850, loss[loss=0.1444, simple_loss=0.2131, pruned_loss=0.03781, over 4766.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2092, pruned_loss=0.03135, over 972002.89 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:19:19,858 INFO [train.py:715] (0/8) Epoch 14, batch 1900, loss[loss=0.1316, simple_loss=0.1974, pruned_loss=0.03295, over 4699.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2092, pruned_loss=0.03127, over 971907.44 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:19:59,661 INFO [train.py:715] (0/8) Epoch 14, batch 1950, loss[loss=0.153, simple_loss=0.2201, pruned_loss=0.04292, over 4821.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2096, pruned_loss=0.03132, over 972243.86 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:20:39,803 INFO [train.py:715] (0/8) Epoch 14, batch 2000, loss[loss=0.1411, simple_loss=0.2139, pruned_loss=0.03416, over 4887.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.03075, over 972258.23 frames.], batch size: 22, lr: 1.59e-04 2022-05-07 23:21:19,091 INFO [train.py:715] (0/8) Epoch 14, batch 2050, loss[loss=0.1064, simple_loss=0.1802, pruned_loss=0.01627, over 4791.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03077, over 972007.26 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:21:58,530 INFO [train.py:715] (0/8) Epoch 14, batch 2100, loss[loss=0.1534, simple_loss=0.2274, pruned_loss=0.0397, over 4896.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2099, pruned_loss=0.03078, over 972685.84 frames.], batch size: 16, lr: 1.59e-04 2022-05-07 23:22:38,246 INFO [train.py:715] (0/8) Epoch 14, batch 2150, loss[loss=0.1469, simple_loss=0.2119, pruned_loss=0.0409, over 4821.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03129, over 972408.58 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:23:16,935 INFO [train.py:715] (0/8) Epoch 14, batch 2200, loss[loss=0.1295, simple_loss=0.2042, pruned_loss=0.02737, over 4933.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2089, pruned_loss=0.03117, over 972479.49 frames.], batch size: 23, lr: 1.59e-04 2022-05-07 23:23:55,888 INFO [train.py:715] (0/8) Epoch 14, batch 2250, loss[loss=0.1405, simple_loss=0.2277, pruned_loss=0.02669, over 4960.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.03139, over 972094.40 frames.], batch size: 24, lr: 1.59e-04 2022-05-07 23:24:34,961 INFO [train.py:715] (0/8) Epoch 14, batch 2300, loss[loss=0.1377, simple_loss=0.2118, pruned_loss=0.03184, over 4955.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03103, over 971354.99 frames.], batch size: 21, lr: 1.59e-04 2022-05-07 23:25:14,124 INFO [train.py:715] (0/8) Epoch 14, batch 2350, loss[loss=0.1411, simple_loss=0.2092, pruned_loss=0.03645, over 4960.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03072, over 972027.66 frames.], batch size: 35, lr: 1.59e-04 2022-05-07 23:25:53,237 INFO [train.py:715] (0/8) Epoch 14, batch 2400, loss[loss=0.1412, simple_loss=0.2105, pruned_loss=0.03592, over 4904.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03096, over 972291.36 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:26:32,284 INFO [train.py:715] (0/8) Epoch 14, batch 2450, loss[loss=0.133, simple_loss=0.2037, pruned_loss=0.03115, over 4916.00 frames.], tot_loss[loss=0.136, simple_loss=0.2099, pruned_loss=0.03105, over 971736.32 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:27:11,604 INFO [train.py:715] (0/8) Epoch 14, batch 2500, loss[loss=0.1383, simple_loss=0.199, pruned_loss=0.03881, over 4806.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03125, over 972003.27 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:27:50,104 INFO [train.py:715] (0/8) Epoch 14, batch 2550, loss[loss=0.1512, simple_loss=0.2184, pruned_loss=0.04201, over 4904.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03076, over 972364.54 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:28:29,683 INFO [train.py:715] (0/8) Epoch 14, batch 2600, loss[loss=0.1339, simple_loss=0.2141, pruned_loss=0.02688, over 4798.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03095, over 971806.49 frames.], batch size: 21, lr: 1.59e-04 2022-05-07 23:29:09,135 INFO [train.py:715] (0/8) Epoch 14, batch 2650, loss[loss=0.1448, simple_loss=0.2164, pruned_loss=0.03657, over 4865.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.03092, over 971998.04 frames.], batch size: 20, lr: 1.59e-04 2022-05-07 23:29:48,488 INFO [train.py:715] (0/8) Epoch 14, batch 2700, loss[loss=0.1218, simple_loss=0.2015, pruned_loss=0.02103, over 4811.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03071, over 972654.45 frames.], batch size: 25, lr: 1.59e-04 2022-05-07 23:30:27,054 INFO [train.py:715] (0/8) Epoch 14, batch 2750, loss[loss=0.1204, simple_loss=0.1967, pruned_loss=0.02208, over 4904.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03068, over 971785.16 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:31:06,231 INFO [train.py:715] (0/8) Epoch 14, batch 2800, loss[loss=0.1235, simple_loss=0.2083, pruned_loss=0.01933, over 4744.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03036, over 971936.65 frames.], batch size: 14, lr: 1.59e-04 2022-05-07 23:31:45,907 INFO [train.py:715] (0/8) Epoch 14, batch 2850, loss[loss=0.1432, simple_loss=0.2117, pruned_loss=0.03733, over 4890.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03055, over 971702.48 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:32:24,346 INFO [train.py:715] (0/8) Epoch 14, batch 2900, loss[loss=0.1613, simple_loss=0.235, pruned_loss=0.04384, over 4930.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03091, over 972004.46 frames.], batch size: 29, lr: 1.59e-04 2022-05-07 23:33:06,133 INFO [train.py:715] (0/8) Epoch 14, batch 2950, loss[loss=0.117, simple_loss=0.1931, pruned_loss=0.02049, over 4832.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2092, pruned_loss=0.03047, over 971479.63 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:33:45,670 INFO [train.py:715] (0/8) Epoch 14, batch 3000, loss[loss=0.1601, simple_loss=0.2326, pruned_loss=0.0438, over 4665.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03081, over 971599.84 frames.], batch size: 13, lr: 1.59e-04 2022-05-07 23:33:45,671 INFO [train.py:733] (0/8) Computing validation loss 2022-05-07 23:33:55,240 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1052, simple_loss=0.1891, pruned_loss=0.01067, over 914524.00 frames. 2022-05-07 23:34:34,252 INFO [train.py:715] (0/8) Epoch 14, batch 3050, loss[loss=0.1327, simple_loss=0.2011, pruned_loss=0.03213, over 4788.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03094, over 970952.81 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:35:14,214 INFO [train.py:715] (0/8) Epoch 14, batch 3100, loss[loss=0.1125, simple_loss=0.1965, pruned_loss=0.01423, over 4801.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03055, over 971350.86 frames.], batch size: 25, lr: 1.59e-04 2022-05-07 23:35:53,767 INFO [train.py:715] (0/8) Epoch 14, batch 3150, loss[loss=0.1239, simple_loss=0.1989, pruned_loss=0.02446, over 4789.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03073, over 971623.10 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:36:33,460 INFO [train.py:715] (0/8) Epoch 14, batch 3200, loss[loss=0.1128, simple_loss=0.1886, pruned_loss=0.01848, over 4919.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03067, over 972566.73 frames.], batch size: 29, lr: 1.59e-04 2022-05-07 23:37:14,482 INFO [train.py:715] (0/8) Epoch 14, batch 3250, loss[loss=0.1098, simple_loss=0.1788, pruned_loss=0.02039, over 4801.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2089, pruned_loss=0.03084, over 971573.86 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:37:54,307 INFO [train.py:715] (0/8) Epoch 14, batch 3300, loss[loss=0.149, simple_loss=0.2227, pruned_loss=0.03768, over 4841.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.03119, over 971578.00 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:38:34,436 INFO [train.py:715] (0/8) Epoch 14, batch 3350, loss[loss=0.1379, simple_loss=0.2115, pruned_loss=0.03217, over 4883.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2093, pruned_loss=0.03045, over 971818.36 frames.], batch size: 16, lr: 1.59e-04 2022-05-07 23:39:15,375 INFO [train.py:715] (0/8) Epoch 14, batch 3400, loss[loss=0.1468, simple_loss=0.2115, pruned_loss=0.04104, over 4903.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03057, over 972733.03 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:39:56,023 INFO [train.py:715] (0/8) Epoch 14, batch 3450, loss[loss=0.1293, simple_loss=0.2063, pruned_loss=0.0262, over 4860.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03085, over 972068.86 frames.], batch size: 13, lr: 1.59e-04 2022-05-07 23:40:35,904 INFO [train.py:715] (0/8) Epoch 14, batch 3500, loss[loss=0.1148, simple_loss=0.1881, pruned_loss=0.02076, over 4813.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03079, over 972414.97 frames.], batch size: 26, lr: 1.59e-04 2022-05-07 23:41:15,975 INFO [train.py:715] (0/8) Epoch 14, batch 3550, loss[loss=0.1661, simple_loss=0.2484, pruned_loss=0.04192, over 4910.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.03107, over 972274.18 frames.], batch size: 39, lr: 1.59e-04 2022-05-07 23:41:56,122 INFO [train.py:715] (0/8) Epoch 14, batch 3600, loss[loss=0.1255, simple_loss=0.196, pruned_loss=0.0275, over 4748.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03107, over 972641.86 frames.], batch size: 16, lr: 1.59e-04 2022-05-07 23:42:36,125 INFO [train.py:715] (0/8) Epoch 14, batch 3650, loss[loss=0.1338, simple_loss=0.204, pruned_loss=0.03185, over 4963.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03072, over 972431.38 frames.], batch size: 24, lr: 1.59e-04 2022-05-07 23:43:16,031 INFO [train.py:715] (0/8) Epoch 14, batch 3700, loss[loss=0.1162, simple_loss=0.1831, pruned_loss=0.02466, over 4963.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.0312, over 973200.99 frames.], batch size: 35, lr: 1.59e-04 2022-05-07 23:43:56,762 INFO [train.py:715] (0/8) Epoch 14, batch 3750, loss[loss=0.1367, simple_loss=0.1993, pruned_loss=0.0371, over 4828.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2082, pruned_loss=0.03076, over 972627.50 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:44:36,925 INFO [train.py:715] (0/8) Epoch 14, batch 3800, loss[loss=0.1738, simple_loss=0.2444, pruned_loss=0.05153, over 4891.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2083, pruned_loss=0.03078, over 972485.79 frames.], batch size: 38, lr: 1.59e-04 2022-05-07 23:45:16,168 INFO [train.py:715] (0/8) Epoch 14, batch 3850, loss[loss=0.1292, simple_loss=0.1994, pruned_loss=0.02949, over 4826.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2079, pruned_loss=0.03065, over 972330.87 frames.], batch size: 27, lr: 1.59e-04 2022-05-07 23:45:56,644 INFO [train.py:715] (0/8) Epoch 14, batch 3900, loss[loss=0.162, simple_loss=0.2177, pruned_loss=0.05319, over 4743.00 frames.], tot_loss[loss=0.1357, simple_loss=0.209, pruned_loss=0.03117, over 972071.32 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:46:37,885 INFO [train.py:715] (0/8) Epoch 14, batch 3950, loss[loss=0.11, simple_loss=0.177, pruned_loss=0.02153, over 4904.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03061, over 972627.11 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:47:18,824 INFO [train.py:715] (0/8) Epoch 14, batch 4000, loss[loss=0.1489, simple_loss=0.2212, pruned_loss=0.03831, over 4987.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2093, pruned_loss=0.03068, over 973278.94 frames.], batch size: 14, lr: 1.59e-04 2022-05-07 23:47:59,308 INFO [train.py:715] (0/8) Epoch 14, batch 4050, loss[loss=0.1498, simple_loss=0.2262, pruned_loss=0.03675, over 4894.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.03102, over 973048.02 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:48:40,128 INFO [train.py:715] (0/8) Epoch 14, batch 4100, loss[loss=0.128, simple_loss=0.2062, pruned_loss=0.02486, over 4857.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03111, over 972755.60 frames.], batch size: 20, lr: 1.59e-04 2022-05-07 23:49:21,559 INFO [train.py:715] (0/8) Epoch 14, batch 4150, loss[loss=0.09836, simple_loss=0.163, pruned_loss=0.01684, over 4749.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03045, over 972746.63 frames.], batch size: 12, lr: 1.59e-04 2022-05-07 23:50:02,217 INFO [train.py:715] (0/8) Epoch 14, batch 4200, loss[loss=0.1159, simple_loss=0.195, pruned_loss=0.01835, over 4942.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03069, over 973421.84 frames.], batch size: 35, lr: 1.59e-04 2022-05-07 23:50:43,285 INFO [train.py:715] (0/8) Epoch 14, batch 4250, loss[loss=0.1366, simple_loss=0.2136, pruned_loss=0.02982, over 4815.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.0306, over 972591.51 frames.], batch size: 13, lr: 1.59e-04 2022-05-07 23:51:25,173 INFO [train.py:715] (0/8) Epoch 14, batch 4300, loss[loss=0.1452, simple_loss=0.2117, pruned_loss=0.03936, over 4782.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03082, over 971895.14 frames.], batch size: 17, lr: 1.59e-04 2022-05-07 23:52:06,420 INFO [train.py:715] (0/8) Epoch 14, batch 4350, loss[loss=0.1388, simple_loss=0.1956, pruned_loss=0.04103, over 4890.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03071, over 972081.52 frames.], batch size: 18, lr: 1.59e-04 2022-05-07 23:52:46,952 INFO [train.py:715] (0/8) Epoch 14, batch 4400, loss[loss=0.1439, simple_loss=0.2107, pruned_loss=0.03858, over 4955.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.03113, over 972608.55 frames.], batch size: 21, lr: 1.59e-04 2022-05-07 23:53:27,640 INFO [train.py:715] (0/8) Epoch 14, batch 4450, loss[loss=0.1515, simple_loss=0.234, pruned_loss=0.03452, over 4968.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.0311, over 972461.78 frames.], batch size: 15, lr: 1.59e-04 2022-05-07 23:54:08,631 INFO [train.py:715] (0/8) Epoch 14, batch 4500, loss[loss=0.126, simple_loss=0.204, pruned_loss=0.02399, over 4822.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03099, over 973102.22 frames.], batch size: 26, lr: 1.59e-04 2022-05-07 23:54:48,679 INFO [train.py:715] (0/8) Epoch 14, batch 4550, loss[loss=0.1356, simple_loss=0.2236, pruned_loss=0.02379, over 4805.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03104, over 972501.31 frames.], batch size: 25, lr: 1.59e-04 2022-05-07 23:55:27,631 INFO [train.py:715] (0/8) Epoch 14, batch 4600, loss[loss=0.1306, simple_loss=0.2, pruned_loss=0.0306, over 4970.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03106, over 973049.06 frames.], batch size: 14, lr: 1.59e-04 2022-05-07 23:56:08,468 INFO [train.py:715] (0/8) Epoch 14, batch 4650, loss[loss=0.1415, simple_loss=0.2063, pruned_loss=0.03842, over 4748.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.0314, over 973360.31 frames.], batch size: 19, lr: 1.59e-04 2022-05-07 23:56:48,195 INFO [train.py:715] (0/8) Epoch 14, batch 4700, loss[loss=0.1281, simple_loss=0.2009, pruned_loss=0.02767, over 4963.00 frames.], tot_loss[loss=0.136, simple_loss=0.2095, pruned_loss=0.03129, over 973049.48 frames.], batch size: 35, lr: 1.59e-04 2022-05-07 23:57:26,877 INFO [train.py:715] (0/8) Epoch 14, batch 4750, loss[loss=0.1562, simple_loss=0.2201, pruned_loss=0.04618, over 4642.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2094, pruned_loss=0.031, over 971985.95 frames.], batch size: 13, lr: 1.58e-04 2022-05-07 23:58:06,245 INFO [train.py:715] (0/8) Epoch 14, batch 4800, loss[loss=0.1365, simple_loss=0.2096, pruned_loss=0.03175, over 4779.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2082, pruned_loss=0.03059, over 971837.99 frames.], batch size: 14, lr: 1.58e-04 2022-05-07 23:58:46,078 INFO [train.py:715] (0/8) Epoch 14, batch 4850, loss[loss=0.1677, simple_loss=0.2307, pruned_loss=0.05237, over 4739.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2073, pruned_loss=0.03023, over 971920.68 frames.], batch size: 16, lr: 1.58e-04 2022-05-07 23:59:25,004 INFO [train.py:715] (0/8) Epoch 14, batch 4900, loss[loss=0.145, simple_loss=0.2124, pruned_loss=0.03878, over 4884.00 frames.], tot_loss[loss=0.134, simple_loss=0.2074, pruned_loss=0.0303, over 972509.69 frames.], batch size: 19, lr: 1.58e-04 2022-05-08 00:00:04,154 INFO [train.py:715] (0/8) Epoch 14, batch 4950, loss[loss=0.1544, simple_loss=0.2283, pruned_loss=0.0403, over 4822.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2084, pruned_loss=0.03096, over 972073.22 frames.], batch size: 25, lr: 1.58e-04 2022-05-08 00:00:44,230 INFO [train.py:715] (0/8) Epoch 14, batch 5000, loss[loss=0.1215, simple_loss=0.2023, pruned_loss=0.02036, over 4762.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03116, over 972327.14 frames.], batch size: 19, lr: 1.58e-04 2022-05-08 00:01:23,510 INFO [train.py:715] (0/8) Epoch 14, batch 5050, loss[loss=0.1434, simple_loss=0.2119, pruned_loss=0.03752, over 4758.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03112, over 972269.41 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:02:02,198 INFO [train.py:715] (0/8) Epoch 14, batch 5100, loss[loss=0.133, simple_loss=0.213, pruned_loss=0.02651, over 4703.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.03096, over 972463.82 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:02:41,796 INFO [train.py:715] (0/8) Epoch 14, batch 5150, loss[loss=0.1275, simple_loss=0.2091, pruned_loss=0.02293, over 4957.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03074, over 972855.90 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 00:03:21,422 INFO [train.py:715] (0/8) Epoch 14, batch 5200, loss[loss=0.1433, simple_loss=0.2167, pruned_loss=0.03497, over 4985.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03078, over 972547.31 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:03:59,947 INFO [train.py:715] (0/8) Epoch 14, batch 5250, loss[loss=0.1357, simple_loss=0.1963, pruned_loss=0.03755, over 4799.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03096, over 973226.86 frames.], batch size: 13, lr: 1.58e-04 2022-05-08 00:04:38,440 INFO [train.py:715] (0/8) Epoch 14, batch 5300, loss[loss=0.1247, simple_loss=0.2011, pruned_loss=0.02417, over 4902.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03105, over 973319.53 frames.], batch size: 19, lr: 1.58e-04 2022-05-08 00:05:17,622 INFO [train.py:715] (0/8) Epoch 14, batch 5350, loss[loss=0.136, simple_loss=0.2191, pruned_loss=0.02645, over 4914.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2103, pruned_loss=0.03115, over 973369.68 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 00:05:56,195 INFO [train.py:715] (0/8) Epoch 14, batch 5400, loss[loss=0.1484, simple_loss=0.2188, pruned_loss=0.03906, over 4730.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2112, pruned_loss=0.0316, over 973371.49 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:06:34,699 INFO [train.py:715] (0/8) Epoch 14, batch 5450, loss[loss=0.1193, simple_loss=0.1911, pruned_loss=0.02375, over 4785.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03165, over 973045.86 frames.], batch size: 12, lr: 1.58e-04 2022-05-08 00:07:13,534 INFO [train.py:715] (0/8) Epoch 14, batch 5500, loss[loss=0.147, simple_loss=0.2169, pruned_loss=0.03852, over 4683.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03131, over 972709.16 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:07:53,211 INFO [train.py:715] (0/8) Epoch 14, batch 5550, loss[loss=0.1124, simple_loss=0.1935, pruned_loss=0.01559, over 4918.00 frames.], tot_loss[loss=0.137, simple_loss=0.2108, pruned_loss=0.03162, over 972838.60 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:08:31,524 INFO [train.py:715] (0/8) Epoch 14, batch 5600, loss[loss=0.1304, simple_loss=0.2149, pruned_loss=0.02293, over 4879.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.03118, over 972217.95 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:09:10,029 INFO [train.py:715] (0/8) Epoch 14, batch 5650, loss[loss=0.1296, simple_loss=0.2021, pruned_loss=0.02855, over 4775.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03113, over 972465.53 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:09:49,143 INFO [train.py:715] (0/8) Epoch 14, batch 5700, loss[loss=0.152, simple_loss=0.2231, pruned_loss=0.04045, over 4856.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.031, over 972305.96 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:10:27,412 INFO [train.py:715] (0/8) Epoch 14, batch 5750, loss[loss=0.1365, simple_loss=0.2169, pruned_loss=0.02808, over 4798.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03091, over 972844.67 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:11:05,794 INFO [train.py:715] (0/8) Epoch 14, batch 5800, loss[loss=0.1561, simple_loss=0.2284, pruned_loss=0.04192, over 4877.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03053, over 973978.75 frames.], batch size: 32, lr: 1.58e-04 2022-05-08 00:11:44,413 INFO [train.py:715] (0/8) Epoch 14, batch 5850, loss[loss=0.1323, simple_loss=0.2091, pruned_loss=0.02771, over 4871.00 frames.], tot_loss[loss=0.135, simple_loss=0.2085, pruned_loss=0.03073, over 973803.98 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:12:23,188 INFO [train.py:715] (0/8) Epoch 14, batch 5900, loss[loss=0.1181, simple_loss=0.1898, pruned_loss=0.0232, over 4730.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03079, over 972548.63 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:13:02,948 INFO [train.py:715] (0/8) Epoch 14, batch 5950, loss[loss=0.1298, simple_loss=0.2064, pruned_loss=0.02659, over 4977.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2087, pruned_loss=0.03108, over 972237.12 frames.], batch size: 25, lr: 1.58e-04 2022-05-08 00:13:42,644 INFO [train.py:715] (0/8) Epoch 14, batch 6000, loss[loss=0.119, simple_loss=0.192, pruned_loss=0.02293, over 4902.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2094, pruned_loss=0.03153, over 972330.73 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 00:13:42,645 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 00:13:52,504 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.105, simple_loss=0.1888, pruned_loss=0.01057, over 914524.00 frames. 2022-05-08 00:14:31,602 INFO [train.py:715] (0/8) Epoch 14, batch 6050, loss[loss=0.1343, simple_loss=0.207, pruned_loss=0.03081, over 4912.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03115, over 971696.79 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:15:10,779 INFO [train.py:715] (0/8) Epoch 14, batch 6100, loss[loss=0.1523, simple_loss=0.2308, pruned_loss=0.03688, over 4758.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03095, over 971821.47 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:15:50,781 INFO [train.py:715] (0/8) Epoch 14, batch 6150, loss[loss=0.14, simple_loss=0.2116, pruned_loss=0.03413, over 4852.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03075, over 971707.01 frames.], batch size: 34, lr: 1.58e-04 2022-05-08 00:16:30,387 INFO [train.py:715] (0/8) Epoch 14, batch 6200, loss[loss=0.1267, simple_loss=0.2064, pruned_loss=0.02356, over 4777.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03118, over 970935.40 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:17:10,262 INFO [train.py:715] (0/8) Epoch 14, batch 6250, loss[loss=0.1539, simple_loss=0.2275, pruned_loss=0.04016, over 4783.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03104, over 970403.69 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 00:17:49,629 INFO [train.py:715] (0/8) Epoch 14, batch 6300, loss[loss=0.1542, simple_loss=0.233, pruned_loss=0.03765, over 4811.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.03094, over 970873.80 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 00:18:29,659 INFO [train.py:715] (0/8) Epoch 14, batch 6350, loss[loss=0.1485, simple_loss=0.2193, pruned_loss=0.03888, over 4866.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03132, over 971526.41 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:19:09,434 INFO [train.py:715] (0/8) Epoch 14, batch 6400, loss[loss=0.1432, simple_loss=0.2156, pruned_loss=0.03544, over 4887.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2101, pruned_loss=0.03163, over 972288.44 frames.], batch size: 22, lr: 1.58e-04 2022-05-08 00:19:49,525 INFO [train.py:715] (0/8) Epoch 14, batch 6450, loss[loss=0.1191, simple_loss=0.1981, pruned_loss=0.01999, over 4917.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2099, pruned_loss=0.03148, over 972911.74 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:20:29,514 INFO [train.py:715] (0/8) Epoch 14, batch 6500, loss[loss=0.1298, simple_loss=0.2035, pruned_loss=0.02807, over 4955.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03127, over 973461.66 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 00:21:09,167 INFO [train.py:715] (0/8) Epoch 14, batch 6550, loss[loss=0.1351, simple_loss=0.2092, pruned_loss=0.03044, over 4961.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03086, over 973447.33 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:21:49,050 INFO [train.py:715] (0/8) Epoch 14, batch 6600, loss[loss=0.1288, simple_loss=0.2039, pruned_loss=0.02683, over 4937.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03096, over 973950.84 frames.], batch size: 23, lr: 1.58e-04 2022-05-08 00:22:29,224 INFO [train.py:715] (0/8) Epoch 14, batch 6650, loss[loss=0.1297, simple_loss=0.1975, pruned_loss=0.03092, over 4827.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2095, pruned_loss=0.03108, over 972879.71 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:23:08,958 INFO [train.py:715] (0/8) Epoch 14, batch 6700, loss[loss=0.1525, simple_loss=0.2197, pruned_loss=0.04266, over 4752.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03126, over 972923.54 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:23:48,841 INFO [train.py:715] (0/8) Epoch 14, batch 6750, loss[loss=0.1128, simple_loss=0.1895, pruned_loss=0.01803, over 4703.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03107, over 972124.90 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:24:28,834 INFO [train.py:715] (0/8) Epoch 14, batch 6800, loss[loss=0.1273, simple_loss=0.2079, pruned_loss=0.0233, over 4782.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03072, over 972781.74 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 00:25:08,845 INFO [train.py:715] (0/8) Epoch 14, batch 6850, loss[loss=0.1555, simple_loss=0.2336, pruned_loss=0.03868, over 4904.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2101, pruned_loss=0.03114, over 973058.51 frames.], batch size: 19, lr: 1.58e-04 2022-05-08 00:25:48,263 INFO [train.py:715] (0/8) Epoch 14, batch 6900, loss[loss=0.1284, simple_loss=0.1937, pruned_loss=0.0316, over 4761.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03121, over 972397.39 frames.], batch size: 12, lr: 1.58e-04 2022-05-08 00:26:28,457 INFO [train.py:715] (0/8) Epoch 14, batch 6950, loss[loss=0.1219, simple_loss=0.1969, pruned_loss=0.02344, over 4851.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03059, over 972831.06 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:27:08,561 INFO [train.py:715] (0/8) Epoch 14, batch 7000, loss[loss=0.1419, simple_loss=0.2134, pruned_loss=0.03515, over 4746.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03091, over 972352.28 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:27:48,544 INFO [train.py:715] (0/8) Epoch 14, batch 7050, loss[loss=0.1357, simple_loss=0.2133, pruned_loss=0.02903, over 4974.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03077, over 971803.72 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 00:28:27,873 INFO [train.py:715] (0/8) Epoch 14, batch 7100, loss[loss=0.1268, simple_loss=0.2032, pruned_loss=0.02521, over 4826.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2085, pruned_loss=0.03065, over 972665.23 frames.], batch size: 27, lr: 1.58e-04 2022-05-08 00:29:07,961 INFO [train.py:715] (0/8) Epoch 14, batch 7150, loss[loss=0.1191, simple_loss=0.2002, pruned_loss=0.01895, over 4864.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03056, over 972965.72 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:29:48,163 INFO [train.py:715] (0/8) Epoch 14, batch 7200, loss[loss=0.1238, simple_loss=0.2018, pruned_loss=0.0229, over 4986.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2089, pruned_loss=0.03068, over 972740.52 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 00:30:28,009 INFO [train.py:715] (0/8) Epoch 14, batch 7250, loss[loss=0.1201, simple_loss=0.1971, pruned_loss=0.02155, over 4830.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03039, over 973450.56 frames.], batch size: 30, lr: 1.58e-04 2022-05-08 00:31:08,138 INFO [train.py:715] (0/8) Epoch 14, batch 7300, loss[loss=0.1266, simple_loss=0.1954, pruned_loss=0.02896, over 4764.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03008, over 973218.05 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:31:48,260 INFO [train.py:715] (0/8) Epoch 14, batch 7350, loss[loss=0.1381, simple_loss=0.2082, pruned_loss=0.03399, over 4855.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2078, pruned_loss=0.03019, over 972674.29 frames.], batch size: 30, lr: 1.58e-04 2022-05-08 00:32:28,614 INFO [train.py:715] (0/8) Epoch 14, batch 7400, loss[loss=0.1318, simple_loss=0.2023, pruned_loss=0.03063, over 4971.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03045, over 972976.03 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 00:33:08,059 INFO [train.py:715] (0/8) Epoch 14, batch 7450, loss[loss=0.1231, simple_loss=0.198, pruned_loss=0.02413, over 4847.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03044, over 971921.59 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:33:47,752 INFO [train.py:715] (0/8) Epoch 14, batch 7500, loss[loss=0.1374, simple_loss=0.2136, pruned_loss=0.03055, over 4838.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03032, over 971823.64 frames.], batch size: 30, lr: 1.58e-04 2022-05-08 00:34:27,406 INFO [train.py:715] (0/8) Epoch 14, batch 7550, loss[loss=0.1348, simple_loss=0.2005, pruned_loss=0.0346, over 4989.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.0303, over 971353.33 frames.], batch size: 31, lr: 1.58e-04 2022-05-08 00:35:06,408 INFO [train.py:715] (0/8) Epoch 14, batch 7600, loss[loss=0.1393, simple_loss=0.2191, pruned_loss=0.02973, over 4972.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2078, pruned_loss=0.03019, over 971606.51 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 00:35:46,361 INFO [train.py:715] (0/8) Epoch 14, batch 7650, loss[loss=0.1099, simple_loss=0.1875, pruned_loss=0.01609, over 4793.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03021, over 972629.67 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 00:36:25,265 INFO [train.py:715] (0/8) Epoch 14, batch 7700, loss[loss=0.1431, simple_loss=0.2235, pruned_loss=0.03136, over 4915.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03024, over 972746.00 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 00:37:05,577 INFO [train.py:715] (0/8) Epoch 14, batch 7750, loss[loss=0.1404, simple_loss=0.2203, pruned_loss=0.03027, over 4857.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2076, pruned_loss=0.02997, over 972931.57 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:37:44,376 INFO [train.py:715] (0/8) Epoch 14, batch 7800, loss[loss=0.117, simple_loss=0.1826, pruned_loss=0.0257, over 4834.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.0304, over 973079.48 frames.], batch size: 13, lr: 1.58e-04 2022-05-08 00:38:23,471 INFO [train.py:715] (0/8) Epoch 14, batch 7850, loss[loss=0.1587, simple_loss=0.2207, pruned_loss=0.04838, over 4850.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2092, pruned_loss=0.03062, over 972819.65 frames.], batch size: 30, lr: 1.58e-04 2022-05-08 00:39:03,274 INFO [train.py:715] (0/8) Epoch 14, batch 7900, loss[loss=0.1134, simple_loss=0.1848, pruned_loss=0.02101, over 4814.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03052, over 973076.32 frames.], batch size: 12, lr: 1.58e-04 2022-05-08 00:39:42,081 INFO [train.py:715] (0/8) Epoch 14, batch 7950, loss[loss=0.1178, simple_loss=0.1973, pruned_loss=0.01916, over 4849.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03049, over 972404.00 frames.], batch size: 32, lr: 1.58e-04 2022-05-08 00:40:21,687 INFO [train.py:715] (0/8) Epoch 14, batch 8000, loss[loss=0.111, simple_loss=0.1778, pruned_loss=0.02213, over 4746.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02991, over 971701.74 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:41:00,524 INFO [train.py:715] (0/8) Epoch 14, batch 8050, loss[loss=0.1258, simple_loss=0.2071, pruned_loss=0.02219, over 4863.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.03041, over 972254.72 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:41:40,053 INFO [train.py:715] (0/8) Epoch 14, batch 8100, loss[loss=0.1869, simple_loss=0.2595, pruned_loss=0.05716, over 4912.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03065, over 972082.40 frames.], batch size: 39, lr: 1.58e-04 2022-05-08 00:42:18,789 INFO [train.py:715] (0/8) Epoch 14, batch 8150, loss[loss=0.1549, simple_loss=0.2251, pruned_loss=0.04237, over 4831.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2089, pruned_loss=0.03099, over 971340.78 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:42:58,274 INFO [train.py:715] (0/8) Epoch 14, batch 8200, loss[loss=0.1353, simple_loss=0.2151, pruned_loss=0.02774, over 4778.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03049, over 971396.13 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:43:37,713 INFO [train.py:715] (0/8) Epoch 14, batch 8250, loss[loss=0.1561, simple_loss=0.2397, pruned_loss=0.03628, over 4930.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2088, pruned_loss=0.03087, over 971967.70 frames.], batch size: 23, lr: 1.58e-04 2022-05-08 00:44:17,180 INFO [train.py:715] (0/8) Epoch 14, batch 8300, loss[loss=0.1209, simple_loss=0.1973, pruned_loss=0.02227, over 4821.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2085, pruned_loss=0.03095, over 972864.93 frames.], batch size: 27, lr: 1.58e-04 2022-05-08 00:44:56,127 INFO [train.py:715] (0/8) Epoch 14, batch 8350, loss[loss=0.1203, simple_loss=0.1946, pruned_loss=0.02303, over 4803.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03054, over 972500.84 frames.], batch size: 13, lr: 1.58e-04 2022-05-08 00:45:35,325 INFO [train.py:715] (0/8) Epoch 14, batch 8400, loss[loss=0.1458, simple_loss=0.2146, pruned_loss=0.03854, over 4825.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03043, over 972283.50 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:46:14,803 INFO [train.py:715] (0/8) Epoch 14, batch 8450, loss[loss=0.136, simple_loss=0.2187, pruned_loss=0.02667, over 4862.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03106, over 972695.16 frames.], batch size: 20, lr: 1.58e-04 2022-05-08 00:46:53,358 INFO [train.py:715] (0/8) Epoch 14, batch 8500, loss[loss=0.1271, simple_loss=0.1974, pruned_loss=0.02844, over 4981.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03087, over 972716.61 frames.], batch size: 28, lr: 1.58e-04 2022-05-08 00:47:32,487 INFO [train.py:715] (0/8) Epoch 14, batch 8550, loss[loss=0.1472, simple_loss=0.2268, pruned_loss=0.03378, over 4783.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2093, pruned_loss=0.03103, over 972560.80 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 00:48:13,442 INFO [train.py:715] (0/8) Epoch 14, batch 8600, loss[loss=0.1073, simple_loss=0.1776, pruned_loss=0.01855, over 4953.00 frames.], tot_loss[loss=0.135, simple_loss=0.2083, pruned_loss=0.03086, over 972412.04 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 00:48:52,734 INFO [train.py:715] (0/8) Epoch 14, batch 8650, loss[loss=0.1334, simple_loss=0.2197, pruned_loss=0.02354, over 4911.00 frames.], tot_loss[loss=0.1348, simple_loss=0.208, pruned_loss=0.03074, over 972969.88 frames.], batch size: 23, lr: 1.58e-04 2022-05-08 00:49:07,211 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-496000.pt 2022-05-08 00:49:34,154 INFO [train.py:715] (0/8) Epoch 14, batch 8700, loss[loss=0.1315, simple_loss=0.2107, pruned_loss=0.0261, over 4932.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2085, pruned_loss=0.03113, over 973060.36 frames.], batch size: 29, lr: 1.58e-04 2022-05-08 00:50:13,523 INFO [train.py:715] (0/8) Epoch 14, batch 8750, loss[loss=0.1142, simple_loss=0.1879, pruned_loss=0.02019, over 4867.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2083, pruned_loss=0.03111, over 973207.20 frames.], batch size: 22, lr: 1.58e-04 2022-05-08 00:50:53,245 INFO [train.py:715] (0/8) Epoch 14, batch 8800, loss[loss=0.1245, simple_loss=0.2032, pruned_loss=0.02292, over 4985.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2079, pruned_loss=0.03086, over 973278.29 frames.], batch size: 25, lr: 1.58e-04 2022-05-08 00:51:32,823 INFO [train.py:715] (0/8) Epoch 14, batch 8850, loss[loss=0.127, simple_loss=0.1896, pruned_loss=0.03217, over 4836.00 frames.], tot_loss[loss=0.134, simple_loss=0.2073, pruned_loss=0.03034, over 972848.03 frames.], batch size: 13, lr: 1.58e-04 2022-05-08 00:52:13,343 INFO [train.py:715] (0/8) Epoch 14, batch 8900, loss[loss=0.1118, simple_loss=0.1914, pruned_loss=0.01606, over 4769.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2078, pruned_loss=0.03069, over 973163.29 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 00:52:53,207 INFO [train.py:715] (0/8) Epoch 14, batch 8950, loss[loss=0.134, simple_loss=0.2096, pruned_loss=0.0292, over 4780.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2083, pruned_loss=0.031, over 972536.29 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 00:53:33,017 INFO [train.py:715] (0/8) Epoch 14, batch 9000, loss[loss=0.129, simple_loss=0.1991, pruned_loss=0.02943, over 4699.00 frames.], tot_loss[loss=0.1359, simple_loss=0.209, pruned_loss=0.0314, over 973128.05 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:53:33,018 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 00:53:47,941 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1052, simple_loss=0.189, pruned_loss=0.01074, over 914524.00 frames. 2022-05-08 00:54:27,479 INFO [train.py:715] (0/8) Epoch 14, batch 9050, loss[loss=0.143, simple_loss=0.2162, pruned_loss=0.03496, over 4883.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.03179, over 972692.03 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 00:55:07,805 INFO [train.py:715] (0/8) Epoch 14, batch 9100, loss[loss=0.1345, simple_loss=0.2212, pruned_loss=0.02389, over 4977.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03113, over 973265.92 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 00:55:47,311 INFO [train.py:715] (0/8) Epoch 14, batch 9150, loss[loss=0.1448, simple_loss=0.2184, pruned_loss=0.03558, over 4912.00 frames.], tot_loss[loss=0.136, simple_loss=0.2093, pruned_loss=0.0313, over 973235.41 frames.], batch size: 19, lr: 1.58e-04 2022-05-08 00:56:27,173 INFO [train.py:715] (0/8) Epoch 14, batch 9200, loss[loss=0.1532, simple_loss=0.2241, pruned_loss=0.04115, over 4923.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2095, pruned_loss=0.03131, over 971415.88 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 00:57:06,886 INFO [train.py:715] (0/8) Epoch 14, batch 9250, loss[loss=0.1116, simple_loss=0.1845, pruned_loss=0.01937, over 4754.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03113, over 971996.69 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 00:57:46,605 INFO [train.py:715] (0/8) Epoch 14, batch 9300, loss[loss=0.1214, simple_loss=0.1979, pruned_loss=0.02242, over 4905.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.0312, over 971417.02 frames.], batch size: 19, lr: 1.58e-04 2022-05-08 00:58:26,526 INFO [train.py:715] (0/8) Epoch 14, batch 9350, loss[loss=0.1103, simple_loss=0.1741, pruned_loss=0.02324, over 4956.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03096, over 970709.44 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 00:59:06,673 INFO [train.py:715] (0/8) Epoch 14, batch 9400, loss[loss=0.1266, simple_loss=0.1957, pruned_loss=0.02877, over 4802.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03097, over 971010.71 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 00:59:46,293 INFO [train.py:715] (0/8) Epoch 14, batch 9450, loss[loss=0.1183, simple_loss=0.1948, pruned_loss=0.02095, over 4694.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.0303, over 970467.27 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 01:00:26,046 INFO [train.py:715] (0/8) Epoch 14, batch 9500, loss[loss=0.1212, simple_loss=0.1962, pruned_loss=0.02315, over 4957.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03007, over 970298.56 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 01:01:05,840 INFO [train.py:715] (0/8) Epoch 14, batch 9550, loss[loss=0.1256, simple_loss=0.194, pruned_loss=0.02854, over 4925.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02991, over 971358.78 frames.], batch size: 29, lr: 1.58e-04 2022-05-08 01:01:45,995 INFO [train.py:715] (0/8) Epoch 14, batch 9600, loss[loss=0.1272, simple_loss=0.2015, pruned_loss=0.02649, over 4908.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03028, over 971071.30 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 01:02:25,430 INFO [train.py:715] (0/8) Epoch 14, batch 9650, loss[loss=0.1354, simple_loss=0.2047, pruned_loss=0.03303, over 4913.00 frames.], tot_loss[loss=0.134, simple_loss=0.2077, pruned_loss=0.03013, over 971272.97 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 01:03:05,453 INFO [train.py:715] (0/8) Epoch 14, batch 9700, loss[loss=0.1347, simple_loss=0.2102, pruned_loss=0.02953, over 4825.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.03014, over 971786.04 frames.], batch size: 26, lr: 1.58e-04 2022-05-08 01:03:45,039 INFO [train.py:715] (0/8) Epoch 14, batch 9750, loss[loss=0.1238, simple_loss=0.2023, pruned_loss=0.02263, over 4793.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03045, over 972051.77 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 01:04:25,341 INFO [train.py:715] (0/8) Epoch 14, batch 9800, loss[loss=0.1422, simple_loss=0.2197, pruned_loss=0.03233, over 4915.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2096, pruned_loss=0.03052, over 972475.19 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 01:05:04,562 INFO [train.py:715] (0/8) Epoch 14, batch 9850, loss[loss=0.1313, simple_loss=0.1972, pruned_loss=0.03269, over 4793.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.02997, over 973090.44 frames.], batch size: 14, lr: 1.58e-04 2022-05-08 01:05:44,640 INFO [train.py:715] (0/8) Epoch 14, batch 9900, loss[loss=0.1217, simple_loss=0.1962, pruned_loss=0.02363, over 4816.00 frames.], tot_loss[loss=0.1344, simple_loss=0.209, pruned_loss=0.02997, over 973005.40 frames.], batch size: 27, lr: 1.58e-04 2022-05-08 01:06:24,619 INFO [train.py:715] (0/8) Epoch 14, batch 9950, loss[loss=0.1155, simple_loss=0.1909, pruned_loss=0.02011, over 4967.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03025, over 972631.54 frames.], batch size: 28, lr: 1.58e-04 2022-05-08 01:07:03,944 INFO [train.py:715] (0/8) Epoch 14, batch 10000, loss[loss=0.1245, simple_loss=0.1997, pruned_loss=0.02462, over 4882.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2093, pruned_loss=0.03048, over 972685.43 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 01:07:43,989 INFO [train.py:715] (0/8) Epoch 14, batch 10050, loss[loss=0.111, simple_loss=0.186, pruned_loss=0.01798, over 4778.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2088, pruned_loss=0.0297, over 973194.52 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 01:08:23,511 INFO [train.py:715] (0/8) Epoch 14, batch 10100, loss[loss=0.1224, simple_loss=0.2007, pruned_loss=0.02211, over 4894.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02972, over 973047.08 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 01:09:03,293 INFO [train.py:715] (0/8) Epoch 14, batch 10150, loss[loss=0.1456, simple_loss=0.2159, pruned_loss=0.03761, over 4916.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02945, over 972270.82 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 01:09:42,485 INFO [train.py:715] (0/8) Epoch 14, batch 10200, loss[loss=0.1472, simple_loss=0.2195, pruned_loss=0.03745, over 4911.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02992, over 972768.54 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 01:10:22,732 INFO [train.py:715] (0/8) Epoch 14, batch 10250, loss[loss=0.1243, simple_loss=0.2016, pruned_loss=0.0235, over 4797.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2087, pruned_loss=0.03012, over 972850.69 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 01:11:02,453 INFO [train.py:715] (0/8) Epoch 14, batch 10300, loss[loss=0.1046, simple_loss=0.1838, pruned_loss=0.01273, over 4701.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02955, over 973502.46 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 01:11:41,908 INFO [train.py:715] (0/8) Epoch 14, batch 10350, loss[loss=0.122, simple_loss=0.195, pruned_loss=0.02448, over 4888.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02958, over 972796.28 frames.], batch size: 22, lr: 1.58e-04 2022-05-08 01:12:22,105 INFO [train.py:715] (0/8) Epoch 14, batch 10400, loss[loss=0.1168, simple_loss=0.1899, pruned_loss=0.02187, over 4931.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2082, pruned_loss=0.02979, over 973291.92 frames.], batch size: 18, lr: 1.58e-04 2022-05-08 01:13:01,499 INFO [train.py:715] (0/8) Epoch 14, batch 10450, loss[loss=0.1651, simple_loss=0.2369, pruned_loss=0.04664, over 4874.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2093, pruned_loss=0.03026, over 973634.09 frames.], batch size: 16, lr: 1.58e-04 2022-05-08 01:13:41,725 INFO [train.py:715] (0/8) Epoch 14, batch 10500, loss[loss=0.1256, simple_loss=0.1935, pruned_loss=0.02887, over 4647.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2078, pruned_loss=0.02967, over 973474.90 frames.], batch size: 13, lr: 1.58e-04 2022-05-08 01:14:21,000 INFO [train.py:715] (0/8) Epoch 14, batch 10550, loss[loss=0.1233, simple_loss=0.1936, pruned_loss=0.02653, over 4913.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.02947, over 972693.88 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 01:15:01,262 INFO [train.py:715] (0/8) Epoch 14, batch 10600, loss[loss=0.1499, simple_loss=0.225, pruned_loss=0.03743, over 4972.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.02997, over 973551.73 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 01:15:40,586 INFO [train.py:715] (0/8) Epoch 14, batch 10650, loss[loss=0.1306, simple_loss=0.2144, pruned_loss=0.02338, over 4807.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03008, over 973481.82 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 01:16:19,711 INFO [train.py:715] (0/8) Epoch 14, batch 10700, loss[loss=0.1269, simple_loss=0.2033, pruned_loss=0.0252, over 4807.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2087, pruned_loss=0.03016, over 972633.11 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 01:16:58,897 INFO [train.py:715] (0/8) Epoch 14, batch 10750, loss[loss=0.1406, simple_loss=0.2133, pruned_loss=0.03393, over 4811.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2092, pruned_loss=0.03056, over 971669.84 frames.], batch size: 27, lr: 1.58e-04 2022-05-08 01:17:38,324 INFO [train.py:715] (0/8) Epoch 14, batch 10800, loss[loss=0.1239, simple_loss=0.1939, pruned_loss=0.02689, over 4959.00 frames.], tot_loss[loss=0.1361, simple_loss=0.21, pruned_loss=0.03108, over 971609.11 frames.], batch size: 15, lr: 1.58e-04 2022-05-08 01:18:17,863 INFO [train.py:715] (0/8) Epoch 14, batch 10850, loss[loss=0.1255, simple_loss=0.1954, pruned_loss=0.02779, over 4907.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03078, over 971529.77 frames.], batch size: 17, lr: 1.58e-04 2022-05-08 01:18:56,524 INFO [train.py:715] (0/8) Epoch 14, batch 10900, loss[loss=0.104, simple_loss=0.1708, pruned_loss=0.01864, over 4840.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2087, pruned_loss=0.03029, over 971642.76 frames.], batch size: 30, lr: 1.58e-04 2022-05-08 01:19:36,718 INFO [train.py:715] (0/8) Epoch 14, batch 10950, loss[loss=0.1492, simple_loss=0.2283, pruned_loss=0.03503, over 4947.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03025, over 971927.63 frames.], batch size: 24, lr: 1.58e-04 2022-05-08 01:20:17,496 INFO [train.py:715] (0/8) Epoch 14, batch 11000, loss[loss=0.1501, simple_loss=0.2219, pruned_loss=0.03918, over 4933.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03004, over 971737.06 frames.], batch size: 21, lr: 1.58e-04 2022-05-08 01:20:56,619 INFO [train.py:715] (0/8) Epoch 14, batch 11050, loss[loss=0.1589, simple_loss=0.2324, pruned_loss=0.04273, over 4954.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2094, pruned_loss=0.03075, over 971439.45 frames.], batch size: 39, lr: 1.57e-04 2022-05-08 01:21:37,656 INFO [train.py:715] (0/8) Epoch 14, batch 11100, loss[loss=0.1568, simple_loss=0.2192, pruned_loss=0.04717, over 4980.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2094, pruned_loss=0.03097, over 972313.39 frames.], batch size: 35, lr: 1.57e-04 2022-05-08 01:22:18,216 INFO [train.py:715] (0/8) Epoch 14, batch 11150, loss[loss=0.1728, simple_loss=0.2449, pruned_loss=0.05038, over 4937.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03052, over 971455.96 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:22:58,448 INFO [train.py:715] (0/8) Epoch 14, batch 11200, loss[loss=0.1351, simple_loss=0.2083, pruned_loss=0.03095, over 4947.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.03093, over 971995.85 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 01:23:37,883 INFO [train.py:715] (0/8) Epoch 14, batch 11250, loss[loss=0.1777, simple_loss=0.2568, pruned_loss=0.04926, over 4900.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03078, over 972477.30 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 01:24:18,312 INFO [train.py:715] (0/8) Epoch 14, batch 11300, loss[loss=0.1209, simple_loss=0.1865, pruned_loss=0.02767, over 4771.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.03014, over 972412.63 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 01:24:58,564 INFO [train.py:715] (0/8) Epoch 14, batch 11350, loss[loss=0.1237, simple_loss=0.2029, pruned_loss=0.02226, over 4799.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03008, over 972382.80 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:25:37,722 INFO [train.py:715] (0/8) Epoch 14, batch 11400, loss[loss=0.1305, simple_loss=0.2082, pruned_loss=0.0264, over 4831.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03029, over 971697.77 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 01:26:18,733 INFO [train.py:715] (0/8) Epoch 14, batch 11450, loss[loss=0.1177, simple_loss=0.1952, pruned_loss=0.02011, over 4990.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02974, over 971038.51 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:26:59,103 INFO [train.py:715] (0/8) Epoch 14, batch 11500, loss[loss=0.1253, simple_loss=0.2, pruned_loss=0.02529, over 4892.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02976, over 970810.90 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 01:27:39,025 INFO [train.py:715] (0/8) Epoch 14, batch 11550, loss[loss=0.1233, simple_loss=0.1978, pruned_loss=0.02438, over 4769.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02979, over 971085.05 frames.], batch size: 14, lr: 1.57e-04 2022-05-08 01:28:18,471 INFO [train.py:715] (0/8) Epoch 14, batch 11600, loss[loss=0.1302, simple_loss=0.2045, pruned_loss=0.02795, over 4978.00 frames.], tot_loss[loss=0.1334, simple_loss=0.207, pruned_loss=0.02988, over 971328.29 frames.], batch size: 39, lr: 1.57e-04 2022-05-08 01:28:58,177 INFO [train.py:715] (0/8) Epoch 14, batch 11650, loss[loss=0.1094, simple_loss=0.187, pruned_loss=0.01594, over 4978.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2073, pruned_loss=0.03002, over 971199.08 frames.], batch size: 25, lr: 1.57e-04 2022-05-08 01:29:37,887 INFO [train.py:715] (0/8) Epoch 14, batch 11700, loss[loss=0.1358, simple_loss=0.2053, pruned_loss=0.03316, over 4852.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2071, pruned_loss=0.02993, over 971140.27 frames.], batch size: 34, lr: 1.57e-04 2022-05-08 01:30:17,156 INFO [train.py:715] (0/8) Epoch 14, batch 11750, loss[loss=0.1383, simple_loss=0.2143, pruned_loss=0.03114, over 4906.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2076, pruned_loss=0.03047, over 970563.82 frames.], batch size: 18, lr: 1.57e-04 2022-05-08 01:30:56,857 INFO [train.py:715] (0/8) Epoch 14, batch 11800, loss[loss=0.1288, simple_loss=0.2042, pruned_loss=0.02674, over 4922.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2084, pruned_loss=0.03103, over 971841.88 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 01:31:35,979 INFO [train.py:715] (0/8) Epoch 14, batch 11850, loss[loss=0.1417, simple_loss=0.22, pruned_loss=0.03174, over 4819.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2084, pruned_loss=0.03118, over 972995.69 frames.], batch size: 27, lr: 1.57e-04 2022-05-08 01:32:14,891 INFO [train.py:715] (0/8) Epoch 14, batch 11900, loss[loss=0.1413, simple_loss=0.2202, pruned_loss=0.03126, over 4904.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2088, pruned_loss=0.03097, over 972690.47 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 01:32:54,215 INFO [train.py:715] (0/8) Epoch 14, batch 11950, loss[loss=0.1511, simple_loss=0.2184, pruned_loss=0.04192, over 4851.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2091, pruned_loss=0.03082, over 972148.61 frames.], batch size: 32, lr: 1.57e-04 2022-05-08 01:33:33,582 INFO [train.py:715] (0/8) Epoch 14, batch 12000, loss[loss=0.1173, simple_loss=0.1925, pruned_loss=0.0211, over 4866.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.03035, over 971834.90 frames.], batch size: 32, lr: 1.57e-04 2022-05-08 01:33:33,584 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 01:33:43,199 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1051, simple_loss=0.1889, pruned_loss=0.01067, over 914524.00 frames. 2022-05-08 01:34:22,501 INFO [train.py:715] (0/8) Epoch 14, batch 12050, loss[loss=0.1332, simple_loss=0.2029, pruned_loss=0.03178, over 4932.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03031, over 972451.69 frames.], batch size: 18, lr: 1.57e-04 2022-05-08 01:35:01,856 INFO [train.py:715] (0/8) Epoch 14, batch 12100, loss[loss=0.1401, simple_loss=0.2118, pruned_loss=0.03415, over 4961.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03088, over 972379.72 frames.], batch size: 39, lr: 1.57e-04 2022-05-08 01:35:41,276 INFO [train.py:715] (0/8) Epoch 14, batch 12150, loss[loss=0.1515, simple_loss=0.2272, pruned_loss=0.0379, over 4944.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2082, pruned_loss=0.03069, over 971772.94 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:36:20,620 INFO [train.py:715] (0/8) Epoch 14, batch 12200, loss[loss=0.1414, simple_loss=0.2089, pruned_loss=0.03698, over 4947.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2079, pruned_loss=0.03031, over 971188.76 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:37:00,489 INFO [train.py:715] (0/8) Epoch 14, batch 12250, loss[loss=0.114, simple_loss=0.1899, pruned_loss=0.01904, over 4846.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.03069, over 971102.33 frames.], batch size: 13, lr: 1.57e-04 2022-05-08 01:37:39,678 INFO [train.py:715] (0/8) Epoch 14, batch 12300, loss[loss=0.1438, simple_loss=0.2198, pruned_loss=0.03395, over 4983.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03095, over 971072.50 frames.], batch size: 25, lr: 1.57e-04 2022-05-08 01:38:19,194 INFO [train.py:715] (0/8) Epoch 14, batch 12350, loss[loss=0.1147, simple_loss=0.1876, pruned_loss=0.02086, over 4906.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03093, over 971687.00 frames.], batch size: 22, lr: 1.57e-04 2022-05-08 01:38:58,799 INFO [train.py:715] (0/8) Epoch 14, batch 12400, loss[loss=0.1203, simple_loss=0.1954, pruned_loss=0.02259, over 4756.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03098, over 971548.45 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 01:39:37,832 INFO [train.py:715] (0/8) Epoch 14, batch 12450, loss[loss=0.1401, simple_loss=0.214, pruned_loss=0.03309, over 4794.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03113, over 972473.60 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:40:17,253 INFO [train.py:715] (0/8) Epoch 14, batch 12500, loss[loss=0.1287, simple_loss=0.2036, pruned_loss=0.02691, over 4663.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03049, over 971399.71 frames.], batch size: 13, lr: 1.57e-04 2022-05-08 01:40:57,007 INFO [train.py:715] (0/8) Epoch 14, batch 12550, loss[loss=0.1239, simple_loss=0.1957, pruned_loss=0.02602, over 4816.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03089, over 971385.99 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:41:36,645 INFO [train.py:715] (0/8) Epoch 14, batch 12600, loss[loss=0.1227, simple_loss=0.2032, pruned_loss=0.02107, over 4927.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03051, over 970959.05 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 01:42:15,596 INFO [train.py:715] (0/8) Epoch 14, batch 12650, loss[loss=0.1275, simple_loss=0.1979, pruned_loss=0.02854, over 4737.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03029, over 971898.89 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:42:55,481 INFO [train.py:715] (0/8) Epoch 14, batch 12700, loss[loss=0.16, simple_loss=0.2214, pruned_loss=0.04933, over 4858.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.0303, over 972169.51 frames.], batch size: 32, lr: 1.57e-04 2022-05-08 01:43:35,533 INFO [train.py:715] (0/8) Epoch 14, batch 12750, loss[loss=0.144, simple_loss=0.2169, pruned_loss=0.03556, over 4972.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03048, over 972282.67 frames.], batch size: 35, lr: 1.57e-04 2022-05-08 01:44:15,514 INFO [train.py:715] (0/8) Epoch 14, batch 12800, loss[loss=0.1766, simple_loss=0.2515, pruned_loss=0.05086, over 4882.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03062, over 972618.90 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:44:55,316 INFO [train.py:715] (0/8) Epoch 14, batch 12850, loss[loss=0.1121, simple_loss=0.1834, pruned_loss=0.02038, over 4895.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03075, over 973902.08 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:45:35,518 INFO [train.py:715] (0/8) Epoch 14, batch 12900, loss[loss=0.1093, simple_loss=0.1874, pruned_loss=0.0156, over 4765.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2092, pruned_loss=0.03047, over 972890.11 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 01:46:15,878 INFO [train.py:715] (0/8) Epoch 14, batch 12950, loss[loss=0.1749, simple_loss=0.2486, pruned_loss=0.05061, over 4964.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2094, pruned_loss=0.03062, over 973201.08 frames.], batch size: 35, lr: 1.57e-04 2022-05-08 01:46:55,832 INFO [train.py:715] (0/8) Epoch 14, batch 13000, loss[loss=0.1166, simple_loss=0.1921, pruned_loss=0.02055, over 4786.00 frames.], tot_loss[loss=0.1347, simple_loss=0.209, pruned_loss=0.03026, over 972763.71 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 01:47:36,084 INFO [train.py:715] (0/8) Epoch 14, batch 13050, loss[loss=0.1466, simple_loss=0.2198, pruned_loss=0.03675, over 4878.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2091, pruned_loss=0.0308, over 972928.28 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:48:16,096 INFO [train.py:715] (0/8) Epoch 14, batch 13100, loss[loss=0.1156, simple_loss=0.1853, pruned_loss=0.02293, over 4942.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03063, over 972724.81 frames.], batch size: 35, lr: 1.57e-04 2022-05-08 01:48:56,289 INFO [train.py:715] (0/8) Epoch 14, batch 13150, loss[loss=0.1269, simple_loss=0.199, pruned_loss=0.02737, over 4790.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03073, over 972645.61 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 01:49:36,388 INFO [train.py:715] (0/8) Epoch 14, batch 13200, loss[loss=0.1402, simple_loss=0.2211, pruned_loss=0.02962, over 4803.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03077, over 972346.68 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:50:16,589 INFO [train.py:715] (0/8) Epoch 14, batch 13250, loss[loss=0.1217, simple_loss=0.2032, pruned_loss=0.0201, over 4960.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2094, pruned_loss=0.03071, over 971742.76 frames.], batch size: 28, lr: 1.57e-04 2022-05-08 01:50:56,842 INFO [train.py:715] (0/8) Epoch 14, batch 13300, loss[loss=0.1367, simple_loss=0.2041, pruned_loss=0.03466, over 4762.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2097, pruned_loss=0.03105, over 972249.76 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:51:36,428 INFO [train.py:715] (0/8) Epoch 14, batch 13350, loss[loss=0.1485, simple_loss=0.222, pruned_loss=0.03748, over 4807.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2094, pruned_loss=0.03098, over 972497.53 frames.], batch size: 25, lr: 1.57e-04 2022-05-08 01:52:15,923 INFO [train.py:715] (0/8) Epoch 14, batch 13400, loss[loss=0.1279, simple_loss=0.2027, pruned_loss=0.02658, over 4979.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2088, pruned_loss=0.03075, over 973040.33 frames.], batch size: 25, lr: 1.57e-04 2022-05-08 01:52:55,515 INFO [train.py:715] (0/8) Epoch 14, batch 13450, loss[loss=0.1313, simple_loss=0.2173, pruned_loss=0.02268, over 4836.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03041, over 972936.48 frames.], batch size: 26, lr: 1.57e-04 2022-05-08 01:53:35,076 INFO [train.py:715] (0/8) Epoch 14, batch 13500, loss[loss=0.1209, simple_loss=0.1918, pruned_loss=0.02498, over 4771.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03045, over 972399.19 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 01:54:14,283 INFO [train.py:715] (0/8) Epoch 14, batch 13550, loss[loss=0.1494, simple_loss=0.2204, pruned_loss=0.03925, over 4983.00 frames.], tot_loss[loss=0.1347, simple_loss=0.209, pruned_loss=0.03021, over 971882.63 frames.], batch size: 35, lr: 1.57e-04 2022-05-08 01:54:53,672 INFO [train.py:715] (0/8) Epoch 14, batch 13600, loss[loss=0.1284, simple_loss=0.1983, pruned_loss=0.0293, over 4788.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.0306, over 971997.90 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:55:32,972 INFO [train.py:715] (0/8) Epoch 14, batch 13650, loss[loss=0.1528, simple_loss=0.2284, pruned_loss=0.03859, over 4943.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03059, over 972575.00 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 01:56:12,539 INFO [train.py:715] (0/8) Epoch 14, batch 13700, loss[loss=0.1286, simple_loss=0.2096, pruned_loss=0.02382, over 4801.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03014, over 972987.21 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 01:56:51,588 INFO [train.py:715] (0/8) Epoch 14, batch 13750, loss[loss=0.1291, simple_loss=0.1913, pruned_loss=0.03345, over 4765.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2075, pruned_loss=0.03003, over 973052.20 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 01:57:30,923 INFO [train.py:715] (0/8) Epoch 14, batch 13800, loss[loss=0.1164, simple_loss=0.1853, pruned_loss=0.02376, over 4753.00 frames.], tot_loss[loss=0.134, simple_loss=0.2077, pruned_loss=0.03019, over 972905.74 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 01:58:12,499 INFO [train.py:715] (0/8) Epoch 14, batch 13850, loss[loss=0.1468, simple_loss=0.2308, pruned_loss=0.03138, over 4966.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03055, over 972789.10 frames.], batch size: 28, lr: 1.57e-04 2022-05-08 01:58:51,821 INFO [train.py:715] (0/8) Epoch 14, batch 13900, loss[loss=0.1378, simple_loss=0.2061, pruned_loss=0.03473, over 4976.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03099, over 971653.94 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 01:59:31,444 INFO [train.py:715] (0/8) Epoch 14, batch 13950, loss[loss=0.1315, simple_loss=0.2133, pruned_loss=0.02488, over 4958.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03103, over 971683.34 frames.], batch size: 23, lr: 1.57e-04 2022-05-08 02:00:10,942 INFO [train.py:715] (0/8) Epoch 14, batch 14000, loss[loss=0.131, simple_loss=0.2103, pruned_loss=0.02581, over 4831.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.0306, over 971951.97 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 02:00:50,386 INFO [train.py:715] (0/8) Epoch 14, batch 14050, loss[loss=0.1415, simple_loss=0.2116, pruned_loss=0.03566, over 4910.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03107, over 971558.90 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 02:01:30,047 INFO [train.py:715] (0/8) Epoch 14, batch 14100, loss[loss=0.1146, simple_loss=0.1889, pruned_loss=0.02017, over 4980.00 frames.], tot_loss[loss=0.1353, simple_loss=0.209, pruned_loss=0.03074, over 972017.78 frames.], batch size: 28, lr: 1.57e-04 2022-05-08 02:02:09,597 INFO [train.py:715] (0/8) Epoch 14, batch 14150, loss[loss=0.1518, simple_loss=0.2261, pruned_loss=0.03872, over 4883.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2098, pruned_loss=0.03118, over 971622.53 frames.], batch size: 39, lr: 1.57e-04 2022-05-08 02:02:49,096 INFO [train.py:715] (0/8) Epoch 14, batch 14200, loss[loss=0.1267, simple_loss=0.2002, pruned_loss=0.02664, over 4872.00 frames.], tot_loss[loss=0.136, simple_loss=0.2099, pruned_loss=0.0311, over 972718.41 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 02:03:28,351 INFO [train.py:715] (0/8) Epoch 14, batch 14250, loss[loss=0.1007, simple_loss=0.1689, pruned_loss=0.01619, over 4647.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03112, over 972170.34 frames.], batch size: 13, lr: 1.57e-04 2022-05-08 02:04:08,207 INFO [train.py:715] (0/8) Epoch 14, batch 14300, loss[loss=0.135, simple_loss=0.2139, pruned_loss=0.02811, over 4788.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2099, pruned_loss=0.03116, over 972281.88 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 02:04:47,398 INFO [train.py:715] (0/8) Epoch 14, batch 14350, loss[loss=0.1289, simple_loss=0.1984, pruned_loss=0.02969, over 4959.00 frames.], tot_loss[loss=0.136, simple_loss=0.2101, pruned_loss=0.03099, over 971953.77 frames.], batch size: 39, lr: 1.57e-04 2022-05-08 02:05:26,848 INFO [train.py:715] (0/8) Epoch 14, batch 14400, loss[loss=0.1622, simple_loss=0.2419, pruned_loss=0.04124, over 4962.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03077, over 971916.23 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 02:06:06,346 INFO [train.py:715] (0/8) Epoch 14, batch 14450, loss[loss=0.1378, simple_loss=0.212, pruned_loss=0.03183, over 4979.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2094, pruned_loss=0.03057, over 972732.65 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:06:45,902 INFO [train.py:715] (0/8) Epoch 14, batch 14500, loss[loss=0.16, simple_loss=0.2337, pruned_loss=0.04318, over 4688.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2097, pruned_loss=0.03058, over 972736.01 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 02:07:25,177 INFO [train.py:715] (0/8) Epoch 14, batch 14550, loss[loss=0.1312, simple_loss=0.2054, pruned_loss=0.02845, over 4805.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.0304, over 973032.76 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 02:08:04,457 INFO [train.py:715] (0/8) Epoch 14, batch 14600, loss[loss=0.1163, simple_loss=0.1884, pruned_loss=0.02217, over 4978.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2099, pruned_loss=0.03081, over 973295.52 frames.], batch size: 25, lr: 1.57e-04 2022-05-08 02:08:44,679 INFO [train.py:715] (0/8) Epoch 14, batch 14650, loss[loss=0.1366, simple_loss=0.2125, pruned_loss=0.03035, over 4946.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2094, pruned_loss=0.03072, over 973319.86 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 02:09:24,105 INFO [train.py:715] (0/8) Epoch 14, batch 14700, loss[loss=0.1311, simple_loss=0.1991, pruned_loss=0.03157, over 4982.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03002, over 974047.62 frames.], batch size: 28, lr: 1.57e-04 2022-05-08 02:10:03,918 INFO [train.py:715] (0/8) Epoch 14, batch 14750, loss[loss=0.1419, simple_loss=0.2145, pruned_loss=0.0347, over 4826.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02984, over 972871.85 frames.], batch size: 30, lr: 1.57e-04 2022-05-08 02:10:43,094 INFO [train.py:715] (0/8) Epoch 14, batch 14800, loss[loss=0.1346, simple_loss=0.2013, pruned_loss=0.03398, over 4906.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.03, over 972803.93 frames.], batch size: 18, lr: 1.57e-04 2022-05-08 02:11:23,009 INFO [train.py:715] (0/8) Epoch 14, batch 14850, loss[loss=0.1546, simple_loss=0.2228, pruned_loss=0.04323, over 4909.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2077, pruned_loss=0.02999, over 972462.97 frames.], batch size: 23, lr: 1.57e-04 2022-05-08 02:12:02,548 INFO [train.py:715] (0/8) Epoch 14, batch 14900, loss[loss=0.1475, simple_loss=0.225, pruned_loss=0.03504, over 4942.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.03019, over 972267.22 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 02:12:41,995 INFO [train.py:715] (0/8) Epoch 14, batch 14950, loss[loss=0.1372, simple_loss=0.2141, pruned_loss=0.03022, over 4926.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03034, over 972997.45 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 02:13:22,055 INFO [train.py:715] (0/8) Epoch 14, batch 15000, loss[loss=0.1329, simple_loss=0.204, pruned_loss=0.03092, over 4951.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03071, over 972594.36 frames.], batch size: 35, lr: 1.57e-04 2022-05-08 02:13:22,056 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 02:13:31,708 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1052, simple_loss=0.1889, pruned_loss=0.01079, over 914524.00 frames. 2022-05-08 02:14:12,552 INFO [train.py:715] (0/8) Epoch 14, batch 15050, loss[loss=0.1148, simple_loss=0.186, pruned_loss=0.02181, over 4792.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03049, over 971742.11 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 02:14:52,641 INFO [train.py:715] (0/8) Epoch 14, batch 15100, loss[loss=0.1243, simple_loss=0.1845, pruned_loss=0.032, over 4833.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2094, pruned_loss=0.03051, over 971681.59 frames.], batch size: 12, lr: 1.57e-04 2022-05-08 02:15:33,284 INFO [train.py:715] (0/8) Epoch 14, batch 15150, loss[loss=0.1124, simple_loss=0.1838, pruned_loss=0.02044, over 4811.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2092, pruned_loss=0.03052, over 971380.16 frames.], batch size: 12, lr: 1.57e-04 2022-05-08 02:16:13,414 INFO [train.py:715] (0/8) Epoch 14, batch 15200, loss[loss=0.1152, simple_loss=0.183, pruned_loss=0.02371, over 4845.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03047, over 971454.85 frames.], batch size: 12, lr: 1.57e-04 2022-05-08 02:16:54,052 INFO [train.py:715] (0/8) Epoch 14, batch 15250, loss[loss=0.145, simple_loss=0.2275, pruned_loss=0.0313, over 4886.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2092, pruned_loss=0.0306, over 971498.41 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 02:17:33,931 INFO [train.py:715] (0/8) Epoch 14, batch 15300, loss[loss=0.1564, simple_loss=0.2284, pruned_loss=0.04216, over 4972.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03079, over 971410.23 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:18:13,472 INFO [train.py:715] (0/8) Epoch 14, batch 15350, loss[loss=0.1386, simple_loss=0.2076, pruned_loss=0.03481, over 4887.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03093, over 971907.85 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 02:18:53,584 INFO [train.py:715] (0/8) Epoch 14, batch 15400, loss[loss=0.1236, simple_loss=0.1964, pruned_loss=0.02542, over 4840.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2097, pruned_loss=0.03043, over 971435.68 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 02:19:32,972 INFO [train.py:715] (0/8) Epoch 14, batch 15450, loss[loss=0.147, simple_loss=0.2236, pruned_loss=0.03522, over 4914.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2102, pruned_loss=0.03108, over 971029.02 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 02:20:12,214 INFO [train.py:715] (0/8) Epoch 14, batch 15500, loss[loss=0.131, simple_loss=0.2093, pruned_loss=0.02639, over 4762.00 frames.], tot_loss[loss=0.135, simple_loss=0.2092, pruned_loss=0.03038, over 971709.33 frames.], batch size: 16, lr: 1.57e-04 2022-05-08 02:20:51,549 INFO [train.py:715] (0/8) Epoch 14, batch 15550, loss[loss=0.1258, simple_loss=0.1974, pruned_loss=0.02712, over 4960.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03116, over 971400.19 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:21:31,494 INFO [train.py:715] (0/8) Epoch 14, batch 15600, loss[loss=0.1693, simple_loss=0.2339, pruned_loss=0.05233, over 4884.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2096, pruned_loss=0.03094, over 972217.18 frames.], batch size: 38, lr: 1.57e-04 2022-05-08 02:22:10,937 INFO [train.py:715] (0/8) Epoch 14, batch 15650, loss[loss=0.1188, simple_loss=0.1933, pruned_loss=0.02209, over 4948.00 frames.], tot_loss[loss=0.1361, simple_loss=0.21, pruned_loss=0.03111, over 972990.35 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:22:49,323 INFO [train.py:715] (0/8) Epoch 14, batch 15700, loss[loss=0.1562, simple_loss=0.2338, pruned_loss=0.03935, over 4949.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2104, pruned_loss=0.03128, over 972523.55 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 02:23:29,530 INFO [train.py:715] (0/8) Epoch 14, batch 15750, loss[loss=0.1416, simple_loss=0.2162, pruned_loss=0.03344, over 4986.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2101, pruned_loss=0.03113, over 973508.07 frames.], batch size: 25, lr: 1.57e-04 2022-05-08 02:24:09,057 INFO [train.py:715] (0/8) Epoch 14, batch 15800, loss[loss=0.1458, simple_loss=0.2184, pruned_loss=0.03661, over 4984.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03089, over 972923.77 frames.], batch size: 14, lr: 1.57e-04 2022-05-08 02:24:48,297 INFO [train.py:715] (0/8) Epoch 14, batch 15850, loss[loss=0.1664, simple_loss=0.233, pruned_loss=0.04989, over 4934.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2095, pruned_loss=0.03082, over 973202.46 frames.], batch size: 23, lr: 1.57e-04 2022-05-08 02:25:27,578 INFO [train.py:715] (0/8) Epoch 14, batch 15900, loss[loss=0.1491, simple_loss=0.2176, pruned_loss=0.04024, over 4799.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03073, over 972816.01 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 02:26:07,619 INFO [train.py:715] (0/8) Epoch 14, batch 15950, loss[loss=0.1012, simple_loss=0.1763, pruned_loss=0.01309, over 4880.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2097, pruned_loss=0.03066, over 974083.75 frames.], batch size: 22, lr: 1.57e-04 2022-05-08 02:26:47,033 INFO [train.py:715] (0/8) Epoch 14, batch 16000, loss[loss=0.1524, simple_loss=0.2229, pruned_loss=0.04093, over 4934.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03041, over 974001.45 frames.], batch size: 23, lr: 1.57e-04 2022-05-08 02:27:25,752 INFO [train.py:715] (0/8) Epoch 14, batch 16050, loss[loss=0.1288, simple_loss=0.2011, pruned_loss=0.02828, over 4924.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2087, pruned_loss=0.03033, over 973497.23 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 02:28:04,466 INFO [train.py:715] (0/8) Epoch 14, batch 16100, loss[loss=0.1758, simple_loss=0.2424, pruned_loss=0.05454, over 4823.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03032, over 973093.37 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 02:28:42,590 INFO [train.py:715] (0/8) Epoch 14, batch 16150, loss[loss=0.1394, simple_loss=0.2184, pruned_loss=0.03016, over 4867.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2087, pruned_loss=0.03032, over 973582.13 frames.], batch size: 32, lr: 1.57e-04 2022-05-08 02:29:20,833 INFO [train.py:715] (0/8) Epoch 14, batch 16200, loss[loss=0.1518, simple_loss=0.2255, pruned_loss=0.03905, over 4823.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2087, pruned_loss=0.0301, over 974470.62 frames.], batch size: 13, lr: 1.57e-04 2022-05-08 02:29:59,441 INFO [train.py:715] (0/8) Epoch 14, batch 16250, loss[loss=0.1053, simple_loss=0.1788, pruned_loss=0.01594, over 4831.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2095, pruned_loss=0.03029, over 973741.92 frames.], batch size: 13, lr: 1.57e-04 2022-05-08 02:30:38,584 INFO [train.py:715] (0/8) Epoch 14, batch 16300, loss[loss=0.1337, simple_loss=0.212, pruned_loss=0.02763, over 4944.00 frames.], tot_loss[loss=0.1358, simple_loss=0.21, pruned_loss=0.03081, over 972832.00 frames.], batch size: 29, lr: 1.57e-04 2022-05-08 02:31:16,533 INFO [train.py:715] (0/8) Epoch 14, batch 16350, loss[loss=0.1728, simple_loss=0.2304, pruned_loss=0.05764, over 4764.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2101, pruned_loss=0.03113, over 973356.10 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 02:31:55,705 INFO [train.py:715] (0/8) Epoch 14, batch 16400, loss[loss=0.1592, simple_loss=0.2358, pruned_loss=0.0413, over 4877.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2106, pruned_loss=0.03135, over 972236.37 frames.], batch size: 22, lr: 1.57e-04 2022-05-08 02:32:35,419 INFO [train.py:715] (0/8) Epoch 14, batch 16450, loss[loss=0.1269, simple_loss=0.2018, pruned_loss=0.02606, over 4958.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03138, over 972412.93 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:33:14,859 INFO [train.py:715] (0/8) Epoch 14, batch 16500, loss[loss=0.1189, simple_loss=0.193, pruned_loss=0.02238, over 4664.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2105, pruned_loss=0.03162, over 972381.92 frames.], batch size: 13, lr: 1.57e-04 2022-05-08 02:33:53,747 INFO [train.py:715] (0/8) Epoch 14, batch 16550, loss[loss=0.1202, simple_loss=0.2028, pruned_loss=0.01876, over 4980.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2104, pruned_loss=0.03137, over 972410.97 frames.], batch size: 28, lr: 1.57e-04 2022-05-08 02:34:34,125 INFO [train.py:715] (0/8) Epoch 14, batch 16600, loss[loss=0.131, simple_loss=0.206, pruned_loss=0.028, over 4883.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2105, pruned_loss=0.03144, over 972313.15 frames.], batch size: 12, lr: 1.57e-04 2022-05-08 02:35:13,398 INFO [train.py:715] (0/8) Epoch 14, batch 16650, loss[loss=0.1214, simple_loss=0.1945, pruned_loss=0.02418, over 4855.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03099, over 972089.68 frames.], batch size: 32, lr: 1.57e-04 2022-05-08 02:35:27,164 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-504000.pt 2022-05-08 02:35:55,028 INFO [train.py:715] (0/8) Epoch 14, batch 16700, loss[loss=0.1169, simple_loss=0.1916, pruned_loss=0.02107, over 4901.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.0309, over 971411.20 frames.], batch size: 18, lr: 1.57e-04 2022-05-08 02:36:34,909 INFO [train.py:715] (0/8) Epoch 14, batch 16750, loss[loss=0.1151, simple_loss=0.1937, pruned_loss=0.01821, over 4851.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03081, over 971537.72 frames.], batch size: 30, lr: 1.57e-04 2022-05-08 02:37:15,260 INFO [train.py:715] (0/8) Epoch 14, batch 16800, loss[loss=0.1304, simple_loss=0.209, pruned_loss=0.02586, over 4988.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03088, over 972131.60 frames.], batch size: 39, lr: 1.57e-04 2022-05-08 02:37:54,772 INFO [train.py:715] (0/8) Epoch 14, batch 16850, loss[loss=0.156, simple_loss=0.2384, pruned_loss=0.03682, over 4685.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03083, over 972061.04 frames.], batch size: 15, lr: 1.57e-04 2022-05-08 02:38:34,397 INFO [train.py:715] (0/8) Epoch 14, batch 16900, loss[loss=0.1345, simple_loss=0.2208, pruned_loss=0.02412, over 4831.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03077, over 971367.07 frames.], batch size: 26, lr: 1.57e-04 2022-05-08 02:39:15,366 INFO [train.py:715] (0/8) Epoch 14, batch 16950, loss[loss=0.1296, simple_loss=0.2031, pruned_loss=0.02801, over 4968.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03096, over 971533.66 frames.], batch size: 28, lr: 1.57e-04 2022-05-08 02:39:56,925 INFO [train.py:715] (0/8) Epoch 14, batch 17000, loss[loss=0.1385, simple_loss=0.2181, pruned_loss=0.02949, over 4929.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2098, pruned_loss=0.03136, over 971816.41 frames.], batch size: 18, lr: 1.57e-04 2022-05-08 02:40:37,816 INFO [train.py:715] (0/8) Epoch 14, batch 17050, loss[loss=0.112, simple_loss=0.1914, pruned_loss=0.0163, over 4754.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03068, over 972132.96 frames.], batch size: 19, lr: 1.57e-04 2022-05-08 02:41:18,916 INFO [train.py:715] (0/8) Epoch 14, batch 17100, loss[loss=0.1393, simple_loss=0.2078, pruned_loss=0.03545, over 4946.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03103, over 972464.93 frames.], batch size: 14, lr: 1.57e-04 2022-05-08 02:42:01,003 INFO [train.py:715] (0/8) Epoch 14, batch 17150, loss[loss=0.1635, simple_loss=0.2355, pruned_loss=0.0457, over 4775.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03078, over 972658.62 frames.], batch size: 17, lr: 1.57e-04 2022-05-08 02:42:41,745 INFO [train.py:715] (0/8) Epoch 14, batch 17200, loss[loss=0.1405, simple_loss=0.2296, pruned_loss=0.02566, over 4959.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03091, over 972376.80 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:43:22,728 INFO [train.py:715] (0/8) Epoch 14, batch 17250, loss[loss=0.1639, simple_loss=0.2225, pruned_loss=0.05267, over 4975.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03089, over 972664.21 frames.], batch size: 14, lr: 1.57e-04 2022-05-08 02:44:04,207 INFO [train.py:715] (0/8) Epoch 14, batch 17300, loss[loss=0.1165, simple_loss=0.1937, pruned_loss=0.0197, over 4796.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03093, over 972542.12 frames.], batch size: 21, lr: 1.57e-04 2022-05-08 02:44:45,868 INFO [train.py:715] (0/8) Epoch 14, batch 17350, loss[loss=0.1178, simple_loss=0.2008, pruned_loss=0.01742, over 4795.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03057, over 972322.00 frames.], batch size: 24, lr: 1.57e-04 2022-05-08 02:45:26,238 INFO [train.py:715] (0/8) Epoch 14, batch 17400, loss[loss=0.1395, simple_loss=0.2104, pruned_loss=0.03432, over 4775.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03048, over 971676.00 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 02:46:07,490 INFO [train.py:715] (0/8) Epoch 14, batch 17450, loss[loss=0.1291, simple_loss=0.1966, pruned_loss=0.03077, over 4689.00 frames.], tot_loss[loss=0.136, simple_loss=0.2099, pruned_loss=0.031, over 972385.86 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 02:46:49,067 INFO [train.py:715] (0/8) Epoch 14, batch 17500, loss[loss=0.1395, simple_loss=0.2122, pruned_loss=0.03338, over 4931.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03116, over 971894.89 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 02:47:29,777 INFO [train.py:715] (0/8) Epoch 14, batch 17550, loss[loss=0.1362, simple_loss=0.2194, pruned_loss=0.02652, over 4919.00 frames.], tot_loss[loss=0.136, simple_loss=0.2098, pruned_loss=0.03108, over 973136.66 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 02:48:10,327 INFO [train.py:715] (0/8) Epoch 14, batch 17600, loss[loss=0.1446, simple_loss=0.2134, pruned_loss=0.03791, over 4756.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2101, pruned_loss=0.03129, over 972322.89 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 02:48:52,025 INFO [train.py:715] (0/8) Epoch 14, batch 17650, loss[loss=0.1072, simple_loss=0.1696, pruned_loss=0.02244, over 4835.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2091, pruned_loss=0.03117, over 972395.55 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 02:49:33,167 INFO [train.py:715] (0/8) Epoch 14, batch 17700, loss[loss=0.109, simple_loss=0.1717, pruned_loss=0.02313, over 4940.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2093, pruned_loss=0.03157, over 972417.37 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 02:50:13,650 INFO [train.py:715] (0/8) Epoch 14, batch 17750, loss[loss=0.1544, simple_loss=0.2182, pruned_loss=0.04528, over 4969.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2092, pruned_loss=0.03157, over 973054.68 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 02:50:55,021 INFO [train.py:715] (0/8) Epoch 14, batch 17800, loss[loss=0.146, simple_loss=0.2156, pruned_loss=0.03817, over 4975.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2102, pruned_loss=0.03205, over 972860.93 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 02:51:35,970 INFO [train.py:715] (0/8) Epoch 14, batch 17850, loss[loss=0.1301, simple_loss=0.195, pruned_loss=0.03255, over 4964.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2098, pruned_loss=0.03181, over 972008.07 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 02:52:16,730 INFO [train.py:715] (0/8) Epoch 14, batch 17900, loss[loss=0.1539, simple_loss=0.2334, pruned_loss=0.03717, over 4905.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03112, over 971918.74 frames.], batch size: 19, lr: 1.56e-04 2022-05-08 02:52:57,221 INFO [train.py:715] (0/8) Epoch 14, batch 17950, loss[loss=0.1466, simple_loss=0.2228, pruned_loss=0.03518, over 4847.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03089, over 971542.86 frames.], batch size: 30, lr: 1.56e-04 2022-05-08 02:53:38,609 INFO [train.py:715] (0/8) Epoch 14, batch 18000, loss[loss=0.1387, simple_loss=0.2162, pruned_loss=0.03065, over 4714.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03043, over 970959.47 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 02:53:38,611 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 02:53:48,449 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1052, simple_loss=0.1889, pruned_loss=0.01075, over 914524.00 frames. 2022-05-08 02:54:29,838 INFO [train.py:715] (0/8) Epoch 14, batch 18050, loss[loss=0.1327, simple_loss=0.2009, pruned_loss=0.03226, over 4848.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2088, pruned_loss=0.03102, over 971386.90 frames.], batch size: 20, lr: 1.56e-04 2022-05-08 02:55:10,988 INFO [train.py:715] (0/8) Epoch 14, batch 18100, loss[loss=0.1623, simple_loss=0.2484, pruned_loss=0.03813, over 4785.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.03118, over 971343.37 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 02:55:52,581 INFO [train.py:715] (0/8) Epoch 14, batch 18150, loss[loss=0.1364, simple_loss=0.2133, pruned_loss=0.02978, over 4833.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03107, over 971488.42 frames.], batch size: 26, lr: 1.56e-04 2022-05-08 02:56:33,500 INFO [train.py:715] (0/8) Epoch 14, batch 18200, loss[loss=0.1632, simple_loss=0.228, pruned_loss=0.04918, over 4774.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.03097, over 971602.07 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 02:57:15,449 INFO [train.py:715] (0/8) Epoch 14, batch 18250, loss[loss=0.1347, simple_loss=0.2115, pruned_loss=0.02896, over 4911.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03027, over 972339.76 frames.], batch size: 29, lr: 1.56e-04 2022-05-08 02:57:56,894 INFO [train.py:715] (0/8) Epoch 14, batch 18300, loss[loss=0.1375, simple_loss=0.2158, pruned_loss=0.02957, over 4777.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2086, pruned_loss=0.03008, over 972061.64 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 02:58:36,498 INFO [train.py:715] (0/8) Epoch 14, batch 18350, loss[loss=0.1123, simple_loss=0.183, pruned_loss=0.02084, over 4883.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02984, over 971762.41 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 02:59:17,361 INFO [train.py:715] (0/8) Epoch 14, batch 18400, loss[loss=0.112, simple_loss=0.1793, pruned_loss=0.02236, over 4851.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03039, over 971667.50 frames.], batch size: 13, lr: 1.56e-04 2022-05-08 02:59:57,994 INFO [train.py:715] (0/8) Epoch 14, batch 18450, loss[loss=0.1356, simple_loss=0.2008, pruned_loss=0.03516, over 4772.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03065, over 970358.42 frames.], batch size: 19, lr: 1.56e-04 2022-05-08 03:00:38,231 INFO [train.py:715] (0/8) Epoch 14, batch 18500, loss[loss=0.151, simple_loss=0.2302, pruned_loss=0.03585, over 4821.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2092, pruned_loss=0.03059, over 970382.64 frames.], batch size: 25, lr: 1.56e-04 2022-05-08 03:01:18,699 INFO [train.py:715] (0/8) Epoch 14, batch 18550, loss[loss=0.1225, simple_loss=0.1994, pruned_loss=0.02281, over 4950.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03049, over 971000.48 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 03:01:59,557 INFO [train.py:715] (0/8) Epoch 14, batch 18600, loss[loss=0.1149, simple_loss=0.1839, pruned_loss=0.02298, over 4824.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03027, over 970937.72 frames.], batch size: 13, lr: 1.56e-04 2022-05-08 03:02:39,861 INFO [train.py:715] (0/8) Epoch 14, batch 18650, loss[loss=0.1264, simple_loss=0.2048, pruned_loss=0.02398, over 4812.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03059, over 971280.91 frames.], batch size: 27, lr: 1.56e-04 2022-05-08 03:03:20,560 INFO [train.py:715] (0/8) Epoch 14, batch 18700, loss[loss=0.1395, simple_loss=0.2162, pruned_loss=0.03135, over 4989.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03039, over 971171.71 frames.], batch size: 26, lr: 1.56e-04 2022-05-08 03:04:01,153 INFO [train.py:715] (0/8) Epoch 14, batch 18750, loss[loss=0.1227, simple_loss=0.1888, pruned_loss=0.02828, over 4896.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2088, pruned_loss=0.03021, over 972109.45 frames.], batch size: 19, lr: 1.56e-04 2022-05-08 03:04:41,112 INFO [train.py:715] (0/8) Epoch 14, batch 18800, loss[loss=0.1635, simple_loss=0.2365, pruned_loss=0.04522, over 4745.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03053, over 972031.85 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 03:05:21,091 INFO [train.py:715] (0/8) Epoch 14, batch 18850, loss[loss=0.1312, simple_loss=0.2042, pruned_loss=0.02907, over 4841.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03039, over 972649.35 frames.], batch size: 20, lr: 1.56e-04 2022-05-08 03:06:01,827 INFO [train.py:715] (0/8) Epoch 14, batch 18900, loss[loss=0.119, simple_loss=0.1897, pruned_loss=0.02415, over 4764.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03029, over 972226.63 frames.], batch size: 19, lr: 1.56e-04 2022-05-08 03:06:42,898 INFO [train.py:715] (0/8) Epoch 14, batch 18950, loss[loss=0.1274, simple_loss=0.1941, pruned_loss=0.03036, over 4980.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03096, over 971520.56 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:07:23,139 INFO [train.py:715] (0/8) Epoch 14, batch 19000, loss[loss=0.1574, simple_loss=0.2397, pruned_loss=0.03761, over 4933.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03057, over 971883.27 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 03:08:04,062 INFO [train.py:715] (0/8) Epoch 14, batch 19050, loss[loss=0.15, simple_loss=0.2209, pruned_loss=0.03957, over 4742.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2095, pruned_loss=0.03099, over 971564.16 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 03:08:45,079 INFO [train.py:715] (0/8) Epoch 14, batch 19100, loss[loss=0.1222, simple_loss=0.2009, pruned_loss=0.02173, over 4865.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03072, over 971609.12 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 03:09:25,463 INFO [train.py:715] (0/8) Epoch 14, batch 19150, loss[loss=0.1529, simple_loss=0.2249, pruned_loss=0.0405, over 4981.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.0305, over 972503.93 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:10:04,868 INFO [train.py:715] (0/8) Epoch 14, batch 19200, loss[loss=0.125, simple_loss=0.2089, pruned_loss=0.02055, over 4811.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03026, over 972581.07 frames.], batch size: 25, lr: 1.56e-04 2022-05-08 03:10:45,983 INFO [train.py:715] (0/8) Epoch 14, batch 19250, loss[loss=0.1166, simple_loss=0.1899, pruned_loss=0.0217, over 4832.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.0301, over 973615.45 frames.], batch size: 27, lr: 1.56e-04 2022-05-08 03:11:26,902 INFO [train.py:715] (0/8) Epoch 14, batch 19300, loss[loss=0.1089, simple_loss=0.173, pruned_loss=0.02243, over 4841.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03071, over 972440.38 frames.], batch size: 13, lr: 1.56e-04 2022-05-08 03:12:06,949 INFO [train.py:715] (0/8) Epoch 14, batch 19350, loss[loss=0.1342, simple_loss=0.2012, pruned_loss=0.03357, over 4852.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03078, over 972024.91 frames.], batch size: 32, lr: 1.56e-04 2022-05-08 03:12:47,192 INFO [train.py:715] (0/8) Epoch 14, batch 19400, loss[loss=0.1096, simple_loss=0.1843, pruned_loss=0.01742, over 4785.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03052, over 972632.30 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 03:13:28,643 INFO [train.py:715] (0/8) Epoch 14, batch 19450, loss[loss=0.1358, simple_loss=0.2085, pruned_loss=0.03151, over 4943.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03068, over 972820.36 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 03:14:08,956 INFO [train.py:715] (0/8) Epoch 14, batch 19500, loss[loss=0.1121, simple_loss=0.1872, pruned_loss=0.01846, over 4866.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.0305, over 972209.71 frames.], batch size: 32, lr: 1.56e-04 2022-05-08 03:14:49,604 INFO [train.py:715] (0/8) Epoch 14, batch 19550, loss[loss=0.1395, simple_loss=0.2203, pruned_loss=0.02937, over 4811.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03033, over 972142.45 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:15:30,062 INFO [train.py:715] (0/8) Epoch 14, batch 19600, loss[loss=0.1419, simple_loss=0.2155, pruned_loss=0.03418, over 4957.00 frames.], tot_loss[loss=0.135, simple_loss=0.2092, pruned_loss=0.03039, over 972502.44 frames.], batch size: 24, lr: 1.56e-04 2022-05-08 03:16:10,995 INFO [train.py:715] (0/8) Epoch 14, batch 19650, loss[loss=0.177, simple_loss=0.2419, pruned_loss=0.05606, over 4932.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03056, over 971770.59 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 03:16:51,969 INFO [train.py:715] (0/8) Epoch 14, batch 19700, loss[loss=0.1528, simple_loss=0.2257, pruned_loss=0.03994, over 4938.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.0306, over 971793.37 frames.], batch size: 35, lr: 1.56e-04 2022-05-08 03:17:32,727 INFO [train.py:715] (0/8) Epoch 14, batch 19750, loss[loss=0.143, simple_loss=0.2264, pruned_loss=0.02983, over 4781.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03098, over 971375.44 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 03:18:13,648 INFO [train.py:715] (0/8) Epoch 14, batch 19800, loss[loss=0.13, simple_loss=0.2012, pruned_loss=0.02937, over 4792.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.03139, over 971695.20 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 03:18:54,279 INFO [train.py:715] (0/8) Epoch 14, batch 19850, loss[loss=0.1233, simple_loss=0.2062, pruned_loss=0.02021, over 4922.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03093, over 971614.37 frames.], batch size: 39, lr: 1.56e-04 2022-05-08 03:19:35,277 INFO [train.py:715] (0/8) Epoch 14, batch 19900, loss[loss=0.1285, simple_loss=0.1978, pruned_loss=0.02956, over 4790.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.0311, over 972543.73 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 03:20:15,387 INFO [train.py:715] (0/8) Epoch 14, batch 19950, loss[loss=0.1237, simple_loss=0.201, pruned_loss=0.02321, over 4817.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.03115, over 972625.21 frames.], batch size: 26, lr: 1.56e-04 2022-05-08 03:20:55,686 INFO [train.py:715] (0/8) Epoch 14, batch 20000, loss[loss=0.1318, simple_loss=0.2027, pruned_loss=0.03047, over 4800.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2091, pruned_loss=0.03127, over 973617.33 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 03:21:35,496 INFO [train.py:715] (0/8) Epoch 14, batch 20050, loss[loss=0.1442, simple_loss=0.2207, pruned_loss=0.03385, over 4830.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2098, pruned_loss=0.03129, over 973714.84 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:22:15,340 INFO [train.py:715] (0/8) Epoch 14, batch 20100, loss[loss=0.1193, simple_loss=0.1911, pruned_loss=0.02376, over 4777.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03069, over 973180.77 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 03:22:55,783 INFO [train.py:715] (0/8) Epoch 14, batch 20150, loss[loss=0.1115, simple_loss=0.1944, pruned_loss=0.0143, over 4975.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03084, over 973768.80 frames.], batch size: 28, lr: 1.56e-04 2022-05-08 03:23:35,886 INFO [train.py:715] (0/8) Epoch 14, batch 20200, loss[loss=0.1205, simple_loss=0.1934, pruned_loss=0.02377, over 4793.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2086, pruned_loss=0.03018, over 972751.35 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 03:24:16,359 INFO [train.py:715] (0/8) Epoch 14, batch 20250, loss[loss=0.1386, simple_loss=0.2052, pruned_loss=0.036, over 4867.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.0303, over 972715.62 frames.], batch size: 32, lr: 1.56e-04 2022-05-08 03:24:56,503 INFO [train.py:715] (0/8) Epoch 14, batch 20300, loss[loss=0.1136, simple_loss=0.1896, pruned_loss=0.01879, over 4846.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2097, pruned_loss=0.03053, over 972508.55 frames.], batch size: 20, lr: 1.56e-04 2022-05-08 03:25:37,277 INFO [train.py:715] (0/8) Epoch 14, batch 20350, loss[loss=0.1055, simple_loss=0.1653, pruned_loss=0.02284, over 4855.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03048, over 971771.12 frames.], batch size: 13, lr: 1.56e-04 2022-05-08 03:26:17,611 INFO [train.py:715] (0/8) Epoch 14, batch 20400, loss[loss=0.1455, simple_loss=0.2249, pruned_loss=0.03307, over 4824.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2105, pruned_loss=0.03137, over 971761.77 frames.], batch size: 26, lr: 1.56e-04 2022-05-08 03:26:58,061 INFO [train.py:715] (0/8) Epoch 14, batch 20450, loss[loss=0.1289, simple_loss=0.2001, pruned_loss=0.02886, over 4837.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2103, pruned_loss=0.03138, over 971991.36 frames.], batch size: 13, lr: 1.56e-04 2022-05-08 03:27:39,216 INFO [train.py:715] (0/8) Epoch 14, batch 20500, loss[loss=0.1456, simple_loss=0.2182, pruned_loss=0.03656, over 4971.00 frames.], tot_loss[loss=0.1362, simple_loss=0.21, pruned_loss=0.0312, over 971750.20 frames.], batch size: 28, lr: 1.56e-04 2022-05-08 03:28:19,568 INFO [train.py:715] (0/8) Epoch 14, batch 20550, loss[loss=0.1197, simple_loss=0.1925, pruned_loss=0.02347, over 4787.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03124, over 971449.64 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 03:29:00,478 INFO [train.py:715] (0/8) Epoch 14, batch 20600, loss[loss=0.1435, simple_loss=0.226, pruned_loss=0.03052, over 4775.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03124, over 970423.03 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 03:29:41,270 INFO [train.py:715] (0/8) Epoch 14, batch 20650, loss[loss=0.1243, simple_loss=0.1961, pruned_loss=0.02623, over 4843.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03105, over 970756.99 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:30:22,918 INFO [train.py:715] (0/8) Epoch 14, batch 20700, loss[loss=0.1411, simple_loss=0.2219, pruned_loss=0.03021, over 4864.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2087, pruned_loss=0.03097, over 972581.14 frames.], batch size: 32, lr: 1.56e-04 2022-05-08 03:31:03,260 INFO [train.py:715] (0/8) Epoch 14, batch 20750, loss[loss=0.1202, simple_loss=0.1902, pruned_loss=0.02509, over 4968.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03078, over 972558.12 frames.], batch size: 35, lr: 1.56e-04 2022-05-08 03:31:43,465 INFO [train.py:715] (0/8) Epoch 14, batch 20800, loss[loss=0.1386, simple_loss=0.2169, pruned_loss=0.03013, over 4959.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03023, over 972163.64 frames.], batch size: 24, lr: 1.56e-04 2022-05-08 03:32:24,157 INFO [train.py:715] (0/8) Epoch 14, batch 20850, loss[loss=0.1706, simple_loss=0.2369, pruned_loss=0.05218, over 4776.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03047, over 971832.36 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 03:33:04,692 INFO [train.py:715] (0/8) Epoch 14, batch 20900, loss[loss=0.1238, simple_loss=0.1931, pruned_loss=0.02724, over 4913.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03045, over 971941.96 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 03:33:45,369 INFO [train.py:715] (0/8) Epoch 14, batch 20950, loss[loss=0.1309, simple_loss=0.2016, pruned_loss=0.03012, over 4931.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03085, over 972212.43 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 03:34:25,919 INFO [train.py:715] (0/8) Epoch 14, batch 21000, loss[loss=0.1603, simple_loss=0.2356, pruned_loss=0.04248, over 4694.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2098, pruned_loss=0.03103, over 972104.13 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:34:25,920 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 03:34:37,000 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1051, simple_loss=0.1889, pruned_loss=0.0107, over 914524.00 frames. 2022-05-08 03:35:17,903 INFO [train.py:715] (0/8) Epoch 14, batch 21050, loss[loss=0.1621, simple_loss=0.239, pruned_loss=0.04259, over 4795.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2107, pruned_loss=0.03157, over 972044.42 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 03:35:58,603 INFO [train.py:715] (0/8) Epoch 14, batch 21100, loss[loss=0.1359, simple_loss=0.2091, pruned_loss=0.03132, over 4933.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03149, over 973034.11 frames.], batch size: 29, lr: 1.56e-04 2022-05-08 03:36:39,418 INFO [train.py:715] (0/8) Epoch 14, batch 21150, loss[loss=0.1324, simple_loss=0.2096, pruned_loss=0.0276, over 4699.00 frames.], tot_loss[loss=0.1361, simple_loss=0.21, pruned_loss=0.03113, over 972503.06 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:37:18,910 INFO [train.py:715] (0/8) Epoch 14, batch 21200, loss[loss=0.1053, simple_loss=0.1754, pruned_loss=0.01757, over 4797.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03084, over 971921.85 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 03:37:59,330 INFO [train.py:715] (0/8) Epoch 14, batch 21250, loss[loss=0.1693, simple_loss=0.2377, pruned_loss=0.05044, over 4850.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03049, over 972541.15 frames.], batch size: 34, lr: 1.56e-04 2022-05-08 03:38:39,031 INFO [train.py:715] (0/8) Epoch 14, batch 21300, loss[loss=0.1277, simple_loss=0.2019, pruned_loss=0.02675, over 4933.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2093, pruned_loss=0.0306, over 972489.80 frames.], batch size: 23, lr: 1.56e-04 2022-05-08 03:39:17,954 INFO [train.py:715] (0/8) Epoch 14, batch 21350, loss[loss=0.1637, simple_loss=0.2417, pruned_loss=0.04287, over 4986.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2091, pruned_loss=0.03093, over 972817.11 frames.], batch size: 24, lr: 1.56e-04 2022-05-08 03:39:58,398 INFO [train.py:715] (0/8) Epoch 14, batch 21400, loss[loss=0.165, simple_loss=0.232, pruned_loss=0.04902, over 4779.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03051, over 973140.11 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 03:40:38,641 INFO [train.py:715] (0/8) Epoch 14, batch 21450, loss[loss=0.1588, simple_loss=0.2345, pruned_loss=0.0416, over 4747.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.03068, over 972655.82 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 03:41:18,059 INFO [train.py:715] (0/8) Epoch 14, batch 21500, loss[loss=0.1365, simple_loss=0.2086, pruned_loss=0.03215, over 4855.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03105, over 973106.48 frames.], batch size: 32, lr: 1.56e-04 2022-05-08 03:41:57,082 INFO [train.py:715] (0/8) Epoch 14, batch 21550, loss[loss=0.1262, simple_loss=0.2099, pruned_loss=0.02119, over 4826.00 frames.], tot_loss[loss=0.135, simple_loss=0.2084, pruned_loss=0.03079, over 973345.36 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:42:37,072 INFO [train.py:715] (0/8) Epoch 14, batch 21600, loss[loss=0.1178, simple_loss=0.1881, pruned_loss=0.02372, over 4790.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03084, over 973580.14 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 03:43:16,846 INFO [train.py:715] (0/8) Epoch 14, batch 21650, loss[loss=0.1779, simple_loss=0.2443, pruned_loss=0.05574, over 4992.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2094, pruned_loss=0.03102, over 974129.07 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 03:43:55,944 INFO [train.py:715] (0/8) Epoch 14, batch 21700, loss[loss=0.15, simple_loss=0.2299, pruned_loss=0.03512, over 4697.00 frames.], tot_loss[loss=0.1371, simple_loss=0.2107, pruned_loss=0.03172, over 972773.50 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:44:36,356 INFO [train.py:715] (0/8) Epoch 14, batch 21750, loss[loss=0.1289, simple_loss=0.2039, pruned_loss=0.02693, over 4810.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2099, pruned_loss=0.03135, over 972701.76 frames.], batch size: 25, lr: 1.56e-04 2022-05-08 03:45:16,755 INFO [train.py:715] (0/8) Epoch 14, batch 21800, loss[loss=0.1769, simple_loss=0.237, pruned_loss=0.05837, over 4827.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2099, pruned_loss=0.03163, over 972375.67 frames.], batch size: 27, lr: 1.56e-04 2022-05-08 03:45:56,148 INFO [train.py:715] (0/8) Epoch 14, batch 21850, loss[loss=0.1364, simple_loss=0.2081, pruned_loss=0.0324, over 4909.00 frames.], tot_loss[loss=0.1369, simple_loss=0.2103, pruned_loss=0.03172, over 973045.72 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 03:46:35,754 INFO [train.py:715] (0/8) Epoch 14, batch 21900, loss[loss=0.1311, simple_loss=0.21, pruned_loss=0.02605, over 4932.00 frames.], tot_loss[loss=0.1364, simple_loss=0.21, pruned_loss=0.03137, over 972728.36 frames.], batch size: 23, lr: 1.56e-04 2022-05-08 03:47:16,026 INFO [train.py:715] (0/8) Epoch 14, batch 21950, loss[loss=0.122, simple_loss=0.2016, pruned_loss=0.02118, over 4912.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03161, over 972780.69 frames.], batch size: 23, lr: 1.56e-04 2022-05-08 03:47:55,288 INFO [train.py:715] (0/8) Epoch 14, batch 22000, loss[loss=0.1619, simple_loss=0.2381, pruned_loss=0.04279, over 4741.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03114, over 972910.91 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 03:48:34,003 INFO [train.py:715] (0/8) Epoch 14, batch 22050, loss[loss=0.1575, simple_loss=0.2415, pruned_loss=0.03673, over 4886.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2089, pruned_loss=0.03091, over 972790.88 frames.], batch size: 22, lr: 1.56e-04 2022-05-08 03:49:14,108 INFO [train.py:715] (0/8) Epoch 14, batch 22100, loss[loss=0.1338, simple_loss=0.2024, pruned_loss=0.03261, over 4804.00 frames.], tot_loss[loss=0.135, simple_loss=0.2083, pruned_loss=0.03085, over 973793.59 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:49:53,814 INFO [train.py:715] (0/8) Epoch 14, batch 22150, loss[loss=0.1043, simple_loss=0.1867, pruned_loss=0.01099, over 4829.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2082, pruned_loss=0.03083, over 972915.10 frames.], batch size: 27, lr: 1.56e-04 2022-05-08 03:50:32,845 INFO [train.py:715] (0/8) Epoch 14, batch 22200, loss[loss=0.1642, simple_loss=0.2288, pruned_loss=0.04978, over 4838.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2085, pruned_loss=0.03091, over 973076.12 frames.], batch size: 30, lr: 1.56e-04 2022-05-08 03:51:12,590 INFO [train.py:715] (0/8) Epoch 14, batch 22250, loss[loss=0.1223, simple_loss=0.203, pruned_loss=0.02087, over 4895.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2076, pruned_loss=0.03043, over 972793.55 frames.], batch size: 22, lr: 1.56e-04 2022-05-08 03:51:52,764 INFO [train.py:715] (0/8) Epoch 14, batch 22300, loss[loss=0.1208, simple_loss=0.205, pruned_loss=0.01831, over 4963.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2081, pruned_loss=0.03064, over 972781.41 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 03:52:32,247 INFO [train.py:715] (0/8) Epoch 14, batch 22350, loss[loss=0.1169, simple_loss=0.189, pruned_loss=0.02242, over 4988.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03026, over 972159.64 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 03:53:11,403 INFO [train.py:715] (0/8) Epoch 14, batch 22400, loss[loss=0.1212, simple_loss=0.1971, pruned_loss=0.0226, over 4921.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03009, over 971663.28 frames.], batch size: 23, lr: 1.56e-04 2022-05-08 03:53:51,751 INFO [train.py:715] (0/8) Epoch 14, batch 22450, loss[loss=0.1152, simple_loss=0.199, pruned_loss=0.01567, over 4792.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03036, over 972403.46 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 03:54:31,161 INFO [train.py:715] (0/8) Epoch 14, batch 22500, loss[loss=0.1562, simple_loss=0.2283, pruned_loss=0.0421, over 4851.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03049, over 972454.73 frames.], batch size: 32, lr: 1.56e-04 2022-05-08 03:55:10,457 INFO [train.py:715] (0/8) Epoch 14, batch 22550, loss[loss=0.1292, simple_loss=0.2066, pruned_loss=0.02592, over 4933.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03004, over 972683.67 frames.], batch size: 35, lr: 1.56e-04 2022-05-08 03:55:50,813 INFO [train.py:715] (0/8) Epoch 14, batch 22600, loss[loss=0.1346, simple_loss=0.2075, pruned_loss=0.03088, over 4932.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02972, over 972060.97 frames.], batch size: 29, lr: 1.56e-04 2022-05-08 03:56:31,693 INFO [train.py:715] (0/8) Epoch 14, batch 22650, loss[loss=0.1505, simple_loss=0.2195, pruned_loss=0.04076, over 4971.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.0296, over 971610.11 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 03:57:11,534 INFO [train.py:715] (0/8) Epoch 14, batch 22700, loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02948, over 4779.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02917, over 971170.33 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 03:57:50,668 INFO [train.py:715] (0/8) Epoch 14, batch 22750, loss[loss=0.122, simple_loss=0.1871, pruned_loss=0.02852, over 4777.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02982, over 971237.11 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 03:58:32,000 INFO [train.py:715] (0/8) Epoch 14, batch 22800, loss[loss=0.1252, simple_loss=0.2032, pruned_loss=0.02359, over 4970.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.03041, over 971714.29 frames.], batch size: 39, lr: 1.56e-04 2022-05-08 03:59:12,916 INFO [train.py:715] (0/8) Epoch 14, batch 22850, loss[loss=0.1403, simple_loss=0.2129, pruned_loss=0.03391, over 4842.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03062, over 970158.28 frames.], batch size: 30, lr: 1.56e-04 2022-05-08 03:59:53,206 INFO [train.py:715] (0/8) Epoch 14, batch 22900, loss[loss=0.1918, simple_loss=0.2704, pruned_loss=0.05665, over 4929.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2097, pruned_loss=0.03122, over 970384.35 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 04:00:33,078 INFO [train.py:715] (0/8) Epoch 14, batch 22950, loss[loss=0.1281, simple_loss=0.2073, pruned_loss=0.02447, over 4926.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2102, pruned_loss=0.03145, over 970706.96 frames.], batch size: 23, lr: 1.56e-04 2022-05-08 04:01:13,580 INFO [train.py:715] (0/8) Epoch 14, batch 23000, loss[loss=0.1076, simple_loss=0.1896, pruned_loss=0.01278, over 4799.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.0313, over 970296.67 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 04:01:53,100 INFO [train.py:715] (0/8) Epoch 14, batch 23050, loss[loss=0.1539, simple_loss=0.2365, pruned_loss=0.03568, over 4815.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2108, pruned_loss=0.03137, over 971208.83 frames.], batch size: 27, lr: 1.56e-04 2022-05-08 04:02:32,414 INFO [train.py:715] (0/8) Epoch 14, batch 23100, loss[loss=0.1307, simple_loss=0.2028, pruned_loss=0.02928, over 4847.00 frames.], tot_loss[loss=0.1371, simple_loss=0.211, pruned_loss=0.03159, over 971720.81 frames.], batch size: 30, lr: 1.56e-04 2022-05-08 04:03:13,066 INFO [train.py:715] (0/8) Epoch 14, batch 23150, loss[loss=0.09902, simple_loss=0.1752, pruned_loss=0.0114, over 4794.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2103, pruned_loss=0.03156, over 971772.18 frames.], batch size: 12, lr: 1.56e-04 2022-05-08 04:03:54,322 INFO [train.py:715] (0/8) Epoch 14, batch 23200, loss[loss=0.1376, simple_loss=0.2154, pruned_loss=0.02985, over 4983.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2096, pruned_loss=0.03144, over 972425.41 frames.], batch size: 28, lr: 1.56e-04 2022-05-08 04:04:33,065 INFO [train.py:715] (0/8) Epoch 14, batch 23250, loss[loss=0.1265, simple_loss=0.2012, pruned_loss=0.02592, over 4911.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2086, pruned_loss=0.03079, over 971968.80 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 04:05:13,472 INFO [train.py:715] (0/8) Epoch 14, batch 23300, loss[loss=0.1174, simple_loss=0.1942, pruned_loss=0.02026, over 4813.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2086, pruned_loss=0.03094, over 971288.24 frames.], batch size: 13, lr: 1.56e-04 2022-05-08 04:05:54,160 INFO [train.py:715] (0/8) Epoch 14, batch 23350, loss[loss=0.1197, simple_loss=0.1991, pruned_loss=0.02013, over 4962.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03064, over 971500.74 frames.], batch size: 14, lr: 1.56e-04 2022-05-08 04:06:33,751 INFO [train.py:715] (0/8) Epoch 14, batch 23400, loss[loss=0.13, simple_loss=0.1982, pruned_loss=0.03091, over 4932.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03011, over 972488.04 frames.], batch size: 29, lr: 1.56e-04 2022-05-08 04:07:12,803 INFO [train.py:715] (0/8) Epoch 14, batch 23450, loss[loss=0.1347, simple_loss=0.2123, pruned_loss=0.0286, over 4875.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.03029, over 972358.12 frames.], batch size: 16, lr: 1.56e-04 2022-05-08 04:07:53,393 INFO [train.py:715] (0/8) Epoch 14, batch 23500, loss[loss=0.135, simple_loss=0.2117, pruned_loss=0.02917, over 4744.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2077, pruned_loss=0.03009, over 971977.12 frames.], batch size: 19, lr: 1.56e-04 2022-05-08 04:08:34,059 INFO [train.py:715] (0/8) Epoch 14, batch 23550, loss[loss=0.1523, simple_loss=0.2163, pruned_loss=0.04414, over 4837.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2087, pruned_loss=0.03089, over 973212.07 frames.], batch size: 15, lr: 1.56e-04 2022-05-08 04:09:13,318 INFO [train.py:715] (0/8) Epoch 14, batch 23600, loss[loss=0.1481, simple_loss=0.2165, pruned_loss=0.03987, over 4976.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2085, pruned_loss=0.03063, over 973470.52 frames.], batch size: 24, lr: 1.56e-04 2022-05-08 04:09:52,596 INFO [train.py:715] (0/8) Epoch 14, batch 23650, loss[loss=0.1546, simple_loss=0.2393, pruned_loss=0.03498, over 4945.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03116, over 972973.01 frames.], batch size: 21, lr: 1.56e-04 2022-05-08 04:10:32,138 INFO [train.py:715] (0/8) Epoch 14, batch 23700, loss[loss=0.1209, simple_loss=0.1889, pruned_loss=0.02647, over 4972.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.0312, over 973350.86 frames.], batch size: 39, lr: 1.56e-04 2022-05-08 04:11:11,201 INFO [train.py:715] (0/8) Epoch 14, batch 23750, loss[loss=0.1447, simple_loss=0.2292, pruned_loss=0.03009, over 4916.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03057, over 972620.32 frames.], batch size: 18, lr: 1.56e-04 2022-05-08 04:11:50,479 INFO [train.py:715] (0/8) Epoch 14, batch 23800, loss[loss=0.1494, simple_loss=0.2144, pruned_loss=0.04217, over 4883.00 frames.], tot_loss[loss=0.1357, simple_loss=0.21, pruned_loss=0.03073, over 972814.32 frames.], batch size: 22, lr: 1.56e-04 2022-05-08 04:12:30,657 INFO [train.py:715] (0/8) Epoch 14, batch 23850, loss[loss=0.1473, simple_loss=0.2093, pruned_loss=0.04266, over 4824.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03043, over 972352.58 frames.], batch size: 26, lr: 1.56e-04 2022-05-08 04:13:10,489 INFO [train.py:715] (0/8) Epoch 14, batch 23900, loss[loss=0.1715, simple_loss=0.2356, pruned_loss=0.05372, over 4768.00 frames.], tot_loss[loss=0.1359, simple_loss=0.21, pruned_loss=0.03088, over 973529.44 frames.], batch size: 17, lr: 1.56e-04 2022-05-08 04:13:49,738 INFO [train.py:715] (0/8) Epoch 14, batch 23950, loss[loss=0.1451, simple_loss=0.2127, pruned_loss=0.03875, over 4830.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2106, pruned_loss=0.03102, over 973083.31 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:14:30,061 INFO [train.py:715] (0/8) Epoch 14, batch 24000, loss[loss=0.1087, simple_loss=0.1783, pruned_loss=0.01949, over 4974.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.0313, over 972857.67 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 04:14:30,062 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 04:14:41,439 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1052, simple_loss=0.1889, pruned_loss=0.01074, over 914524.00 frames. 2022-05-08 04:15:21,382 INFO [train.py:715] (0/8) Epoch 14, batch 24050, loss[loss=0.142, simple_loss=0.2164, pruned_loss=0.03382, over 4805.00 frames.], tot_loss[loss=0.1358, simple_loss=0.21, pruned_loss=0.0308, over 972983.18 frames.], batch size: 13, lr: 1.55e-04 2022-05-08 04:16:02,434 INFO [train.py:715] (0/8) Epoch 14, batch 24100, loss[loss=0.123, simple_loss=0.2003, pruned_loss=0.02282, over 4849.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03071, over 972896.77 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:16:41,514 INFO [train.py:715] (0/8) Epoch 14, batch 24150, loss[loss=0.1542, simple_loss=0.2247, pruned_loss=0.04189, over 4788.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03053, over 973020.74 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 04:17:21,103 INFO [train.py:715] (0/8) Epoch 14, batch 24200, loss[loss=0.1241, simple_loss=0.2001, pruned_loss=0.02401, over 4831.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03038, over 972762.81 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:18:01,397 INFO [train.py:715] (0/8) Epoch 14, batch 24250, loss[loss=0.1189, simple_loss=0.1941, pruned_loss=0.02186, over 4639.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.0301, over 972896.80 frames.], batch size: 13, lr: 1.55e-04 2022-05-08 04:18:41,644 INFO [train.py:715] (0/8) Epoch 14, batch 24300, loss[loss=0.1214, simple_loss=0.1834, pruned_loss=0.02972, over 4782.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03036, over 973929.57 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 04:19:20,605 INFO [train.py:715] (0/8) Epoch 14, batch 24350, loss[loss=0.1411, simple_loss=0.213, pruned_loss=0.03463, over 4770.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2089, pruned_loss=0.03088, over 974097.79 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 04:20:01,377 INFO [train.py:715] (0/8) Epoch 14, batch 24400, loss[loss=0.143, simple_loss=0.223, pruned_loss=0.03152, over 4940.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03033, over 973575.98 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 04:20:43,002 INFO [train.py:715] (0/8) Epoch 14, batch 24450, loss[loss=0.1426, simple_loss=0.2134, pruned_loss=0.03589, over 4952.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03066, over 972852.37 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 04:21:22,339 INFO [train.py:715] (0/8) Epoch 14, batch 24500, loss[loss=0.1502, simple_loss=0.224, pruned_loss=0.03817, over 4804.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03073, over 973577.85 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 04:22:02,600 INFO [train.py:715] (0/8) Epoch 14, batch 24550, loss[loss=0.1401, simple_loss=0.2246, pruned_loss=0.02781, over 4749.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2092, pruned_loss=0.03062, over 973000.28 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 04:22:43,756 INFO [train.py:715] (0/8) Epoch 14, batch 24600, loss[loss=0.1119, simple_loss=0.1932, pruned_loss=0.0153, over 4747.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03041, over 972093.36 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 04:23:25,376 INFO [train.py:715] (0/8) Epoch 14, batch 24650, loss[loss=0.1448, simple_loss=0.2182, pruned_loss=0.03566, over 4902.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03028, over 972137.11 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 04:23:39,458 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-512000.pt 2022-05-08 04:24:07,556 INFO [train.py:715] (0/8) Epoch 14, batch 24700, loss[loss=0.1107, simple_loss=0.1772, pruned_loss=0.02215, over 4784.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2092, pruned_loss=0.0305, over 972524.80 frames.], batch size: 12, lr: 1.55e-04 2022-05-08 04:24:48,437 INFO [train.py:715] (0/8) Epoch 14, batch 24750, loss[loss=0.1509, simple_loss=0.2253, pruned_loss=0.03825, over 4957.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03056, over 972921.73 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 04:25:30,054 INFO [train.py:715] (0/8) Epoch 14, batch 24800, loss[loss=0.1402, simple_loss=0.2078, pruned_loss=0.03628, over 4922.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03039, over 971423.05 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 04:26:10,631 INFO [train.py:715] (0/8) Epoch 14, batch 24850, loss[loss=0.1117, simple_loss=0.1866, pruned_loss=0.01835, over 4976.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03028, over 971111.18 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 04:26:50,219 INFO [train.py:715] (0/8) Epoch 14, batch 24900, loss[loss=0.114, simple_loss=0.1867, pruned_loss=0.02059, over 4892.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02989, over 971982.72 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 04:27:31,157 INFO [train.py:715] (0/8) Epoch 14, batch 24950, loss[loss=0.2026, simple_loss=0.283, pruned_loss=0.06114, over 4888.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.03037, over 972960.19 frames.], batch size: 38, lr: 1.55e-04 2022-05-08 04:28:12,056 INFO [train.py:715] (0/8) Epoch 14, batch 25000, loss[loss=0.1112, simple_loss=0.1777, pruned_loss=0.02234, over 4677.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03034, over 973614.85 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:28:51,319 INFO [train.py:715] (0/8) Epoch 14, batch 25050, loss[loss=0.1231, simple_loss=0.2011, pruned_loss=0.02258, over 4961.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2088, pruned_loss=0.03015, over 973544.58 frames.], batch size: 28, lr: 1.55e-04 2022-05-08 04:29:32,179 INFO [train.py:715] (0/8) Epoch 14, batch 25100, loss[loss=0.1655, simple_loss=0.2412, pruned_loss=0.04487, over 4982.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2088, pruned_loss=0.03014, over 974284.14 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:30:13,139 INFO [train.py:715] (0/8) Epoch 14, batch 25150, loss[loss=0.1264, simple_loss=0.1868, pruned_loss=0.03296, over 4972.00 frames.], tot_loss[loss=0.135, simple_loss=0.2092, pruned_loss=0.03038, over 974053.14 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:30:53,339 INFO [train.py:715] (0/8) Epoch 14, batch 25200, loss[loss=0.1313, simple_loss=0.2016, pruned_loss=0.03052, over 4984.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03048, over 973624.58 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 04:31:31,964 INFO [train.py:715] (0/8) Epoch 14, batch 25250, loss[loss=0.1497, simple_loss=0.222, pruned_loss=0.03872, over 4686.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03069, over 973504.64 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:32:12,611 INFO [train.py:715] (0/8) Epoch 14, batch 25300, loss[loss=0.1401, simple_loss=0.2138, pruned_loss=0.03319, over 4922.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2096, pruned_loss=0.03151, over 973389.73 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 04:32:53,030 INFO [train.py:715] (0/8) Epoch 14, batch 25350, loss[loss=0.127, simple_loss=0.2042, pruned_loss=0.02484, over 4964.00 frames.], tot_loss[loss=0.1372, simple_loss=0.2107, pruned_loss=0.03184, over 973591.15 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 04:33:31,589 INFO [train.py:715] (0/8) Epoch 14, batch 25400, loss[loss=0.11, simple_loss=0.1802, pruned_loss=0.01994, over 4834.00 frames.], tot_loss[loss=0.1371, simple_loss=0.211, pruned_loss=0.03158, over 973771.30 frames.], batch size: 12, lr: 1.55e-04 2022-05-08 04:34:11,984 INFO [train.py:715] (0/8) Epoch 14, batch 25450, loss[loss=0.1371, simple_loss=0.2093, pruned_loss=0.03246, over 4848.00 frames.], tot_loss[loss=0.1371, simple_loss=0.211, pruned_loss=0.03157, over 973111.49 frames.], batch size: 32, lr: 1.55e-04 2022-05-08 04:34:52,389 INFO [train.py:715] (0/8) Epoch 14, batch 25500, loss[loss=0.1332, simple_loss=0.2025, pruned_loss=0.03198, over 4763.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2103, pruned_loss=0.03139, over 972941.53 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 04:35:31,819 INFO [train.py:715] (0/8) Epoch 14, batch 25550, loss[loss=0.1185, simple_loss=0.1893, pruned_loss=0.02383, over 4963.00 frames.], tot_loss[loss=0.1367, simple_loss=0.2102, pruned_loss=0.03163, over 973508.20 frames.], batch size: 35, lr: 1.55e-04 2022-05-08 04:36:10,560 INFO [train.py:715] (0/8) Epoch 14, batch 25600, loss[loss=0.1637, simple_loss=0.2223, pruned_loss=0.05252, over 4861.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2097, pruned_loss=0.03146, over 973342.58 frames.], batch size: 30, lr: 1.55e-04 2022-05-08 04:36:50,632 INFO [train.py:715] (0/8) Epoch 14, batch 25650, loss[loss=0.1544, simple_loss=0.2327, pruned_loss=0.03804, over 4915.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03045, over 973571.81 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 04:37:30,748 INFO [train.py:715] (0/8) Epoch 14, batch 25700, loss[loss=0.1339, simple_loss=0.2043, pruned_loss=0.03178, over 4793.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03077, over 973351.92 frames.], batch size: 12, lr: 1.55e-04 2022-05-08 04:38:09,215 INFO [train.py:715] (0/8) Epoch 14, batch 25750, loss[loss=0.1504, simple_loss=0.2233, pruned_loss=0.03872, over 4768.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03074, over 973703.29 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 04:38:48,532 INFO [train.py:715] (0/8) Epoch 14, batch 25800, loss[loss=0.1914, simple_loss=0.2543, pruned_loss=0.0642, over 4698.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03039, over 973538.97 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:39:28,741 INFO [train.py:715] (0/8) Epoch 14, batch 25850, loss[loss=0.1582, simple_loss=0.2202, pruned_loss=0.04812, over 4962.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2081, pruned_loss=0.03039, over 972782.09 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:40:07,963 INFO [train.py:715] (0/8) Epoch 14, batch 25900, loss[loss=0.1291, simple_loss=0.2044, pruned_loss=0.02693, over 4969.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2072, pruned_loss=0.03011, over 973076.65 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:40:46,743 INFO [train.py:715] (0/8) Epoch 14, batch 25950, loss[loss=0.1264, simple_loss=0.1892, pruned_loss=0.03182, over 4964.00 frames.], tot_loss[loss=0.134, simple_loss=0.2073, pruned_loss=0.03035, over 972455.98 frames.], batch size: 35, lr: 1.55e-04 2022-05-08 04:41:26,881 INFO [train.py:715] (0/8) Epoch 14, batch 26000, loss[loss=0.1432, simple_loss=0.2079, pruned_loss=0.03924, over 4808.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2078, pruned_loss=0.03044, over 971993.51 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 04:42:06,871 INFO [train.py:715] (0/8) Epoch 14, batch 26050, loss[loss=0.12, simple_loss=0.1943, pruned_loss=0.02281, over 4987.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2082, pruned_loss=0.03059, over 973331.74 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 04:42:44,781 INFO [train.py:715] (0/8) Epoch 14, batch 26100, loss[loss=0.1314, simple_loss=0.1954, pruned_loss=0.03369, over 4888.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2083, pruned_loss=0.03056, over 973902.56 frames.], batch size: 32, lr: 1.55e-04 2022-05-08 04:43:24,716 INFO [train.py:715] (0/8) Epoch 14, batch 26150, loss[loss=0.09875, simple_loss=0.1659, pruned_loss=0.01583, over 4765.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2079, pruned_loss=0.03054, over 973469.05 frames.], batch size: 12, lr: 1.55e-04 2022-05-08 04:44:05,199 INFO [train.py:715] (0/8) Epoch 14, batch 26200, loss[loss=0.1541, simple_loss=0.2285, pruned_loss=0.03987, over 4844.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2089, pruned_loss=0.03126, over 973015.06 frames.], batch size: 32, lr: 1.55e-04 2022-05-08 04:44:44,005 INFO [train.py:715] (0/8) Epoch 14, batch 26250, loss[loss=0.1504, simple_loss=0.2273, pruned_loss=0.03676, over 4802.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2087, pruned_loss=0.03101, over 972632.84 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 04:45:23,187 INFO [train.py:715] (0/8) Epoch 14, batch 26300, loss[loss=0.1478, simple_loss=0.2193, pruned_loss=0.0381, over 4898.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2083, pruned_loss=0.03122, over 972069.36 frames.], batch size: 22, lr: 1.55e-04 2022-05-08 04:46:03,662 INFO [train.py:715] (0/8) Epoch 14, batch 26350, loss[loss=0.1352, simple_loss=0.2126, pruned_loss=0.02886, over 4751.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03112, over 971759.98 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 04:46:43,193 INFO [train.py:715] (0/8) Epoch 14, batch 26400, loss[loss=0.1248, simple_loss=0.2117, pruned_loss=0.01892, over 4899.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2101, pruned_loss=0.03151, over 972429.56 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 04:47:21,821 INFO [train.py:715] (0/8) Epoch 14, batch 26450, loss[loss=0.1274, simple_loss=0.2156, pruned_loss=0.01959, over 4987.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2094, pruned_loss=0.03091, over 972552.95 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 04:48:02,182 INFO [train.py:715] (0/8) Epoch 14, batch 26500, loss[loss=0.1423, simple_loss=0.2272, pruned_loss=0.02868, over 4767.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2102, pruned_loss=0.03124, over 972802.54 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 04:48:42,601 INFO [train.py:715] (0/8) Epoch 14, batch 26550, loss[loss=0.1405, simple_loss=0.2109, pruned_loss=0.03508, over 4920.00 frames.], tot_loss[loss=0.1361, simple_loss=0.21, pruned_loss=0.0311, over 972828.78 frames.], batch size: 23, lr: 1.55e-04 2022-05-08 04:49:21,896 INFO [train.py:715] (0/8) Epoch 14, batch 26600, loss[loss=0.1211, simple_loss=0.1804, pruned_loss=0.03087, over 4960.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2097, pruned_loss=0.031, over 972308.11 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 04:50:00,863 INFO [train.py:715] (0/8) Epoch 14, batch 26650, loss[loss=0.1346, simple_loss=0.2191, pruned_loss=0.02511, over 4773.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03055, over 972699.27 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 04:50:41,183 INFO [train.py:715] (0/8) Epoch 14, batch 26700, loss[loss=0.1259, simple_loss=0.2121, pruned_loss=0.01989, over 4787.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03057, over 973204.99 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 04:51:21,683 INFO [train.py:715] (0/8) Epoch 14, batch 26750, loss[loss=0.1458, simple_loss=0.2185, pruned_loss=0.03653, over 4952.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.03016, over 973247.69 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 04:52:00,695 INFO [train.py:715] (0/8) Epoch 14, batch 26800, loss[loss=0.1259, simple_loss=0.2071, pruned_loss=0.02235, over 4940.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02997, over 973308.71 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 04:52:40,482 INFO [train.py:715] (0/8) Epoch 14, batch 26850, loss[loss=0.1149, simple_loss=0.1891, pruned_loss=0.02034, over 4830.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02991, over 972235.31 frames.], batch size: 30, lr: 1.55e-04 2022-05-08 04:53:20,916 INFO [train.py:715] (0/8) Epoch 14, batch 26900, loss[loss=0.1567, simple_loss=0.2233, pruned_loss=0.04502, over 4982.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03053, over 972844.37 frames.], batch size: 31, lr: 1.55e-04 2022-05-08 04:54:00,760 INFO [train.py:715] (0/8) Epoch 14, batch 26950, loss[loss=0.1633, simple_loss=0.2278, pruned_loss=0.04938, over 4745.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03072, over 972610.10 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 04:54:39,967 INFO [train.py:715] (0/8) Epoch 14, batch 27000, loss[loss=0.1261, simple_loss=0.1981, pruned_loss=0.02702, over 4822.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03055, over 972110.95 frames.], batch size: 27, lr: 1.55e-04 2022-05-08 04:54:39,968 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 04:54:49,614 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1049, simple_loss=0.1886, pruned_loss=0.01053, over 914524.00 frames. 2022-05-08 04:55:29,146 INFO [train.py:715] (0/8) Epoch 14, batch 27050, loss[loss=0.1087, simple_loss=0.1759, pruned_loss=0.02081, over 4799.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03019, over 972375.93 frames.], batch size: 12, lr: 1.55e-04 2022-05-08 04:56:09,801 INFO [train.py:715] (0/8) Epoch 14, batch 27100, loss[loss=0.1498, simple_loss=0.2217, pruned_loss=0.03893, over 4971.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2079, pruned_loss=0.03013, over 972460.21 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 04:56:50,324 INFO [train.py:715] (0/8) Epoch 14, batch 27150, loss[loss=0.1314, simple_loss=0.2118, pruned_loss=0.02552, over 4887.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2079, pruned_loss=0.03033, over 971854.35 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 04:57:29,049 INFO [train.py:715] (0/8) Epoch 14, batch 27200, loss[loss=0.1195, simple_loss=0.1954, pruned_loss=0.0218, over 4764.00 frames.], tot_loss[loss=0.1344, simple_loss=0.208, pruned_loss=0.0304, over 972240.27 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 04:58:08,433 INFO [train.py:715] (0/8) Epoch 14, batch 27250, loss[loss=0.1483, simple_loss=0.2391, pruned_loss=0.02876, over 4865.00 frames.], tot_loss[loss=0.134, simple_loss=0.2075, pruned_loss=0.03021, over 971957.27 frames.], batch size: 20, lr: 1.55e-04 2022-05-08 04:58:48,577 INFO [train.py:715] (0/8) Epoch 14, batch 27300, loss[loss=0.1393, simple_loss=0.2166, pruned_loss=0.03098, over 4976.00 frames.], tot_loss[loss=0.134, simple_loss=0.2072, pruned_loss=0.03043, over 971499.94 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 04:59:28,195 INFO [train.py:715] (0/8) Epoch 14, batch 27350, loss[loss=0.1176, simple_loss=0.1798, pruned_loss=0.02769, over 4979.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03057, over 972942.47 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:00:06,591 INFO [train.py:715] (0/8) Epoch 14, batch 27400, loss[loss=0.1485, simple_loss=0.2141, pruned_loss=0.04141, over 4761.00 frames.], tot_loss[loss=0.1345, simple_loss=0.208, pruned_loss=0.03049, over 972278.91 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 05:00:46,870 INFO [train.py:715] (0/8) Epoch 14, batch 27450, loss[loss=0.1375, simple_loss=0.2077, pruned_loss=0.0337, over 4778.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03114, over 972721.92 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 05:01:26,692 INFO [train.py:715] (0/8) Epoch 14, batch 27500, loss[loss=0.1641, simple_loss=0.2489, pruned_loss=0.03967, over 4877.00 frames.], tot_loss[loss=0.136, simple_loss=0.2097, pruned_loss=0.03115, over 972903.91 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 05:02:05,458 INFO [train.py:715] (0/8) Epoch 14, batch 27550, loss[loss=0.1359, simple_loss=0.2147, pruned_loss=0.02857, over 4899.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2102, pruned_loss=0.03098, over 972881.45 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 05:02:45,162 INFO [train.py:715] (0/8) Epoch 14, batch 27600, loss[loss=0.1292, simple_loss=0.1899, pruned_loss=0.03418, over 4809.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2103, pruned_loss=0.03114, over 972097.10 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:03:25,491 INFO [train.py:715] (0/8) Epoch 14, batch 27650, loss[loss=0.1199, simple_loss=0.1943, pruned_loss=0.02276, over 4914.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2105, pruned_loss=0.03121, over 972132.45 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 05:04:04,757 INFO [train.py:715] (0/8) Epoch 14, batch 27700, loss[loss=0.1317, simple_loss=0.2081, pruned_loss=0.02768, over 4839.00 frames.], tot_loss[loss=0.1374, simple_loss=0.2112, pruned_loss=0.03181, over 972398.39 frames.], batch size: 30, lr: 1.55e-04 2022-05-08 05:04:43,288 INFO [train.py:715] (0/8) Epoch 14, batch 27750, loss[loss=0.131, simple_loss=0.2092, pruned_loss=0.02643, over 4926.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03131, over 971923.63 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 05:05:23,449 INFO [train.py:715] (0/8) Epoch 14, batch 27800, loss[loss=0.1443, simple_loss=0.2262, pruned_loss=0.03119, over 4909.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.03093, over 971948.94 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 05:06:03,188 INFO [train.py:715] (0/8) Epoch 14, batch 27850, loss[loss=0.12, simple_loss=0.2049, pruned_loss=0.01755, over 4991.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03102, over 972031.75 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 05:06:41,703 INFO [train.py:715] (0/8) Epoch 14, batch 27900, loss[loss=0.1346, simple_loss=0.2095, pruned_loss=0.02982, over 4820.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.03073, over 971805.22 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 05:07:21,723 INFO [train.py:715] (0/8) Epoch 14, batch 27950, loss[loss=0.1408, simple_loss=0.2208, pruned_loss=0.03042, over 4777.00 frames.], tot_loss[loss=0.1346, simple_loss=0.208, pruned_loss=0.03059, over 971352.26 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 05:08:01,578 INFO [train.py:715] (0/8) Epoch 14, batch 28000, loss[loss=0.1134, simple_loss=0.1873, pruned_loss=0.01976, over 4776.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2082, pruned_loss=0.03069, over 971543.14 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 05:08:40,620 INFO [train.py:715] (0/8) Epoch 14, batch 28050, loss[loss=0.122, simple_loss=0.1958, pruned_loss=0.02407, over 4806.00 frames.], tot_loss[loss=0.135, simple_loss=0.2085, pruned_loss=0.03077, over 971329.18 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:09:19,677 INFO [train.py:715] (0/8) Epoch 14, batch 28100, loss[loss=0.1431, simple_loss=0.2182, pruned_loss=0.03406, over 4977.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2088, pruned_loss=0.03097, over 972335.47 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 05:10:00,235 INFO [train.py:715] (0/8) Epoch 14, batch 28150, loss[loss=0.1348, simple_loss=0.2116, pruned_loss=0.02905, over 4793.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2079, pruned_loss=0.03068, over 973117.12 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 05:10:39,942 INFO [train.py:715] (0/8) Epoch 14, batch 28200, loss[loss=0.1553, simple_loss=0.236, pruned_loss=0.03725, over 4905.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03035, over 971930.95 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 05:11:17,983 INFO [train.py:715] (0/8) Epoch 14, batch 28250, loss[loss=0.1307, simple_loss=0.2126, pruned_loss=0.02438, over 4938.00 frames.], tot_loss[loss=0.134, simple_loss=0.2075, pruned_loss=0.0303, over 972390.75 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 05:11:58,125 INFO [train.py:715] (0/8) Epoch 14, batch 28300, loss[loss=0.1362, simple_loss=0.2074, pruned_loss=0.03249, over 4914.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2077, pruned_loss=0.03021, over 972553.40 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 05:12:38,003 INFO [train.py:715] (0/8) Epoch 14, batch 28350, loss[loss=0.1419, simple_loss=0.2145, pruned_loss=0.03467, over 4934.00 frames.], tot_loss[loss=0.1337, simple_loss=0.207, pruned_loss=0.03014, over 973204.06 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 05:13:16,551 INFO [train.py:715] (0/8) Epoch 14, batch 28400, loss[loss=0.128, simple_loss=0.2096, pruned_loss=0.02316, over 4802.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2079, pruned_loss=0.03051, over 972614.17 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 05:13:56,138 INFO [train.py:715] (0/8) Epoch 14, batch 28450, loss[loss=0.1373, simple_loss=0.2106, pruned_loss=0.03197, over 4884.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2087, pruned_loss=0.0311, over 972409.54 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 05:14:36,396 INFO [train.py:715] (0/8) Epoch 14, batch 28500, loss[loss=0.1379, simple_loss=0.2162, pruned_loss=0.02981, over 4985.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03116, over 973089.20 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 05:15:15,663 INFO [train.py:715] (0/8) Epoch 14, batch 28550, loss[loss=0.1467, simple_loss=0.2187, pruned_loss=0.03733, over 4830.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2086, pruned_loss=0.03095, over 973203.07 frames.], batch size: 30, lr: 1.55e-04 2022-05-08 05:15:54,183 INFO [train.py:715] (0/8) Epoch 14, batch 28600, loss[loss=0.1391, simple_loss=0.2144, pruned_loss=0.03188, over 4938.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03098, over 973887.70 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 05:16:34,511 INFO [train.py:715] (0/8) Epoch 14, batch 28650, loss[loss=0.1099, simple_loss=0.1853, pruned_loss=0.01724, over 4707.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03035, over 973674.31 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:17:14,552 INFO [train.py:715] (0/8) Epoch 14, batch 28700, loss[loss=0.1201, simple_loss=0.193, pruned_loss=0.02357, over 4782.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2089, pruned_loss=0.03013, over 973531.22 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 05:17:52,652 INFO [train.py:715] (0/8) Epoch 14, batch 28750, loss[loss=0.1355, simple_loss=0.2076, pruned_loss=0.03172, over 4810.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2086, pruned_loss=0.03009, over 973298.71 frames.], batch size: 13, lr: 1.55e-04 2022-05-08 05:18:32,376 INFO [train.py:715] (0/8) Epoch 14, batch 28800, loss[loss=0.1309, simple_loss=0.2119, pruned_loss=0.02502, over 4825.00 frames.], tot_loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.02971, over 972535.12 frames.], batch size: 26, lr: 1.55e-04 2022-05-08 05:19:12,482 INFO [train.py:715] (0/8) Epoch 14, batch 28850, loss[loss=0.1548, simple_loss=0.234, pruned_loss=0.03777, over 4978.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2088, pruned_loss=0.03, over 971599.29 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:19:52,375 INFO [train.py:715] (0/8) Epoch 14, batch 28900, loss[loss=0.1102, simple_loss=0.1789, pruned_loss=0.02078, over 4835.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2087, pruned_loss=0.02989, over 971964.43 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 05:20:30,226 INFO [train.py:715] (0/8) Epoch 14, batch 28950, loss[loss=0.1917, simple_loss=0.2588, pruned_loss=0.06237, over 4971.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2093, pruned_loss=0.03018, over 971725.61 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 05:21:10,703 INFO [train.py:715] (0/8) Epoch 14, batch 29000, loss[loss=0.1447, simple_loss=0.2156, pruned_loss=0.03697, over 4747.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.0297, over 971276.78 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 05:21:50,335 INFO [train.py:715] (0/8) Epoch 14, batch 29050, loss[loss=0.1603, simple_loss=0.2317, pruned_loss=0.04443, over 4935.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03017, over 971300.78 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 05:22:29,105 INFO [train.py:715] (0/8) Epoch 14, batch 29100, loss[loss=0.1248, simple_loss=0.2066, pruned_loss=0.02148, over 4794.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2096, pruned_loss=0.03064, over 971390.98 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 05:23:08,498 INFO [train.py:715] (0/8) Epoch 14, batch 29150, loss[loss=0.136, simple_loss=0.2055, pruned_loss=0.03326, over 4879.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03066, over 971318.10 frames.], batch size: 32, lr: 1.55e-04 2022-05-08 05:23:48,533 INFO [train.py:715] (0/8) Epoch 14, batch 29200, loss[loss=0.1586, simple_loss=0.2281, pruned_loss=0.04455, over 4940.00 frames.], tot_loss[loss=0.1363, simple_loss=0.2104, pruned_loss=0.03112, over 971992.48 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 05:24:28,394 INFO [train.py:715] (0/8) Epoch 14, batch 29250, loss[loss=0.1275, simple_loss=0.201, pruned_loss=0.02698, over 4808.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2106, pruned_loss=0.03085, over 971523.26 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 05:25:06,490 INFO [train.py:715] (0/8) Epoch 14, batch 29300, loss[loss=0.1092, simple_loss=0.1885, pruned_loss=0.0149, over 4972.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2089, pruned_loss=0.02988, over 970852.65 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:25:46,608 INFO [train.py:715] (0/8) Epoch 14, batch 29350, loss[loss=0.1188, simple_loss=0.1914, pruned_loss=0.02306, over 4956.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03026, over 970921.66 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 05:26:26,517 INFO [train.py:715] (0/8) Epoch 14, batch 29400, loss[loss=0.1151, simple_loss=0.1938, pruned_loss=0.01817, over 4817.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.0305, over 970785.89 frames.], batch size: 27, lr: 1.55e-04 2022-05-08 05:27:05,392 INFO [train.py:715] (0/8) Epoch 14, batch 29450, loss[loss=0.1182, simple_loss=0.1985, pruned_loss=0.01894, over 4890.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2094, pruned_loss=0.03064, over 971179.56 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 05:27:45,242 INFO [train.py:715] (0/8) Epoch 14, batch 29500, loss[loss=0.1029, simple_loss=0.1748, pruned_loss=0.01551, over 4820.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03066, over 971324.60 frames.], batch size: 26, lr: 1.55e-04 2022-05-08 05:28:25,577 INFO [train.py:715] (0/8) Epoch 14, batch 29550, loss[loss=0.1455, simple_loss=0.2266, pruned_loss=0.03221, over 4893.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03031, over 971667.90 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 05:29:05,389 INFO [train.py:715] (0/8) Epoch 14, batch 29600, loss[loss=0.1413, simple_loss=0.2157, pruned_loss=0.03343, over 4968.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02987, over 971273.02 frames.], batch size: 24, lr: 1.55e-04 2022-05-08 05:29:44,397 INFO [train.py:715] (0/8) Epoch 14, batch 29650, loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02946, over 4902.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02999, over 972493.00 frames.], batch size: 19, lr: 1.55e-04 2022-05-08 05:30:25,188 INFO [train.py:715] (0/8) Epoch 14, batch 29700, loss[loss=0.1399, simple_loss=0.2109, pruned_loss=0.03447, over 4877.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03032, over 972908.07 frames.], batch size: 16, lr: 1.55e-04 2022-05-08 05:31:06,295 INFO [train.py:715] (0/8) Epoch 14, batch 29750, loss[loss=0.2078, simple_loss=0.2427, pruned_loss=0.0865, over 4779.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.0309, over 972521.93 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 05:31:45,875 INFO [train.py:715] (0/8) Epoch 14, batch 29800, loss[loss=0.1393, simple_loss=0.2045, pruned_loss=0.03703, over 4844.00 frames.], tot_loss[loss=0.1357, simple_loss=0.21, pruned_loss=0.03075, over 972898.12 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:32:26,694 INFO [train.py:715] (0/8) Epoch 14, batch 29850, loss[loss=0.1332, simple_loss=0.1994, pruned_loss=0.03354, over 4989.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.03043, over 972719.68 frames.], batch size: 14, lr: 1.55e-04 2022-05-08 05:33:06,689 INFO [train.py:715] (0/8) Epoch 14, batch 29900, loss[loss=0.1255, simple_loss=0.2085, pruned_loss=0.02129, over 4929.00 frames.], tot_loss[loss=0.1359, simple_loss=0.21, pruned_loss=0.03088, over 973406.70 frames.], batch size: 21, lr: 1.55e-04 2022-05-08 05:33:46,333 INFO [train.py:715] (0/8) Epoch 14, batch 29950, loss[loss=0.1343, simple_loss=0.205, pruned_loss=0.03179, over 4917.00 frames.], tot_loss[loss=0.135, simple_loss=0.2093, pruned_loss=0.03038, over 972408.63 frames.], batch size: 29, lr: 1.55e-04 2022-05-08 05:34:25,098 INFO [train.py:715] (0/8) Epoch 14, batch 30000, loss[loss=0.1242, simple_loss=0.205, pruned_loss=0.02171, over 4852.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2094, pruned_loss=0.03015, over 972688.43 frames.], batch size: 20, lr: 1.55e-04 2022-05-08 05:34:25,099 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 05:34:42,243 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1052, simple_loss=0.189, pruned_loss=0.01075, over 914524.00 frames. 2022-05-08 05:35:21,212 INFO [train.py:715] (0/8) Epoch 14, batch 30050, loss[loss=0.1194, simple_loss=0.1956, pruned_loss=0.02158, over 4973.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2091, pruned_loss=0.02986, over 972625.18 frames.], batch size: 25, lr: 1.55e-04 2022-05-08 05:36:01,186 INFO [train.py:715] (0/8) Epoch 14, batch 30100, loss[loss=0.1592, simple_loss=0.2399, pruned_loss=0.03925, over 4930.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2096, pruned_loss=0.03073, over 972989.64 frames.], batch size: 18, lr: 1.55e-04 2022-05-08 05:36:42,307 INFO [train.py:715] (0/8) Epoch 14, batch 30150, loss[loss=0.1223, simple_loss=0.1934, pruned_loss=0.02567, over 4929.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03108, over 972928.03 frames.], batch size: 17, lr: 1.55e-04 2022-05-08 05:37:21,240 INFO [train.py:715] (0/8) Epoch 14, batch 30200, loss[loss=0.1506, simple_loss=0.2305, pruned_loss=0.03535, over 4817.00 frames.], tot_loss[loss=0.1361, simple_loss=0.21, pruned_loss=0.03115, over 972034.84 frames.], batch size: 26, lr: 1.55e-04 2022-05-08 05:38:01,175 INFO [train.py:715] (0/8) Epoch 14, batch 30250, loss[loss=0.1569, simple_loss=0.2143, pruned_loss=0.04977, over 4847.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2094, pruned_loss=0.0311, over 972626.12 frames.], batch size: 32, lr: 1.55e-04 2022-05-08 05:38:41,854 INFO [train.py:715] (0/8) Epoch 14, batch 30300, loss[loss=0.1183, simple_loss=0.1951, pruned_loss=0.02073, over 4931.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.031, over 973309.09 frames.], batch size: 29, lr: 1.55e-04 2022-05-08 05:39:21,369 INFO [train.py:715] (0/8) Epoch 14, batch 30350, loss[loss=0.1581, simple_loss=0.2227, pruned_loss=0.04673, over 4973.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03102, over 973594.40 frames.], batch size: 31, lr: 1.55e-04 2022-05-08 05:40:00,592 INFO [train.py:715] (0/8) Epoch 14, batch 30400, loss[loss=0.1215, simple_loss=0.2021, pruned_loss=0.02049, over 4844.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.0307, over 973233.58 frames.], batch size: 15, lr: 1.55e-04 2022-05-08 05:40:40,494 INFO [train.py:715] (0/8) Epoch 14, batch 30450, loss[loss=0.1348, simple_loss=0.2106, pruned_loss=0.02948, over 4971.00 frames.], tot_loss[loss=0.1359, simple_loss=0.21, pruned_loss=0.03093, over 973429.31 frames.], batch size: 39, lr: 1.55e-04 2022-05-08 05:41:20,818 INFO [train.py:715] (0/8) Epoch 14, batch 30500, loss[loss=0.1416, simple_loss=0.2062, pruned_loss=0.03851, over 4932.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03126, over 973578.69 frames.], batch size: 23, lr: 1.55e-04 2022-05-08 05:41:59,759 INFO [train.py:715] (0/8) Epoch 14, batch 30550, loss[loss=0.1249, simple_loss=0.194, pruned_loss=0.02793, over 4955.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.03074, over 973360.27 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 05:42:39,654 INFO [train.py:715] (0/8) Epoch 14, batch 30600, loss[loss=0.1251, simple_loss=0.1909, pruned_loss=0.02965, over 4980.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2087, pruned_loss=0.03101, over 973240.41 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 05:43:20,415 INFO [train.py:715] (0/8) Epoch 14, batch 30650, loss[loss=0.1068, simple_loss=0.1861, pruned_loss=0.01379, over 4815.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03068, over 972952.11 frames.], batch size: 27, lr: 1.54e-04 2022-05-08 05:43:59,992 INFO [train.py:715] (0/8) Epoch 14, batch 30700, loss[loss=0.1372, simple_loss=0.2142, pruned_loss=0.03009, over 4933.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03016, over 973080.83 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 05:44:39,756 INFO [train.py:715] (0/8) Epoch 14, batch 30750, loss[loss=0.1385, simple_loss=0.2095, pruned_loss=0.03381, over 4984.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02976, over 973035.48 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 05:45:19,655 INFO [train.py:715] (0/8) Epoch 14, batch 30800, loss[loss=0.1344, simple_loss=0.2095, pruned_loss=0.02965, over 4666.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.0298, over 972931.84 frames.], batch size: 13, lr: 1.54e-04 2022-05-08 05:46:00,444 INFO [train.py:715] (0/8) Epoch 14, batch 30850, loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02947, over 4847.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02982, over 973246.58 frames.], batch size: 34, lr: 1.54e-04 2022-05-08 05:46:39,511 INFO [train.py:715] (0/8) Epoch 14, batch 30900, loss[loss=0.1402, simple_loss=0.2243, pruned_loss=0.02803, over 4937.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02994, over 973355.42 frames.], batch size: 23, lr: 1.54e-04 2022-05-08 05:47:18,041 INFO [train.py:715] (0/8) Epoch 14, batch 30950, loss[loss=0.1264, simple_loss=0.2103, pruned_loss=0.02124, over 4830.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.0299, over 973335.33 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 05:47:57,803 INFO [train.py:715] (0/8) Epoch 14, batch 31000, loss[loss=0.1348, simple_loss=0.2029, pruned_loss=0.03338, over 4751.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.02997, over 973206.16 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 05:48:37,481 INFO [train.py:715] (0/8) Epoch 14, batch 31050, loss[loss=0.1241, simple_loss=0.2018, pruned_loss=0.02316, over 4904.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.03, over 972090.23 frames.], batch size: 17, lr: 1.54e-04 2022-05-08 05:49:17,850 INFO [train.py:715] (0/8) Epoch 14, batch 31100, loss[loss=0.1118, simple_loss=0.1892, pruned_loss=0.01723, over 4814.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.03037, over 972193.08 frames.], batch size: 25, lr: 1.54e-04 2022-05-08 05:49:58,976 INFO [train.py:715] (0/8) Epoch 14, batch 31150, loss[loss=0.1335, simple_loss=0.2111, pruned_loss=0.02795, over 4966.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2099, pruned_loss=0.03079, over 972691.60 frames.], batch size: 24, lr: 1.54e-04 2022-05-08 05:50:40,126 INFO [train.py:715] (0/8) Epoch 14, batch 31200, loss[loss=0.1261, simple_loss=0.2007, pruned_loss=0.02578, over 4903.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2094, pruned_loss=0.03073, over 972510.47 frames.], batch size: 17, lr: 1.54e-04 2022-05-08 05:51:19,909 INFO [train.py:715] (0/8) Epoch 14, batch 31250, loss[loss=0.1501, simple_loss=0.2226, pruned_loss=0.03876, over 4951.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03074, over 972428.87 frames.], batch size: 21, lr: 1.54e-04 2022-05-08 05:52:00,311 INFO [train.py:715] (0/8) Epoch 14, batch 31300, loss[loss=0.131, simple_loss=0.2003, pruned_loss=0.03086, over 4851.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03032, over 973015.09 frames.], batch size: 32, lr: 1.54e-04 2022-05-08 05:52:41,151 INFO [train.py:715] (0/8) Epoch 14, batch 31350, loss[loss=0.1168, simple_loss=0.1945, pruned_loss=0.01955, over 4835.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03054, over 972598.97 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 05:53:21,041 INFO [train.py:715] (0/8) Epoch 14, batch 31400, loss[loss=0.1398, simple_loss=0.2146, pruned_loss=0.03254, over 4791.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.03011, over 971904.73 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 05:54:00,718 INFO [train.py:715] (0/8) Epoch 14, batch 31450, loss[loss=0.1386, simple_loss=0.2105, pruned_loss=0.03333, over 4987.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03034, over 972715.82 frames.], batch size: 27, lr: 1.54e-04 2022-05-08 05:54:40,749 INFO [train.py:715] (0/8) Epoch 14, batch 31500, loss[loss=0.1059, simple_loss=0.1861, pruned_loss=0.01283, over 4856.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.0299, over 972321.75 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 05:55:21,334 INFO [train.py:715] (0/8) Epoch 14, batch 31550, loss[loss=0.1499, simple_loss=0.2199, pruned_loss=0.03993, over 4738.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02974, over 973165.97 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 05:56:01,190 INFO [train.py:715] (0/8) Epoch 14, batch 31600, loss[loss=0.1326, simple_loss=0.2133, pruned_loss=0.02598, over 4963.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2086, pruned_loss=0.03016, over 972846.69 frames.], batch size: 24, lr: 1.54e-04 2022-05-08 05:56:40,694 INFO [train.py:715] (0/8) Epoch 14, batch 31650, loss[loss=0.1514, simple_loss=0.2192, pruned_loss=0.04176, over 4840.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03053, over 972906.15 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 05:57:21,072 INFO [train.py:715] (0/8) Epoch 14, batch 31700, loss[loss=0.1248, simple_loss=0.1969, pruned_loss=0.0263, over 4953.00 frames.], tot_loss[loss=0.1345, simple_loss=0.208, pruned_loss=0.03046, over 973501.22 frames.], batch size: 21, lr: 1.54e-04 2022-05-08 05:58:00,669 INFO [train.py:715] (0/8) Epoch 14, batch 31750, loss[loss=0.1127, simple_loss=0.1875, pruned_loss=0.01898, over 4768.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2083, pruned_loss=0.03056, over 972913.26 frames.], batch size: 19, lr: 1.54e-04 2022-05-08 05:58:40,572 INFO [train.py:715] (0/8) Epoch 14, batch 31800, loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02866, over 4891.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2087, pruned_loss=0.03089, over 972801.53 frames.], batch size: 22, lr: 1.54e-04 2022-05-08 05:59:20,878 INFO [train.py:715] (0/8) Epoch 14, batch 31850, loss[loss=0.1302, simple_loss=0.2076, pruned_loss=0.02637, over 4882.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2102, pruned_loss=0.03134, over 973211.79 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 06:00:01,588 INFO [train.py:715] (0/8) Epoch 14, batch 31900, loss[loss=0.1397, simple_loss=0.2118, pruned_loss=0.03385, over 4952.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.03127, over 973599.79 frames.], batch size: 35, lr: 1.54e-04 2022-05-08 06:00:40,980 INFO [train.py:715] (0/8) Epoch 14, batch 31950, loss[loss=0.1463, simple_loss=0.2175, pruned_loss=0.03751, over 4643.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2096, pruned_loss=0.03108, over 972650.44 frames.], batch size: 13, lr: 1.54e-04 2022-05-08 06:01:20,562 INFO [train.py:715] (0/8) Epoch 14, batch 32000, loss[loss=0.1076, simple_loss=0.1855, pruned_loss=0.01488, over 4838.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2086, pruned_loss=0.03076, over 973272.89 frames.], batch size: 26, lr: 1.54e-04 2022-05-08 06:02:01,141 INFO [train.py:715] (0/8) Epoch 14, batch 32050, loss[loss=0.1309, simple_loss=0.1967, pruned_loss=0.03254, over 4839.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03072, over 972756.94 frames.], batch size: 13, lr: 1.54e-04 2022-05-08 06:02:40,612 INFO [train.py:715] (0/8) Epoch 14, batch 32100, loss[loss=0.1379, simple_loss=0.2131, pruned_loss=0.03136, over 4781.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03071, over 972651.80 frames.], batch size: 17, lr: 1.54e-04 2022-05-08 06:03:20,376 INFO [train.py:715] (0/8) Epoch 14, batch 32150, loss[loss=0.1203, simple_loss=0.1874, pruned_loss=0.02664, over 4882.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03097, over 971924.48 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 06:04:00,807 INFO [train.py:715] (0/8) Epoch 14, batch 32200, loss[loss=0.1249, simple_loss=0.2102, pruned_loss=0.01977, over 4835.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03066, over 972415.47 frames.], batch size: 26, lr: 1.54e-04 2022-05-08 06:04:41,244 INFO [train.py:715] (0/8) Epoch 14, batch 32250, loss[loss=0.1414, simple_loss=0.2061, pruned_loss=0.03836, over 4881.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.0302, over 972725.88 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 06:05:20,518 INFO [train.py:715] (0/8) Epoch 14, batch 32300, loss[loss=0.1046, simple_loss=0.1772, pruned_loss=0.01597, over 4839.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03048, over 973270.68 frames.], batch size: 27, lr: 1.54e-04 2022-05-08 06:06:00,147 INFO [train.py:715] (0/8) Epoch 14, batch 32350, loss[loss=0.1159, simple_loss=0.1841, pruned_loss=0.0239, over 4945.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.03068, over 973638.72 frames.], batch size: 21, lr: 1.54e-04 2022-05-08 06:06:40,260 INFO [train.py:715] (0/8) Epoch 14, batch 32400, loss[loss=0.1341, simple_loss=0.215, pruned_loss=0.02658, over 4943.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03116, over 974287.81 frames.], batch size: 21, lr: 1.54e-04 2022-05-08 06:07:19,949 INFO [train.py:715] (0/8) Epoch 14, batch 32450, loss[loss=0.1377, simple_loss=0.213, pruned_loss=0.03119, over 4799.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2102, pruned_loss=0.03113, over 973565.21 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:07:59,615 INFO [train.py:715] (0/8) Epoch 14, batch 32500, loss[loss=0.1349, simple_loss=0.2068, pruned_loss=0.03151, over 4833.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2097, pruned_loss=0.03061, over 972909.17 frames.], batch size: 13, lr: 1.54e-04 2022-05-08 06:08:39,985 INFO [train.py:715] (0/8) Epoch 14, batch 32550, loss[loss=0.133, simple_loss=0.2154, pruned_loss=0.02528, over 4854.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2097, pruned_loss=0.03053, over 973089.69 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 06:09:20,732 INFO [train.py:715] (0/8) Epoch 14, batch 32600, loss[loss=0.1383, simple_loss=0.2052, pruned_loss=0.03571, over 4687.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2103, pruned_loss=0.03091, over 972573.11 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 06:10:00,310 INFO [train.py:715] (0/8) Epoch 14, batch 32650, loss[loss=0.1543, simple_loss=0.233, pruned_loss=0.03783, over 4873.00 frames.], tot_loss[loss=0.1366, simple_loss=0.2109, pruned_loss=0.03115, over 972288.67 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 06:10:15,192 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-520000.pt 2022-05-08 06:10:43,603 INFO [train.py:715] (0/8) Epoch 14, batch 32700, loss[loss=0.1255, simple_loss=0.1967, pruned_loss=0.02712, over 4705.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.031, over 971627.60 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 06:11:24,766 INFO [train.py:715] (0/8) Epoch 14, batch 32750, loss[loss=0.1186, simple_loss=0.1985, pruned_loss=0.0193, over 4944.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03068, over 972012.68 frames.], batch size: 21, lr: 1.54e-04 2022-05-08 06:12:05,085 INFO [train.py:715] (0/8) Epoch 14, batch 32800, loss[loss=0.1265, simple_loss=0.2049, pruned_loss=0.02404, over 4774.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2096, pruned_loss=0.03099, over 971654.65 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:12:45,511 INFO [train.py:715] (0/8) Epoch 14, batch 32850, loss[loss=0.1238, simple_loss=0.2035, pruned_loss=0.02203, over 4806.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03094, over 971495.72 frames.], batch size: 21, lr: 1.54e-04 2022-05-08 06:13:26,805 INFO [train.py:715] (0/8) Epoch 14, batch 32900, loss[loss=0.1402, simple_loss=0.2146, pruned_loss=0.03291, over 4895.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.0302, over 971932.14 frames.], batch size: 19, lr: 1.54e-04 2022-05-08 06:14:07,966 INFO [train.py:715] (0/8) Epoch 14, batch 32950, loss[loss=0.1089, simple_loss=0.1711, pruned_loss=0.02331, over 4976.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03041, over 972053.48 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 06:14:47,666 INFO [train.py:715] (0/8) Epoch 14, batch 33000, loss[loss=0.1161, simple_loss=0.1945, pruned_loss=0.01888, over 4783.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03015, over 972203.75 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:14:47,667 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 06:15:25,560 INFO [train.py:742] (0/8) Epoch 14, validation: loss=0.1051, simple_loss=0.1889, pruned_loss=0.01071, over 914524.00 frames. 2022-05-08 06:16:05,295 INFO [train.py:715] (0/8) Epoch 14, batch 33050, loss[loss=0.1151, simple_loss=0.1907, pruned_loss=0.01971, over 4811.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2079, pruned_loss=0.03015, over 971744.96 frames.], batch size: 26, lr: 1.54e-04 2022-05-08 06:16:46,141 INFO [train.py:715] (0/8) Epoch 14, batch 33100, loss[loss=0.123, simple_loss=0.197, pruned_loss=0.02453, over 4789.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03036, over 971659.68 frames.], batch size: 17, lr: 1.54e-04 2022-05-08 06:17:27,367 INFO [train.py:715] (0/8) Epoch 14, batch 33150, loss[loss=0.117, simple_loss=0.191, pruned_loss=0.02149, over 4902.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2075, pruned_loss=0.02997, over 971359.90 frames.], batch size: 17, lr: 1.54e-04 2022-05-08 06:18:07,433 INFO [train.py:715] (0/8) Epoch 14, batch 33200, loss[loss=0.142, simple_loss=0.2163, pruned_loss=0.03383, over 4915.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03024, over 971989.85 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:18:47,771 INFO [train.py:715] (0/8) Epoch 14, batch 33250, loss[loss=0.1363, simple_loss=0.1968, pruned_loss=0.0379, over 4978.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2085, pruned_loss=0.03086, over 971930.91 frames.], batch size: 35, lr: 1.54e-04 2022-05-08 06:19:28,523 INFO [train.py:715] (0/8) Epoch 14, batch 33300, loss[loss=0.1297, simple_loss=0.203, pruned_loss=0.02815, over 4828.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03049, over 972134.16 frames.], batch size: 25, lr: 1.54e-04 2022-05-08 06:20:09,723 INFO [train.py:715] (0/8) Epoch 14, batch 33350, loss[loss=0.1107, simple_loss=0.1784, pruned_loss=0.02154, over 4812.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03065, over 971840.79 frames.], batch size: 25, lr: 1.54e-04 2022-05-08 06:20:49,894 INFO [train.py:715] (0/8) Epoch 14, batch 33400, loss[loss=0.1272, simple_loss=0.208, pruned_loss=0.02319, over 4741.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03073, over 972042.29 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 06:21:30,272 INFO [train.py:715] (0/8) Epoch 14, batch 33450, loss[loss=0.1297, simple_loss=0.2012, pruned_loss=0.0291, over 4990.00 frames.], tot_loss[loss=0.136, simple_loss=0.21, pruned_loss=0.03099, over 972372.46 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 06:22:11,507 INFO [train.py:715] (0/8) Epoch 14, batch 33500, loss[loss=0.1122, simple_loss=0.1797, pruned_loss=0.02234, over 4824.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2099, pruned_loss=0.03096, over 972803.40 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 06:22:51,805 INFO [train.py:715] (0/8) Epoch 14, batch 33550, loss[loss=0.1507, simple_loss=0.2181, pruned_loss=0.0417, over 4914.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2095, pruned_loss=0.03088, over 972730.31 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:23:33,001 INFO [train.py:715] (0/8) Epoch 14, batch 33600, loss[loss=0.1274, simple_loss=0.2109, pruned_loss=0.022, over 4785.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03056, over 972194.60 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 06:24:14,048 INFO [train.py:715] (0/8) Epoch 14, batch 33650, loss[loss=0.142, simple_loss=0.2152, pruned_loss=0.03441, over 4836.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03027, over 972867.86 frames.], batch size: 32, lr: 1.54e-04 2022-05-08 06:24:54,969 INFO [train.py:715] (0/8) Epoch 14, batch 33700, loss[loss=0.1212, simple_loss=0.1921, pruned_loss=0.02519, over 4836.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03025, over 972909.44 frames.], batch size: 30, lr: 1.54e-04 2022-05-08 06:25:35,113 INFO [train.py:715] (0/8) Epoch 14, batch 33750, loss[loss=0.1624, simple_loss=0.2309, pruned_loss=0.04698, over 4800.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2096, pruned_loss=0.03049, over 973151.74 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:26:15,682 INFO [train.py:715] (0/8) Epoch 14, batch 33800, loss[loss=0.1417, simple_loss=0.2034, pruned_loss=0.03996, over 4851.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03057, over 972722.18 frames.], batch size: 30, lr: 1.54e-04 2022-05-08 06:26:56,936 INFO [train.py:715] (0/8) Epoch 14, batch 33850, loss[loss=0.1228, simple_loss=0.1937, pruned_loss=0.026, over 4927.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2085, pruned_loss=0.03068, over 972521.21 frames.], batch size: 29, lr: 1.54e-04 2022-05-08 06:27:37,011 INFO [train.py:715] (0/8) Epoch 14, batch 33900, loss[loss=0.1216, simple_loss=0.1938, pruned_loss=0.02471, over 4861.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2084, pruned_loss=0.03115, over 972841.16 frames.], batch size: 30, lr: 1.54e-04 2022-05-08 06:28:17,551 INFO [train.py:715] (0/8) Epoch 14, batch 33950, loss[loss=0.1306, simple_loss=0.2089, pruned_loss=0.02616, over 4776.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2086, pruned_loss=0.03124, over 973056.15 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:28:58,267 INFO [train.py:715] (0/8) Epoch 14, batch 34000, loss[loss=0.1453, simple_loss=0.2279, pruned_loss=0.03136, over 4761.00 frames.], tot_loss[loss=0.136, simple_loss=0.2091, pruned_loss=0.03147, over 972770.46 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 06:29:39,253 INFO [train.py:715] (0/8) Epoch 14, batch 34050, loss[loss=0.108, simple_loss=0.1837, pruned_loss=0.01613, over 4945.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2094, pruned_loss=0.03118, over 972810.04 frames.], batch size: 23, lr: 1.54e-04 2022-05-08 06:30:19,208 INFO [train.py:715] (0/8) Epoch 14, batch 34100, loss[loss=0.1215, simple_loss=0.1923, pruned_loss=0.02531, over 4902.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2101, pruned_loss=0.03136, over 973544.73 frames.], batch size: 22, lr: 1.54e-04 2022-05-08 06:30:59,698 INFO [train.py:715] (0/8) Epoch 14, batch 34150, loss[loss=0.1076, simple_loss=0.1779, pruned_loss=0.01868, over 4848.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03097, over 973553.34 frames.], batch size: 34, lr: 1.54e-04 2022-05-08 06:31:40,130 INFO [train.py:715] (0/8) Epoch 14, batch 34200, loss[loss=0.1454, simple_loss=0.2326, pruned_loss=0.02908, over 4900.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03044, over 973196.95 frames.], batch size: 39, lr: 1.54e-04 2022-05-08 06:32:20,300 INFO [train.py:715] (0/8) Epoch 14, batch 34250, loss[loss=0.131, simple_loss=0.2092, pruned_loss=0.02636, over 4968.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02989, over 973835.70 frames.], batch size: 39, lr: 1.54e-04 2022-05-08 06:33:00,833 INFO [train.py:715] (0/8) Epoch 14, batch 34300, loss[loss=0.1794, simple_loss=0.2388, pruned_loss=0.05998, over 4985.00 frames.], tot_loss[loss=0.1342, simple_loss=0.208, pruned_loss=0.03016, over 974017.42 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 06:33:41,485 INFO [train.py:715] (0/8) Epoch 14, batch 34350, loss[loss=0.1447, simple_loss=0.2061, pruned_loss=0.04168, over 4982.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03034, over 974580.81 frames.], batch size: 31, lr: 1.54e-04 2022-05-08 06:34:22,181 INFO [train.py:715] (0/8) Epoch 14, batch 34400, loss[loss=0.1553, simple_loss=0.2247, pruned_loss=0.04295, over 4845.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.0307, over 974434.13 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 06:35:01,769 INFO [train.py:715] (0/8) Epoch 14, batch 34450, loss[loss=0.1173, simple_loss=0.188, pruned_loss=0.0233, over 4786.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03004, over 974473.38 frames.], batch size: 14, lr: 1.54e-04 2022-05-08 06:35:42,625 INFO [train.py:715] (0/8) Epoch 14, batch 34500, loss[loss=0.1312, simple_loss=0.2028, pruned_loss=0.02976, over 4736.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03003, over 973908.37 frames.], batch size: 16, lr: 1.54e-04 2022-05-08 06:36:23,321 INFO [train.py:715] (0/8) Epoch 14, batch 34550, loss[loss=0.1337, simple_loss=0.2111, pruned_loss=0.02816, over 4971.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02982, over 973865.66 frames.], batch size: 15, lr: 1.54e-04 2022-05-08 06:37:03,433 INFO [train.py:715] (0/8) Epoch 14, batch 34600, loss[loss=0.1345, simple_loss=0.2053, pruned_loss=0.03182, over 4831.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.0299, over 973203.79 frames.], batch size: 30, lr: 1.54e-04 2022-05-08 06:37:43,652 INFO [train.py:715] (0/8) Epoch 14, batch 34650, loss[loss=0.149, simple_loss=0.2203, pruned_loss=0.03887, over 4911.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2085, pruned_loss=0.02982, over 972333.17 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:38:24,394 INFO [train.py:715] (0/8) Epoch 14, batch 34700, loss[loss=0.1296, simple_loss=0.2076, pruned_loss=0.02584, over 4980.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03042, over 972310.60 frames.], batch size: 24, lr: 1.54e-04 2022-05-08 06:39:03,272 INFO [train.py:715] (0/8) Epoch 14, batch 34750, loss[loss=0.1183, simple_loss=0.1952, pruned_loss=0.02076, over 4867.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03069, over 972540.64 frames.], batch size: 20, lr: 1.54e-04 2022-05-08 06:39:40,043 INFO [train.py:715] (0/8) Epoch 14, batch 34800, loss[loss=0.1388, simple_loss=0.2228, pruned_loss=0.02733, over 4925.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03027, over 971735.52 frames.], batch size: 18, lr: 1.54e-04 2022-05-08 06:39:48,794 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-14.pt 2022-05-08 06:40:33,609 INFO [train.py:715] (0/8) Epoch 15, batch 0, loss[loss=0.1222, simple_loss=0.2026, pruned_loss=0.02094, over 4818.00 frames.], tot_loss[loss=0.1222, simple_loss=0.2026, pruned_loss=0.02094, over 4818.00 frames.], batch size: 26, lr: 1.49e-04 2022-05-08 06:41:12,922 INFO [train.py:715] (0/8) Epoch 15, batch 50, loss[loss=0.1202, simple_loss=0.1922, pruned_loss=0.02412, over 4882.00 frames.], tot_loss[loss=0.1365, simple_loss=0.2091, pruned_loss=0.03192, over 219492.10 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 06:41:54,166 INFO [train.py:715] (0/8) Epoch 15, batch 100, loss[loss=0.1121, simple_loss=0.188, pruned_loss=0.01805, over 4905.00 frames.], tot_loss[loss=0.1368, simple_loss=0.2099, pruned_loss=0.03191, over 386759.21 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 06:42:35,661 INFO [train.py:715] (0/8) Epoch 15, batch 150, loss[loss=0.1237, simple_loss=0.1928, pruned_loss=0.02734, over 4796.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.0301, over 516830.45 frames.], batch size: 14, lr: 1.49e-04 2022-05-08 06:43:15,917 INFO [train.py:715] (0/8) Epoch 15, batch 200, loss[loss=0.1436, simple_loss=0.2147, pruned_loss=0.03624, over 4966.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2085, pruned_loss=0.03116, over 617763.65 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 06:43:56,383 INFO [train.py:715] (0/8) Epoch 15, batch 250, loss[loss=0.1318, simple_loss=0.2115, pruned_loss=0.02604, over 4924.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2077, pruned_loss=0.03042, over 696084.42 frames.], batch size: 23, lr: 1.49e-04 2022-05-08 06:44:37,765 INFO [train.py:715] (0/8) Epoch 15, batch 300, loss[loss=0.1134, simple_loss=0.1831, pruned_loss=0.02184, over 4812.00 frames.], tot_loss[loss=0.134, simple_loss=0.2075, pruned_loss=0.0303, over 757045.37 frames.], batch size: 12, lr: 1.49e-04 2022-05-08 06:45:18,786 INFO [train.py:715] (0/8) Epoch 15, batch 350, loss[loss=0.1358, simple_loss=0.2146, pruned_loss=0.02845, over 4786.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2074, pruned_loss=0.0304, over 803746.96 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 06:45:58,472 INFO [train.py:715] (0/8) Epoch 15, batch 400, loss[loss=0.1293, simple_loss=0.206, pruned_loss=0.02627, over 4814.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2078, pruned_loss=0.03053, over 840460.11 frames.], batch size: 25, lr: 1.49e-04 2022-05-08 06:46:39,365 INFO [train.py:715] (0/8) Epoch 15, batch 450, loss[loss=0.1319, simple_loss=0.2083, pruned_loss=0.02775, over 4952.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2083, pruned_loss=0.03059, over 869852.47 frames.], batch size: 29, lr: 1.49e-04 2022-05-08 06:47:20,100 INFO [train.py:715] (0/8) Epoch 15, batch 500, loss[loss=0.1354, simple_loss=0.1999, pruned_loss=0.03548, over 4796.00 frames.], tot_loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.03025, over 892991.07 frames.], batch size: 14, lr: 1.49e-04 2022-05-08 06:48:00,513 INFO [train.py:715] (0/8) Epoch 15, batch 550, loss[loss=0.1215, simple_loss=0.1983, pruned_loss=0.02239, over 4824.00 frames.], tot_loss[loss=0.1345, simple_loss=0.208, pruned_loss=0.0305, over 910379.92 frames.], batch size: 13, lr: 1.49e-04 2022-05-08 06:48:40,064 INFO [train.py:715] (0/8) Epoch 15, batch 600, loss[loss=0.1367, simple_loss=0.2112, pruned_loss=0.03108, over 4817.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03025, over 923950.73 frames.], batch size: 27, lr: 1.49e-04 2022-05-08 06:49:21,140 INFO [train.py:715] (0/8) Epoch 15, batch 650, loss[loss=0.1364, simple_loss=0.2063, pruned_loss=0.03326, over 4960.00 frames.], tot_loss[loss=0.136, simple_loss=0.2099, pruned_loss=0.03099, over 934878.46 frames.], batch size: 35, lr: 1.49e-04 2022-05-08 06:50:01,507 INFO [train.py:715] (0/8) Epoch 15, batch 700, loss[loss=0.142, simple_loss=0.2153, pruned_loss=0.03439, over 4801.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2098, pruned_loss=0.03084, over 943641.61 frames.], batch size: 14, lr: 1.49e-04 2022-05-08 06:50:41,536 INFO [train.py:715] (0/8) Epoch 15, batch 750, loss[loss=0.1403, simple_loss=0.2179, pruned_loss=0.03132, over 4866.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2104, pruned_loss=0.03074, over 950531.25 frames.], batch size: 30, lr: 1.49e-04 2022-05-08 06:51:22,002 INFO [train.py:715] (0/8) Epoch 15, batch 800, loss[loss=0.1135, simple_loss=0.1813, pruned_loss=0.02283, over 4858.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.03045, over 955498.37 frames.], batch size: 20, lr: 1.49e-04 2022-05-08 06:52:02,782 INFO [train.py:715] (0/8) Epoch 15, batch 850, loss[loss=0.1115, simple_loss=0.1914, pruned_loss=0.01577, over 4814.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03008, over 959497.10 frames.], batch size: 26, lr: 1.49e-04 2022-05-08 06:52:43,860 INFO [train.py:715] (0/8) Epoch 15, batch 900, loss[loss=0.1456, simple_loss=0.224, pruned_loss=0.03359, over 4934.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03013, over 962607.95 frames.], batch size: 39, lr: 1.49e-04 2022-05-08 06:53:23,524 INFO [train.py:715] (0/8) Epoch 15, batch 950, loss[loss=0.1195, simple_loss=0.1837, pruned_loss=0.0276, over 4736.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2077, pruned_loss=0.03041, over 963904.53 frames.], batch size: 12, lr: 1.49e-04 2022-05-08 06:54:04,072 INFO [train.py:715] (0/8) Epoch 15, batch 1000, loss[loss=0.1315, simple_loss=0.2077, pruned_loss=0.02767, over 4762.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2088, pruned_loss=0.03122, over 965626.05 frames.], batch size: 19, lr: 1.49e-04 2022-05-08 06:54:44,298 INFO [train.py:715] (0/8) Epoch 15, batch 1050, loss[loss=0.1347, simple_loss=0.2021, pruned_loss=0.03366, over 4787.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03105, over 967425.86 frames.], batch size: 14, lr: 1.49e-04 2022-05-08 06:55:23,575 INFO [train.py:715] (0/8) Epoch 15, batch 1100, loss[loss=0.1182, simple_loss=0.1957, pruned_loss=0.02032, over 4828.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.0307, over 968434.99 frames.], batch size: 27, lr: 1.49e-04 2022-05-08 06:56:04,684 INFO [train.py:715] (0/8) Epoch 15, batch 1150, loss[loss=0.1292, simple_loss=0.2032, pruned_loss=0.02762, over 4834.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2097, pruned_loss=0.03067, over 969392.91 frames.], batch size: 32, lr: 1.49e-04 2022-05-08 06:56:45,825 INFO [train.py:715] (0/8) Epoch 15, batch 1200, loss[loss=0.1475, simple_loss=0.2168, pruned_loss=0.03914, over 4913.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03083, over 970926.55 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 06:57:26,535 INFO [train.py:715] (0/8) Epoch 15, batch 1250, loss[loss=0.1491, simple_loss=0.2212, pruned_loss=0.03847, over 4972.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03028, over 971417.02 frames.], batch size: 25, lr: 1.49e-04 2022-05-08 06:58:06,001 INFO [train.py:715] (0/8) Epoch 15, batch 1300, loss[loss=0.135, simple_loss=0.208, pruned_loss=0.03105, over 4969.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.0299, over 971933.58 frames.], batch size: 35, lr: 1.49e-04 2022-05-08 06:58:46,682 INFO [train.py:715] (0/8) Epoch 15, batch 1350, loss[loss=0.1676, simple_loss=0.2409, pruned_loss=0.0471, over 4919.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03057, over 972171.05 frames.], batch size: 23, lr: 1.49e-04 2022-05-08 06:59:27,344 INFO [train.py:715] (0/8) Epoch 15, batch 1400, loss[loss=0.1327, simple_loss=0.2047, pruned_loss=0.03035, over 4876.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03062, over 972552.29 frames.], batch size: 16, lr: 1.49e-04 2022-05-08 07:00:07,533 INFO [train.py:715] (0/8) Epoch 15, batch 1450, loss[loss=0.124, simple_loss=0.1965, pruned_loss=0.02576, over 4924.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03061, over 971797.52 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 07:00:47,338 INFO [train.py:715] (0/8) Epoch 15, batch 1500, loss[loss=0.1212, simple_loss=0.1903, pruned_loss=0.02606, over 4958.00 frames.], tot_loss[loss=0.1352, simple_loss=0.209, pruned_loss=0.0307, over 972748.31 frames.], batch size: 14, lr: 1.49e-04 2022-05-08 07:01:28,514 INFO [train.py:715] (0/8) Epoch 15, batch 1550, loss[loss=0.1655, simple_loss=0.2377, pruned_loss=0.04668, over 4798.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03058, over 972282.46 frames.], batch size: 24, lr: 1.49e-04 2022-05-08 07:02:08,732 INFO [train.py:715] (0/8) Epoch 15, batch 1600, loss[loss=0.1274, simple_loss=0.2015, pruned_loss=0.02662, over 4775.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03039, over 972339.18 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 07:02:47,771 INFO [train.py:715] (0/8) Epoch 15, batch 1650, loss[loss=0.1231, simple_loss=0.1903, pruned_loss=0.02799, over 4829.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03043, over 972419.64 frames.], batch size: 13, lr: 1.49e-04 2022-05-08 07:03:28,302 INFO [train.py:715] (0/8) Epoch 15, batch 1700, loss[loss=0.1537, simple_loss=0.2267, pruned_loss=0.04031, over 4908.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2086, pruned_loss=0.03083, over 972853.24 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 07:04:08,883 INFO [train.py:715] (0/8) Epoch 15, batch 1750, loss[loss=0.1289, simple_loss=0.1967, pruned_loss=0.03052, over 4986.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03058, over 973292.67 frames.], batch size: 26, lr: 1.49e-04 2022-05-08 07:04:48,962 INFO [train.py:715] (0/8) Epoch 15, batch 1800, loss[loss=0.1337, simple_loss=0.2147, pruned_loss=0.02636, over 4879.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03037, over 972846.75 frames.], batch size: 22, lr: 1.49e-04 2022-05-08 07:05:28,944 INFO [train.py:715] (0/8) Epoch 15, batch 1850, loss[loss=0.1344, simple_loss=0.2048, pruned_loss=0.03199, over 4927.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2093, pruned_loss=0.03071, over 973313.10 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 07:06:09,781 INFO [train.py:715] (0/8) Epoch 15, batch 1900, loss[loss=0.1791, simple_loss=0.2465, pruned_loss=0.05584, over 4874.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03025, over 973036.49 frames.], batch size: 16, lr: 1.49e-04 2022-05-08 07:06:50,232 INFO [train.py:715] (0/8) Epoch 15, batch 1950, loss[loss=0.1471, simple_loss=0.2257, pruned_loss=0.03431, over 4895.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0297, over 973498.71 frames.], batch size: 19, lr: 1.49e-04 2022-05-08 07:07:29,409 INFO [train.py:715] (0/8) Epoch 15, batch 2000, loss[loss=0.1302, simple_loss=0.2061, pruned_loss=0.02711, over 4911.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02937, over 972965.00 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 07:08:10,501 INFO [train.py:715] (0/8) Epoch 15, batch 2050, loss[loss=0.1286, simple_loss=0.2013, pruned_loss=0.02795, over 4959.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02992, over 972069.45 frames.], batch size: 24, lr: 1.49e-04 2022-05-08 07:08:50,814 INFO [train.py:715] (0/8) Epoch 15, batch 2100, loss[loss=0.1256, simple_loss=0.2048, pruned_loss=0.02317, over 4976.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2074, pruned_loss=0.02999, over 972911.58 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:09:30,721 INFO [train.py:715] (0/8) Epoch 15, batch 2150, loss[loss=0.1721, simple_loss=0.2341, pruned_loss=0.05507, over 4685.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2079, pruned_loss=0.03025, over 972164.77 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:10:10,976 INFO [train.py:715] (0/8) Epoch 15, batch 2200, loss[loss=0.1349, simple_loss=0.2172, pruned_loss=0.02628, over 4976.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03056, over 972164.10 frames.], batch size: 28, lr: 1.49e-04 2022-05-08 07:10:51,403 INFO [train.py:715] (0/8) Epoch 15, batch 2250, loss[loss=0.1126, simple_loss=0.1895, pruned_loss=0.01788, over 4929.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2075, pruned_loss=0.03036, over 971689.40 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 07:11:31,547 INFO [train.py:715] (0/8) Epoch 15, batch 2300, loss[loss=0.1698, simple_loss=0.2473, pruned_loss=0.04614, over 4961.00 frames.], tot_loss[loss=0.1336, simple_loss=0.207, pruned_loss=0.03012, over 972395.77 frames.], batch size: 24, lr: 1.49e-04 2022-05-08 07:12:11,042 INFO [train.py:715] (0/8) Epoch 15, batch 2350, loss[loss=0.1066, simple_loss=0.1839, pruned_loss=0.0147, over 4937.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03012, over 973255.24 frames.], batch size: 21, lr: 1.49e-04 2022-05-08 07:12:51,315 INFO [train.py:715] (0/8) Epoch 15, batch 2400, loss[loss=0.1551, simple_loss=0.2146, pruned_loss=0.04775, over 4838.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2086, pruned_loss=0.03121, over 972832.26 frames.], batch size: 32, lr: 1.49e-04 2022-05-08 07:13:31,565 INFO [train.py:715] (0/8) Epoch 15, batch 2450, loss[loss=0.1219, simple_loss=0.1969, pruned_loss=0.02346, over 4903.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03072, over 971659.29 frames.], batch size: 23, lr: 1.49e-04 2022-05-08 07:14:11,488 INFO [train.py:715] (0/8) Epoch 15, batch 2500, loss[loss=0.1034, simple_loss=0.1731, pruned_loss=0.01691, over 4857.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03025, over 971357.43 frames.], batch size: 13, lr: 1.49e-04 2022-05-08 07:14:50,604 INFO [train.py:715] (0/8) Epoch 15, batch 2550, loss[loss=0.1324, simple_loss=0.2194, pruned_loss=0.02267, over 4821.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03005, over 971044.15 frames.], batch size: 26, lr: 1.49e-04 2022-05-08 07:15:31,415 INFO [train.py:715] (0/8) Epoch 15, batch 2600, loss[loss=0.1645, simple_loss=0.2422, pruned_loss=0.04338, over 4771.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2076, pruned_loss=0.03002, over 970675.76 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 07:16:12,101 INFO [train.py:715] (0/8) Epoch 15, batch 2650, loss[loss=0.1177, simple_loss=0.1833, pruned_loss=0.02605, over 4698.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03012, over 971188.18 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:16:51,592 INFO [train.py:715] (0/8) Epoch 15, batch 2700, loss[loss=0.1351, simple_loss=0.2093, pruned_loss=0.03049, over 4684.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02968, over 971481.50 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:17:33,117 INFO [train.py:715] (0/8) Epoch 15, batch 2750, loss[loss=0.1344, simple_loss=0.2182, pruned_loss=0.0253, over 4894.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.0296, over 972199.39 frames.], batch size: 22, lr: 1.49e-04 2022-05-08 07:18:14,186 INFO [train.py:715] (0/8) Epoch 15, batch 2800, loss[loss=0.1349, simple_loss=0.2206, pruned_loss=0.02467, over 4785.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02973, over 972644.34 frames.], batch size: 21, lr: 1.49e-04 2022-05-08 07:18:54,886 INFO [train.py:715] (0/8) Epoch 15, batch 2850, loss[loss=0.1286, simple_loss=0.2028, pruned_loss=0.02714, over 4978.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2073, pruned_loss=0.02961, over 972610.12 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:19:34,217 INFO [train.py:715] (0/8) Epoch 15, batch 2900, loss[loss=0.1413, simple_loss=0.2157, pruned_loss=0.0335, over 4930.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02969, over 972594.26 frames.], batch size: 17, lr: 1.49e-04 2022-05-08 07:20:14,827 INFO [train.py:715] (0/8) Epoch 15, batch 2950, loss[loss=0.14, simple_loss=0.2052, pruned_loss=0.03742, over 4720.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2075, pruned_loss=0.0301, over 972260.09 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:20:55,620 INFO [train.py:715] (0/8) Epoch 15, batch 3000, loss[loss=0.1427, simple_loss=0.2207, pruned_loss=0.03229, over 4850.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03029, over 972119.24 frames.], batch size: 13, lr: 1.49e-04 2022-05-08 07:20:55,621 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 07:21:13,097 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1049, simple_loss=0.1887, pruned_loss=0.01057, over 914524.00 frames. 2022-05-08 07:21:54,025 INFO [train.py:715] (0/8) Epoch 15, batch 3050, loss[loss=0.1383, simple_loss=0.2244, pruned_loss=0.02606, over 4895.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03026, over 971549.01 frames.], batch size: 22, lr: 1.49e-04 2022-05-08 07:22:33,940 INFO [train.py:715] (0/8) Epoch 15, batch 3100, loss[loss=0.1092, simple_loss=0.1832, pruned_loss=0.01762, over 4931.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03068, over 971100.99 frames.], batch size: 29, lr: 1.49e-04 2022-05-08 07:23:14,656 INFO [train.py:715] (0/8) Epoch 15, batch 3150, loss[loss=0.1378, simple_loss=0.2209, pruned_loss=0.0274, over 4808.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03076, over 972449.97 frames.], batch size: 21, lr: 1.49e-04 2022-05-08 07:23:55,203 INFO [train.py:715] (0/8) Epoch 15, batch 3200, loss[loss=0.1476, simple_loss=0.2069, pruned_loss=0.04413, over 4813.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03018, over 972178.12 frames.], batch size: 13, lr: 1.49e-04 2022-05-08 07:24:35,381 INFO [train.py:715] (0/8) Epoch 15, batch 3250, loss[loss=0.1348, simple_loss=0.2155, pruned_loss=0.02706, over 4939.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.03013, over 972550.56 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 07:25:15,327 INFO [train.py:715] (0/8) Epoch 15, batch 3300, loss[loss=0.1333, simple_loss=0.1933, pruned_loss=0.03665, over 4964.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03044, over 972859.97 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:25:56,106 INFO [train.py:715] (0/8) Epoch 15, batch 3350, loss[loss=0.1088, simple_loss=0.1815, pruned_loss=0.01803, over 4884.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02999, over 973686.01 frames.], batch size: 22, lr: 1.49e-04 2022-05-08 07:26:36,435 INFO [train.py:715] (0/8) Epoch 15, batch 3400, loss[loss=0.113, simple_loss=0.1878, pruned_loss=0.01911, over 4766.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02987, over 973779.85 frames.], batch size: 12, lr: 1.49e-04 2022-05-08 07:27:16,660 INFO [train.py:715] (0/8) Epoch 15, batch 3450, loss[loss=0.1274, simple_loss=0.1975, pruned_loss=0.02864, over 4776.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03009, over 974037.95 frames.], batch size: 18, lr: 1.49e-04 2022-05-08 07:27:56,928 INFO [train.py:715] (0/8) Epoch 15, batch 3500, loss[loss=0.1243, simple_loss=0.2076, pruned_loss=0.02046, over 4823.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02984, over 973806.09 frames.], batch size: 27, lr: 1.49e-04 2022-05-08 07:28:37,335 INFO [train.py:715] (0/8) Epoch 15, batch 3550, loss[loss=0.1366, simple_loss=0.21, pruned_loss=0.03165, over 4773.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03009, over 973320.76 frames.], batch size: 14, lr: 1.49e-04 2022-05-08 07:29:17,844 INFO [train.py:715] (0/8) Epoch 15, batch 3600, loss[loss=0.1309, simple_loss=0.214, pruned_loss=0.02384, over 4703.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.0301, over 972787.12 frames.], batch size: 15, lr: 1.49e-04 2022-05-08 07:29:57,634 INFO [train.py:715] (0/8) Epoch 15, batch 3650, loss[loss=0.1465, simple_loss=0.2194, pruned_loss=0.03687, over 4882.00 frames.], tot_loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.03026, over 973436.60 frames.], batch size: 39, lr: 1.48e-04 2022-05-08 07:30:38,279 INFO [train.py:715] (0/8) Epoch 15, batch 3700, loss[loss=0.1509, simple_loss=0.2218, pruned_loss=0.04003, over 4754.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03041, over 973101.13 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 07:31:19,133 INFO [train.py:715] (0/8) Epoch 15, batch 3750, loss[loss=0.1414, simple_loss=0.2233, pruned_loss=0.02971, over 4765.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02992, over 973430.50 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 07:31:58,795 INFO [train.py:715] (0/8) Epoch 15, batch 3800, loss[loss=0.1227, simple_loss=0.1873, pruned_loss=0.02901, over 4891.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2082, pruned_loss=0.03051, over 973619.32 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 07:32:38,798 INFO [train.py:715] (0/8) Epoch 15, batch 3850, loss[loss=0.1598, simple_loss=0.2291, pruned_loss=0.04528, over 4843.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03005, over 972853.49 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 07:33:19,082 INFO [train.py:715] (0/8) Epoch 15, batch 3900, loss[loss=0.1321, simple_loss=0.2004, pruned_loss=0.03184, over 4791.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2072, pruned_loss=0.0297, over 972751.38 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 07:33:58,267 INFO [train.py:715] (0/8) Epoch 15, batch 3950, loss[loss=0.1311, simple_loss=0.2075, pruned_loss=0.02733, over 4840.00 frames.], tot_loss[loss=0.1342, simple_loss=0.208, pruned_loss=0.03015, over 972317.81 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 07:34:37,987 INFO [train.py:715] (0/8) Epoch 15, batch 4000, loss[loss=0.1309, simple_loss=0.2077, pruned_loss=0.02706, over 4871.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02986, over 971406.74 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 07:35:17,768 INFO [train.py:715] (0/8) Epoch 15, batch 4050, loss[loss=0.1446, simple_loss=0.2215, pruned_loss=0.03386, over 4870.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02953, over 971829.36 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 07:35:58,778 INFO [train.py:715] (0/8) Epoch 15, batch 4100, loss[loss=0.1096, simple_loss=0.188, pruned_loss=0.01558, over 4900.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02966, over 972224.19 frames.], batch size: 19, lr: 1.48e-04 2022-05-08 07:36:37,608 INFO [train.py:715] (0/8) Epoch 15, batch 4150, loss[loss=0.1093, simple_loss=0.1877, pruned_loss=0.01549, over 4950.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02978, over 971508.20 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 07:37:17,774 INFO [train.py:715] (0/8) Epoch 15, batch 4200, loss[loss=0.1367, simple_loss=0.2197, pruned_loss=0.0269, over 4946.00 frames.], tot_loss[loss=0.134, simple_loss=0.2076, pruned_loss=0.03017, over 971140.25 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 07:37:58,200 INFO [train.py:715] (0/8) Epoch 15, batch 4250, loss[loss=0.1083, simple_loss=0.1841, pruned_loss=0.01623, over 4925.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03034, over 971329.00 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 07:38:38,200 INFO [train.py:715] (0/8) Epoch 15, batch 4300, loss[loss=0.1239, simple_loss=0.2077, pruned_loss=0.02007, over 4967.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.0305, over 971663.85 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 07:39:18,221 INFO [train.py:715] (0/8) Epoch 15, batch 4350, loss[loss=0.1277, simple_loss=0.2001, pruned_loss=0.02762, over 4797.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03044, over 971818.97 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 07:39:58,267 INFO [train.py:715] (0/8) Epoch 15, batch 4400, loss[loss=0.146, simple_loss=0.2162, pruned_loss=0.03789, over 4963.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03029, over 971973.84 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 07:40:38,800 INFO [train.py:715] (0/8) Epoch 15, batch 4450, loss[loss=0.1408, simple_loss=0.22, pruned_loss=0.03077, over 4870.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02992, over 971799.77 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 07:41:18,467 INFO [train.py:715] (0/8) Epoch 15, batch 4500, loss[loss=0.1213, simple_loss=0.1946, pruned_loss=0.02396, over 4900.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02966, over 972707.27 frames.], batch size: 19, lr: 1.48e-04 2022-05-08 07:41:58,876 INFO [train.py:715] (0/8) Epoch 15, batch 4550, loss[loss=0.1613, simple_loss=0.2282, pruned_loss=0.04716, over 4964.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02989, over 972359.28 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 07:42:39,513 INFO [train.py:715] (0/8) Epoch 15, batch 4600, loss[loss=0.1314, simple_loss=0.2038, pruned_loss=0.02956, over 4784.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2088, pruned_loss=0.02975, over 972375.31 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 07:43:19,665 INFO [train.py:715] (0/8) Epoch 15, batch 4650, loss[loss=0.1214, simple_loss=0.1999, pruned_loss=0.02144, over 4798.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2087, pruned_loss=0.02956, over 972253.04 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 07:43:59,056 INFO [train.py:715] (0/8) Epoch 15, batch 4700, loss[loss=0.1114, simple_loss=0.194, pruned_loss=0.01438, over 4947.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2088, pruned_loss=0.02972, over 972138.77 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 07:44:39,327 INFO [train.py:715] (0/8) Epoch 15, batch 4750, loss[loss=0.1098, simple_loss=0.1785, pruned_loss=0.02056, over 4833.00 frames.], tot_loss[loss=0.134, simple_loss=0.2087, pruned_loss=0.02962, over 972230.46 frames.], batch size: 12, lr: 1.48e-04 2022-05-08 07:45:20,566 INFO [train.py:715] (0/8) Epoch 15, batch 4800, loss[loss=0.1276, simple_loss=0.1949, pruned_loss=0.03015, over 4891.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.0291, over 972471.06 frames.], batch size: 32, lr: 1.48e-04 2022-05-08 07:46:00,531 INFO [train.py:715] (0/8) Epoch 15, batch 4850, loss[loss=0.1397, simple_loss=0.214, pruned_loss=0.03267, over 4865.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.02913, over 973055.23 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 07:46:41,238 INFO [train.py:715] (0/8) Epoch 15, batch 4900, loss[loss=0.1442, simple_loss=0.2177, pruned_loss=0.03531, over 4840.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02919, over 973287.05 frames.], batch size: 32, lr: 1.48e-04 2022-05-08 07:47:21,679 INFO [train.py:715] (0/8) Epoch 15, batch 4950, loss[loss=0.1299, simple_loss=0.1915, pruned_loss=0.03418, over 4833.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02969, over 973717.28 frames.], batch size: 12, lr: 1.48e-04 2022-05-08 07:48:02,260 INFO [train.py:715] (0/8) Epoch 15, batch 5000, loss[loss=0.1533, simple_loss=0.2288, pruned_loss=0.03893, over 4955.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2073, pruned_loss=0.02954, over 973011.56 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 07:48:41,750 INFO [train.py:715] (0/8) Epoch 15, batch 5050, loss[loss=0.1553, simple_loss=0.2269, pruned_loss=0.04187, over 4874.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03024, over 972616.89 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 07:49:21,836 INFO [train.py:715] (0/8) Epoch 15, batch 5100, loss[loss=0.1168, simple_loss=0.1888, pruned_loss=0.02236, over 4754.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2081, pruned_loss=0.03024, over 972870.09 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 07:50:02,141 INFO [train.py:715] (0/8) Epoch 15, batch 5150, loss[loss=0.1588, simple_loss=0.2323, pruned_loss=0.04269, over 4849.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03058, over 972191.84 frames.], batch size: 26, lr: 1.48e-04 2022-05-08 07:50:42,059 INFO [train.py:715] (0/8) Epoch 15, batch 5200, loss[loss=0.1361, simple_loss=0.2087, pruned_loss=0.03175, over 4777.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03096, over 971910.15 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 07:51:22,077 INFO [train.py:715] (0/8) Epoch 15, batch 5250, loss[loss=0.1183, simple_loss=0.201, pruned_loss=0.01778, over 4863.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2089, pruned_loss=0.03085, over 971728.70 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 07:52:03,629 INFO [train.py:715] (0/8) Epoch 15, batch 5300, loss[loss=0.134, simple_loss=0.1979, pruned_loss=0.03506, over 4785.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.03091, over 971793.85 frames.], batch size: 12, lr: 1.48e-04 2022-05-08 07:52:45,868 INFO [train.py:715] (0/8) Epoch 15, batch 5350, loss[loss=0.1433, simple_loss=0.21, pruned_loss=0.03834, over 4897.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2083, pruned_loss=0.03071, over 971986.28 frames.], batch size: 18, lr: 1.48e-04 2022-05-08 07:53:26,876 INFO [train.py:715] (0/8) Epoch 15, batch 5400, loss[loss=0.1348, simple_loss=0.2072, pruned_loss=0.03126, over 4762.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03048, over 972671.06 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 07:54:08,831 INFO [train.py:715] (0/8) Epoch 15, batch 5450, loss[loss=0.1468, simple_loss=0.2126, pruned_loss=0.04053, over 4973.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03071, over 972907.47 frames.], batch size: 35, lr: 1.48e-04 2022-05-08 07:54:50,467 INFO [train.py:715] (0/8) Epoch 15, batch 5500, loss[loss=0.1402, simple_loss=0.22, pruned_loss=0.03017, over 4796.00 frames.], tot_loss[loss=0.1359, simple_loss=0.2095, pruned_loss=0.0312, over 974164.85 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 07:55:32,152 INFO [train.py:715] (0/8) Epoch 15, batch 5550, loss[loss=0.1402, simple_loss=0.2096, pruned_loss=0.03539, over 4809.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03064, over 973619.42 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 07:56:12,919 INFO [train.py:715] (0/8) Epoch 15, batch 5600, loss[loss=0.1155, simple_loss=0.1931, pruned_loss=0.01896, over 4703.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2091, pruned_loss=0.03105, over 973825.92 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 07:56:54,779 INFO [train.py:715] (0/8) Epoch 15, batch 5650, loss[loss=0.1241, simple_loss=0.2074, pruned_loss=0.02042, over 4983.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2087, pruned_loss=0.03093, over 973967.16 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 07:57:37,298 INFO [train.py:715] (0/8) Epoch 15, batch 5700, loss[loss=0.1319, simple_loss=0.1991, pruned_loss=0.03239, over 4871.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2087, pruned_loss=0.03092, over 974350.61 frames.], batch size: 32, lr: 1.48e-04 2022-05-08 07:58:18,519 INFO [train.py:715] (0/8) Epoch 15, batch 5750, loss[loss=0.1587, simple_loss=0.2325, pruned_loss=0.04246, over 4786.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2091, pruned_loss=0.03116, over 973974.01 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 07:58:59,971 INFO [train.py:715] (0/8) Epoch 15, batch 5800, loss[loss=0.1397, simple_loss=0.2189, pruned_loss=0.03029, over 4852.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2093, pruned_loss=0.03083, over 974707.82 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 07:59:41,227 INFO [train.py:715] (0/8) Epoch 15, batch 5850, loss[loss=0.1132, simple_loss=0.1922, pruned_loss=0.01705, over 4879.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03048, over 973401.18 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 07:59:48,136 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-528000.pt 2022-05-08 08:00:25,529 INFO [train.py:715] (0/8) Epoch 15, batch 5900, loss[loss=0.1572, simple_loss=0.2412, pruned_loss=0.03657, over 4968.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03029, over 973799.59 frames.], batch size: 40, lr: 1.48e-04 2022-05-08 08:01:06,124 INFO [train.py:715] (0/8) Epoch 15, batch 5950, loss[loss=0.153, simple_loss=0.2373, pruned_loss=0.0344, over 4763.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03028, over 973103.68 frames.], batch size: 19, lr: 1.48e-04 2022-05-08 08:01:47,585 INFO [train.py:715] (0/8) Epoch 15, batch 6000, loss[loss=0.169, simple_loss=0.2396, pruned_loss=0.04918, over 4890.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.02967, over 973596.61 frames.], batch size: 39, lr: 1.48e-04 2022-05-08 08:01:47,587 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 08:01:57,158 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1051, simple_loss=0.1887, pruned_loss=0.01077, over 914524.00 frames. 2022-05-08 08:02:38,331 INFO [train.py:715] (0/8) Epoch 15, batch 6050, loss[loss=0.1346, simple_loss=0.2144, pruned_loss=0.02741, over 4797.00 frames.], tot_loss[loss=0.1343, simple_loss=0.209, pruned_loss=0.02985, over 974286.20 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:03:20,376 INFO [train.py:715] (0/8) Epoch 15, batch 6100, loss[loss=0.1252, simple_loss=0.1988, pruned_loss=0.02585, over 4830.00 frames.], tot_loss[loss=0.135, simple_loss=0.2095, pruned_loss=0.03027, over 974383.69 frames.], batch size: 13, lr: 1.48e-04 2022-05-08 08:04:00,129 INFO [train.py:715] (0/8) Epoch 15, batch 6150, loss[loss=0.14, simple_loss=0.2115, pruned_loss=0.03422, over 4876.00 frames.], tot_loss[loss=0.1358, simple_loss=0.2103, pruned_loss=0.03071, over 974439.21 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 08:04:41,011 INFO [train.py:715] (0/8) Epoch 15, batch 6200, loss[loss=0.1314, simple_loss=0.204, pruned_loss=0.02934, over 4952.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2096, pruned_loss=0.03039, over 974249.18 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 08:05:20,637 INFO [train.py:715] (0/8) Epoch 15, batch 6250, loss[loss=0.1415, simple_loss=0.2242, pruned_loss=0.02936, over 4938.00 frames.], tot_loss[loss=0.1346, simple_loss=0.209, pruned_loss=0.03009, over 973895.95 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 08:06:01,394 INFO [train.py:715] (0/8) Epoch 15, batch 6300, loss[loss=0.1465, simple_loss=0.2408, pruned_loss=0.02606, over 4764.00 frames.], tot_loss[loss=0.134, simple_loss=0.2086, pruned_loss=0.02968, over 973248.76 frames.], batch size: 19, lr: 1.48e-04 2022-05-08 08:06:41,233 INFO [train.py:715] (0/8) Epoch 15, batch 6350, loss[loss=0.1337, simple_loss=0.2135, pruned_loss=0.02697, over 4935.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2089, pruned_loss=0.02992, over 972975.73 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 08:07:21,275 INFO [train.py:715] (0/8) Epoch 15, batch 6400, loss[loss=0.14, simple_loss=0.2195, pruned_loss=0.03024, over 4893.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2099, pruned_loss=0.03072, over 972385.77 frames.], batch size: 39, lr: 1.48e-04 2022-05-08 08:08:01,848 INFO [train.py:715] (0/8) Epoch 15, batch 6450, loss[loss=0.133, simple_loss=0.2044, pruned_loss=0.03079, over 4844.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2099, pruned_loss=0.03074, over 972263.29 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 08:08:41,398 INFO [train.py:715] (0/8) Epoch 15, batch 6500, loss[loss=0.1397, simple_loss=0.2019, pruned_loss=0.03875, over 4845.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03004, over 972342.80 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 08:09:21,818 INFO [train.py:715] (0/8) Epoch 15, batch 6550, loss[loss=0.1403, simple_loss=0.2137, pruned_loss=0.03344, over 4912.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02987, over 972291.21 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 08:10:02,129 INFO [train.py:715] (0/8) Epoch 15, batch 6600, loss[loss=0.1618, simple_loss=0.2308, pruned_loss=0.04639, over 4858.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03028, over 971946.57 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 08:10:42,818 INFO [train.py:715] (0/8) Epoch 15, batch 6650, loss[loss=0.1215, simple_loss=0.196, pruned_loss=0.02343, over 4772.00 frames.], tot_loss[loss=0.135, simple_loss=0.2092, pruned_loss=0.03041, over 972079.99 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 08:11:22,253 INFO [train.py:715] (0/8) Epoch 15, batch 6700, loss[loss=0.13, simple_loss=0.2043, pruned_loss=0.02788, over 4804.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2092, pruned_loss=0.0305, over 971501.97 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 08:12:02,756 INFO [train.py:715] (0/8) Epoch 15, batch 6750, loss[loss=0.1495, simple_loss=0.2225, pruned_loss=0.03819, over 4937.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2093, pruned_loss=0.03042, over 971762.39 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:12:44,119 INFO [train.py:715] (0/8) Epoch 15, batch 6800, loss[loss=0.1311, simple_loss=0.2045, pruned_loss=0.02886, over 4969.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2087, pruned_loss=0.02988, over 972560.34 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 08:13:23,953 INFO [train.py:715] (0/8) Epoch 15, batch 6850, loss[loss=0.1403, simple_loss=0.2238, pruned_loss=0.02842, over 4802.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2088, pruned_loss=0.03, over 973118.31 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 08:14:03,536 INFO [train.py:715] (0/8) Epoch 15, batch 6900, loss[loss=0.1563, simple_loss=0.2257, pruned_loss=0.04346, over 4818.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2087, pruned_loss=0.02986, over 972372.31 frames.], batch size: 25, lr: 1.48e-04 2022-05-08 08:14:44,361 INFO [train.py:715] (0/8) Epoch 15, batch 6950, loss[loss=0.1787, simple_loss=0.2415, pruned_loss=0.05795, over 4920.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2089, pruned_loss=0.03018, over 972087.24 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 08:15:24,996 INFO [train.py:715] (0/8) Epoch 15, batch 7000, loss[loss=0.1383, simple_loss=0.2154, pruned_loss=0.03056, over 4785.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.02974, over 973118.25 frames.], batch size: 18, lr: 1.48e-04 2022-05-08 08:16:03,960 INFO [train.py:715] (0/8) Epoch 15, batch 7050, loss[loss=0.1205, simple_loss=0.1943, pruned_loss=0.02332, over 4780.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02967, over 972714.51 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 08:16:44,703 INFO [train.py:715] (0/8) Epoch 15, batch 7100, loss[loss=0.1267, simple_loss=0.1982, pruned_loss=0.02764, over 4830.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02991, over 972784.57 frames.], batch size: 13, lr: 1.48e-04 2022-05-08 08:17:25,230 INFO [train.py:715] (0/8) Epoch 15, batch 7150, loss[loss=0.1305, simple_loss=0.2009, pruned_loss=0.02998, over 4820.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02962, over 973085.00 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:18:05,132 INFO [train.py:715] (0/8) Epoch 15, batch 7200, loss[loss=0.1262, simple_loss=0.1996, pruned_loss=0.02642, over 4984.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03002, over 973235.77 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:18:44,343 INFO [train.py:715] (0/8) Epoch 15, batch 7250, loss[loss=0.1381, simple_loss=0.2159, pruned_loss=0.03017, over 4931.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02986, over 972235.10 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:19:25,089 INFO [train.py:715] (0/8) Epoch 15, batch 7300, loss[loss=0.1211, simple_loss=0.2036, pruned_loss=0.01927, over 4878.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2076, pruned_loss=0.0298, over 972533.97 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 08:20:06,074 INFO [train.py:715] (0/8) Epoch 15, batch 7350, loss[loss=0.1378, simple_loss=0.2118, pruned_loss=0.03193, over 4775.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02954, over 972479.03 frames.], batch size: 18, lr: 1.48e-04 2022-05-08 08:20:45,517 INFO [train.py:715] (0/8) Epoch 15, batch 7400, loss[loss=0.1017, simple_loss=0.1825, pruned_loss=0.01045, over 4813.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2081, pruned_loss=0.02927, over 972279.79 frames.], batch size: 25, lr: 1.48e-04 2022-05-08 08:21:25,988 INFO [train.py:715] (0/8) Epoch 15, batch 7450, loss[loss=0.1019, simple_loss=0.182, pruned_loss=0.01086, over 4809.00 frames.], tot_loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.02976, over 972951.28 frames.], batch size: 25, lr: 1.48e-04 2022-05-08 08:22:06,380 INFO [train.py:715] (0/8) Epoch 15, batch 7500, loss[loss=0.1244, simple_loss=0.2052, pruned_loss=0.02175, over 4955.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02983, over 971926.71 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:22:46,666 INFO [train.py:715] (0/8) Epoch 15, batch 7550, loss[loss=0.1473, simple_loss=0.222, pruned_loss=0.03631, over 4972.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02945, over 971908.59 frames.], batch size: 24, lr: 1.48e-04 2022-05-08 08:23:25,907 INFO [train.py:715] (0/8) Epoch 15, batch 7600, loss[loss=0.1351, simple_loss=0.2191, pruned_loss=0.02552, over 4779.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02986, over 971583.33 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 08:24:05,909 INFO [train.py:715] (0/8) Epoch 15, batch 7650, loss[loss=0.1208, simple_loss=0.2007, pruned_loss=0.0205, over 4814.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2086, pruned_loss=0.02979, over 971476.32 frames.], batch size: 27, lr: 1.48e-04 2022-05-08 08:24:45,953 INFO [train.py:715] (0/8) Epoch 15, batch 7700, loss[loss=0.1381, simple_loss=0.2117, pruned_loss=0.03227, over 4867.00 frames.], tot_loss[loss=0.1345, simple_loss=0.209, pruned_loss=0.03, over 971691.18 frames.], batch size: 38, lr: 1.48e-04 2022-05-08 08:25:24,891 INFO [train.py:715] (0/8) Epoch 15, batch 7750, loss[loss=0.126, simple_loss=0.2028, pruned_loss=0.02458, over 4983.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02979, over 972769.37 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 08:26:04,530 INFO [train.py:715] (0/8) Epoch 15, batch 7800, loss[loss=0.1276, simple_loss=0.199, pruned_loss=0.02808, over 4869.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2091, pruned_loss=0.03014, over 972740.58 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 08:26:43,768 INFO [train.py:715] (0/8) Epoch 15, batch 7850, loss[loss=0.1211, simple_loss=0.1935, pruned_loss=0.02432, over 4822.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2088, pruned_loss=0.03025, over 972172.79 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:27:23,762 INFO [train.py:715] (0/8) Epoch 15, batch 7900, loss[loss=0.1386, simple_loss=0.2163, pruned_loss=0.03044, over 4987.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.0302, over 972640.40 frames.], batch size: 28, lr: 1.48e-04 2022-05-08 08:28:01,920 INFO [train.py:715] (0/8) Epoch 15, batch 7950, loss[loss=0.1389, simple_loss=0.2151, pruned_loss=0.03137, over 4984.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03073, over 973235.59 frames.], batch size: 26, lr: 1.48e-04 2022-05-08 08:28:41,232 INFO [train.py:715] (0/8) Epoch 15, batch 8000, loss[loss=0.1441, simple_loss=0.2176, pruned_loss=0.03532, over 4837.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2091, pruned_loss=0.03087, over 973786.50 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 08:29:20,795 INFO [train.py:715] (0/8) Epoch 15, batch 8050, loss[loss=0.1595, simple_loss=0.2325, pruned_loss=0.04322, over 4864.00 frames.], tot_loss[loss=0.1354, simple_loss=0.209, pruned_loss=0.03089, over 972922.62 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 08:29:59,766 INFO [train.py:715] (0/8) Epoch 15, batch 8100, loss[loss=0.126, simple_loss=0.1986, pruned_loss=0.02667, over 4887.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03037, over 972367.53 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 08:30:38,763 INFO [train.py:715] (0/8) Epoch 15, batch 8150, loss[loss=0.1335, simple_loss=0.1978, pruned_loss=0.03463, over 4779.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2078, pruned_loss=0.03049, over 972650.61 frames.], batch size: 18, lr: 1.48e-04 2022-05-08 08:31:18,838 INFO [train.py:715] (0/8) Epoch 15, batch 8200, loss[loss=0.1027, simple_loss=0.1782, pruned_loss=0.01363, over 4972.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2076, pruned_loss=0.03039, over 972136.61 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 08:31:57,570 INFO [train.py:715] (0/8) Epoch 15, batch 8250, loss[loss=0.1294, simple_loss=0.2055, pruned_loss=0.02667, over 4842.00 frames.], tot_loss[loss=0.1344, simple_loss=0.208, pruned_loss=0.03043, over 971893.69 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:32:36,544 INFO [train.py:715] (0/8) Epoch 15, batch 8300, loss[loss=0.1562, simple_loss=0.2296, pruned_loss=0.04135, over 4923.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03049, over 972476.21 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:33:15,771 INFO [train.py:715] (0/8) Epoch 15, batch 8350, loss[loss=0.1237, simple_loss=0.1938, pruned_loss=0.02683, over 4975.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2085, pruned_loss=0.03059, over 973532.84 frames.], batch size: 28, lr: 1.48e-04 2022-05-08 08:33:55,966 INFO [train.py:715] (0/8) Epoch 15, batch 8400, loss[loss=0.1207, simple_loss=0.1906, pruned_loss=0.02538, over 4818.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03062, over 973096.24 frames.], batch size: 12, lr: 1.48e-04 2022-05-08 08:34:35,509 INFO [train.py:715] (0/8) Epoch 15, batch 8450, loss[loss=0.1505, simple_loss=0.2174, pruned_loss=0.04185, over 4871.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2095, pruned_loss=0.03091, over 972716.17 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 08:35:14,649 INFO [train.py:715] (0/8) Epoch 15, batch 8500, loss[loss=0.1108, simple_loss=0.1878, pruned_loss=0.01684, over 4808.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03056, over 972655.82 frames.], batch size: 26, lr: 1.48e-04 2022-05-08 08:35:54,832 INFO [train.py:715] (0/8) Epoch 15, batch 8550, loss[loss=0.1383, simple_loss=0.2116, pruned_loss=0.03247, over 4870.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.0305, over 971952.50 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 08:36:33,506 INFO [train.py:715] (0/8) Epoch 15, batch 8600, loss[loss=0.1449, simple_loss=0.2106, pruned_loss=0.03958, over 4923.00 frames.], tot_loss[loss=0.1345, simple_loss=0.208, pruned_loss=0.03045, over 972310.23 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 08:37:12,322 INFO [train.py:715] (0/8) Epoch 15, batch 8650, loss[loss=0.1572, simple_loss=0.2199, pruned_loss=0.04724, over 4841.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03079, over 971292.38 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:37:51,172 INFO [train.py:715] (0/8) Epoch 15, batch 8700, loss[loss=0.1402, simple_loss=0.213, pruned_loss=0.03367, over 4961.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03044, over 972478.65 frames.], batch size: 29, lr: 1.48e-04 2022-05-08 08:38:30,422 INFO [train.py:715] (0/8) Epoch 15, batch 8750, loss[loss=0.1148, simple_loss=0.1869, pruned_loss=0.02133, over 4971.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.0301, over 972427.12 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 08:39:08,909 INFO [train.py:715] (0/8) Epoch 15, batch 8800, loss[loss=0.1622, simple_loss=0.2259, pruned_loss=0.0492, over 4902.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03035, over 972151.91 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 08:39:47,402 INFO [train.py:715] (0/8) Epoch 15, batch 8850, loss[loss=0.1251, simple_loss=0.1967, pruned_loss=0.02673, over 4667.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03022, over 971373.14 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 08:40:26,829 INFO [train.py:715] (0/8) Epoch 15, batch 8900, loss[loss=0.1565, simple_loss=0.224, pruned_loss=0.04453, over 4955.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03032, over 971635.26 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:41:06,363 INFO [train.py:715] (0/8) Epoch 15, batch 8950, loss[loss=0.118, simple_loss=0.1897, pruned_loss=0.02312, over 4795.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03053, over 971472.17 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:41:45,469 INFO [train.py:715] (0/8) Epoch 15, batch 9000, loss[loss=0.157, simple_loss=0.2276, pruned_loss=0.04327, over 4805.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03045, over 971254.71 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:41:45,470 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 08:42:05,029 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1051, simple_loss=0.1887, pruned_loss=0.01074, over 914524.00 frames. 2022-05-08 08:42:44,048 INFO [train.py:715] (0/8) Epoch 15, batch 9050, loss[loss=0.1358, simple_loss=0.2093, pruned_loss=0.03114, over 4813.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.02998, over 972260.59 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:43:23,563 INFO [train.py:715] (0/8) Epoch 15, batch 9100, loss[loss=0.1271, simple_loss=0.1997, pruned_loss=0.02728, over 4934.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2079, pruned_loss=0.03012, over 971934.76 frames.], batch size: 23, lr: 1.48e-04 2022-05-08 08:44:03,261 INFO [train.py:715] (0/8) Epoch 15, batch 9150, loss[loss=0.1225, simple_loss=0.1999, pruned_loss=0.02253, over 4926.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03019, over 971147.35 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:44:42,058 INFO [train.py:715] (0/8) Epoch 15, batch 9200, loss[loss=0.1607, simple_loss=0.2465, pruned_loss=0.03748, over 4786.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02981, over 971267.94 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 08:45:21,335 INFO [train.py:715] (0/8) Epoch 15, batch 9250, loss[loss=0.1277, simple_loss=0.1995, pruned_loss=0.02799, over 4877.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02986, over 970862.81 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 08:46:01,211 INFO [train.py:715] (0/8) Epoch 15, batch 9300, loss[loss=0.09607, simple_loss=0.168, pruned_loss=0.01207, over 4812.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02976, over 971301.76 frames.], batch size: 13, lr: 1.48e-04 2022-05-08 08:46:41,139 INFO [train.py:715] (0/8) Epoch 15, batch 9350, loss[loss=0.1483, simple_loss=0.2237, pruned_loss=0.03644, over 4986.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2075, pruned_loss=0.03004, over 971135.29 frames.], batch size: 35, lr: 1.48e-04 2022-05-08 08:47:19,980 INFO [train.py:715] (0/8) Epoch 15, batch 9400, loss[loss=0.1352, simple_loss=0.2112, pruned_loss=0.0296, over 4831.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03015, over 972058.49 frames.], batch size: 26, lr: 1.48e-04 2022-05-08 08:47:59,294 INFO [train.py:715] (0/8) Epoch 15, batch 9450, loss[loss=0.1701, simple_loss=0.2322, pruned_loss=0.05404, over 4834.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03039, over 971439.38 frames.], batch size: 30, lr: 1.48e-04 2022-05-08 08:48:38,581 INFO [train.py:715] (0/8) Epoch 15, batch 9500, loss[loss=0.1235, simple_loss=0.2029, pruned_loss=0.02201, over 4845.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2077, pruned_loss=0.02994, over 971628.02 frames.], batch size: 13, lr: 1.48e-04 2022-05-08 08:49:16,966 INFO [train.py:715] (0/8) Epoch 15, batch 9550, loss[loss=0.1302, simple_loss=0.2009, pruned_loss=0.02977, over 4824.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.03002, over 972247.03 frames.], batch size: 13, lr: 1.48e-04 2022-05-08 08:49:56,302 INFO [train.py:715] (0/8) Epoch 15, batch 9600, loss[loss=0.1516, simple_loss=0.223, pruned_loss=0.04007, over 4777.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03071, over 972336.05 frames.], batch size: 19, lr: 1.48e-04 2022-05-08 08:50:35,891 INFO [train.py:715] (0/8) Epoch 15, batch 9650, loss[loss=0.1617, simple_loss=0.2275, pruned_loss=0.048, over 4915.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03069, over 972862.51 frames.], batch size: 39, lr: 1.48e-04 2022-05-08 08:51:15,443 INFO [train.py:715] (0/8) Epoch 15, batch 9700, loss[loss=0.1424, simple_loss=0.2248, pruned_loss=0.03002, over 4968.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.0305, over 972269.35 frames.], batch size: 15, lr: 1.48e-04 2022-05-08 08:51:53,980 INFO [train.py:715] (0/8) Epoch 15, batch 9750, loss[loss=0.1351, simple_loss=0.2161, pruned_loss=0.02709, over 4943.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.0305, over 972455.45 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:52:33,220 INFO [train.py:715] (0/8) Epoch 15, batch 9800, loss[loss=0.1443, simple_loss=0.2082, pruned_loss=0.04022, over 4905.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2096, pruned_loss=0.03048, over 972648.98 frames.], batch size: 17, lr: 1.48e-04 2022-05-08 08:53:12,407 INFO [train.py:715] (0/8) Epoch 15, batch 9850, loss[loss=0.1344, simple_loss=0.2144, pruned_loss=0.02723, over 4778.00 frames.], tot_loss[loss=0.136, simple_loss=0.2105, pruned_loss=0.03076, over 973299.80 frames.], batch size: 18, lr: 1.48e-04 2022-05-08 08:53:50,989 INFO [train.py:715] (0/8) Epoch 15, batch 9900, loss[loss=0.1235, simple_loss=0.1938, pruned_loss=0.02662, over 4950.00 frames.], tot_loss[loss=0.135, simple_loss=0.2092, pruned_loss=0.03034, over 973654.57 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 08:54:30,412 INFO [train.py:715] (0/8) Epoch 15, batch 9950, loss[loss=0.1178, simple_loss=0.1907, pruned_loss=0.02241, over 4970.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02964, over 974474.09 frames.], batch size: 14, lr: 1.48e-04 2022-05-08 08:55:09,385 INFO [train.py:715] (0/8) Epoch 15, batch 10000, loss[loss=0.1585, simple_loss=0.2165, pruned_loss=0.05031, over 4847.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02942, over 973265.56 frames.], batch size: 34, lr: 1.48e-04 2022-05-08 08:55:48,598 INFO [train.py:715] (0/8) Epoch 15, batch 10050, loss[loss=0.134, simple_loss=0.2093, pruned_loss=0.02934, over 4922.00 frames.], tot_loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02918, over 972906.92 frames.], batch size: 18, lr: 1.48e-04 2022-05-08 08:56:26,955 INFO [train.py:715] (0/8) Epoch 15, batch 10100, loss[loss=0.1207, simple_loss=0.2026, pruned_loss=0.01937, over 4814.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2073, pruned_loss=0.02897, over 972507.52 frames.], batch size: 26, lr: 1.48e-04 2022-05-08 08:57:05,755 INFO [train.py:715] (0/8) Epoch 15, batch 10150, loss[loss=0.128, simple_loss=0.2122, pruned_loss=0.02183, over 4974.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2081, pruned_loss=0.02924, over 971977.92 frames.], batch size: 25, lr: 1.48e-04 2022-05-08 08:57:45,598 INFO [train.py:715] (0/8) Epoch 15, batch 10200, loss[loss=0.1256, simple_loss=0.2055, pruned_loss=0.02285, over 4815.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2079, pruned_loss=0.02925, over 972136.94 frames.], batch size: 27, lr: 1.48e-04 2022-05-08 08:58:23,920 INFO [train.py:715] (0/8) Epoch 15, batch 10250, loss[loss=0.1246, simple_loss=0.2057, pruned_loss=0.02177, over 4961.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2083, pruned_loss=0.02943, over 972809.10 frames.], batch size: 29, lr: 1.48e-04 2022-05-08 08:59:03,220 INFO [train.py:715] (0/8) Epoch 15, batch 10300, loss[loss=0.149, simple_loss=0.2199, pruned_loss=0.039, over 4758.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2079, pruned_loss=0.0292, over 972190.96 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 08:59:42,678 INFO [train.py:715] (0/8) Epoch 15, batch 10350, loss[loss=0.1425, simple_loss=0.2199, pruned_loss=0.03259, over 4943.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2073, pruned_loss=0.02897, over 971887.28 frames.], batch size: 21, lr: 1.48e-04 2022-05-08 09:00:21,802 INFO [train.py:715] (0/8) Epoch 15, batch 10400, loss[loss=0.129, simple_loss=0.2089, pruned_loss=0.02457, over 4863.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02865, over 972281.89 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 09:00:59,818 INFO [train.py:715] (0/8) Epoch 15, batch 10450, loss[loss=0.1155, simple_loss=0.1919, pruned_loss=0.01955, over 4743.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02898, over 972637.36 frames.], batch size: 16, lr: 1.48e-04 2022-05-08 09:01:38,800 INFO [train.py:715] (0/8) Epoch 15, batch 10500, loss[loss=0.1294, simple_loss=0.2029, pruned_loss=0.0279, over 4858.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02927, over 972701.03 frames.], batch size: 22, lr: 1.48e-04 2022-05-08 09:02:18,516 INFO [train.py:715] (0/8) Epoch 15, batch 10550, loss[loss=0.1131, simple_loss=0.1945, pruned_loss=0.01583, over 4830.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2078, pruned_loss=0.02927, over 973011.66 frames.], batch size: 26, lr: 1.48e-04 2022-05-08 09:02:56,676 INFO [train.py:715] (0/8) Epoch 15, batch 10600, loss[loss=0.1074, simple_loss=0.1729, pruned_loss=0.02096, over 4806.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02944, over 972898.30 frames.], batch size: 12, lr: 1.48e-04 2022-05-08 09:03:35,330 INFO [train.py:715] (0/8) Epoch 15, batch 10650, loss[loss=0.1012, simple_loss=0.1677, pruned_loss=0.01739, over 4873.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2076, pruned_loss=0.02977, over 972682.73 frames.], batch size: 12, lr: 1.48e-04 2022-05-08 09:04:14,415 INFO [train.py:715] (0/8) Epoch 15, batch 10700, loss[loss=0.1684, simple_loss=0.2368, pruned_loss=0.05002, over 4857.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03001, over 973377.50 frames.], batch size: 20, lr: 1.48e-04 2022-05-08 09:04:53,623 INFO [train.py:715] (0/8) Epoch 15, batch 10750, loss[loss=0.1518, simple_loss=0.218, pruned_loss=0.0428, over 4752.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2087, pruned_loss=0.03046, over 972247.48 frames.], batch size: 19, lr: 1.48e-04 2022-05-08 09:05:31,573 INFO [train.py:715] (0/8) Epoch 15, batch 10800, loss[loss=0.1506, simple_loss=0.2401, pruned_loss=0.03059, over 4807.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03021, over 971579.64 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 09:06:11,095 INFO [train.py:715] (0/8) Epoch 15, batch 10850, loss[loss=0.1493, simple_loss=0.2204, pruned_loss=0.03908, over 4984.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2093, pruned_loss=0.03003, over 971502.84 frames.], batch size: 24, lr: 1.47e-04 2022-05-08 09:06:50,383 INFO [train.py:715] (0/8) Epoch 15, batch 10900, loss[loss=0.1071, simple_loss=0.1808, pruned_loss=0.01676, over 4882.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03016, over 971651.49 frames.], batch size: 22, lr: 1.47e-04 2022-05-08 09:07:28,754 INFO [train.py:715] (0/8) Epoch 15, batch 10950, loss[loss=0.1427, simple_loss=0.2105, pruned_loss=0.03741, over 4989.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2092, pruned_loss=0.03016, over 972093.17 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:08:06,751 INFO [train.py:715] (0/8) Epoch 15, batch 11000, loss[loss=0.1303, simple_loss=0.1984, pruned_loss=0.03115, over 4772.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03036, over 971954.57 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:08:45,831 INFO [train.py:715] (0/8) Epoch 15, batch 11050, loss[loss=0.1322, simple_loss=0.2141, pruned_loss=0.0252, over 4768.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03072, over 971129.60 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:09:25,312 INFO [train.py:715] (0/8) Epoch 15, batch 11100, loss[loss=0.1459, simple_loss=0.2208, pruned_loss=0.03548, over 4895.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2096, pruned_loss=0.03078, over 971288.06 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:10:03,223 INFO [train.py:715] (0/8) Epoch 15, batch 11150, loss[loss=0.1425, simple_loss=0.2141, pruned_loss=0.03547, over 4870.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.0303, over 971697.87 frames.], batch size: 20, lr: 1.47e-04 2022-05-08 09:10:41,860 INFO [train.py:715] (0/8) Epoch 15, batch 11200, loss[loss=0.1375, simple_loss=0.2063, pruned_loss=0.03437, over 4846.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2096, pruned_loss=0.0307, over 971915.70 frames.], batch size: 32, lr: 1.47e-04 2022-05-08 09:11:20,810 INFO [train.py:715] (0/8) Epoch 15, batch 11250, loss[loss=0.1222, simple_loss=0.1879, pruned_loss=0.02827, over 4853.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03068, over 971819.42 frames.], batch size: 34, lr: 1.47e-04 2022-05-08 09:11:59,329 INFO [train.py:715] (0/8) Epoch 15, batch 11300, loss[loss=0.1263, simple_loss=0.2016, pruned_loss=0.0255, over 4943.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2094, pruned_loss=0.03082, over 971595.51 frames.], batch size: 29, lr: 1.47e-04 2022-05-08 09:12:37,832 INFO [train.py:715] (0/8) Epoch 15, batch 11350, loss[loss=0.1374, simple_loss=0.2109, pruned_loss=0.03196, over 4655.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.031, over 971956.37 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 09:13:17,185 INFO [train.py:715] (0/8) Epoch 15, batch 11400, loss[loss=0.1242, simple_loss=0.2018, pruned_loss=0.02333, over 4875.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03015, over 971941.06 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:13:55,478 INFO [train.py:715] (0/8) Epoch 15, batch 11450, loss[loss=0.1129, simple_loss=0.1779, pruned_loss=0.02393, over 4826.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2081, pruned_loss=0.03046, over 971862.42 frames.], batch size: 12, lr: 1.47e-04 2022-05-08 09:14:34,168 INFO [train.py:715] (0/8) Epoch 15, batch 11500, loss[loss=0.08676, simple_loss=0.157, pruned_loss=0.008281, over 4773.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2076, pruned_loss=0.03036, over 971630.28 frames.], batch size: 12, lr: 1.47e-04 2022-05-08 09:15:13,128 INFO [train.py:715] (0/8) Epoch 15, batch 11550, loss[loss=0.1157, simple_loss=0.1917, pruned_loss=0.01989, over 4914.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2071, pruned_loss=0.03016, over 971037.79 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 09:15:52,413 INFO [train.py:715] (0/8) Epoch 15, batch 11600, loss[loss=0.1457, simple_loss=0.2155, pruned_loss=0.03795, over 4847.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03014, over 971178.12 frames.], batch size: 30, lr: 1.47e-04 2022-05-08 09:16:30,742 INFO [train.py:715] (0/8) Epoch 15, batch 11650, loss[loss=0.1502, simple_loss=0.2149, pruned_loss=0.0428, over 4863.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.0299, over 971512.66 frames.], batch size: 32, lr: 1.47e-04 2022-05-08 09:17:09,218 INFO [train.py:715] (0/8) Epoch 15, batch 11700, loss[loss=0.1219, simple_loss=0.1964, pruned_loss=0.02371, over 4788.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2079, pruned_loss=0.03045, over 972343.05 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:17:48,439 INFO [train.py:715] (0/8) Epoch 15, batch 11750, loss[loss=0.1235, simple_loss=0.2018, pruned_loss=0.02263, over 4980.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2081, pruned_loss=0.03048, over 973284.43 frames.], batch size: 28, lr: 1.47e-04 2022-05-08 09:18:27,443 INFO [train.py:715] (0/8) Epoch 15, batch 11800, loss[loss=0.1354, simple_loss=0.208, pruned_loss=0.03141, over 4683.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.03002, over 973496.27 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:19:05,511 INFO [train.py:715] (0/8) Epoch 15, batch 11850, loss[loss=0.1159, simple_loss=0.1917, pruned_loss=0.02004, over 4862.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03014, over 973889.08 frames.], batch size: 32, lr: 1.47e-04 2022-05-08 09:19:45,037 INFO [train.py:715] (0/8) Epoch 15, batch 11900, loss[loss=0.1365, simple_loss=0.2113, pruned_loss=0.03084, over 4912.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03032, over 972896.00 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:20:25,118 INFO [train.py:715] (0/8) Epoch 15, batch 11950, loss[loss=0.1405, simple_loss=0.2113, pruned_loss=0.03482, over 4965.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02965, over 973229.05 frames.], batch size: 35, lr: 1.47e-04 2022-05-08 09:21:03,705 INFO [train.py:715] (0/8) Epoch 15, batch 12000, loss[loss=0.1363, simple_loss=0.2108, pruned_loss=0.03088, over 4706.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02977, over 972650.87 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:21:03,706 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 09:21:20,396 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.105, simple_loss=0.1887, pruned_loss=0.01066, over 914524.00 frames. 2022-05-08 09:21:59,110 INFO [train.py:715] (0/8) Epoch 15, batch 12050, loss[loss=0.1117, simple_loss=0.1819, pruned_loss=0.02075, over 4828.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.0296, over 972840.88 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 09:22:38,251 INFO [train.py:715] (0/8) Epoch 15, batch 12100, loss[loss=0.1212, simple_loss=0.1943, pruned_loss=0.02411, over 4842.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02975, over 972988.13 frames.], batch size: 25, lr: 1.47e-04 2022-05-08 09:23:17,961 INFO [train.py:715] (0/8) Epoch 15, batch 12150, loss[loss=0.1412, simple_loss=0.2177, pruned_loss=0.03234, over 4766.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02967, over 972886.78 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:23:56,407 INFO [train.py:715] (0/8) Epoch 15, batch 12200, loss[loss=0.1261, simple_loss=0.1952, pruned_loss=0.02852, over 4995.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.0296, over 973660.80 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:24:35,173 INFO [train.py:715] (0/8) Epoch 15, batch 12250, loss[loss=0.1359, simple_loss=0.2073, pruned_loss=0.03225, over 4704.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03038, over 972564.06 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:25:14,192 INFO [train.py:715] (0/8) Epoch 15, batch 12300, loss[loss=0.1605, simple_loss=0.2283, pruned_loss=0.0463, over 4905.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03013, over 972319.30 frames.], batch size: 39, lr: 1.47e-04 2022-05-08 09:25:54,055 INFO [train.py:715] (0/8) Epoch 15, batch 12350, loss[loss=0.1433, simple_loss=0.2178, pruned_loss=0.03437, over 4978.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02979, over 972313.06 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:26:32,321 INFO [train.py:715] (0/8) Epoch 15, batch 12400, loss[loss=0.1333, simple_loss=0.2053, pruned_loss=0.03062, over 4828.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.03017, over 972661.52 frames.], batch size: 26, lr: 1.47e-04 2022-05-08 09:27:11,072 INFO [train.py:715] (0/8) Epoch 15, batch 12450, loss[loss=0.1318, simple_loss=0.1937, pruned_loss=0.03498, over 4870.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.03018, over 972456.18 frames.], batch size: 22, lr: 1.47e-04 2022-05-08 09:27:51,090 INFO [train.py:715] (0/8) Epoch 15, batch 12500, loss[loss=0.1275, simple_loss=0.1921, pruned_loss=0.03148, over 4735.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03014, over 972393.58 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:28:29,285 INFO [train.py:715] (0/8) Epoch 15, batch 12550, loss[loss=0.1392, simple_loss=0.2073, pruned_loss=0.0355, over 4704.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2083, pruned_loss=0.03059, over 972444.42 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:29:08,350 INFO [train.py:715] (0/8) Epoch 15, batch 12600, loss[loss=0.1594, simple_loss=0.2295, pruned_loss=0.04469, over 4937.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03049, over 973495.95 frames.], batch size: 39, lr: 1.47e-04 2022-05-08 09:29:46,864 INFO [train.py:715] (0/8) Epoch 15, batch 12650, loss[loss=0.1264, simple_loss=0.2053, pruned_loss=0.02381, over 4700.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2086, pruned_loss=0.03022, over 973128.25 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:30:26,457 INFO [train.py:715] (0/8) Epoch 15, batch 12700, loss[loss=0.1634, simple_loss=0.2368, pruned_loss=0.04499, over 4878.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03004, over 972197.77 frames.], batch size: 20, lr: 1.47e-04 2022-05-08 09:31:04,799 INFO [train.py:715] (0/8) Epoch 15, batch 12750, loss[loss=0.1384, simple_loss=0.2112, pruned_loss=0.03279, over 4874.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03049, over 972188.21 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:31:43,633 INFO [train.py:715] (0/8) Epoch 15, batch 12800, loss[loss=0.1285, simple_loss=0.2038, pruned_loss=0.02663, over 4970.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03025, over 972705.83 frames.], batch size: 25, lr: 1.47e-04 2022-05-08 09:32:23,163 INFO [train.py:715] (0/8) Epoch 15, batch 12850, loss[loss=0.1401, simple_loss=0.2143, pruned_loss=0.03299, over 4871.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03014, over 972446.10 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:33:01,775 INFO [train.py:715] (0/8) Epoch 15, batch 12900, loss[loss=0.1418, simple_loss=0.2101, pruned_loss=0.0367, over 4968.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03045, over 972784.84 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:33:40,811 INFO [train.py:715] (0/8) Epoch 15, batch 12950, loss[loss=0.1327, simple_loss=0.2141, pruned_loss=0.02562, over 4847.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03042, over 972665.77 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:34:20,124 INFO [train.py:715] (0/8) Epoch 15, batch 13000, loss[loss=0.1207, simple_loss=0.201, pruned_loss=0.02022, over 4896.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2094, pruned_loss=0.03054, over 972921.73 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 09:34:59,662 INFO [train.py:715] (0/8) Epoch 15, batch 13050, loss[loss=0.1217, simple_loss=0.1969, pruned_loss=0.02325, over 4809.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2088, pruned_loss=0.03023, over 972171.65 frames.], batch size: 25, lr: 1.47e-04 2022-05-08 09:35:38,144 INFO [train.py:715] (0/8) Epoch 15, batch 13100, loss[loss=0.1234, simple_loss=0.2062, pruned_loss=0.02028, over 4860.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02976, over 971972.32 frames.], batch size: 20, lr: 1.47e-04 2022-05-08 09:36:17,619 INFO [train.py:715] (0/8) Epoch 15, batch 13150, loss[loss=0.1163, simple_loss=0.2003, pruned_loss=0.01618, over 4818.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2092, pruned_loss=0.02992, over 972862.83 frames.], batch size: 25, lr: 1.47e-04 2022-05-08 09:36:57,400 INFO [train.py:715] (0/8) Epoch 15, batch 13200, loss[loss=0.1213, simple_loss=0.2007, pruned_loss=0.02093, over 4812.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2092, pruned_loss=0.02999, over 972246.09 frames.], batch size: 25, lr: 1.47e-04 2022-05-08 09:37:35,196 INFO [train.py:715] (0/8) Epoch 15, batch 13250, loss[loss=0.1158, simple_loss=0.1961, pruned_loss=0.01777, over 4906.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2096, pruned_loss=0.03011, over 972914.37 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 09:38:14,328 INFO [train.py:715] (0/8) Epoch 15, batch 13300, loss[loss=0.1559, simple_loss=0.2296, pruned_loss=0.04113, over 4908.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2095, pruned_loss=0.03039, over 972458.52 frames.], batch size: 29, lr: 1.47e-04 2022-05-08 09:38:53,944 INFO [train.py:715] (0/8) Epoch 15, batch 13350, loss[loss=0.1535, simple_loss=0.2164, pruned_loss=0.04529, over 4776.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2098, pruned_loss=0.03054, over 973574.62 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:39:34,552 INFO [train.py:715] (0/8) Epoch 15, batch 13400, loss[loss=0.1358, simple_loss=0.2182, pruned_loss=0.02668, over 4983.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2093, pruned_loss=0.03048, over 973467.60 frames.], batch size: 40, lr: 1.47e-04 2022-05-08 09:40:13,181 INFO [train.py:715] (0/8) Epoch 15, batch 13450, loss[loss=0.1454, simple_loss=0.2292, pruned_loss=0.03076, over 4939.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2094, pruned_loss=0.03062, over 972700.27 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 09:40:51,763 INFO [train.py:715] (0/8) Epoch 15, batch 13500, loss[loss=0.1446, simple_loss=0.2241, pruned_loss=0.03254, over 4850.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.02973, over 972889.61 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 09:41:31,303 INFO [train.py:715] (0/8) Epoch 15, batch 13550, loss[loss=0.122, simple_loss=0.1954, pruned_loss=0.02432, over 4638.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02948, over 972426.58 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 09:42:09,580 INFO [train.py:715] (0/8) Epoch 15, batch 13600, loss[loss=0.1433, simple_loss=0.2212, pruned_loss=0.03266, over 4914.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02943, over 972082.93 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:42:48,563 INFO [train.py:715] (0/8) Epoch 15, batch 13650, loss[loss=0.1469, simple_loss=0.2016, pruned_loss=0.04611, over 4775.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.0297, over 972083.33 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 09:43:27,801 INFO [train.py:715] (0/8) Epoch 15, batch 13700, loss[loss=0.1491, simple_loss=0.2179, pruned_loss=0.04022, over 4924.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.02996, over 972225.19 frames.], batch size: 23, lr: 1.47e-04 2022-05-08 09:44:06,257 INFO [train.py:715] (0/8) Epoch 15, batch 13750, loss[loss=0.1113, simple_loss=0.1841, pruned_loss=0.01925, over 4827.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02966, over 971738.46 frames.], batch size: 30, lr: 1.47e-04 2022-05-08 09:44:44,975 INFO [train.py:715] (0/8) Epoch 15, batch 13800, loss[loss=0.1436, simple_loss=0.216, pruned_loss=0.03562, over 4786.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02998, over 971889.06 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:45:23,197 INFO [train.py:715] (0/8) Epoch 15, batch 13850, loss[loss=0.1464, simple_loss=0.2121, pruned_loss=0.04033, over 4990.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2073, pruned_loss=0.03011, over 971631.81 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:45:29,836 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-536000.pt 2022-05-08 09:46:05,201 INFO [train.py:715] (0/8) Epoch 15, batch 13900, loss[loss=0.1175, simple_loss=0.1809, pruned_loss=0.02706, over 4876.00 frames.], tot_loss[loss=0.133, simple_loss=0.2066, pruned_loss=0.0297, over 972064.84 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:46:43,309 INFO [train.py:715] (0/8) Epoch 15, batch 13950, loss[loss=0.1168, simple_loss=0.1924, pruned_loss=0.02054, over 4763.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02973, over 971287.75 frames.], batch size: 12, lr: 1.47e-04 2022-05-08 09:47:21,599 INFO [train.py:715] (0/8) Epoch 15, batch 14000, loss[loss=0.1303, simple_loss=0.2021, pruned_loss=0.02925, over 4838.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02931, over 971016.95 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:48:00,895 INFO [train.py:715] (0/8) Epoch 15, batch 14050, loss[loss=0.1269, simple_loss=0.2095, pruned_loss=0.02214, over 4784.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2063, pruned_loss=0.02925, over 970752.34 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:48:38,843 INFO [train.py:715] (0/8) Epoch 15, batch 14100, loss[loss=0.1332, simple_loss=0.2001, pruned_loss=0.03317, over 4956.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02939, over 970998.24 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 09:49:17,891 INFO [train.py:715] (0/8) Epoch 15, batch 14150, loss[loss=0.1094, simple_loss=0.1802, pruned_loss=0.01928, over 4938.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2072, pruned_loss=0.03005, over 970860.89 frames.], batch size: 23, lr: 1.47e-04 2022-05-08 09:49:56,534 INFO [train.py:715] (0/8) Epoch 15, batch 14200, loss[loss=0.1223, simple_loss=0.1953, pruned_loss=0.02465, over 4760.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2079, pruned_loss=0.03014, over 971381.29 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 09:50:35,491 INFO [train.py:715] (0/8) Epoch 15, batch 14250, loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02878, over 4766.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2085, pruned_loss=0.03069, over 971990.53 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 09:51:13,318 INFO [train.py:715] (0/8) Epoch 15, batch 14300, loss[loss=0.1329, simple_loss=0.21, pruned_loss=0.02789, over 4834.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2081, pruned_loss=0.03023, over 971945.75 frames.], batch size: 20, lr: 1.47e-04 2022-05-08 09:51:51,764 INFO [train.py:715] (0/8) Epoch 15, batch 14350, loss[loss=0.1514, simple_loss=0.2283, pruned_loss=0.03721, over 4856.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2091, pruned_loss=0.03071, over 973163.06 frames.], batch size: 20, lr: 1.47e-04 2022-05-08 09:52:30,857 INFO [train.py:715] (0/8) Epoch 15, batch 14400, loss[loss=0.1746, simple_loss=0.2481, pruned_loss=0.05055, over 4851.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03011, over 973571.39 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:53:08,604 INFO [train.py:715] (0/8) Epoch 15, batch 14450, loss[loss=0.1528, simple_loss=0.2303, pruned_loss=0.03768, over 4980.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03008, over 973124.66 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:53:47,582 INFO [train.py:715] (0/8) Epoch 15, batch 14500, loss[loss=0.1662, simple_loss=0.2286, pruned_loss=0.05189, over 4884.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2096, pruned_loss=0.03073, over 973347.70 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 09:54:25,856 INFO [train.py:715] (0/8) Epoch 15, batch 14550, loss[loss=0.131, simple_loss=0.2028, pruned_loss=0.02961, over 4971.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03037, over 972896.06 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 09:55:04,846 INFO [train.py:715] (0/8) Epoch 15, batch 14600, loss[loss=0.1209, simple_loss=0.1955, pruned_loss=0.02317, over 4937.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03027, over 973146.22 frames.], batch size: 29, lr: 1.47e-04 2022-05-08 09:55:42,670 INFO [train.py:715] (0/8) Epoch 15, batch 14650, loss[loss=0.1668, simple_loss=0.2455, pruned_loss=0.044, over 4802.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03039, over 972609.93 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 09:56:20,648 INFO [train.py:715] (0/8) Epoch 15, batch 14700, loss[loss=0.1299, simple_loss=0.2016, pruned_loss=0.0291, over 4761.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03065, over 972931.23 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 09:56:59,717 INFO [train.py:715] (0/8) Epoch 15, batch 14750, loss[loss=0.1242, simple_loss=0.1935, pruned_loss=0.02741, over 4828.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03049, over 972266.98 frames.], batch size: 30, lr: 1.47e-04 2022-05-08 09:57:37,351 INFO [train.py:715] (0/8) Epoch 15, batch 14800, loss[loss=0.1364, simple_loss=0.2191, pruned_loss=0.02682, over 4958.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03053, over 971717.50 frames.], batch size: 24, lr: 1.47e-04 2022-05-08 09:58:16,191 INFO [train.py:715] (0/8) Epoch 15, batch 14850, loss[loss=0.1302, simple_loss=0.2158, pruned_loss=0.0223, over 4979.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03, over 971787.20 frames.], batch size: 26, lr: 1.47e-04 2022-05-08 09:58:55,095 INFO [train.py:715] (0/8) Epoch 15, batch 14900, loss[loss=0.1219, simple_loss=0.1951, pruned_loss=0.02439, over 4879.00 frames.], tot_loss[loss=0.1347, simple_loss=0.209, pruned_loss=0.03017, over 971938.27 frames.], batch size: 22, lr: 1.47e-04 2022-05-08 09:59:33,269 INFO [train.py:715] (0/8) Epoch 15, batch 14950, loss[loss=0.1358, simple_loss=0.2006, pruned_loss=0.03556, over 4946.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2086, pruned_loss=0.03014, over 972472.52 frames.], batch size: 35, lr: 1.47e-04 2022-05-08 10:00:11,562 INFO [train.py:715] (0/8) Epoch 15, batch 15000, loss[loss=0.1263, simple_loss=0.202, pruned_loss=0.02527, over 4945.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02986, over 972711.37 frames.], batch size: 29, lr: 1.47e-04 2022-05-08 10:00:11,563 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 10:00:26,345 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1051, simple_loss=0.1887, pruned_loss=0.01077, over 914524.00 frames. 2022-05-08 10:01:05,810 INFO [train.py:715] (0/8) Epoch 15, batch 15050, loss[loss=0.1823, simple_loss=0.2432, pruned_loss=0.0607, over 4776.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03018, over 972767.80 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:01:43,976 INFO [train.py:715] (0/8) Epoch 15, batch 15100, loss[loss=0.1298, simple_loss=0.2062, pruned_loss=0.02675, over 4892.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03038, over 972772.61 frames.], batch size: 22, lr: 1.47e-04 2022-05-08 10:02:23,328 INFO [train.py:715] (0/8) Epoch 15, batch 15150, loss[loss=0.13, simple_loss=0.1995, pruned_loss=0.03024, over 4980.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2094, pruned_loss=0.03086, over 972518.31 frames.], batch size: 28, lr: 1.47e-04 2022-05-08 10:03:01,051 INFO [train.py:715] (0/8) Epoch 15, batch 15200, loss[loss=0.1365, simple_loss=0.2078, pruned_loss=0.03264, over 4889.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2095, pruned_loss=0.03088, over 972743.87 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:03:39,349 INFO [train.py:715] (0/8) Epoch 15, batch 15250, loss[loss=0.1639, simple_loss=0.2311, pruned_loss=0.04839, over 4972.00 frames.], tot_loss[loss=0.136, simple_loss=0.2101, pruned_loss=0.03097, over 971794.73 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:04:18,901 INFO [train.py:715] (0/8) Epoch 15, batch 15300, loss[loss=0.1518, simple_loss=0.2325, pruned_loss=0.03552, over 4781.00 frames.], tot_loss[loss=0.136, simple_loss=0.2105, pruned_loss=0.03074, over 971492.63 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 10:04:56,981 INFO [train.py:715] (0/8) Epoch 15, batch 15350, loss[loss=0.1401, simple_loss=0.2131, pruned_loss=0.03351, over 4890.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2099, pruned_loss=0.03041, over 971297.82 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:05:35,888 INFO [train.py:715] (0/8) Epoch 15, batch 15400, loss[loss=0.1262, simple_loss=0.1911, pruned_loss=0.03064, over 4945.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.03041, over 972016.30 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 10:06:13,984 INFO [train.py:715] (0/8) Epoch 15, batch 15450, loss[loss=0.1496, simple_loss=0.2357, pruned_loss=0.03176, over 4914.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2098, pruned_loss=0.03073, over 971782.26 frames.], batch size: 29, lr: 1.47e-04 2022-05-08 10:06:52,878 INFO [train.py:715] (0/8) Epoch 15, batch 15500, loss[loss=0.1184, simple_loss=0.1927, pruned_loss=0.02204, over 4801.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2104, pruned_loss=0.03097, over 971522.67 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 10:07:31,443 INFO [train.py:715] (0/8) Epoch 15, batch 15550, loss[loss=0.1447, simple_loss=0.2086, pruned_loss=0.04039, over 4849.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.0312, over 971448.63 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 10:08:10,326 INFO [train.py:715] (0/8) Epoch 15, batch 15600, loss[loss=0.1468, simple_loss=0.2164, pruned_loss=0.03856, over 4871.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2092, pruned_loss=0.03109, over 971488.48 frames.], batch size: 32, lr: 1.47e-04 2022-05-08 10:08:49,132 INFO [train.py:715] (0/8) Epoch 15, batch 15650, loss[loss=0.1289, simple_loss=0.2007, pruned_loss=0.02858, over 4969.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2081, pruned_loss=0.03047, over 971337.97 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 10:09:27,214 INFO [train.py:715] (0/8) Epoch 15, batch 15700, loss[loss=0.1184, simple_loss=0.2072, pruned_loss=0.01483, over 4984.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03009, over 970920.43 frames.], batch size: 28, lr: 1.47e-04 2022-05-08 10:10:05,785 INFO [train.py:715] (0/8) Epoch 15, batch 15750, loss[loss=0.1321, simple_loss=0.2212, pruned_loss=0.0215, over 4743.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03006, over 971557.54 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 10:10:44,346 INFO [train.py:715] (0/8) Epoch 15, batch 15800, loss[loss=0.1355, simple_loss=0.204, pruned_loss=0.03356, over 4908.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02963, over 971614.25 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 10:11:23,018 INFO [train.py:715] (0/8) Epoch 15, batch 15850, loss[loss=0.1201, simple_loss=0.1948, pruned_loss=0.02273, over 4815.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03004, over 971683.30 frames.], batch size: 27, lr: 1.47e-04 2022-05-08 10:12:01,144 INFO [train.py:715] (0/8) Epoch 15, batch 15900, loss[loss=0.1358, simple_loss=0.2116, pruned_loss=0.02999, over 4706.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.02997, over 972794.99 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 10:12:39,304 INFO [train.py:715] (0/8) Epoch 15, batch 15950, loss[loss=0.137, simple_loss=0.2194, pruned_loss=0.02724, over 4791.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03037, over 972706.68 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 10:13:18,367 INFO [train.py:715] (0/8) Epoch 15, batch 16000, loss[loss=0.1403, simple_loss=0.2139, pruned_loss=0.03336, over 4865.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.03042, over 972953.75 frames.], batch size: 20, lr: 1.47e-04 2022-05-08 10:13:55,999 INFO [train.py:715] (0/8) Epoch 15, batch 16050, loss[loss=0.1225, simple_loss=0.192, pruned_loss=0.0265, over 4923.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.03007, over 972835.05 frames.], batch size: 23, lr: 1.47e-04 2022-05-08 10:14:34,589 INFO [train.py:715] (0/8) Epoch 15, batch 16100, loss[loss=0.1467, simple_loss=0.2228, pruned_loss=0.03533, over 4949.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2085, pruned_loss=0.02969, over 972929.23 frames.], batch size: 29, lr: 1.47e-04 2022-05-08 10:15:13,036 INFO [train.py:715] (0/8) Epoch 15, batch 16150, loss[loss=0.1379, simple_loss=0.2089, pruned_loss=0.03342, over 4721.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02953, over 973004.78 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 10:15:51,553 INFO [train.py:715] (0/8) Epoch 15, batch 16200, loss[loss=0.1242, simple_loss=0.2021, pruned_loss=0.02317, over 4790.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.03, over 973569.55 frames.], batch size: 17, lr: 1.47e-04 2022-05-08 10:16:29,835 INFO [train.py:715] (0/8) Epoch 15, batch 16250, loss[loss=0.1202, simple_loss=0.2072, pruned_loss=0.01658, over 4817.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02992, over 973561.12 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 10:17:08,214 INFO [train.py:715] (0/8) Epoch 15, batch 16300, loss[loss=0.1061, simple_loss=0.1878, pruned_loss=0.0122, over 4783.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02977, over 972767.96 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:17:46,750 INFO [train.py:715] (0/8) Epoch 15, batch 16350, loss[loss=0.125, simple_loss=0.2096, pruned_loss=0.0202, over 4839.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02961, over 972051.19 frames.], batch size: 13, lr: 1.47e-04 2022-05-08 10:18:24,619 INFO [train.py:715] (0/8) Epoch 15, batch 16400, loss[loss=0.1279, simple_loss=0.2112, pruned_loss=0.02233, over 4865.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02957, over 972540.96 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 10:19:03,501 INFO [train.py:715] (0/8) Epoch 15, batch 16450, loss[loss=0.1236, simple_loss=0.198, pruned_loss=0.02457, over 4959.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02967, over 972937.85 frames.], batch size: 24, lr: 1.47e-04 2022-05-08 10:19:41,757 INFO [train.py:715] (0/8) Epoch 15, batch 16500, loss[loss=0.1365, simple_loss=0.2147, pruned_loss=0.02917, over 4905.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2073, pruned_loss=0.02993, over 972704.64 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:20:20,129 INFO [train.py:715] (0/8) Epoch 15, batch 16550, loss[loss=0.1353, simple_loss=0.1933, pruned_loss=0.03868, over 4790.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2083, pruned_loss=0.0306, over 972270.63 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:20:58,276 INFO [train.py:715] (0/8) Epoch 15, batch 16600, loss[loss=0.1148, simple_loss=0.1909, pruned_loss=0.01933, over 4986.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2074, pruned_loss=0.02999, over 972214.36 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 10:21:37,033 INFO [train.py:715] (0/8) Epoch 15, batch 16650, loss[loss=0.12, simple_loss=0.2047, pruned_loss=0.01766, over 4681.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02976, over 971798.78 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 10:22:16,799 INFO [train.py:715] (0/8) Epoch 15, batch 16700, loss[loss=0.1442, simple_loss=0.216, pruned_loss=0.03621, over 4984.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02972, over 972767.84 frames.], batch size: 33, lr: 1.47e-04 2022-05-08 10:22:55,520 INFO [train.py:715] (0/8) Epoch 15, batch 16750, loss[loss=0.1134, simple_loss=0.1855, pruned_loss=0.02066, over 4842.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02979, over 972811.84 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 10:23:34,511 INFO [train.py:715] (0/8) Epoch 15, batch 16800, loss[loss=0.125, simple_loss=0.2081, pruned_loss=0.02098, over 4822.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02922, over 973682.92 frames.], batch size: 26, lr: 1.47e-04 2022-05-08 10:24:13,667 INFO [train.py:715] (0/8) Epoch 15, batch 16850, loss[loss=0.1192, simple_loss=0.1819, pruned_loss=0.02831, over 4959.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02959, over 973421.58 frames.], batch size: 35, lr: 1.47e-04 2022-05-08 10:24:52,751 INFO [train.py:715] (0/8) Epoch 15, batch 16900, loss[loss=0.1483, simple_loss=0.2249, pruned_loss=0.03591, over 4953.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02993, over 973022.96 frames.], batch size: 35, lr: 1.47e-04 2022-05-08 10:25:31,719 INFO [train.py:715] (0/8) Epoch 15, batch 16950, loss[loss=0.1227, simple_loss=0.194, pruned_loss=0.0257, over 4812.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03009, over 973172.49 frames.], batch size: 26, lr: 1.47e-04 2022-05-08 10:26:10,070 INFO [train.py:715] (0/8) Epoch 15, batch 17000, loss[loss=0.1215, simple_loss=0.1895, pruned_loss=0.02679, over 4935.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02982, over 972837.64 frames.], batch size: 35, lr: 1.47e-04 2022-05-08 10:26:49,332 INFO [train.py:715] (0/8) Epoch 15, batch 17050, loss[loss=0.1271, simple_loss=0.1953, pruned_loss=0.02948, over 4914.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02964, over 972311.61 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:27:27,260 INFO [train.py:715] (0/8) Epoch 15, batch 17100, loss[loss=0.1356, simple_loss=0.2185, pruned_loss=0.02641, over 4807.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2087, pruned_loss=0.0298, over 972679.12 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:28:06,073 INFO [train.py:715] (0/8) Epoch 15, batch 17150, loss[loss=0.1377, simple_loss=0.2144, pruned_loss=0.03051, over 4753.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2082, pruned_loss=0.02949, over 973055.02 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 10:28:44,475 INFO [train.py:715] (0/8) Epoch 15, batch 17200, loss[loss=0.1354, simple_loss=0.2061, pruned_loss=0.03239, over 4926.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.02994, over 973178.72 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 10:29:23,144 INFO [train.py:715] (0/8) Epoch 15, batch 17250, loss[loss=0.1172, simple_loss=0.2, pruned_loss=0.01722, over 4767.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03015, over 972664.06 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:30:01,720 INFO [train.py:715] (0/8) Epoch 15, batch 17300, loss[loss=0.1488, simple_loss=0.2218, pruned_loss=0.03796, over 4805.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2094, pruned_loss=0.03049, over 972387.85 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:30:40,359 INFO [train.py:715] (0/8) Epoch 15, batch 17350, loss[loss=0.1226, simple_loss=0.2064, pruned_loss=0.01939, over 4787.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03047, over 973087.31 frames.], batch size: 14, lr: 1.47e-04 2022-05-08 10:31:19,949 INFO [train.py:715] (0/8) Epoch 15, batch 17400, loss[loss=0.1129, simple_loss=0.1859, pruned_loss=0.01996, over 4838.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.03042, over 972955.00 frames.], batch size: 30, lr: 1.47e-04 2022-05-08 10:31:57,889 INFO [train.py:715] (0/8) Epoch 15, batch 17450, loss[loss=0.1505, simple_loss=0.2269, pruned_loss=0.03706, over 4965.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2092, pruned_loss=0.03014, over 973549.45 frames.], batch size: 35, lr: 1.47e-04 2022-05-08 10:32:36,868 INFO [train.py:715] (0/8) Epoch 15, batch 17500, loss[loss=0.123, simple_loss=0.1995, pruned_loss=0.02327, over 4708.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2099, pruned_loss=0.03053, over 973299.98 frames.], batch size: 15, lr: 1.47e-04 2022-05-08 10:33:15,842 INFO [train.py:715] (0/8) Epoch 15, batch 17550, loss[loss=0.1444, simple_loss=0.2127, pruned_loss=0.03805, over 4821.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2098, pruned_loss=0.03043, over 972288.15 frames.], batch size: 26, lr: 1.47e-04 2022-05-08 10:33:54,431 INFO [train.py:715] (0/8) Epoch 15, batch 17600, loss[loss=0.1456, simple_loss=0.2162, pruned_loss=0.03754, over 4850.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2093, pruned_loss=0.03001, over 972918.12 frames.], batch size: 32, lr: 1.47e-04 2022-05-08 10:34:32,811 INFO [train.py:715] (0/8) Epoch 15, batch 17650, loss[loss=0.1323, simple_loss=0.2007, pruned_loss=0.03194, over 4898.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03015, over 972894.53 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:35:11,433 INFO [train.py:715] (0/8) Epoch 15, batch 17700, loss[loss=0.1629, simple_loss=0.2373, pruned_loss=0.04423, over 4890.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03023, over 973355.34 frames.], batch size: 38, lr: 1.47e-04 2022-05-08 10:35:50,322 INFO [train.py:715] (0/8) Epoch 15, batch 17750, loss[loss=0.1319, simple_loss=0.2141, pruned_loss=0.02487, over 4943.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03019, over 973838.42 frames.], batch size: 21, lr: 1.47e-04 2022-05-08 10:36:28,682 INFO [train.py:715] (0/8) Epoch 15, batch 17800, loss[loss=0.1362, simple_loss=0.2071, pruned_loss=0.03262, over 4924.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02978, over 973015.75 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 10:37:07,664 INFO [train.py:715] (0/8) Epoch 15, batch 17850, loss[loss=0.1497, simple_loss=0.2204, pruned_loss=0.03947, over 4767.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02953, over 973804.28 frames.], batch size: 16, lr: 1.47e-04 2022-05-08 10:37:46,654 INFO [train.py:715] (0/8) Epoch 15, batch 17900, loss[loss=0.1157, simple_loss=0.1921, pruned_loss=0.0196, over 4760.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2072, pruned_loss=0.02991, over 973858.31 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:38:25,485 INFO [train.py:715] (0/8) Epoch 15, batch 17950, loss[loss=0.1397, simple_loss=0.2189, pruned_loss=0.03031, over 4815.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2075, pruned_loss=0.03008, over 972995.14 frames.], batch size: 25, lr: 1.47e-04 2022-05-08 10:39:03,817 INFO [train.py:715] (0/8) Epoch 15, batch 18000, loss[loss=0.129, simple_loss=0.2162, pruned_loss=0.02093, over 4762.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2071, pruned_loss=0.02998, over 972575.74 frames.], batch size: 19, lr: 1.47e-04 2022-05-08 10:39:03,818 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 10:39:13,330 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1048, simple_loss=0.1885, pruned_loss=0.01059, over 914524.00 frames. 2022-05-08 10:39:51,807 INFO [train.py:715] (0/8) Epoch 15, batch 18050, loss[loss=0.1287, simple_loss=0.2079, pruned_loss=0.02475, over 4911.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03032, over 972889.52 frames.], batch size: 18, lr: 1.47e-04 2022-05-08 10:40:30,475 INFO [train.py:715] (0/8) Epoch 15, batch 18100, loss[loss=0.1339, simple_loss=0.2168, pruned_loss=0.02553, over 4957.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.0302, over 973280.43 frames.], batch size: 24, lr: 1.46e-04 2022-05-08 10:41:09,232 INFO [train.py:715] (0/8) Epoch 15, batch 18150, loss[loss=0.1339, simple_loss=0.217, pruned_loss=0.02536, over 4744.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03007, over 972581.76 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 10:41:47,119 INFO [train.py:715] (0/8) Epoch 15, batch 18200, loss[loss=0.1557, simple_loss=0.2338, pruned_loss=0.0388, over 4787.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02965, over 971046.63 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 10:42:25,780 INFO [train.py:715] (0/8) Epoch 15, batch 18250, loss[loss=0.159, simple_loss=0.2396, pruned_loss=0.03924, over 4740.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02952, over 971491.88 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 10:43:04,442 INFO [train.py:715] (0/8) Epoch 15, batch 18300, loss[loss=0.112, simple_loss=0.1909, pruned_loss=0.01655, over 4882.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02933, over 971661.38 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 10:43:42,530 INFO [train.py:715] (0/8) Epoch 15, batch 18350, loss[loss=0.1553, simple_loss=0.2259, pruned_loss=0.04234, over 4699.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02958, over 971575.68 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 10:44:21,119 INFO [train.py:715] (0/8) Epoch 15, batch 18400, loss[loss=0.1453, simple_loss=0.2196, pruned_loss=0.03545, over 4978.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03025, over 971354.52 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 10:44:59,617 INFO [train.py:715] (0/8) Epoch 15, batch 18450, loss[loss=0.1311, simple_loss=0.219, pruned_loss=0.02156, over 4931.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2098, pruned_loss=0.03059, over 971082.66 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 10:45:38,877 INFO [train.py:715] (0/8) Epoch 15, batch 18500, loss[loss=0.144, simple_loss=0.2219, pruned_loss=0.03306, over 4912.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2099, pruned_loss=0.03062, over 971325.42 frames.], batch size: 39, lr: 1.46e-04 2022-05-08 10:46:17,372 INFO [train.py:715] (0/8) Epoch 15, batch 18550, loss[loss=0.1124, simple_loss=0.1852, pruned_loss=0.0198, over 4819.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2102, pruned_loss=0.03047, over 970954.12 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 10:46:55,961 INFO [train.py:715] (0/8) Epoch 15, batch 18600, loss[loss=0.126, simple_loss=0.202, pruned_loss=0.02496, over 4831.00 frames.], tot_loss[loss=0.1356, simple_loss=0.21, pruned_loss=0.03055, over 971151.23 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 10:47:34,878 INFO [train.py:715] (0/8) Epoch 15, batch 18650, loss[loss=0.1069, simple_loss=0.183, pruned_loss=0.01536, over 4831.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2097, pruned_loss=0.03043, over 971154.93 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 10:48:13,534 INFO [train.py:715] (0/8) Epoch 15, batch 18700, loss[loss=0.1403, simple_loss=0.216, pruned_loss=0.03229, over 4756.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2094, pruned_loss=0.03052, over 971401.90 frames.], batch size: 19, lr: 1.46e-04 2022-05-08 10:48:52,382 INFO [train.py:715] (0/8) Epoch 15, batch 18750, loss[loss=0.1312, simple_loss=0.2072, pruned_loss=0.02764, over 4945.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03077, over 971397.82 frames.], batch size: 23, lr: 1.46e-04 2022-05-08 10:49:31,671 INFO [train.py:715] (0/8) Epoch 15, batch 18800, loss[loss=0.1108, simple_loss=0.1776, pruned_loss=0.02195, over 4837.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2098, pruned_loss=0.03081, over 971609.24 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 10:50:10,915 INFO [train.py:715] (0/8) Epoch 15, batch 18850, loss[loss=0.134, simple_loss=0.1967, pruned_loss=0.0356, over 4815.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03047, over 971961.89 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 10:50:49,335 INFO [train.py:715] (0/8) Epoch 15, batch 18900, loss[loss=0.1362, simple_loss=0.2037, pruned_loss=0.03435, over 4737.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2093, pruned_loss=0.03063, over 971899.65 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 10:51:28,546 INFO [train.py:715] (0/8) Epoch 15, batch 18950, loss[loss=0.1185, simple_loss=0.1869, pruned_loss=0.02503, over 4782.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03046, over 972757.02 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 10:52:07,866 INFO [train.py:715] (0/8) Epoch 15, batch 19000, loss[loss=0.1004, simple_loss=0.1747, pruned_loss=0.01305, over 4642.00 frames.], tot_loss[loss=0.1333, simple_loss=0.207, pruned_loss=0.02982, over 971396.67 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 10:52:46,219 INFO [train.py:715] (0/8) Epoch 15, batch 19050, loss[loss=0.1387, simple_loss=0.1971, pruned_loss=0.0402, over 4772.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2063, pruned_loss=0.02945, over 970991.07 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 10:53:25,393 INFO [train.py:715] (0/8) Epoch 15, batch 19100, loss[loss=0.1369, simple_loss=0.2118, pruned_loss=0.03103, over 4787.00 frames.], tot_loss[loss=0.133, simple_loss=0.2067, pruned_loss=0.02963, over 971156.85 frames.], batch size: 24, lr: 1.46e-04 2022-05-08 10:54:03,694 INFO [train.py:715] (0/8) Epoch 15, batch 19150, loss[loss=0.1228, simple_loss=0.2039, pruned_loss=0.02085, over 4793.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2069, pruned_loss=0.02981, over 971375.33 frames.], batch size: 24, lr: 1.46e-04 2022-05-08 10:54:41,927 INFO [train.py:715] (0/8) Epoch 15, batch 19200, loss[loss=0.1401, simple_loss=0.2195, pruned_loss=0.0303, over 4909.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02995, over 972065.09 frames.], batch size: 23, lr: 1.46e-04 2022-05-08 10:55:19,944 INFO [train.py:715] (0/8) Epoch 15, batch 19250, loss[loss=0.1315, simple_loss=0.2065, pruned_loss=0.02829, over 4899.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02994, over 972720.02 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 10:55:58,062 INFO [train.py:715] (0/8) Epoch 15, batch 19300, loss[loss=0.1489, simple_loss=0.2206, pruned_loss=0.03863, over 4914.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02972, over 972252.04 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 10:56:36,947 INFO [train.py:715] (0/8) Epoch 15, batch 19350, loss[loss=0.1171, simple_loss=0.1904, pruned_loss=0.02193, over 4888.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02966, over 972815.48 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 10:57:14,725 INFO [train.py:715] (0/8) Epoch 15, batch 19400, loss[loss=0.1144, simple_loss=0.1812, pruned_loss=0.02377, over 4988.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2073, pruned_loss=0.02996, over 973067.68 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 10:57:53,611 INFO [train.py:715] (0/8) Epoch 15, batch 19450, loss[loss=0.1191, simple_loss=0.2024, pruned_loss=0.01788, over 4833.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2071, pruned_loss=0.03002, over 972976.50 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 10:58:31,633 INFO [train.py:715] (0/8) Epoch 15, batch 19500, loss[loss=0.1311, simple_loss=0.216, pruned_loss=0.0231, over 4780.00 frames.], tot_loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02961, over 972145.54 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 10:59:09,767 INFO [train.py:715] (0/8) Epoch 15, batch 19550, loss[loss=0.1305, simple_loss=0.2056, pruned_loss=0.02766, over 4870.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02984, over 972428.22 frames.], batch size: 20, lr: 1.46e-04 2022-05-08 10:59:48,215 INFO [train.py:715] (0/8) Epoch 15, batch 19600, loss[loss=0.1021, simple_loss=0.1744, pruned_loss=0.01484, over 4861.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03006, over 972721.76 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 11:00:26,253 INFO [train.py:715] (0/8) Epoch 15, batch 19650, loss[loss=0.148, simple_loss=0.2366, pruned_loss=0.02971, over 4748.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.0301, over 971434.78 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:01:05,290 INFO [train.py:715] (0/8) Epoch 15, batch 19700, loss[loss=0.1812, simple_loss=0.2619, pruned_loss=0.05024, over 4935.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2075, pruned_loss=0.02997, over 971754.00 frames.], batch size: 39, lr: 1.46e-04 2022-05-08 11:01:42,986 INFO [train.py:715] (0/8) Epoch 15, batch 19750, loss[loss=0.136, simple_loss=0.2265, pruned_loss=0.02279, over 4818.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.0299, over 971844.31 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:02:21,389 INFO [train.py:715] (0/8) Epoch 15, batch 19800, loss[loss=0.1173, simple_loss=0.1787, pruned_loss=0.02793, over 4771.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03019, over 971628.33 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 11:02:59,696 INFO [train.py:715] (0/8) Epoch 15, batch 19850, loss[loss=0.1167, simple_loss=0.189, pruned_loss=0.02222, over 4686.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03021, over 971350.48 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:03:37,784 INFO [train.py:715] (0/8) Epoch 15, batch 19900, loss[loss=0.1522, simple_loss=0.2195, pruned_loss=0.04247, over 4740.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03006, over 971991.07 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:04:16,954 INFO [train.py:715] (0/8) Epoch 15, batch 19950, loss[loss=0.1107, simple_loss=0.1783, pruned_loss=0.02154, over 4962.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03037, over 972155.57 frames.], batch size: 35, lr: 1.46e-04 2022-05-08 11:04:55,166 INFO [train.py:715] (0/8) Epoch 15, batch 20000, loss[loss=0.1347, simple_loss=0.2122, pruned_loss=0.02862, over 4895.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03033, over 973207.01 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 11:05:33,565 INFO [train.py:715] (0/8) Epoch 15, batch 20050, loss[loss=0.1361, simple_loss=0.2119, pruned_loss=0.03019, over 4835.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2081, pruned_loss=0.03025, over 973144.04 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 11:06:11,835 INFO [train.py:715] (0/8) Epoch 15, batch 20100, loss[loss=0.1601, simple_loss=0.2317, pruned_loss=0.04427, over 4969.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03047, over 972699.84 frames.], batch size: 39, lr: 1.46e-04 2022-05-08 11:06:50,115 INFO [train.py:715] (0/8) Epoch 15, batch 20150, loss[loss=0.0974, simple_loss=0.1715, pruned_loss=0.01166, over 4748.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03004, over 972316.24 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 11:07:28,121 INFO [train.py:715] (0/8) Epoch 15, batch 20200, loss[loss=0.1587, simple_loss=0.2244, pruned_loss=0.04653, over 4686.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02985, over 971451.58 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:08:05,818 INFO [train.py:715] (0/8) Epoch 15, batch 20250, loss[loss=0.1366, simple_loss=0.2075, pruned_loss=0.03286, over 4876.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2074, pruned_loss=0.02999, over 972687.08 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 11:08:44,516 INFO [train.py:715] (0/8) Epoch 15, batch 20300, loss[loss=0.1378, simple_loss=0.2303, pruned_loss=0.02265, over 4801.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02979, over 972599.14 frames.], batch size: 21, lr: 1.46e-04 2022-05-08 11:09:22,702 INFO [train.py:715] (0/8) Epoch 15, batch 20350, loss[loss=0.1197, simple_loss=0.1902, pruned_loss=0.02461, over 4809.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02983, over 973236.46 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:10:01,088 INFO [train.py:715] (0/8) Epoch 15, batch 20400, loss[loss=0.1466, simple_loss=0.2148, pruned_loss=0.03919, over 4799.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02953, over 973065.30 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:10:38,944 INFO [train.py:715] (0/8) Epoch 15, batch 20450, loss[loss=0.1422, simple_loss=0.2134, pruned_loss=0.03546, over 4942.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02987, over 972615.25 frames.], batch size: 23, lr: 1.46e-04 2022-05-08 11:11:17,695 INFO [train.py:715] (0/8) Epoch 15, batch 20500, loss[loss=0.1377, simple_loss=0.2191, pruned_loss=0.02813, over 4775.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02979, over 971497.15 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 11:11:55,868 INFO [train.py:715] (0/8) Epoch 15, batch 20550, loss[loss=0.1246, simple_loss=0.1984, pruned_loss=0.02543, over 4779.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02991, over 971120.36 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 11:12:33,917 INFO [train.py:715] (0/8) Epoch 15, batch 20600, loss[loss=0.1302, simple_loss=0.2089, pruned_loss=0.02579, over 4808.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02955, over 971102.78 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:13:12,977 INFO [train.py:715] (0/8) Epoch 15, batch 20650, loss[loss=0.1159, simple_loss=0.1932, pruned_loss=0.01928, over 4687.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02923, over 971445.14 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:13:51,735 INFO [train.py:715] (0/8) Epoch 15, batch 20700, loss[loss=0.1521, simple_loss=0.2117, pruned_loss=0.04632, over 4792.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02889, over 970954.41 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:14:31,080 INFO [train.py:715] (0/8) Epoch 15, batch 20750, loss[loss=0.1364, simple_loss=0.2048, pruned_loss=0.034, over 4964.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2074, pruned_loss=0.02896, over 971695.46 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:15:09,384 INFO [train.py:715] (0/8) Epoch 15, batch 20800, loss[loss=0.1287, simple_loss=0.2009, pruned_loss=0.02824, over 4909.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02884, over 971631.74 frames.], batch size: 19, lr: 1.46e-04 2022-05-08 11:15:48,760 INFO [train.py:715] (0/8) Epoch 15, batch 20850, loss[loss=0.1433, simple_loss=0.2224, pruned_loss=0.03216, over 4894.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02902, over 971893.91 frames.], batch size: 23, lr: 1.46e-04 2022-05-08 11:16:27,988 INFO [train.py:715] (0/8) Epoch 15, batch 20900, loss[loss=0.1471, simple_loss=0.2237, pruned_loss=0.03523, over 4850.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02932, over 972179.77 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:17:06,240 INFO [train.py:715] (0/8) Epoch 15, batch 20950, loss[loss=0.1187, simple_loss=0.1931, pruned_loss=0.02214, over 4800.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02949, over 972702.11 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:17:45,523 INFO [train.py:715] (0/8) Epoch 15, batch 21000, loss[loss=0.1301, simple_loss=0.1963, pruned_loss=0.03193, over 4816.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02973, over 972604.20 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 11:17:45,525 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 11:17:56,038 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1051, simple_loss=0.1887, pruned_loss=0.01075, over 914524.00 frames. 2022-05-08 11:18:35,284 INFO [train.py:715] (0/8) Epoch 15, batch 21050, loss[loss=0.1473, simple_loss=0.2221, pruned_loss=0.03627, over 4917.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.03011, over 972943.10 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 11:19:14,767 INFO [train.py:715] (0/8) Epoch 15, batch 21100, loss[loss=0.1314, simple_loss=0.2129, pruned_loss=0.02491, over 4972.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02963, over 972761.57 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:19:53,784 INFO [train.py:715] (0/8) Epoch 15, batch 21150, loss[loss=0.1663, simple_loss=0.2352, pruned_loss=0.04869, over 4961.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02986, over 972731.89 frames.], batch size: 39, lr: 1.46e-04 2022-05-08 11:20:32,264 INFO [train.py:715] (0/8) Epoch 15, batch 21200, loss[loss=0.1276, simple_loss=0.2055, pruned_loss=0.02486, over 4988.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2085, pruned_loss=0.02985, over 972477.80 frames.], batch size: 28, lr: 1.46e-04 2022-05-08 11:21:11,102 INFO [train.py:715] (0/8) Epoch 15, batch 21250, loss[loss=0.1373, simple_loss=0.2218, pruned_loss=0.02643, over 4989.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02987, over 972439.63 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:21:49,142 INFO [train.py:715] (0/8) Epoch 15, batch 21300, loss[loss=0.1229, simple_loss=0.1999, pruned_loss=0.02295, over 4860.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03057, over 973144.12 frames.], batch size: 38, lr: 1.46e-04 2022-05-08 11:22:26,786 INFO [train.py:715] (0/8) Epoch 15, batch 21350, loss[loss=0.1395, simple_loss=0.2059, pruned_loss=0.0366, over 4942.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.0303, over 972873.48 frames.], batch size: 21, lr: 1.46e-04 2022-05-08 11:23:05,094 INFO [train.py:715] (0/8) Epoch 15, batch 21400, loss[loss=0.1149, simple_loss=0.1777, pruned_loss=0.02612, over 4772.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2072, pruned_loss=0.02985, over 972313.90 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 11:23:43,351 INFO [train.py:715] (0/8) Epoch 15, batch 21450, loss[loss=0.1265, simple_loss=0.2033, pruned_loss=0.02483, over 4682.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.03031, over 972745.74 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:24:21,355 INFO [train.py:715] (0/8) Epoch 15, batch 21500, loss[loss=0.1813, simple_loss=0.2324, pruned_loss=0.06509, over 4772.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2073, pruned_loss=0.03015, over 972660.99 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:24:59,645 INFO [train.py:715] (0/8) Epoch 15, batch 21550, loss[loss=0.1373, simple_loss=0.2107, pruned_loss=0.03195, over 4860.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03008, over 973252.82 frames.], batch size: 20, lr: 1.46e-04 2022-05-08 11:25:38,150 INFO [train.py:715] (0/8) Epoch 15, batch 21600, loss[loss=0.1247, simple_loss=0.2032, pruned_loss=0.02309, over 4827.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03008, over 973973.44 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 11:26:16,017 INFO [train.py:715] (0/8) Epoch 15, batch 21650, loss[loss=0.141, simple_loss=0.2202, pruned_loss=0.03088, over 4829.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03057, over 972729.13 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:26:54,236 INFO [train.py:715] (0/8) Epoch 15, batch 21700, loss[loss=0.1365, simple_loss=0.2189, pruned_loss=0.02705, over 4745.00 frames.], tot_loss[loss=0.135, simple_loss=0.2087, pruned_loss=0.03062, over 972623.58 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:27:32,375 INFO [train.py:715] (0/8) Epoch 15, batch 21750, loss[loss=0.1502, simple_loss=0.2206, pruned_loss=0.03993, over 4974.00 frames.], tot_loss[loss=0.1355, simple_loss=0.209, pruned_loss=0.03101, over 973463.65 frames.], batch size: 24, lr: 1.46e-04 2022-05-08 11:28:10,474 INFO [train.py:715] (0/8) Epoch 15, batch 21800, loss[loss=0.129, simple_loss=0.1979, pruned_loss=0.03007, over 4877.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2088, pruned_loss=0.03085, over 974618.67 frames.], batch size: 32, lr: 1.46e-04 2022-05-08 11:28:48,402 INFO [train.py:715] (0/8) Epoch 15, batch 21850, loss[loss=0.126, simple_loss=0.1976, pruned_loss=0.02724, over 4971.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2087, pruned_loss=0.03075, over 974745.73 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:28:55,318 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-544000.pt 2022-05-08 11:29:29,582 INFO [train.py:715] (0/8) Epoch 15, batch 21900, loss[loss=0.1348, simple_loss=0.2219, pruned_loss=0.02387, over 4767.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03045, over 974856.60 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 11:30:08,734 INFO [train.py:715] (0/8) Epoch 15, batch 21950, loss[loss=0.122, simple_loss=0.2014, pruned_loss=0.02127, over 4938.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2077, pruned_loss=0.03033, over 974351.50 frames.], batch size: 21, lr: 1.46e-04 2022-05-08 11:30:47,267 INFO [train.py:715] (0/8) Epoch 15, batch 22000, loss[loss=0.1284, simple_loss=0.2046, pruned_loss=0.02611, over 4941.00 frames.], tot_loss[loss=0.1346, simple_loss=0.208, pruned_loss=0.03055, over 973551.97 frames.], batch size: 21, lr: 1.46e-04 2022-05-08 11:31:25,788 INFO [train.py:715] (0/8) Epoch 15, batch 22050, loss[loss=0.1222, simple_loss=0.2073, pruned_loss=0.01858, over 4798.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03054, over 972267.92 frames.], batch size: 24, lr: 1.46e-04 2022-05-08 11:32:05,111 INFO [train.py:715] (0/8) Epoch 15, batch 22100, loss[loss=0.1295, simple_loss=0.2048, pruned_loss=0.02713, over 4804.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03049, over 972442.77 frames.], batch size: 21, lr: 1.46e-04 2022-05-08 11:32:43,912 INFO [train.py:715] (0/8) Epoch 15, batch 22150, loss[loss=0.1161, simple_loss=0.1885, pruned_loss=0.02181, over 4762.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03028, over 971841.67 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 11:33:22,283 INFO [train.py:715] (0/8) Epoch 15, batch 22200, loss[loss=0.1028, simple_loss=0.1769, pruned_loss=0.01435, over 4963.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2088, pruned_loss=0.03051, over 972150.97 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:34:01,325 INFO [train.py:715] (0/8) Epoch 15, batch 22250, loss[loss=0.1473, simple_loss=0.2167, pruned_loss=0.03896, over 4916.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03035, over 971449.24 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 11:34:40,256 INFO [train.py:715] (0/8) Epoch 15, batch 22300, loss[loss=0.1498, simple_loss=0.2169, pruned_loss=0.04131, over 4841.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03034, over 971379.03 frames.], batch size: 30, lr: 1.46e-04 2022-05-08 11:35:18,793 INFO [train.py:715] (0/8) Epoch 15, batch 22350, loss[loss=0.1166, simple_loss=0.1885, pruned_loss=0.02229, over 4812.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2086, pruned_loss=0.03022, over 970547.92 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 11:35:57,364 INFO [train.py:715] (0/8) Epoch 15, batch 22400, loss[loss=0.1351, simple_loss=0.2042, pruned_loss=0.03301, over 4957.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2093, pruned_loss=0.03101, over 971092.57 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:36:36,627 INFO [train.py:715] (0/8) Epoch 15, batch 22450, loss[loss=0.1741, simple_loss=0.2473, pruned_loss=0.0504, over 4781.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03049, over 970690.81 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 11:37:15,516 INFO [train.py:715] (0/8) Epoch 15, batch 22500, loss[loss=0.1449, simple_loss=0.2196, pruned_loss=0.03513, over 4932.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2089, pruned_loss=0.03063, over 971775.55 frames.], batch size: 29, lr: 1.46e-04 2022-05-08 11:37:54,239 INFO [train.py:715] (0/8) Epoch 15, batch 22550, loss[loss=0.1234, simple_loss=0.1846, pruned_loss=0.03109, over 4790.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2079, pruned_loss=0.03018, over 973198.55 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 11:38:32,795 INFO [train.py:715] (0/8) Epoch 15, batch 22600, loss[loss=0.1225, simple_loss=0.1909, pruned_loss=0.02704, over 4873.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03012, over 972921.42 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:39:11,716 INFO [train.py:715] (0/8) Epoch 15, batch 22650, loss[loss=0.139, simple_loss=0.2054, pruned_loss=0.03623, over 4639.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02995, over 972600.77 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 11:39:50,373 INFO [train.py:715] (0/8) Epoch 15, batch 22700, loss[loss=0.112, simple_loss=0.1906, pruned_loss=0.01677, over 4751.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03032, over 973450.82 frames.], batch size: 19, lr: 1.46e-04 2022-05-08 11:40:29,112 INFO [train.py:715] (0/8) Epoch 15, batch 22750, loss[loss=0.1281, simple_loss=0.2001, pruned_loss=0.02801, over 4779.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03043, over 973092.86 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:41:08,454 INFO [train.py:715] (0/8) Epoch 15, batch 22800, loss[loss=0.1268, simple_loss=0.2124, pruned_loss=0.02064, over 4775.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.02989, over 973641.59 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 11:41:47,278 INFO [train.py:715] (0/8) Epoch 15, batch 22850, loss[loss=0.1168, simple_loss=0.2026, pruned_loss=0.01549, over 4759.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02987, over 973117.09 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:42:26,021 INFO [train.py:715] (0/8) Epoch 15, batch 22900, loss[loss=0.1307, simple_loss=0.1997, pruned_loss=0.03081, over 4949.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02988, over 973792.69 frames.], batch size: 24, lr: 1.46e-04 2022-05-08 11:43:05,208 INFO [train.py:715] (0/8) Epoch 15, batch 22950, loss[loss=0.1485, simple_loss=0.2233, pruned_loss=0.03685, over 4987.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02992, over 973479.71 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:43:43,826 INFO [train.py:715] (0/8) Epoch 15, batch 23000, loss[loss=0.1212, simple_loss=0.1912, pruned_loss=0.02562, over 4961.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2071, pruned_loss=0.02989, over 973202.68 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:44:22,235 INFO [train.py:715] (0/8) Epoch 15, batch 23050, loss[loss=0.1295, simple_loss=0.1997, pruned_loss=0.02966, over 4850.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03045, over 972822.24 frames.], batch size: 20, lr: 1.46e-04 2022-05-08 11:45:00,624 INFO [train.py:715] (0/8) Epoch 15, batch 23100, loss[loss=0.1359, simple_loss=0.1999, pruned_loss=0.03593, over 4837.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2077, pruned_loss=0.03059, over 973694.97 frames.], batch size: 30, lr: 1.46e-04 2022-05-08 11:45:39,477 INFO [train.py:715] (0/8) Epoch 15, batch 23150, loss[loss=0.127, simple_loss=0.1955, pruned_loss=0.02925, over 4646.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2083, pruned_loss=0.03054, over 972859.97 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 11:46:17,449 INFO [train.py:715] (0/8) Epoch 15, batch 23200, loss[loss=0.1379, simple_loss=0.206, pruned_loss=0.03486, over 4895.00 frames.], tot_loss[loss=0.1353, simple_loss=0.209, pruned_loss=0.03076, over 972324.78 frames.], batch size: 19, lr: 1.46e-04 2022-05-08 11:46:55,706 INFO [train.py:715] (0/8) Epoch 15, batch 23250, loss[loss=0.1238, simple_loss=0.1975, pruned_loss=0.02503, over 4905.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03057, over 971919.77 frames.], batch size: 19, lr: 1.46e-04 2022-05-08 11:47:34,384 INFO [train.py:715] (0/8) Epoch 15, batch 23300, loss[loss=0.113, simple_loss=0.1853, pruned_loss=0.02031, over 4773.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02979, over 972275.83 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 11:48:12,426 INFO [train.py:715] (0/8) Epoch 15, batch 23350, loss[loss=0.1252, simple_loss=0.2, pruned_loss=0.02522, over 4868.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02944, over 972875.25 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 11:48:50,736 INFO [train.py:715] (0/8) Epoch 15, batch 23400, loss[loss=0.154, simple_loss=0.2256, pruned_loss=0.04127, over 4755.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03031, over 972582.00 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:49:28,560 INFO [train.py:715] (0/8) Epoch 15, batch 23450, loss[loss=0.1294, simple_loss=0.2131, pruned_loss=0.02278, over 4878.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.03021, over 972113.86 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 11:50:07,083 INFO [train.py:715] (0/8) Epoch 15, batch 23500, loss[loss=0.1583, simple_loss=0.2199, pruned_loss=0.0483, over 4929.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03061, over 972235.83 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 11:50:44,856 INFO [train.py:715] (0/8) Epoch 15, batch 23550, loss[loss=0.1218, simple_loss=0.204, pruned_loss=0.01983, over 4816.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2083, pruned_loss=0.03032, over 972848.19 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 11:51:22,852 INFO [train.py:715] (0/8) Epoch 15, batch 23600, loss[loss=0.1735, simple_loss=0.241, pruned_loss=0.05295, over 4872.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2087, pruned_loss=0.03038, over 972238.68 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:52:01,276 INFO [train.py:715] (0/8) Epoch 15, batch 23650, loss[loss=0.133, simple_loss=0.2201, pruned_loss=0.02297, over 4695.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2093, pruned_loss=0.03065, over 971349.01 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:52:39,164 INFO [train.py:715] (0/8) Epoch 15, batch 23700, loss[loss=0.1143, simple_loss=0.1797, pruned_loss=0.02449, over 4734.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03054, over 971174.60 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 11:53:17,229 INFO [train.py:715] (0/8) Epoch 15, batch 23750, loss[loss=0.1374, simple_loss=0.2107, pruned_loss=0.03203, over 4871.00 frames.], tot_loss[loss=0.135, simple_loss=0.2088, pruned_loss=0.03056, over 971432.87 frames.], batch size: 32, lr: 1.46e-04 2022-05-08 11:53:55,056 INFO [train.py:715] (0/8) Epoch 15, batch 23800, loss[loss=0.1187, simple_loss=0.1946, pruned_loss=0.02138, over 4772.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03073, over 971123.44 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 11:54:33,043 INFO [train.py:715] (0/8) Epoch 15, batch 23850, loss[loss=0.1595, simple_loss=0.2344, pruned_loss=0.04228, over 4753.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2097, pruned_loss=0.03075, over 972831.79 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:55:11,361 INFO [train.py:715] (0/8) Epoch 15, batch 23900, loss[loss=0.1444, simple_loss=0.2137, pruned_loss=0.03758, over 4688.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03027, over 972167.22 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:55:48,898 INFO [train.py:715] (0/8) Epoch 15, batch 23950, loss[loss=0.1342, simple_loss=0.2162, pruned_loss=0.02609, over 4816.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03026, over 972340.13 frames.], batch size: 27, lr: 1.46e-04 2022-05-08 11:56:27,440 INFO [train.py:715] (0/8) Epoch 15, batch 24000, loss[loss=0.145, simple_loss=0.227, pruned_loss=0.03152, over 4703.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.0303, over 971819.81 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:56:27,441 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 11:56:37,035 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.105, simple_loss=0.1886, pruned_loss=0.01071, over 914524.00 frames. 2022-05-08 11:57:15,620 INFO [train.py:715] (0/8) Epoch 15, batch 24050, loss[loss=0.1206, simple_loss=0.1926, pruned_loss=0.0243, over 4817.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02969, over 972341.39 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:57:54,184 INFO [train.py:715] (0/8) Epoch 15, batch 24100, loss[loss=0.127, simple_loss=0.1972, pruned_loss=0.02846, over 4834.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02957, over 973370.33 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:58:32,188 INFO [train.py:715] (0/8) Epoch 15, batch 24150, loss[loss=0.1584, simple_loss=0.2176, pruned_loss=0.04964, over 4748.00 frames.], tot_loss[loss=0.1329, simple_loss=0.207, pruned_loss=0.02944, over 972815.95 frames.], batch size: 16, lr: 1.46e-04 2022-05-08 11:59:10,404 INFO [train.py:715] (0/8) Epoch 15, batch 24200, loss[loss=0.1368, simple_loss=0.2148, pruned_loss=0.02943, over 4694.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02944, over 972426.70 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 11:59:48,420 INFO [train.py:715] (0/8) Epoch 15, batch 24250, loss[loss=0.1208, simple_loss=0.2001, pruned_loss=0.02073, over 4852.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02963, over 972450.88 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 12:00:26,745 INFO [train.py:715] (0/8) Epoch 15, batch 24300, loss[loss=0.1309, simple_loss=0.1971, pruned_loss=0.03234, over 4912.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03048, over 972705.65 frames.], batch size: 17, lr: 1.46e-04 2022-05-08 12:01:03,896 INFO [train.py:715] (0/8) Epoch 15, batch 24350, loss[loss=0.1478, simple_loss=0.2318, pruned_loss=0.03193, over 4940.00 frames.], tot_loss[loss=0.135, simple_loss=0.2086, pruned_loss=0.03068, over 972790.81 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 12:01:42,319 INFO [train.py:715] (0/8) Epoch 15, batch 24400, loss[loss=0.1463, simple_loss=0.2281, pruned_loss=0.03229, over 4763.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03007, over 972506.14 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 12:02:20,843 INFO [train.py:715] (0/8) Epoch 15, batch 24450, loss[loss=0.1139, simple_loss=0.1885, pruned_loss=0.01963, over 4820.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2076, pruned_loss=0.03007, over 971966.62 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 12:02:58,823 INFO [train.py:715] (0/8) Epoch 15, batch 24500, loss[loss=0.1345, simple_loss=0.2038, pruned_loss=0.03256, over 4917.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2065, pruned_loss=0.0293, over 971219.96 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 12:03:36,483 INFO [train.py:715] (0/8) Epoch 15, batch 24550, loss[loss=0.1292, simple_loss=0.2041, pruned_loss=0.0272, over 4883.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.02931, over 972074.49 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 12:04:14,725 INFO [train.py:715] (0/8) Epoch 15, batch 24600, loss[loss=0.1353, simple_loss=0.2146, pruned_loss=0.02796, over 4989.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02968, over 971531.10 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 12:04:53,491 INFO [train.py:715] (0/8) Epoch 15, batch 24650, loss[loss=0.144, simple_loss=0.2162, pruned_loss=0.03594, over 4845.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02969, over 971872.31 frames.], batch size: 30, lr: 1.46e-04 2022-05-08 12:05:31,174 INFO [train.py:715] (0/8) Epoch 15, batch 24700, loss[loss=0.1301, simple_loss=0.206, pruned_loss=0.02715, over 4782.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02991, over 971524.99 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 12:06:09,578 INFO [train.py:715] (0/8) Epoch 15, batch 24750, loss[loss=0.1201, simple_loss=0.1986, pruned_loss=0.02081, over 4774.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02971, over 972298.00 frames.], batch size: 19, lr: 1.46e-04 2022-05-08 12:06:47,905 INFO [train.py:715] (0/8) Epoch 15, batch 24800, loss[loss=0.1341, simple_loss=0.2017, pruned_loss=0.03327, over 4838.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02963, over 972089.86 frames.], batch size: 30, lr: 1.46e-04 2022-05-08 12:07:25,657 INFO [train.py:715] (0/8) Epoch 15, batch 24850, loss[loss=0.1233, simple_loss=0.206, pruned_loss=0.02028, over 4944.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2088, pruned_loss=0.02994, over 972069.11 frames.], batch size: 29, lr: 1.46e-04 2022-05-08 12:08:03,589 INFO [train.py:715] (0/8) Epoch 15, batch 24900, loss[loss=0.1438, simple_loss=0.2218, pruned_loss=0.03295, over 4828.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2091, pruned_loss=0.02992, over 972352.74 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 12:08:41,836 INFO [train.py:715] (0/8) Epoch 15, batch 24950, loss[loss=0.1499, simple_loss=0.2199, pruned_loss=0.03992, over 4941.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2097, pruned_loss=0.03023, over 972199.67 frames.], batch size: 39, lr: 1.46e-04 2022-05-08 12:09:20,945 INFO [train.py:715] (0/8) Epoch 15, batch 25000, loss[loss=0.1448, simple_loss=0.2229, pruned_loss=0.03336, over 4808.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2091, pruned_loss=0.03005, over 972954.63 frames.], batch size: 25, lr: 1.46e-04 2022-05-08 12:09:58,496 INFO [train.py:715] (0/8) Epoch 15, batch 25050, loss[loss=0.1397, simple_loss=0.214, pruned_loss=0.03276, over 4990.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03027, over 972960.15 frames.], batch size: 14, lr: 1.46e-04 2022-05-08 12:10:36,534 INFO [train.py:715] (0/8) Epoch 15, batch 25100, loss[loss=0.1366, simple_loss=0.2061, pruned_loss=0.03359, over 4862.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2086, pruned_loss=0.02998, over 972515.12 frames.], batch size: 32, lr: 1.46e-04 2022-05-08 12:11:14,985 INFO [train.py:715] (0/8) Epoch 15, batch 25150, loss[loss=0.1459, simple_loss=0.2196, pruned_loss=0.03612, over 4824.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03011, over 972332.37 frames.], batch size: 30, lr: 1.46e-04 2022-05-08 12:11:53,001 INFO [train.py:715] (0/8) Epoch 15, batch 25200, loss[loss=0.1072, simple_loss=0.1764, pruned_loss=0.01898, over 4826.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2086, pruned_loss=0.03002, over 972624.24 frames.], batch size: 13, lr: 1.46e-04 2022-05-08 12:12:30,792 INFO [train.py:715] (0/8) Epoch 15, batch 25250, loss[loss=0.1344, simple_loss=0.2139, pruned_loss=0.02743, over 4893.00 frames.], tot_loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.02978, over 973016.43 frames.], batch size: 22, lr: 1.46e-04 2022-05-08 12:13:09,117 INFO [train.py:715] (0/8) Epoch 15, batch 25300, loss[loss=0.1392, simple_loss=0.2085, pruned_loss=0.03493, over 4980.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03012, over 973121.65 frames.], batch size: 35, lr: 1.46e-04 2022-05-08 12:13:47,199 INFO [train.py:715] (0/8) Epoch 15, batch 25350, loss[loss=0.1245, simple_loss=0.2027, pruned_loss=0.02319, over 4819.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02985, over 973220.25 frames.], batch size: 26, lr: 1.46e-04 2022-05-08 12:14:24,742 INFO [train.py:715] (0/8) Epoch 15, batch 25400, loss[loss=0.1318, simple_loss=0.2, pruned_loss=0.0318, over 4826.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02964, over 972946.38 frames.], batch size: 15, lr: 1.46e-04 2022-05-08 12:15:02,816 INFO [train.py:715] (0/8) Epoch 15, batch 25450, loss[loss=0.1066, simple_loss=0.1736, pruned_loss=0.01976, over 4815.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02965, over 972797.09 frames.], batch size: 12, lr: 1.46e-04 2022-05-08 12:15:41,205 INFO [train.py:715] (0/8) Epoch 15, batch 25500, loss[loss=0.1393, simple_loss=0.2084, pruned_loss=0.03511, over 4931.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.0294, over 973397.86 frames.], batch size: 18, lr: 1.46e-04 2022-05-08 12:16:18,761 INFO [train.py:715] (0/8) Epoch 15, batch 25550, loss[loss=0.1196, simple_loss=0.1854, pruned_loss=0.02693, over 4917.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2078, pruned_loss=0.02973, over 973464.19 frames.], batch size: 29, lr: 1.45e-04 2022-05-08 12:16:56,913 INFO [train.py:715] (0/8) Epoch 15, batch 25600, loss[loss=0.1401, simple_loss=0.2173, pruned_loss=0.03144, over 4989.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02932, over 973700.01 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 12:17:35,535 INFO [train.py:715] (0/8) Epoch 15, batch 25650, loss[loss=0.1398, simple_loss=0.2202, pruned_loss=0.02977, over 4879.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.0296, over 973100.20 frames.], batch size: 39, lr: 1.45e-04 2022-05-08 12:18:13,813 INFO [train.py:715] (0/8) Epoch 15, batch 25700, loss[loss=0.1054, simple_loss=0.1733, pruned_loss=0.01871, over 4862.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.0294, over 972724.74 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 12:18:51,200 INFO [train.py:715] (0/8) Epoch 15, batch 25750, loss[loss=0.1627, simple_loss=0.2316, pruned_loss=0.04694, over 4748.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02952, over 972506.95 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:19:29,346 INFO [train.py:715] (0/8) Epoch 15, batch 25800, loss[loss=0.1566, simple_loss=0.2241, pruned_loss=0.04452, over 4846.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02941, over 972570.32 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 12:20:07,972 INFO [train.py:715] (0/8) Epoch 15, batch 25850, loss[loss=0.1575, simple_loss=0.227, pruned_loss=0.04405, over 4786.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.0295, over 972095.68 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 12:20:45,415 INFO [train.py:715] (0/8) Epoch 15, batch 25900, loss[loss=0.1277, simple_loss=0.2059, pruned_loss=0.02475, over 4913.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02975, over 972397.13 frames.], batch size: 29, lr: 1.45e-04 2022-05-08 12:21:24,011 INFO [train.py:715] (0/8) Epoch 15, batch 25950, loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.0285, over 4986.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02959, over 972598.94 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 12:22:02,164 INFO [train.py:715] (0/8) Epoch 15, batch 26000, loss[loss=0.1341, simple_loss=0.2028, pruned_loss=0.03271, over 4974.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2066, pruned_loss=0.02929, over 972621.88 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:22:39,850 INFO [train.py:715] (0/8) Epoch 15, batch 26050, loss[loss=0.1259, simple_loss=0.1938, pruned_loss=0.02904, over 4817.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02928, over 972652.39 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 12:23:17,642 INFO [train.py:715] (0/8) Epoch 15, batch 26100, loss[loss=0.157, simple_loss=0.2314, pruned_loss=0.04135, over 4758.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02923, over 972883.94 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 12:23:56,073 INFO [train.py:715] (0/8) Epoch 15, batch 26150, loss[loss=0.1643, simple_loss=0.2422, pruned_loss=0.04318, over 4982.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02917, over 972967.79 frames.], batch size: 28, lr: 1.45e-04 2022-05-08 12:24:33,866 INFO [train.py:715] (0/8) Epoch 15, batch 26200, loss[loss=0.1348, simple_loss=0.2084, pruned_loss=0.03053, over 4914.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02914, over 972437.93 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 12:25:11,684 INFO [train.py:715] (0/8) Epoch 15, batch 26250, loss[loss=0.1487, simple_loss=0.2166, pruned_loss=0.04038, over 4943.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02859, over 972739.24 frames.], batch size: 35, lr: 1.45e-04 2022-05-08 12:25:50,001 INFO [train.py:715] (0/8) Epoch 15, batch 26300, loss[loss=0.1317, simple_loss=0.2001, pruned_loss=0.03164, over 4778.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02931, over 972221.05 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 12:26:28,458 INFO [train.py:715] (0/8) Epoch 15, batch 26350, loss[loss=0.1362, simple_loss=0.2097, pruned_loss=0.03139, over 4924.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02932, over 971722.38 frames.], batch size: 23, lr: 1.45e-04 2022-05-08 12:27:06,216 INFO [train.py:715] (0/8) Epoch 15, batch 26400, loss[loss=0.1073, simple_loss=0.1934, pruned_loss=0.0106, over 4976.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2076, pruned_loss=0.02907, over 971658.32 frames.], batch size: 28, lr: 1.45e-04 2022-05-08 12:27:44,348 INFO [train.py:715] (0/8) Epoch 15, batch 26450, loss[loss=0.1185, simple_loss=0.1881, pruned_loss=0.02438, over 4741.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02891, over 971049.01 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:28:22,634 INFO [train.py:715] (0/8) Epoch 15, batch 26500, loss[loss=0.1007, simple_loss=0.1828, pruned_loss=0.009281, over 4830.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02884, over 971624.93 frames.], batch size: 25, lr: 1.45e-04 2022-05-08 12:29:00,415 INFO [train.py:715] (0/8) Epoch 15, batch 26550, loss[loss=0.1146, simple_loss=0.1887, pruned_loss=0.02024, over 4694.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02867, over 971348.96 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:29:38,152 INFO [train.py:715] (0/8) Epoch 15, batch 26600, loss[loss=0.1352, simple_loss=0.194, pruned_loss=0.03819, over 4874.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02924, over 971796.49 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 12:30:16,185 INFO [train.py:715] (0/8) Epoch 15, batch 26650, loss[loss=0.1224, simple_loss=0.1977, pruned_loss=0.02362, over 4955.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02908, over 971565.41 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 12:30:54,317 INFO [train.py:715] (0/8) Epoch 15, batch 26700, loss[loss=0.1272, simple_loss=0.2001, pruned_loss=0.02715, over 4769.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.0295, over 971521.61 frames.], batch size: 19, lr: 1.45e-04 2022-05-08 12:31:31,945 INFO [train.py:715] (0/8) Epoch 15, batch 26750, loss[loss=0.1311, simple_loss=0.2123, pruned_loss=0.02499, over 4825.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02916, over 972557.65 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 12:32:10,362 INFO [train.py:715] (0/8) Epoch 15, batch 26800, loss[loss=0.1416, simple_loss=0.2237, pruned_loss=0.02977, over 4747.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02978, over 972473.28 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:32:48,678 INFO [train.py:715] (0/8) Epoch 15, batch 26850, loss[loss=0.1574, simple_loss=0.2256, pruned_loss=0.04461, over 4749.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02949, over 972713.25 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:33:26,759 INFO [train.py:715] (0/8) Epoch 15, batch 26900, loss[loss=0.125, simple_loss=0.1955, pruned_loss=0.0273, over 4696.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02993, over 971501.31 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:34:04,499 INFO [train.py:715] (0/8) Epoch 15, batch 26950, loss[loss=0.1408, simple_loss=0.2196, pruned_loss=0.03094, over 4876.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03002, over 972388.74 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 12:34:42,582 INFO [train.py:715] (0/8) Epoch 15, batch 27000, loss[loss=0.1378, simple_loss=0.2203, pruned_loss=0.02766, over 4892.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02974, over 971987.65 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 12:34:42,583 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 12:34:52,204 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1049, simple_loss=0.1884, pruned_loss=0.01064, over 914524.00 frames. 2022-05-08 12:35:31,294 INFO [train.py:715] (0/8) Epoch 15, batch 27050, loss[loss=0.1156, simple_loss=0.1797, pruned_loss=0.02581, over 4836.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02954, over 971967.06 frames.], batch size: 32, lr: 1.45e-04 2022-05-08 12:36:10,017 INFO [train.py:715] (0/8) Epoch 15, batch 27100, loss[loss=0.1408, simple_loss=0.2084, pruned_loss=0.03658, over 4843.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2093, pruned_loss=0.03005, over 972581.00 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 12:36:48,672 INFO [train.py:715] (0/8) Epoch 15, batch 27150, loss[loss=0.1248, simple_loss=0.1885, pruned_loss=0.03055, over 4919.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03027, over 972420.08 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 12:37:26,863 INFO [train.py:715] (0/8) Epoch 15, batch 27200, loss[loss=0.1178, simple_loss=0.1963, pruned_loss=0.01968, over 4818.00 frames.], tot_loss[loss=0.1347, simple_loss=0.209, pruned_loss=0.03022, over 972523.07 frames.], batch size: 27, lr: 1.45e-04 2022-05-08 12:38:05,900 INFO [train.py:715] (0/8) Epoch 15, batch 27250, loss[loss=0.1482, simple_loss=0.2204, pruned_loss=0.03797, over 4964.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03024, over 973632.91 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 12:38:43,691 INFO [train.py:715] (0/8) Epoch 15, batch 27300, loss[loss=0.1372, simple_loss=0.2073, pruned_loss=0.03358, over 4842.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2094, pruned_loss=0.03006, over 973368.15 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 12:39:21,919 INFO [train.py:715] (0/8) Epoch 15, batch 27350, loss[loss=0.1324, simple_loss=0.2019, pruned_loss=0.03144, over 4828.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02985, over 973161.33 frames.], batch size: 25, lr: 1.45e-04 2022-05-08 12:40:00,094 INFO [train.py:715] (0/8) Epoch 15, batch 27400, loss[loss=0.1586, simple_loss=0.2316, pruned_loss=0.04282, over 4921.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02959, over 973298.16 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 12:40:38,393 INFO [train.py:715] (0/8) Epoch 15, batch 27450, loss[loss=0.1217, simple_loss=0.2027, pruned_loss=0.02035, over 4823.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.02994, over 973573.37 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:41:16,657 INFO [train.py:715] (0/8) Epoch 15, batch 27500, loss[loss=0.1467, simple_loss=0.2138, pruned_loss=0.03976, over 4908.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03043, over 973438.48 frames.], batch size: 39, lr: 1.45e-04 2022-05-08 12:41:54,847 INFO [train.py:715] (0/8) Epoch 15, batch 27550, loss[loss=0.1295, simple_loss=0.2073, pruned_loss=0.02578, over 4990.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03072, over 973126.70 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 12:42:33,398 INFO [train.py:715] (0/8) Epoch 15, batch 27600, loss[loss=0.1215, simple_loss=0.2089, pruned_loss=0.01702, over 4899.00 frames.], tot_loss[loss=0.135, simple_loss=0.209, pruned_loss=0.03047, over 973117.34 frames.], batch size: 19, lr: 1.45e-04 2022-05-08 12:43:10,757 INFO [train.py:715] (0/8) Epoch 15, batch 27650, loss[loss=0.1158, simple_loss=0.192, pruned_loss=0.01983, over 4794.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2086, pruned_loss=0.02981, over 973106.10 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 12:43:49,456 INFO [train.py:715] (0/8) Epoch 15, batch 27700, loss[loss=0.1203, simple_loss=0.1876, pruned_loss=0.02651, over 4955.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.03001, over 973715.50 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 12:44:27,753 INFO [train.py:715] (0/8) Epoch 15, batch 27750, loss[loss=0.1361, simple_loss=0.2136, pruned_loss=0.02931, over 4927.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02996, over 974166.52 frames.], batch size: 23, lr: 1.45e-04 2022-05-08 12:45:06,219 INFO [train.py:715] (0/8) Epoch 15, batch 27800, loss[loss=0.1209, simple_loss=0.2, pruned_loss=0.02093, over 4912.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02995, over 972756.14 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 12:45:44,230 INFO [train.py:715] (0/8) Epoch 15, batch 27850, loss[loss=0.1576, simple_loss=0.2356, pruned_loss=0.03976, over 4822.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03011, over 973343.30 frames.], batch size: 27, lr: 1.45e-04 2022-05-08 12:46:21,971 INFO [train.py:715] (0/8) Epoch 15, batch 27900, loss[loss=0.1147, simple_loss=0.1827, pruned_loss=0.02336, over 4874.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03031, over 973556.75 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 12:47:00,796 INFO [train.py:715] (0/8) Epoch 15, batch 27950, loss[loss=0.1508, simple_loss=0.2186, pruned_loss=0.04147, over 4966.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.0303, over 974067.26 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:47:38,665 INFO [train.py:715] (0/8) Epoch 15, batch 28000, loss[loss=0.14, simple_loss=0.22, pruned_loss=0.03004, over 4816.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.02998, over 973542.19 frames.], batch size: 27, lr: 1.45e-04 2022-05-08 12:48:16,878 INFO [train.py:715] (0/8) Epoch 15, batch 28050, loss[loss=0.1277, simple_loss=0.2137, pruned_loss=0.02086, over 4750.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2085, pruned_loss=0.03025, over 972543.13 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:48:55,110 INFO [train.py:715] (0/8) Epoch 15, batch 28100, loss[loss=0.1261, simple_loss=0.2071, pruned_loss=0.02257, over 4972.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.03003, over 971611.44 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 12:49:33,360 INFO [train.py:715] (0/8) Epoch 15, batch 28150, loss[loss=0.1471, simple_loss=0.2315, pruned_loss=0.03137, over 4820.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2087, pruned_loss=0.02995, over 971597.09 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:50:11,121 INFO [train.py:715] (0/8) Epoch 15, batch 28200, loss[loss=0.1337, simple_loss=0.2189, pruned_loss=0.0243, over 4890.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02958, over 971979.03 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:50:49,024 INFO [train.py:715] (0/8) Epoch 15, batch 28250, loss[loss=0.1407, simple_loss=0.214, pruned_loss=0.03371, over 4773.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02976, over 971898.11 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 12:51:28,178 INFO [train.py:715] (0/8) Epoch 15, batch 28300, loss[loss=0.1305, simple_loss=0.2111, pruned_loss=0.02494, over 4977.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03012, over 972269.49 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 12:52:05,674 INFO [train.py:715] (0/8) Epoch 15, batch 28350, loss[loss=0.1548, simple_loss=0.2235, pruned_loss=0.04299, over 4786.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2086, pruned_loss=0.02997, over 972090.89 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 12:52:43,903 INFO [train.py:715] (0/8) Epoch 15, batch 28400, loss[loss=0.172, simple_loss=0.2166, pruned_loss=0.06375, over 4701.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02994, over 971816.07 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 12:53:22,222 INFO [train.py:715] (0/8) Epoch 15, batch 28450, loss[loss=0.1246, simple_loss=0.2106, pruned_loss=0.01932, over 4862.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02973, over 971622.63 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 12:54:00,372 INFO [train.py:715] (0/8) Epoch 15, batch 28500, loss[loss=0.1389, simple_loss=0.2228, pruned_loss=0.02748, over 4860.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02993, over 970763.11 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 12:54:38,500 INFO [train.py:715] (0/8) Epoch 15, batch 28550, loss[loss=0.1211, simple_loss=0.2001, pruned_loss=0.02101, over 4777.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02997, over 970798.69 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 12:55:16,668 INFO [train.py:715] (0/8) Epoch 15, batch 28600, loss[loss=0.1192, simple_loss=0.189, pruned_loss=0.02469, over 4909.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02984, over 972253.16 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 12:55:55,084 INFO [train.py:715] (0/8) Epoch 15, batch 28650, loss[loss=0.1122, simple_loss=0.1903, pruned_loss=0.01703, over 4980.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02988, over 972308.76 frames.], batch size: 28, lr: 1.45e-04 2022-05-08 12:56:32,944 INFO [train.py:715] (0/8) Epoch 15, batch 28700, loss[loss=0.1146, simple_loss=0.1925, pruned_loss=0.01831, over 4848.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02984, over 972872.89 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 12:57:11,381 INFO [train.py:715] (0/8) Epoch 15, batch 28750, loss[loss=0.1136, simple_loss=0.1907, pruned_loss=0.01826, over 4976.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02988, over 974120.08 frames.], batch size: 26, lr: 1.45e-04 2022-05-08 12:57:50,112 INFO [train.py:715] (0/8) Epoch 15, batch 28800, loss[loss=0.1308, simple_loss=0.2007, pruned_loss=0.03041, over 4905.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03059, over 973092.40 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 12:58:28,470 INFO [train.py:715] (0/8) Epoch 15, batch 28850, loss[loss=0.09886, simple_loss=0.1677, pruned_loss=0.015, over 4866.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03009, over 972242.06 frames.], batch size: 12, lr: 1.45e-04 2022-05-08 12:59:06,963 INFO [train.py:715] (0/8) Epoch 15, batch 28900, loss[loss=0.1193, simple_loss=0.1837, pruned_loss=0.02743, over 4647.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03038, over 971893.93 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 12:59:45,679 INFO [train.py:715] (0/8) Epoch 15, batch 28950, loss[loss=0.1313, simple_loss=0.2122, pruned_loss=0.02516, over 4924.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03039, over 972575.92 frames.], batch size: 23, lr: 1.45e-04 2022-05-08 13:00:24,852 INFO [train.py:715] (0/8) Epoch 15, batch 29000, loss[loss=0.1325, simple_loss=0.2214, pruned_loss=0.02181, over 4861.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03066, over 971984.38 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 13:01:03,425 INFO [train.py:715] (0/8) Epoch 15, batch 29050, loss[loss=0.1354, simple_loss=0.2052, pruned_loss=0.03276, over 4926.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03081, over 972697.48 frames.], batch size: 23, lr: 1.45e-04 2022-05-08 13:01:42,348 INFO [train.py:715] (0/8) Epoch 15, batch 29100, loss[loss=0.1701, simple_loss=0.2377, pruned_loss=0.05128, over 4753.00 frames.], tot_loss[loss=0.1349, simple_loss=0.209, pruned_loss=0.0304, over 973220.66 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 13:02:21,516 INFO [train.py:715] (0/8) Epoch 15, batch 29150, loss[loss=0.1257, simple_loss=0.2106, pruned_loss=0.02043, over 4901.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2092, pruned_loss=0.03017, over 972567.74 frames.], batch size: 19, lr: 1.45e-04 2022-05-08 13:03:00,528 INFO [train.py:715] (0/8) Epoch 15, batch 29200, loss[loss=0.12, simple_loss=0.2038, pruned_loss=0.01809, over 4936.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03079, over 972841.91 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 13:03:38,940 INFO [train.py:715] (0/8) Epoch 15, batch 29250, loss[loss=0.1446, simple_loss=0.212, pruned_loss=0.03862, over 4877.00 frames.], tot_loss[loss=0.135, simple_loss=0.2091, pruned_loss=0.03045, over 972140.78 frames.], batch size: 39, lr: 1.45e-04 2022-05-08 13:04:17,998 INFO [train.py:715] (0/8) Epoch 15, batch 29300, loss[loss=0.1267, simple_loss=0.2024, pruned_loss=0.02553, over 4645.00 frames.], tot_loss[loss=0.1361, simple_loss=0.2101, pruned_loss=0.031, over 971307.73 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 13:04:56,887 INFO [train.py:715] (0/8) Epoch 15, batch 29350, loss[loss=0.1386, simple_loss=0.2201, pruned_loss=0.0286, over 4945.00 frames.], tot_loss[loss=0.1357, simple_loss=0.2097, pruned_loss=0.03091, over 971869.27 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 13:05:35,474 INFO [train.py:715] (0/8) Epoch 15, batch 29400, loss[loss=0.1372, simple_loss=0.2128, pruned_loss=0.03082, over 4922.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03066, over 972014.58 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 13:06:14,531 INFO [train.py:715] (0/8) Epoch 15, batch 29450, loss[loss=0.1523, simple_loss=0.2102, pruned_loss=0.04717, over 4854.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03063, over 971914.87 frames.], batch size: 32, lr: 1.45e-04 2022-05-08 13:06:53,807 INFO [train.py:715] (0/8) Epoch 15, batch 29500, loss[loss=0.142, simple_loss=0.2202, pruned_loss=0.03188, over 4961.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03047, over 972885.18 frames.], batch size: 35, lr: 1.45e-04 2022-05-08 13:07:31,953 INFO [train.py:715] (0/8) Epoch 15, batch 29550, loss[loss=0.1389, simple_loss=0.2176, pruned_loss=0.0301, over 4868.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.03022, over 973240.98 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 13:08:09,732 INFO [train.py:715] (0/8) Epoch 15, batch 29600, loss[loss=0.1608, simple_loss=0.223, pruned_loss=0.04928, over 4836.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2086, pruned_loss=0.03012, over 973065.71 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:08:48,777 INFO [train.py:715] (0/8) Epoch 15, batch 29650, loss[loss=0.122, simple_loss=0.196, pruned_loss=0.02403, over 4986.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.0298, over 973183.01 frames.], batch size: 31, lr: 1.45e-04 2022-05-08 13:09:27,541 INFO [train.py:715] (0/8) Epoch 15, batch 29700, loss[loss=0.1666, simple_loss=0.2318, pruned_loss=0.05071, over 4815.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2081, pruned_loss=0.03019, over 972332.25 frames.], batch size: 27, lr: 1.45e-04 2022-05-08 13:10:05,826 INFO [train.py:715] (0/8) Epoch 15, batch 29750, loss[loss=0.108, simple_loss=0.1869, pruned_loss=0.01455, over 4880.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2073, pruned_loss=0.02999, over 972697.43 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 13:10:43,492 INFO [train.py:715] (0/8) Epoch 15, batch 29800, loss[loss=0.1235, simple_loss=0.1922, pruned_loss=0.02736, over 4989.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02974, over 973682.05 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:11:22,785 INFO [train.py:715] (0/8) Epoch 15, batch 29850, loss[loss=0.1173, simple_loss=0.1951, pruned_loss=0.01974, over 4820.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02957, over 974011.10 frames.], batch size: 27, lr: 1.45e-04 2022-05-08 13:11:29,443 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-552000.pt 2022-05-08 13:12:04,495 INFO [train.py:715] (0/8) Epoch 15, batch 29900, loss[loss=0.1357, simple_loss=0.2129, pruned_loss=0.02923, over 4682.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.02998, over 972686.46 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:12:43,056 INFO [train.py:715] (0/8) Epoch 15, batch 29950, loss[loss=0.1371, simple_loss=0.2128, pruned_loss=0.03071, over 4802.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02987, over 972270.33 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 13:13:21,387 INFO [train.py:715] (0/8) Epoch 15, batch 30000, loss[loss=0.122, simple_loss=0.1915, pruned_loss=0.02621, over 4848.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2086, pruned_loss=0.02984, over 972450.12 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:13:21,388 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 13:13:30,916 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1049, simple_loss=0.1885, pruned_loss=0.01066, over 914524.00 frames. 2022-05-08 13:14:09,966 INFO [train.py:715] (0/8) Epoch 15, batch 30050, loss[loss=0.128, simple_loss=0.1965, pruned_loss=0.0298, over 4923.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02967, over 972791.94 frames.], batch size: 23, lr: 1.45e-04 2022-05-08 13:14:49,055 INFO [train.py:715] (0/8) Epoch 15, batch 30100, loss[loss=0.1097, simple_loss=0.1731, pruned_loss=0.02314, over 4832.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2082, pruned_loss=0.02954, over 972305.63 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:15:28,214 INFO [train.py:715] (0/8) Epoch 15, batch 30150, loss[loss=0.1177, simple_loss=0.1911, pruned_loss=0.02213, over 4784.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02963, over 972339.27 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 13:16:07,077 INFO [train.py:715] (0/8) Epoch 15, batch 30200, loss[loss=0.149, simple_loss=0.2244, pruned_loss=0.03677, over 4972.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2081, pruned_loss=0.02944, over 972689.71 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:16:46,372 INFO [train.py:715] (0/8) Epoch 15, batch 30250, loss[loss=0.1185, simple_loss=0.1861, pruned_loss=0.02542, over 4907.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2088, pruned_loss=0.02975, over 972503.61 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 13:17:25,195 INFO [train.py:715] (0/8) Epoch 15, batch 30300, loss[loss=0.1411, simple_loss=0.2208, pruned_loss=0.03075, over 4959.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02994, over 973579.08 frames.], batch size: 35, lr: 1.45e-04 2022-05-08 13:18:03,166 INFO [train.py:715] (0/8) Epoch 15, batch 30350, loss[loss=0.1447, simple_loss=0.2322, pruned_loss=0.02859, over 4879.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02992, over 973369.64 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 13:18:42,388 INFO [train.py:715] (0/8) Epoch 15, batch 30400, loss[loss=0.1219, simple_loss=0.1951, pruned_loss=0.02434, over 4757.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02968, over 971726.74 frames.], batch size: 19, lr: 1.45e-04 2022-05-08 13:19:21,251 INFO [train.py:715] (0/8) Epoch 15, batch 30450, loss[loss=0.1349, simple_loss=0.2103, pruned_loss=0.02976, over 4934.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02975, over 973128.86 frames.], batch size: 23, lr: 1.45e-04 2022-05-08 13:20:00,130 INFO [train.py:715] (0/8) Epoch 15, batch 30500, loss[loss=0.1472, simple_loss=0.227, pruned_loss=0.03375, over 4988.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02991, over 973465.21 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:20:38,341 INFO [train.py:715] (0/8) Epoch 15, batch 30550, loss[loss=0.1431, simple_loss=0.2206, pruned_loss=0.03278, over 4842.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2088, pruned_loss=0.02999, over 972288.66 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 13:21:17,385 INFO [train.py:715] (0/8) Epoch 15, batch 30600, loss[loss=0.1388, simple_loss=0.2043, pruned_loss=0.03667, over 4795.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2086, pruned_loss=0.03002, over 972854.42 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 13:21:56,191 INFO [train.py:715] (0/8) Epoch 15, batch 30650, loss[loss=0.1557, simple_loss=0.2203, pruned_loss=0.04556, over 4970.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03007, over 972961.31 frames.], batch size: 35, lr: 1.45e-04 2022-05-08 13:22:34,335 INFO [train.py:715] (0/8) Epoch 15, batch 30700, loss[loss=0.1389, simple_loss=0.2083, pruned_loss=0.03471, over 4772.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.03026, over 973554.17 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:23:13,401 INFO [train.py:715] (0/8) Epoch 15, batch 30750, loss[loss=0.1289, simple_loss=0.2181, pruned_loss=0.01983, over 4832.00 frames.], tot_loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.03029, over 973001.53 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:23:52,073 INFO [train.py:715] (0/8) Epoch 15, batch 30800, loss[loss=0.1262, simple_loss=0.2087, pruned_loss=0.02188, over 4891.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02979, over 973390.96 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 13:24:30,178 INFO [train.py:715] (0/8) Epoch 15, batch 30850, loss[loss=0.1105, simple_loss=0.1807, pruned_loss=0.02016, over 4830.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2077, pruned_loss=0.03006, over 973319.60 frames.], batch size: 25, lr: 1.45e-04 2022-05-08 13:25:08,414 INFO [train.py:715] (0/8) Epoch 15, batch 30900, loss[loss=0.1431, simple_loss=0.2219, pruned_loss=0.03219, over 4901.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.0301, over 972375.61 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 13:25:46,853 INFO [train.py:715] (0/8) Epoch 15, batch 30950, loss[loss=0.1296, simple_loss=0.2054, pruned_loss=0.02691, over 4983.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03013, over 971964.64 frames.], batch size: 35, lr: 1.45e-04 2022-05-08 13:26:25,008 INFO [train.py:715] (0/8) Epoch 15, batch 31000, loss[loss=0.1414, simple_loss=0.2248, pruned_loss=0.02901, over 4891.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2086, pruned_loss=0.03006, over 972529.14 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 13:27:02,420 INFO [train.py:715] (0/8) Epoch 15, batch 31050, loss[loss=0.1156, simple_loss=0.1962, pruned_loss=0.01752, over 4822.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.02998, over 973288.75 frames.], batch size: 26, lr: 1.45e-04 2022-05-08 13:27:40,732 INFO [train.py:715] (0/8) Epoch 15, batch 31100, loss[loss=0.1328, simple_loss=0.2, pruned_loss=0.03275, over 4793.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02993, over 974152.20 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:28:18,884 INFO [train.py:715] (0/8) Epoch 15, batch 31150, loss[loss=0.1093, simple_loss=0.1859, pruned_loss=0.01632, over 4863.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02955, over 974078.28 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 13:28:57,276 INFO [train.py:715] (0/8) Epoch 15, batch 31200, loss[loss=0.1299, simple_loss=0.2088, pruned_loss=0.02554, over 4987.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02943, over 974273.89 frames.], batch size: 25, lr: 1.45e-04 2022-05-08 13:29:34,873 INFO [train.py:715] (0/8) Epoch 15, batch 31250, loss[loss=0.1404, simple_loss=0.2131, pruned_loss=0.03383, over 4939.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02953, over 973135.51 frames.], batch size: 35, lr: 1.45e-04 2022-05-08 13:30:13,195 INFO [train.py:715] (0/8) Epoch 15, batch 31300, loss[loss=0.1139, simple_loss=0.1934, pruned_loss=0.01724, over 4786.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.0293, over 973422.56 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 13:30:51,246 INFO [train.py:715] (0/8) Epoch 15, batch 31350, loss[loss=0.1069, simple_loss=0.1816, pruned_loss=0.01611, over 4792.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02932, over 973081.95 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 13:31:28,509 INFO [train.py:715] (0/8) Epoch 15, batch 31400, loss[loss=0.1388, simple_loss=0.2111, pruned_loss=0.03324, over 4988.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02903, over 972946.41 frames.], batch size: 31, lr: 1.45e-04 2022-05-08 13:32:06,852 INFO [train.py:715] (0/8) Epoch 15, batch 31450, loss[loss=0.1268, simple_loss=0.2057, pruned_loss=0.02397, over 4807.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02907, over 973005.91 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 13:32:45,115 INFO [train.py:715] (0/8) Epoch 15, batch 31500, loss[loss=0.1109, simple_loss=0.1892, pruned_loss=0.0163, over 4759.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02856, over 973299.72 frames.], batch size: 12, lr: 1.45e-04 2022-05-08 13:33:23,443 INFO [train.py:715] (0/8) Epoch 15, batch 31550, loss[loss=0.124, simple_loss=0.1998, pruned_loss=0.02414, over 4779.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2059, pruned_loss=0.02895, over 972545.60 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:34:01,210 INFO [train.py:715] (0/8) Epoch 15, batch 31600, loss[loss=0.1188, simple_loss=0.1926, pruned_loss=0.02247, over 4907.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.0291, over 972359.43 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 13:34:39,665 INFO [train.py:715] (0/8) Epoch 15, batch 31650, loss[loss=0.1495, simple_loss=0.2233, pruned_loss=0.03779, over 4791.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02918, over 972470.02 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 13:35:17,998 INFO [train.py:715] (0/8) Epoch 15, batch 31700, loss[loss=0.1411, simple_loss=0.2187, pruned_loss=0.03174, over 4934.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02919, over 973359.91 frames.], batch size: 21, lr: 1.45e-04 2022-05-08 13:35:55,493 INFO [train.py:715] (0/8) Epoch 15, batch 31750, loss[loss=0.1367, simple_loss=0.2038, pruned_loss=0.0348, over 4976.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2082, pruned_loss=0.03043, over 973146.11 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:36:34,375 INFO [train.py:715] (0/8) Epoch 15, batch 31800, loss[loss=0.1378, simple_loss=0.2108, pruned_loss=0.03239, over 4773.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2076, pruned_loss=0.02999, over 972553.58 frames.], batch size: 19, lr: 1.45e-04 2022-05-08 13:37:12,847 INFO [train.py:715] (0/8) Epoch 15, batch 31850, loss[loss=0.1308, simple_loss=0.1997, pruned_loss=0.03098, over 4975.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2069, pruned_loss=0.03001, over 973142.02 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:37:52,386 INFO [train.py:715] (0/8) Epoch 15, batch 31900, loss[loss=0.1279, simple_loss=0.211, pruned_loss=0.02243, over 4769.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03008, over 973091.73 frames.], batch size: 18, lr: 1.45e-04 2022-05-08 13:38:29,675 INFO [train.py:715] (0/8) Epoch 15, batch 31950, loss[loss=0.1313, simple_loss=0.2157, pruned_loss=0.02349, over 4893.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03016, over 972641.37 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 13:39:08,338 INFO [train.py:715] (0/8) Epoch 15, batch 32000, loss[loss=0.138, simple_loss=0.2157, pruned_loss=0.03021, over 4948.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02964, over 972787.50 frames.], batch size: 39, lr: 1.45e-04 2022-05-08 13:39:46,515 INFO [train.py:715] (0/8) Epoch 15, batch 32050, loss[loss=0.117, simple_loss=0.1887, pruned_loss=0.02265, over 4792.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02968, over 971938.97 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 13:40:23,945 INFO [train.py:715] (0/8) Epoch 15, batch 32100, loss[loss=0.1204, simple_loss=0.1996, pruned_loss=0.02056, over 4807.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02926, over 972228.76 frames.], batch size: 25, lr: 1.45e-04 2022-05-08 13:41:02,339 INFO [train.py:715] (0/8) Epoch 15, batch 32150, loss[loss=0.1356, simple_loss=0.212, pruned_loss=0.02958, over 4850.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02948, over 972791.08 frames.], batch size: 22, lr: 1.45e-04 2022-05-08 13:41:40,500 INFO [train.py:715] (0/8) Epoch 15, batch 32200, loss[loss=0.1196, simple_loss=0.1937, pruned_loss=0.02277, over 4761.00 frames.], tot_loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.02977, over 972232.71 frames.], batch size: 17, lr: 1.45e-04 2022-05-08 13:42:19,020 INFO [train.py:715] (0/8) Epoch 15, batch 32250, loss[loss=0.1496, simple_loss=0.2309, pruned_loss=0.03414, over 4707.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02977, over 971140.58 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:42:56,887 INFO [train.py:715] (0/8) Epoch 15, batch 32300, loss[loss=0.139, simple_loss=0.2265, pruned_loss=0.02579, over 4843.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2079, pruned_loss=0.02932, over 971381.72 frames.], batch size: 26, lr: 1.45e-04 2022-05-08 13:43:35,756 INFO [train.py:715] (0/8) Epoch 15, batch 32350, loss[loss=0.1322, simple_loss=0.2036, pruned_loss=0.03044, over 4710.00 frames.], tot_loss[loss=0.1334, simple_loss=0.208, pruned_loss=0.02936, over 971196.20 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:44:14,151 INFO [train.py:715] (0/8) Epoch 15, batch 32400, loss[loss=0.1067, simple_loss=0.1749, pruned_loss=0.01928, over 4968.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02926, over 972014.36 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:44:51,906 INFO [train.py:715] (0/8) Epoch 15, batch 32450, loss[loss=0.1559, simple_loss=0.217, pruned_loss=0.04737, over 4885.00 frames.], tot_loss[loss=0.133, simple_loss=0.2067, pruned_loss=0.02959, over 972288.78 frames.], batch size: 16, lr: 1.45e-04 2022-05-08 13:45:30,472 INFO [train.py:715] (0/8) Epoch 15, batch 32500, loss[loss=0.1115, simple_loss=0.1896, pruned_loss=0.01676, over 4638.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2065, pruned_loss=0.0294, over 971993.85 frames.], batch size: 13, lr: 1.45e-04 2022-05-08 13:46:08,933 INFO [train.py:715] (0/8) Epoch 15, batch 32550, loss[loss=0.142, simple_loss=0.2159, pruned_loss=0.03403, over 4834.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2061, pruned_loss=0.02924, over 971361.46 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:46:47,812 INFO [train.py:715] (0/8) Epoch 15, batch 32600, loss[loss=0.1366, simple_loss=0.2049, pruned_loss=0.03411, over 4847.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2054, pruned_loss=0.02917, over 971591.39 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 13:47:26,414 INFO [train.py:715] (0/8) Epoch 15, batch 32650, loss[loss=0.1444, simple_loss=0.2146, pruned_loss=0.03715, over 4825.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2054, pruned_loss=0.0289, over 972035.71 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:48:05,084 INFO [train.py:715] (0/8) Epoch 15, batch 32700, loss[loss=0.1311, simple_loss=0.2044, pruned_loss=0.02894, over 4831.00 frames.], tot_loss[loss=0.131, simple_loss=0.2049, pruned_loss=0.0286, over 972551.19 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 13:48:43,310 INFO [train.py:715] (0/8) Epoch 15, batch 32750, loss[loss=0.1418, simple_loss=0.2212, pruned_loss=0.03122, over 4850.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2057, pruned_loss=0.02929, over 972591.44 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 13:49:21,518 INFO [train.py:715] (0/8) Epoch 15, batch 32800, loss[loss=0.1256, simple_loss=0.2013, pruned_loss=0.02498, over 4794.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02901, over 973123.23 frames.], batch size: 24, lr: 1.45e-04 2022-05-08 13:49:59,265 INFO [train.py:715] (0/8) Epoch 15, batch 32850, loss[loss=0.1537, simple_loss=0.2262, pruned_loss=0.04055, over 4777.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02939, over 972054.33 frames.], batch size: 14, lr: 1.45e-04 2022-05-08 13:50:37,483 INFO [train.py:715] (0/8) Epoch 15, batch 32900, loss[loss=0.1232, simple_loss=0.1972, pruned_loss=0.02458, over 4833.00 frames.], tot_loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02962, over 972195.33 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 13:51:16,076 INFO [train.py:715] (0/8) Epoch 15, batch 32950, loss[loss=0.1264, simple_loss=0.1985, pruned_loss=0.02712, over 4834.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2063, pruned_loss=0.02947, over 971944.20 frames.], batch size: 30, lr: 1.45e-04 2022-05-08 13:51:54,463 INFO [train.py:715] (0/8) Epoch 15, batch 33000, loss[loss=0.1568, simple_loss=0.2294, pruned_loss=0.04212, over 4841.00 frames.], tot_loss[loss=0.133, simple_loss=0.2066, pruned_loss=0.02968, over 971027.10 frames.], batch size: 20, lr: 1.45e-04 2022-05-08 13:51:54,464 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 13:52:03,986 INFO [train.py:742] (0/8) Epoch 15, validation: loss=0.1052, simple_loss=0.1886, pruned_loss=0.01088, over 914524.00 frames. 2022-05-08 13:52:42,022 INFO [train.py:715] (0/8) Epoch 15, batch 33050, loss[loss=0.1196, simple_loss=0.1917, pruned_loss=0.02373, over 4988.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2063, pruned_loss=0.02946, over 971969.01 frames.], batch size: 28, lr: 1.45e-04 2022-05-08 13:53:20,374 INFO [train.py:715] (0/8) Epoch 15, batch 33100, loss[loss=0.135, simple_loss=0.2306, pruned_loss=0.01972, over 4987.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2066, pruned_loss=0.02964, over 973026.12 frames.], batch size: 15, lr: 1.45e-04 2022-05-08 13:53:58,081 INFO [train.py:715] (0/8) Epoch 15, batch 33150, loss[loss=0.1429, simple_loss=0.2219, pruned_loss=0.03199, over 4781.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2072, pruned_loss=0.02985, over 973293.76 frames.], batch size: 17, lr: 1.44e-04 2022-05-08 13:54:37,160 INFO [train.py:715] (0/8) Epoch 15, batch 33200, loss[loss=0.1249, simple_loss=0.2035, pruned_loss=0.02316, over 4791.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02984, over 971917.30 frames.], batch size: 17, lr: 1.44e-04 2022-05-08 13:55:15,595 INFO [train.py:715] (0/8) Epoch 15, batch 33250, loss[loss=0.1237, simple_loss=0.2036, pruned_loss=0.02191, over 4878.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.0298, over 971799.19 frames.], batch size: 22, lr: 1.44e-04 2022-05-08 13:55:53,704 INFO [train.py:715] (0/8) Epoch 15, batch 33300, loss[loss=0.1517, simple_loss=0.2384, pruned_loss=0.03248, over 4916.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03031, over 970955.72 frames.], batch size: 17, lr: 1.44e-04 2022-05-08 13:56:31,674 INFO [train.py:715] (0/8) Epoch 15, batch 33350, loss[loss=0.1542, simple_loss=0.2346, pruned_loss=0.03691, over 4832.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03009, over 971096.34 frames.], batch size: 27, lr: 1.44e-04 2022-05-08 13:57:09,331 INFO [train.py:715] (0/8) Epoch 15, batch 33400, loss[loss=0.1314, simple_loss=0.2005, pruned_loss=0.03118, over 4902.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03039, over 970639.06 frames.], batch size: 19, lr: 1.44e-04 2022-05-08 13:57:47,383 INFO [train.py:715] (0/8) Epoch 15, batch 33450, loss[loss=0.1239, simple_loss=0.1898, pruned_loss=0.02904, over 4970.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03037, over 971323.63 frames.], batch size: 15, lr: 1.44e-04 2022-05-08 13:58:25,100 INFO [train.py:715] (0/8) Epoch 15, batch 33500, loss[loss=0.1122, simple_loss=0.1917, pruned_loss=0.01642, over 4949.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.03017, over 971758.96 frames.], batch size: 21, lr: 1.44e-04 2022-05-08 13:59:02,922 INFO [train.py:715] (0/8) Epoch 15, batch 33550, loss[loss=0.1381, simple_loss=0.2015, pruned_loss=0.03737, over 4926.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.0297, over 971924.65 frames.], batch size: 18, lr: 1.44e-04 2022-05-08 13:59:40,599 INFO [train.py:715] (0/8) Epoch 15, batch 33600, loss[loss=0.1313, simple_loss=0.1932, pruned_loss=0.03466, over 4879.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02974, over 972176.85 frames.], batch size: 20, lr: 1.44e-04 2022-05-08 14:00:18,650 INFO [train.py:715] (0/8) Epoch 15, batch 33650, loss[loss=0.1298, simple_loss=0.2061, pruned_loss=0.02681, over 4787.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2087, pruned_loss=0.02984, over 972391.80 frames.], batch size: 17, lr: 1.44e-04 2022-05-08 14:00:56,117 INFO [train.py:715] (0/8) Epoch 15, batch 33700, loss[loss=0.1426, simple_loss=0.221, pruned_loss=0.03212, over 4834.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02979, over 972169.44 frames.], batch size: 30, lr: 1.44e-04 2022-05-08 14:01:33,643 INFO [train.py:715] (0/8) Epoch 15, batch 33750, loss[loss=0.147, simple_loss=0.2197, pruned_loss=0.03711, over 4871.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02977, over 972769.03 frames.], batch size: 20, lr: 1.44e-04 2022-05-08 14:02:11,482 INFO [train.py:715] (0/8) Epoch 15, batch 33800, loss[loss=0.1333, simple_loss=0.2176, pruned_loss=0.02456, over 4907.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.0299, over 973471.38 frames.], batch size: 19, lr: 1.44e-04 2022-05-08 14:02:48,672 INFO [train.py:715] (0/8) Epoch 15, batch 33850, loss[loss=0.1211, simple_loss=0.1983, pruned_loss=0.02196, over 4961.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2079, pruned_loss=0.0301, over 973158.99 frames.], batch size: 24, lr: 1.44e-04 2022-05-08 14:03:26,482 INFO [train.py:715] (0/8) Epoch 15, batch 33900, loss[loss=0.126, simple_loss=0.1968, pruned_loss=0.02759, over 4842.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.0299, over 974299.57 frames.], batch size: 30, lr: 1.44e-04 2022-05-08 14:04:04,821 INFO [train.py:715] (0/8) Epoch 15, batch 33950, loss[loss=0.1353, simple_loss=0.2043, pruned_loss=0.03317, over 4768.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2082, pruned_loss=0.02954, over 973590.66 frames.], batch size: 18, lr: 1.44e-04 2022-05-08 14:04:42,872 INFO [train.py:715] (0/8) Epoch 15, batch 34000, loss[loss=0.1202, simple_loss=0.2023, pruned_loss=0.01904, over 4833.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2086, pruned_loss=0.02957, over 973624.72 frames.], batch size: 26, lr: 1.44e-04 2022-05-08 14:05:20,766 INFO [train.py:715] (0/8) Epoch 15, batch 34050, loss[loss=0.1117, simple_loss=0.1866, pruned_loss=0.01839, over 4807.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2088, pruned_loss=0.02989, over 973530.46 frames.], batch size: 21, lr: 1.44e-04 2022-05-08 14:05:58,928 INFO [train.py:715] (0/8) Epoch 15, batch 34100, loss[loss=0.1463, simple_loss=0.2275, pruned_loss=0.0325, over 4699.00 frames.], tot_loss[loss=0.1345, simple_loss=0.209, pruned_loss=0.03, over 972842.86 frames.], batch size: 12, lr: 1.44e-04 2022-05-08 14:06:37,186 INFO [train.py:715] (0/8) Epoch 15, batch 34150, loss[loss=0.1265, simple_loss=0.2068, pruned_loss=0.02313, over 4983.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.03007, over 972596.60 frames.], batch size: 26, lr: 1.44e-04 2022-05-08 14:07:14,885 INFO [train.py:715] (0/8) Epoch 15, batch 34200, loss[loss=0.1223, simple_loss=0.2026, pruned_loss=0.02098, over 4803.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2081, pruned_loss=0.0295, over 972223.62 frames.], batch size: 25, lr: 1.44e-04 2022-05-08 14:07:52,717 INFO [train.py:715] (0/8) Epoch 15, batch 34250, loss[loss=0.1305, simple_loss=0.2185, pruned_loss=0.02127, over 4815.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02914, over 972339.87 frames.], batch size: 25, lr: 1.44e-04 2022-05-08 14:08:30,682 INFO [train.py:715] (0/8) Epoch 15, batch 34300, loss[loss=0.1371, simple_loss=0.2157, pruned_loss=0.0293, over 4978.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02951, over 972900.79 frames.], batch size: 28, lr: 1.44e-04 2022-05-08 14:09:08,609 INFO [train.py:715] (0/8) Epoch 15, batch 34350, loss[loss=0.1269, simple_loss=0.2043, pruned_loss=0.02479, over 4946.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02956, over 972799.71 frames.], batch size: 23, lr: 1.44e-04 2022-05-08 14:09:45,969 INFO [train.py:715] (0/8) Epoch 15, batch 34400, loss[loss=0.1471, simple_loss=0.232, pruned_loss=0.03114, over 4780.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02964, over 972663.56 frames.], batch size: 17, lr: 1.44e-04 2022-05-08 14:10:24,170 INFO [train.py:715] (0/8) Epoch 15, batch 34450, loss[loss=0.1275, simple_loss=0.1968, pruned_loss=0.02908, over 4903.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02956, over 971958.98 frames.], batch size: 17, lr: 1.44e-04 2022-05-08 14:11:02,051 INFO [train.py:715] (0/8) Epoch 15, batch 34500, loss[loss=0.1548, simple_loss=0.2248, pruned_loss=0.04246, over 4758.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02949, over 972088.25 frames.], batch size: 19, lr: 1.44e-04 2022-05-08 14:11:39,388 INFO [train.py:715] (0/8) Epoch 15, batch 34550, loss[loss=0.1399, simple_loss=0.2082, pruned_loss=0.0358, over 4957.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.0294, over 972316.19 frames.], batch size: 21, lr: 1.44e-04 2022-05-08 14:12:17,003 INFO [train.py:715] (0/8) Epoch 15, batch 34600, loss[loss=0.1501, simple_loss=0.2239, pruned_loss=0.03815, over 4975.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.02951, over 972308.12 frames.], batch size: 24, lr: 1.44e-04 2022-05-08 14:12:54,929 INFO [train.py:715] (0/8) Epoch 15, batch 34650, loss[loss=0.1404, simple_loss=0.212, pruned_loss=0.03442, over 4872.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2082, pruned_loss=0.02983, over 972551.82 frames.], batch size: 32, lr: 1.44e-04 2022-05-08 14:13:32,472 INFO [train.py:715] (0/8) Epoch 15, batch 34700, loss[loss=0.1036, simple_loss=0.1734, pruned_loss=0.01686, over 4824.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02989, over 972288.42 frames.], batch size: 12, lr: 1.44e-04 2022-05-08 14:14:09,601 INFO [train.py:715] (0/8) Epoch 15, batch 34750, loss[loss=0.1845, simple_loss=0.2553, pruned_loss=0.05688, over 4769.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2082, pruned_loss=0.02982, over 971831.02 frames.], batch size: 19, lr: 1.44e-04 2022-05-08 14:14:44,837 INFO [train.py:715] (0/8) Epoch 15, batch 34800, loss[loss=0.147, simple_loss=0.2245, pruned_loss=0.03471, over 4910.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2077, pruned_loss=0.02999, over 972201.07 frames.], batch size: 18, lr: 1.44e-04 2022-05-08 14:14:52,795 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-15.pt 2022-05-08 14:15:33,453 INFO [train.py:715] (0/8) Epoch 16, batch 0, loss[loss=0.1271, simple_loss=0.2045, pruned_loss=0.02485, over 4883.00 frames.], tot_loss[loss=0.1271, simple_loss=0.2045, pruned_loss=0.02485, over 4883.00 frames.], batch size: 22, lr: 1.40e-04 2022-05-08 14:16:11,640 INFO [train.py:715] (0/8) Epoch 16, batch 50, loss[loss=0.1202, simple_loss=0.1996, pruned_loss=0.02041, over 4961.00 frames.], tot_loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.03126, over 219919.52 frames.], batch size: 39, lr: 1.40e-04 2022-05-08 14:16:50,210 INFO [train.py:715] (0/8) Epoch 16, batch 100, loss[loss=0.1141, simple_loss=0.196, pruned_loss=0.01613, over 4944.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03066, over 386929.14 frames.], batch size: 29, lr: 1.40e-04 2022-05-08 14:17:27,943 INFO [train.py:715] (0/8) Epoch 16, batch 150, loss[loss=0.1386, simple_loss=0.2149, pruned_loss=0.03116, over 4807.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03087, over 516314.72 frames.], batch size: 21, lr: 1.40e-04 2022-05-08 14:18:06,149 INFO [train.py:715] (0/8) Epoch 16, batch 200, loss[loss=0.1516, simple_loss=0.233, pruned_loss=0.03512, over 4834.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02999, over 617352.39 frames.], batch size: 26, lr: 1.40e-04 2022-05-08 14:18:44,276 INFO [train.py:715] (0/8) Epoch 16, batch 250, loss[loss=0.1308, simple_loss=0.2126, pruned_loss=0.02453, over 4779.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03021, over 694990.68 frames.], batch size: 17, lr: 1.40e-04 2022-05-08 14:19:22,598 INFO [train.py:715] (0/8) Epoch 16, batch 300, loss[loss=0.1358, simple_loss=0.1932, pruned_loss=0.03923, over 4871.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2092, pruned_loss=0.03002, over 756212.26 frames.], batch size: 32, lr: 1.40e-04 2022-05-08 14:20:01,026 INFO [train.py:715] (0/8) Epoch 16, batch 350, loss[loss=0.1271, simple_loss=0.1941, pruned_loss=0.03002, over 4752.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2097, pruned_loss=0.03022, over 803541.36 frames.], batch size: 12, lr: 1.40e-04 2022-05-08 14:20:38,705 INFO [train.py:715] (0/8) Epoch 16, batch 400, loss[loss=0.115, simple_loss=0.1959, pruned_loss=0.01707, over 4880.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2096, pruned_loss=0.0301, over 840813.38 frames.], batch size: 16, lr: 1.40e-04 2022-05-08 14:21:17,410 INFO [train.py:715] (0/8) Epoch 16, batch 450, loss[loss=0.155, simple_loss=0.2252, pruned_loss=0.0424, over 4967.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2086, pruned_loss=0.02961, over 870174.52 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:21:55,824 INFO [train.py:715] (0/8) Epoch 16, batch 500, loss[loss=0.1563, simple_loss=0.2258, pruned_loss=0.04338, over 4859.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02973, over 892976.10 frames.], batch size: 32, lr: 1.40e-04 2022-05-08 14:22:33,534 INFO [train.py:715] (0/8) Epoch 16, batch 550, loss[loss=0.1262, simple_loss=0.2126, pruned_loss=0.0199, over 4895.00 frames.], tot_loss[loss=0.135, simple_loss=0.2089, pruned_loss=0.03055, over 910350.03 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 14:23:12,209 INFO [train.py:715] (0/8) Epoch 16, batch 600, loss[loss=0.1138, simple_loss=0.1788, pruned_loss=0.02436, over 4715.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.03034, over 923656.78 frames.], batch size: 12, lr: 1.40e-04 2022-05-08 14:23:50,873 INFO [train.py:715] (0/8) Epoch 16, batch 650, loss[loss=0.1283, simple_loss=0.1937, pruned_loss=0.03144, over 4978.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2071, pruned_loss=0.03004, over 934549.76 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 14:24:28,543 INFO [train.py:715] (0/8) Epoch 16, batch 700, loss[loss=0.1138, simple_loss=0.1672, pruned_loss=0.03026, over 4777.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2063, pruned_loss=0.02977, over 942942.19 frames.], batch size: 12, lr: 1.40e-04 2022-05-08 14:25:06,443 INFO [train.py:715] (0/8) Epoch 16, batch 750, loss[loss=0.1225, simple_loss=0.1975, pruned_loss=0.0238, over 4820.00 frames.], tot_loss[loss=0.1328, simple_loss=0.206, pruned_loss=0.02977, over 949515.70 frames.], batch size: 27, lr: 1.40e-04 2022-05-08 14:25:45,231 INFO [train.py:715] (0/8) Epoch 16, batch 800, loss[loss=0.128, simple_loss=0.2027, pruned_loss=0.02669, over 4933.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2067, pruned_loss=0.02978, over 954660.70 frames.], batch size: 29, lr: 1.40e-04 2022-05-08 14:26:23,530 INFO [train.py:715] (0/8) Epoch 16, batch 850, loss[loss=0.1185, simple_loss=0.1897, pruned_loss=0.02368, over 4949.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2069, pruned_loss=0.02966, over 958775.11 frames.], batch size: 29, lr: 1.40e-04 2022-05-08 14:27:01,573 INFO [train.py:715] (0/8) Epoch 16, batch 900, loss[loss=0.1415, simple_loss=0.2164, pruned_loss=0.03334, over 4839.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.03, over 961607.32 frames.], batch size: 32, lr: 1.40e-04 2022-05-08 14:27:39,690 INFO [train.py:715] (0/8) Epoch 16, batch 950, loss[loss=0.1193, simple_loss=0.2026, pruned_loss=0.01803, over 4966.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02958, over 964297.19 frames.], batch size: 24, lr: 1.40e-04 2022-05-08 14:28:18,126 INFO [train.py:715] (0/8) Epoch 16, batch 1000, loss[loss=0.117, simple_loss=0.1976, pruned_loss=0.01822, over 4826.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02986, over 965875.12 frames.], batch size: 25, lr: 1.40e-04 2022-05-08 14:28:55,784 INFO [train.py:715] (0/8) Epoch 16, batch 1050, loss[loss=0.1215, simple_loss=0.2002, pruned_loss=0.02137, over 4802.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2086, pruned_loss=0.02993, over 966582.97 frames.], batch size: 25, lr: 1.40e-04 2022-05-08 14:29:33,184 INFO [train.py:715] (0/8) Epoch 16, batch 1100, loss[loss=0.1479, simple_loss=0.2128, pruned_loss=0.0415, over 4744.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.0297, over 967875.27 frames.], batch size: 16, lr: 1.40e-04 2022-05-08 14:30:11,810 INFO [train.py:715] (0/8) Epoch 16, batch 1150, loss[loss=0.1199, simple_loss=0.1935, pruned_loss=0.02319, over 4855.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02932, over 968199.20 frames.], batch size: 13, lr: 1.40e-04 2022-05-08 14:30:49,878 INFO [train.py:715] (0/8) Epoch 16, batch 1200, loss[loss=0.1417, simple_loss=0.2244, pruned_loss=0.02955, over 4781.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02968, over 969168.49 frames.], batch size: 18, lr: 1.40e-04 2022-05-08 14:31:27,243 INFO [train.py:715] (0/8) Epoch 16, batch 1250, loss[loss=0.1298, simple_loss=0.2113, pruned_loss=0.02418, over 4985.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02977, over 969866.80 frames.], batch size: 28, lr: 1.40e-04 2022-05-08 14:32:05,201 INFO [train.py:715] (0/8) Epoch 16, batch 1300, loss[loss=0.118, simple_loss=0.1868, pruned_loss=0.02458, over 4778.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2073, pruned_loss=0.03002, over 969933.18 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:32:43,362 INFO [train.py:715] (0/8) Epoch 16, batch 1350, loss[loss=0.1541, simple_loss=0.2162, pruned_loss=0.046, over 4923.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2074, pruned_loss=0.03021, over 969815.76 frames.], batch size: 29, lr: 1.40e-04 2022-05-08 14:33:21,094 INFO [train.py:715] (0/8) Epoch 16, batch 1400, loss[loss=0.1435, simple_loss=0.2213, pruned_loss=0.03285, over 4886.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02983, over 970188.40 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 14:33:59,217 INFO [train.py:715] (0/8) Epoch 16, batch 1450, loss[loss=0.1578, simple_loss=0.2308, pruned_loss=0.04239, over 4838.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02956, over 970431.37 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 14:34:37,198 INFO [train.py:715] (0/8) Epoch 16, batch 1500, loss[loss=0.1155, simple_loss=0.1848, pruned_loss=0.02305, over 4794.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03024, over 970715.48 frames.], batch size: 12, lr: 1.40e-04 2022-05-08 14:35:14,919 INFO [train.py:715] (0/8) Epoch 16, batch 1550, loss[loss=0.1187, simple_loss=0.1971, pruned_loss=0.02008, over 4915.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.0301, over 971097.36 frames.], batch size: 18, lr: 1.40e-04 2022-05-08 14:35:52,767 INFO [train.py:715] (0/8) Epoch 16, batch 1600, loss[loss=0.134, simple_loss=0.2163, pruned_loss=0.02581, over 4838.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02995, over 971330.76 frames.], batch size: 13, lr: 1.40e-04 2022-05-08 14:36:30,166 INFO [train.py:715] (0/8) Epoch 16, batch 1650, loss[loss=0.1201, simple_loss=0.1948, pruned_loss=0.02265, over 4800.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02998, over 971435.48 frames.], batch size: 12, lr: 1.40e-04 2022-05-08 14:37:07,991 INFO [train.py:715] (0/8) Epoch 16, batch 1700, loss[loss=0.1047, simple_loss=0.186, pruned_loss=0.01175, over 4785.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2078, pruned_loss=0.03025, over 971421.95 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:37:46,150 INFO [train.py:715] (0/8) Epoch 16, batch 1750, loss[loss=0.1175, simple_loss=0.1828, pruned_loss=0.02607, over 4772.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02978, over 972217.16 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 14:38:24,065 INFO [train.py:715] (0/8) Epoch 16, batch 1800, loss[loss=0.1439, simple_loss=0.2196, pruned_loss=0.0341, over 4897.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2079, pruned_loss=0.03041, over 971891.23 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 14:39:02,375 INFO [train.py:715] (0/8) Epoch 16, batch 1850, loss[loss=0.1149, simple_loss=0.1943, pruned_loss=0.01774, over 4965.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2074, pruned_loss=0.03008, over 970982.76 frames.], batch size: 24, lr: 1.40e-04 2022-05-08 14:39:41,002 INFO [train.py:715] (0/8) Epoch 16, batch 1900, loss[loss=0.1422, simple_loss=0.2158, pruned_loss=0.03432, over 4957.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2069, pruned_loss=0.02986, over 973186.90 frames.], batch size: 39, lr: 1.40e-04 2022-05-08 14:40:18,869 INFO [train.py:715] (0/8) Epoch 16, batch 1950, loss[loss=0.1399, simple_loss=0.2117, pruned_loss=0.03401, over 4887.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2077, pruned_loss=0.03004, over 973244.49 frames.], batch size: 22, lr: 1.40e-04 2022-05-08 14:40:57,047 INFO [train.py:715] (0/8) Epoch 16, batch 2000, loss[loss=0.1237, simple_loss=0.2061, pruned_loss=0.02071, over 4822.00 frames.], tot_loss[loss=0.1337, simple_loss=0.208, pruned_loss=0.02976, over 973215.49 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 14:41:35,845 INFO [train.py:715] (0/8) Epoch 16, batch 2050, loss[loss=0.1108, simple_loss=0.1844, pruned_loss=0.01856, over 4962.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2078, pruned_loss=0.02976, over 973324.76 frames.], batch size: 35, lr: 1.40e-04 2022-05-08 14:42:14,572 INFO [train.py:715] (0/8) Epoch 16, batch 2100, loss[loss=0.1213, simple_loss=0.1937, pruned_loss=0.02445, over 4818.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02953, over 972621.22 frames.], batch size: 25, lr: 1.40e-04 2022-05-08 14:42:52,427 INFO [train.py:715] (0/8) Epoch 16, batch 2150, loss[loss=0.1541, simple_loss=0.2271, pruned_loss=0.04051, over 4933.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02942, over 972311.35 frames.], batch size: 18, lr: 1.40e-04 2022-05-08 14:43:31,556 INFO [train.py:715] (0/8) Epoch 16, batch 2200, loss[loss=0.1364, simple_loss=0.2149, pruned_loss=0.02894, over 4797.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02955, over 971843.14 frames.], batch size: 18, lr: 1.40e-04 2022-05-08 14:44:09,848 INFO [train.py:715] (0/8) Epoch 16, batch 2250, loss[loss=0.1344, simple_loss=0.209, pruned_loss=0.02994, over 4850.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02908, over 972341.64 frames.], batch size: 20, lr: 1.40e-04 2022-05-08 14:44:47,481 INFO [train.py:715] (0/8) Epoch 16, batch 2300, loss[loss=0.1218, simple_loss=0.1968, pruned_loss=0.02339, over 4983.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.0294, over 972613.56 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:45:25,049 INFO [train.py:715] (0/8) Epoch 16, batch 2350, loss[loss=0.1245, simple_loss=0.1956, pruned_loss=0.0267, over 4765.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2072, pruned_loss=0.02978, over 973042.35 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:46:03,341 INFO [train.py:715] (0/8) Epoch 16, batch 2400, loss[loss=0.1234, simple_loss=0.1989, pruned_loss=0.02401, over 4928.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02932, over 972748.90 frames.], batch size: 29, lr: 1.40e-04 2022-05-08 14:46:41,417 INFO [train.py:715] (0/8) Epoch 16, batch 2450, loss[loss=0.1388, simple_loss=0.2056, pruned_loss=0.036, over 4747.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02942, over 972733.42 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 14:47:18,879 INFO [train.py:715] (0/8) Epoch 16, batch 2500, loss[loss=0.1231, simple_loss=0.18, pruned_loss=0.03308, over 4979.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02927, over 972002.18 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:47:57,271 INFO [train.py:715] (0/8) Epoch 16, batch 2550, loss[loss=0.1444, simple_loss=0.2133, pruned_loss=0.03776, over 4945.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02907, over 972683.21 frames.], batch size: 35, lr: 1.40e-04 2022-05-08 14:48:35,424 INFO [train.py:715] (0/8) Epoch 16, batch 2600, loss[loss=0.1225, simple_loss=0.1994, pruned_loss=0.02279, over 4817.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0297, over 973500.94 frames.], batch size: 27, lr: 1.40e-04 2022-05-08 14:49:13,157 INFO [train.py:715] (0/8) Epoch 16, batch 2650, loss[loss=0.15, simple_loss=0.235, pruned_loss=0.03251, over 4984.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02915, over 973526.83 frames.], batch size: 25, lr: 1.40e-04 2022-05-08 14:49:51,045 INFO [train.py:715] (0/8) Epoch 16, batch 2700, loss[loss=0.1204, simple_loss=0.2015, pruned_loss=0.01963, over 4765.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.0296, over 972465.57 frames.], batch size: 16, lr: 1.40e-04 2022-05-08 14:50:29,656 INFO [train.py:715] (0/8) Epoch 16, batch 2750, loss[loss=0.1195, simple_loss=0.1937, pruned_loss=0.02263, over 4819.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02966, over 972303.46 frames.], batch size: 27, lr: 1.40e-04 2022-05-08 14:51:08,569 INFO [train.py:715] (0/8) Epoch 16, batch 2800, loss[loss=0.1065, simple_loss=0.1867, pruned_loss=0.01313, over 4848.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.03003, over 972904.80 frames.], batch size: 13, lr: 1.40e-04 2022-05-08 14:51:46,947 INFO [train.py:715] (0/8) Epoch 16, batch 2850, loss[loss=0.1323, simple_loss=0.2098, pruned_loss=0.02747, over 4976.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2074, pruned_loss=0.03, over 973582.63 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:52:24,996 INFO [train.py:715] (0/8) Epoch 16, batch 2900, loss[loss=0.175, simple_loss=0.2455, pruned_loss=0.05229, over 4692.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2074, pruned_loss=0.03018, over 973118.37 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 14:53:03,771 INFO [train.py:715] (0/8) Epoch 16, batch 2950, loss[loss=0.1172, simple_loss=0.1864, pruned_loss=0.02402, over 4988.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03026, over 972885.68 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 14:53:41,755 INFO [train.py:715] (0/8) Epoch 16, batch 3000, loss[loss=0.1417, simple_loss=0.2149, pruned_loss=0.03423, over 4810.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02996, over 972668.60 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 14:53:41,756 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 14:53:51,191 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.105, simple_loss=0.1885, pruned_loss=0.01074, over 914524.00 frames. 2022-05-08 14:54:29,004 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-560000.pt 2022-05-08 14:54:31,352 INFO [train.py:715] (0/8) Epoch 16, batch 3050, loss[loss=0.152, simple_loss=0.2147, pruned_loss=0.04467, over 4793.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02967, over 971715.27 frames.], batch size: 17, lr: 1.40e-04 2022-05-08 14:55:09,455 INFO [train.py:715] (0/8) Epoch 16, batch 3100, loss[loss=0.1283, simple_loss=0.21, pruned_loss=0.02329, over 4820.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02941, over 972439.02 frames.], batch size: 26, lr: 1.40e-04 2022-05-08 14:55:47,847 INFO [train.py:715] (0/8) Epoch 16, batch 3150, loss[loss=0.1595, simple_loss=0.24, pruned_loss=0.03947, over 4884.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2087, pruned_loss=0.02998, over 972349.23 frames.], batch size: 22, lr: 1.40e-04 2022-05-08 14:56:26,002 INFO [train.py:715] (0/8) Epoch 16, batch 3200, loss[loss=0.1284, simple_loss=0.2115, pruned_loss=0.02259, over 4957.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02979, over 972595.90 frames.], batch size: 24, lr: 1.40e-04 2022-05-08 14:57:04,236 INFO [train.py:715] (0/8) Epoch 16, batch 3250, loss[loss=0.1326, simple_loss=0.2173, pruned_loss=0.02393, over 4880.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.0297, over 972594.20 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 14:57:42,061 INFO [train.py:715] (0/8) Epoch 16, batch 3300, loss[loss=0.1316, simple_loss=0.2003, pruned_loss=0.03145, over 4843.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.03003, over 973394.70 frames.], batch size: 30, lr: 1.40e-04 2022-05-08 14:58:20,051 INFO [train.py:715] (0/8) Epoch 16, batch 3350, loss[loss=0.1227, simple_loss=0.1986, pruned_loss=0.02338, over 4829.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03004, over 973146.76 frames.], batch size: 26, lr: 1.40e-04 2022-05-08 14:58:57,927 INFO [train.py:715] (0/8) Epoch 16, batch 3400, loss[loss=0.1311, simple_loss=0.2022, pruned_loss=0.03003, over 4766.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03014, over 972557.17 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 14:59:35,861 INFO [train.py:715] (0/8) Epoch 16, batch 3450, loss[loss=0.1286, simple_loss=0.206, pruned_loss=0.02555, over 4846.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2073, pruned_loss=0.02991, over 972438.14 frames.], batch size: 20, lr: 1.40e-04 2022-05-08 15:00:13,950 INFO [train.py:715] (0/8) Epoch 16, batch 3500, loss[loss=0.1194, simple_loss=0.195, pruned_loss=0.02184, over 4913.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2069, pruned_loss=0.03012, over 973083.16 frames.], batch size: 17, lr: 1.40e-04 2022-05-08 15:00:51,752 INFO [train.py:715] (0/8) Epoch 16, batch 3550, loss[loss=0.163, simple_loss=0.2301, pruned_loss=0.04789, over 4882.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02992, over 972147.46 frames.], batch size: 16, lr: 1.40e-04 2022-05-08 15:01:30,175 INFO [train.py:715] (0/8) Epoch 16, batch 3600, loss[loss=0.134, simple_loss=0.214, pruned_loss=0.02703, over 4899.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03006, over 972069.85 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 15:02:07,895 INFO [train.py:715] (0/8) Epoch 16, batch 3650, loss[loss=0.1399, simple_loss=0.2151, pruned_loss=0.03234, over 4775.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03052, over 971631.49 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 15:02:46,540 INFO [train.py:715] (0/8) Epoch 16, batch 3700, loss[loss=0.1579, simple_loss=0.2311, pruned_loss=0.04235, over 4975.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.03032, over 971363.78 frames.], batch size: 15, lr: 1.40e-04 2022-05-08 15:03:25,023 INFO [train.py:715] (0/8) Epoch 16, batch 3750, loss[loss=0.1422, simple_loss=0.2115, pruned_loss=0.03645, over 4777.00 frames.], tot_loss[loss=0.134, simple_loss=0.2076, pruned_loss=0.03017, over 970428.15 frames.], batch size: 18, lr: 1.40e-04 2022-05-08 15:04:03,388 INFO [train.py:715] (0/8) Epoch 16, batch 3800, loss[loss=0.1307, simple_loss=0.2112, pruned_loss=0.02513, over 4896.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03044, over 971288.47 frames.], batch size: 19, lr: 1.40e-04 2022-05-08 15:04:42,252 INFO [train.py:715] (0/8) Epoch 16, batch 3850, loss[loss=0.1541, simple_loss=0.2253, pruned_loss=0.04142, over 4663.00 frames.], tot_loss[loss=0.134, simple_loss=0.2074, pruned_loss=0.03029, over 971543.50 frames.], batch size: 14, lr: 1.40e-04 2022-05-08 15:05:21,009 INFO [train.py:715] (0/8) Epoch 16, batch 3900, loss[loss=0.1227, simple_loss=0.1876, pruned_loss=0.02893, over 4807.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2077, pruned_loss=0.03036, over 971272.35 frames.], batch size: 13, lr: 1.39e-04 2022-05-08 15:05:58,856 INFO [train.py:715] (0/8) Epoch 16, batch 3950, loss[loss=0.1215, simple_loss=0.2027, pruned_loss=0.02018, over 4902.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2078, pruned_loss=0.03045, over 971446.75 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:06:36,781 INFO [train.py:715] (0/8) Epoch 16, batch 4000, loss[loss=0.136, simple_loss=0.2092, pruned_loss=0.03139, over 4757.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2086, pruned_loss=0.03107, over 971297.24 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:07:14,741 INFO [train.py:715] (0/8) Epoch 16, batch 4050, loss[loss=0.1475, simple_loss=0.2208, pruned_loss=0.03713, over 4745.00 frames.], tot_loss[loss=0.1356, simple_loss=0.209, pruned_loss=0.03106, over 971812.31 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:07:52,143 INFO [train.py:715] (0/8) Epoch 16, batch 4100, loss[loss=0.1191, simple_loss=0.1962, pruned_loss=0.02104, over 4917.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2083, pruned_loss=0.03052, over 971817.75 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:08:29,795 INFO [train.py:715] (0/8) Epoch 16, batch 4150, loss[loss=0.1062, simple_loss=0.1779, pruned_loss=0.01726, over 4923.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03048, over 972100.99 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:09:07,461 INFO [train.py:715] (0/8) Epoch 16, batch 4200, loss[loss=0.1289, simple_loss=0.2088, pruned_loss=0.02451, over 4775.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03055, over 971794.05 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:09:45,631 INFO [train.py:715] (0/8) Epoch 16, batch 4250, loss[loss=0.1467, simple_loss=0.2212, pruned_loss=0.03612, over 4845.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2081, pruned_loss=0.03026, over 971985.25 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:10:23,339 INFO [train.py:715] (0/8) Epoch 16, batch 4300, loss[loss=0.1102, simple_loss=0.1864, pruned_loss=0.01702, over 4862.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03031, over 971850.08 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 15:11:01,192 INFO [train.py:715] (0/8) Epoch 16, batch 4350, loss[loss=0.1175, simple_loss=0.178, pruned_loss=0.02848, over 4770.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2089, pruned_loss=0.03076, over 971132.26 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:11:39,302 INFO [train.py:715] (0/8) Epoch 16, batch 4400, loss[loss=0.1285, simple_loss=0.2124, pruned_loss=0.02229, over 4967.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03036, over 971060.39 frames.], batch size: 39, lr: 1.39e-04 2022-05-08 15:12:17,132 INFO [train.py:715] (0/8) Epoch 16, batch 4450, loss[loss=0.1187, simple_loss=0.2061, pruned_loss=0.01561, over 4984.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03048, over 970624.46 frames.], batch size: 28, lr: 1.39e-04 2022-05-08 15:12:54,748 INFO [train.py:715] (0/8) Epoch 16, batch 4500, loss[loss=0.144, simple_loss=0.2151, pruned_loss=0.0364, over 4948.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2086, pruned_loss=0.03048, over 971191.70 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:13:32,856 INFO [train.py:715] (0/8) Epoch 16, batch 4550, loss[loss=0.137, simple_loss=0.2083, pruned_loss=0.0329, over 4875.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2082, pruned_loss=0.03028, over 971759.69 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:14:11,251 INFO [train.py:715] (0/8) Epoch 16, batch 4600, loss[loss=0.136, simple_loss=0.1997, pruned_loss=0.03608, over 4951.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03041, over 972353.14 frames.], batch size: 24, lr: 1.39e-04 2022-05-08 15:14:49,227 INFO [train.py:715] (0/8) Epoch 16, batch 4650, loss[loss=0.1247, simple_loss=0.1977, pruned_loss=0.02586, over 4821.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03009, over 971939.84 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:15:27,627 INFO [train.py:715] (0/8) Epoch 16, batch 4700, loss[loss=0.1337, simple_loss=0.218, pruned_loss=0.02467, over 4811.00 frames.], tot_loss[loss=0.134, simple_loss=0.2077, pruned_loss=0.03015, over 972215.42 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:16:06,224 INFO [train.py:715] (0/8) Epoch 16, batch 4750, loss[loss=0.1452, simple_loss=0.2213, pruned_loss=0.03455, over 4866.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.02999, over 972527.59 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 15:16:44,825 INFO [train.py:715] (0/8) Epoch 16, batch 4800, loss[loss=0.1513, simple_loss=0.2337, pruned_loss=0.03445, over 4802.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02963, over 973369.23 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:17:23,105 INFO [train.py:715] (0/8) Epoch 16, batch 4850, loss[loss=0.1479, simple_loss=0.2269, pruned_loss=0.03451, over 4924.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02975, over 974342.26 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:18:01,806 INFO [train.py:715] (0/8) Epoch 16, batch 4900, loss[loss=0.1345, simple_loss=0.2123, pruned_loss=0.02835, over 4756.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02982, over 973624.68 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:18:40,680 INFO [train.py:715] (0/8) Epoch 16, batch 4950, loss[loss=0.1333, simple_loss=0.2104, pruned_loss=0.02815, over 4912.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2087, pruned_loss=0.02996, over 973684.36 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 15:19:18,921 INFO [train.py:715] (0/8) Epoch 16, batch 5000, loss[loss=0.1281, simple_loss=0.1988, pruned_loss=0.02869, over 4874.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03028, over 973154.96 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 15:19:57,136 INFO [train.py:715] (0/8) Epoch 16, batch 5050, loss[loss=0.1283, simple_loss=0.2109, pruned_loss=0.0229, over 4837.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2095, pruned_loss=0.03062, over 973310.54 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:20:35,443 INFO [train.py:715] (0/8) Epoch 16, batch 5100, loss[loss=0.1426, simple_loss=0.2223, pruned_loss=0.03145, over 4938.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03036, over 972387.93 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:21:13,346 INFO [train.py:715] (0/8) Epoch 16, batch 5150, loss[loss=0.1443, simple_loss=0.218, pruned_loss=0.03534, over 4801.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03004, over 972652.13 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 15:21:50,904 INFO [train.py:715] (0/8) Epoch 16, batch 5200, loss[loss=0.1324, simple_loss=0.2149, pruned_loss=0.02501, over 4897.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03009, over 972821.08 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:22:28,862 INFO [train.py:715] (0/8) Epoch 16, batch 5250, loss[loss=0.1281, simple_loss=0.204, pruned_loss=0.02606, over 4948.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.03002, over 972937.04 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:23:07,100 INFO [train.py:715] (0/8) Epoch 16, batch 5300, loss[loss=0.1447, simple_loss=0.2183, pruned_loss=0.03548, over 4817.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2091, pruned_loss=0.03011, over 973481.67 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 15:23:45,223 INFO [train.py:715] (0/8) Epoch 16, batch 5350, loss[loss=0.1069, simple_loss=0.1853, pruned_loss=0.01429, over 4875.00 frames.], tot_loss[loss=0.134, simple_loss=0.2087, pruned_loss=0.02964, over 972992.26 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 15:24:23,033 INFO [train.py:715] (0/8) Epoch 16, batch 5400, loss[loss=0.1439, simple_loss=0.215, pruned_loss=0.03637, over 4947.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2081, pruned_loss=0.02938, over 973099.20 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:25:00,886 INFO [train.py:715] (0/8) Epoch 16, batch 5450, loss[loss=0.1454, simple_loss=0.2155, pruned_loss=0.03771, over 4835.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2088, pruned_loss=0.02993, over 974013.30 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:25:38,706 INFO [train.py:715] (0/8) Epoch 16, batch 5500, loss[loss=0.1222, simple_loss=0.1912, pruned_loss=0.0266, over 4935.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2086, pruned_loss=0.02977, over 973542.23 frames.], batch size: 29, lr: 1.39e-04 2022-05-08 15:26:16,322 INFO [train.py:715] (0/8) Epoch 16, batch 5550, loss[loss=0.1415, simple_loss=0.2134, pruned_loss=0.03481, over 4699.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02975, over 973041.35 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:26:54,071 INFO [train.py:715] (0/8) Epoch 16, batch 5600, loss[loss=0.1524, simple_loss=0.2263, pruned_loss=0.03926, over 4921.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2078, pruned_loss=0.02926, over 973047.25 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:27:32,726 INFO [train.py:715] (0/8) Epoch 16, batch 5650, loss[loss=0.1339, simple_loss=0.2139, pruned_loss=0.02695, over 4934.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02939, over 973517.20 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 15:28:10,543 INFO [train.py:715] (0/8) Epoch 16, batch 5700, loss[loss=0.1378, simple_loss=0.2048, pruned_loss=0.03535, over 4716.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02977, over 973516.69 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:28:48,365 INFO [train.py:715] (0/8) Epoch 16, batch 5750, loss[loss=0.1306, simple_loss=0.2066, pruned_loss=0.02725, over 4928.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02916, over 973969.31 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 15:29:26,210 INFO [train.py:715] (0/8) Epoch 16, batch 5800, loss[loss=0.1449, simple_loss=0.217, pruned_loss=0.0364, over 4856.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.0297, over 972709.01 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:30:04,473 INFO [train.py:715] (0/8) Epoch 16, batch 5850, loss[loss=0.138, simple_loss=0.2093, pruned_loss=0.03333, over 4790.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02955, over 972918.77 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:30:42,014 INFO [train.py:715] (0/8) Epoch 16, batch 5900, loss[loss=0.1249, simple_loss=0.2012, pruned_loss=0.02423, over 4874.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02938, over 971994.11 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:31:19,658 INFO [train.py:715] (0/8) Epoch 16, batch 5950, loss[loss=0.1504, simple_loss=0.2374, pruned_loss=0.0317, over 4985.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2072, pruned_loss=0.02991, over 972019.72 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 15:31:58,425 INFO [train.py:715] (0/8) Epoch 16, batch 6000, loss[loss=0.1624, simple_loss=0.2244, pruned_loss=0.05016, over 4877.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2076, pruned_loss=0.03053, over 971974.06 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 15:31:58,427 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 15:32:07,945 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.105, simple_loss=0.1885, pruned_loss=0.01082, over 914524.00 frames. 2022-05-08 15:32:46,976 INFO [train.py:715] (0/8) Epoch 16, batch 6050, loss[loss=0.1442, simple_loss=0.234, pruned_loss=0.02722, over 4889.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2085, pruned_loss=0.03061, over 972436.77 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 15:33:25,024 INFO [train.py:715] (0/8) Epoch 16, batch 6100, loss[loss=0.1424, simple_loss=0.2208, pruned_loss=0.03195, over 4931.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2081, pruned_loss=0.0303, over 973115.47 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 15:34:02,791 INFO [train.py:715] (0/8) Epoch 16, batch 6150, loss[loss=0.1407, simple_loss=0.2095, pruned_loss=0.03593, over 4961.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03009, over 972118.47 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:34:40,932 INFO [train.py:715] (0/8) Epoch 16, batch 6200, loss[loss=0.1472, simple_loss=0.2158, pruned_loss=0.03927, over 4875.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03004, over 972630.95 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:35:19,468 INFO [train.py:715] (0/8) Epoch 16, batch 6250, loss[loss=0.1539, simple_loss=0.21, pruned_loss=0.04886, over 4851.00 frames.], tot_loss[loss=0.1344, simple_loss=0.208, pruned_loss=0.03041, over 972584.00 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:35:57,112 INFO [train.py:715] (0/8) Epoch 16, batch 6300, loss[loss=0.1328, simple_loss=0.2146, pruned_loss=0.02547, over 4951.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2075, pruned_loss=0.03015, over 972915.35 frames.], batch size: 24, lr: 1.39e-04 2022-05-08 15:36:34,882 INFO [train.py:715] (0/8) Epoch 16, batch 6350, loss[loss=0.1219, simple_loss=0.1965, pruned_loss=0.0237, over 4930.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2075, pruned_loss=0.02993, over 972620.64 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 15:37:13,399 INFO [train.py:715] (0/8) Epoch 16, batch 6400, loss[loss=0.1571, simple_loss=0.2317, pruned_loss=0.04129, over 4961.00 frames.], tot_loss[loss=0.134, simple_loss=0.2074, pruned_loss=0.03033, over 973076.20 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:37:51,666 INFO [train.py:715] (0/8) Epoch 16, batch 6450, loss[loss=0.1629, simple_loss=0.2353, pruned_loss=0.04521, over 4954.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2069, pruned_loss=0.02984, over 973323.40 frames.], batch size: 35, lr: 1.39e-04 2022-05-08 15:38:29,438 INFO [train.py:715] (0/8) Epoch 16, batch 6500, loss[loss=0.1344, simple_loss=0.2128, pruned_loss=0.02798, over 4978.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2065, pruned_loss=0.02967, over 973721.96 frames.], batch size: 35, lr: 1.39e-04 2022-05-08 15:39:07,579 INFO [train.py:715] (0/8) Epoch 16, batch 6550, loss[loss=0.117, simple_loss=0.1888, pruned_loss=0.02263, over 4746.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2063, pruned_loss=0.0295, over 974261.66 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 15:39:46,025 INFO [train.py:715] (0/8) Epoch 16, batch 6600, loss[loss=0.1863, simple_loss=0.2744, pruned_loss=0.04906, over 4879.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02982, over 974228.98 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 15:40:23,827 INFO [train.py:715] (0/8) Epoch 16, batch 6650, loss[loss=0.1299, simple_loss=0.1991, pruned_loss=0.03035, over 4991.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02975, over 973970.08 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 15:41:01,684 INFO [train.py:715] (0/8) Epoch 16, batch 6700, loss[loss=0.1317, simple_loss=0.2081, pruned_loss=0.02769, over 4990.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2072, pruned_loss=0.02979, over 974162.14 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 15:41:39,710 INFO [train.py:715] (0/8) Epoch 16, batch 6750, loss[loss=0.1396, simple_loss=0.219, pruned_loss=0.0301, over 4910.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03013, over 973949.39 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:42:17,828 INFO [train.py:715] (0/8) Epoch 16, batch 6800, loss[loss=0.1212, simple_loss=0.2029, pruned_loss=0.01975, over 4939.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2072, pruned_loss=0.03017, over 973364.58 frames.], batch size: 29, lr: 1.39e-04 2022-05-08 15:42:54,809 INFO [train.py:715] (0/8) Epoch 16, batch 6850, loss[loss=0.1121, simple_loss=0.1894, pruned_loss=0.01735, over 4792.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02965, over 973577.84 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 15:43:32,593 INFO [train.py:715] (0/8) Epoch 16, batch 6900, loss[loss=0.1329, simple_loss=0.2066, pruned_loss=0.02966, over 4900.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02972, over 972831.58 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:44:10,711 INFO [train.py:715] (0/8) Epoch 16, batch 6950, loss[loss=0.141, simple_loss=0.2151, pruned_loss=0.03348, over 4933.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02978, over 972128.63 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 15:44:48,417 INFO [train.py:715] (0/8) Epoch 16, batch 7000, loss[loss=0.119, simple_loss=0.1955, pruned_loss=0.02125, over 4782.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.03008, over 971912.41 frames.], batch size: 17, lr: 1.39e-04 2022-05-08 15:45:26,354 INFO [train.py:715] (0/8) Epoch 16, batch 7050, loss[loss=0.1132, simple_loss=0.1906, pruned_loss=0.0179, over 4866.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02987, over 971972.10 frames.], batch size: 20, lr: 1.39e-04 2022-05-08 15:46:04,186 INFO [train.py:715] (0/8) Epoch 16, batch 7100, loss[loss=0.1372, simple_loss=0.2188, pruned_loss=0.02776, over 4845.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2071, pruned_loss=0.0296, over 972440.68 frames.], batch size: 30, lr: 1.39e-04 2022-05-08 15:46:42,643 INFO [train.py:715] (0/8) Epoch 16, batch 7150, loss[loss=0.1616, simple_loss=0.2352, pruned_loss=0.04402, over 4950.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02953, over 971957.59 frames.], batch size: 35, lr: 1.39e-04 2022-05-08 15:47:19,955 INFO [train.py:715] (0/8) Epoch 16, batch 7200, loss[loss=0.1319, simple_loss=0.2116, pruned_loss=0.02605, over 4839.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.0291, over 971542.08 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 15:47:57,926 INFO [train.py:715] (0/8) Epoch 16, batch 7250, loss[loss=0.1875, simple_loss=0.2561, pruned_loss=0.05939, over 4754.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02955, over 972461.41 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 15:48:36,997 INFO [train.py:715] (0/8) Epoch 16, batch 7300, loss[loss=0.135, simple_loss=0.203, pruned_loss=0.03347, over 4976.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02952, over 971422.34 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:49:15,796 INFO [train.py:715] (0/8) Epoch 16, batch 7350, loss[loss=0.1171, simple_loss=0.1996, pruned_loss=0.01726, over 4937.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02964, over 971450.60 frames.], batch size: 29, lr: 1.39e-04 2022-05-08 15:49:55,244 INFO [train.py:715] (0/8) Epoch 16, batch 7400, loss[loss=0.1274, simple_loss=0.1989, pruned_loss=0.02792, over 4972.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02988, over 972579.74 frames.], batch size: 24, lr: 1.39e-04 2022-05-08 15:50:34,945 INFO [train.py:715] (0/8) Epoch 16, batch 7450, loss[loss=0.1471, simple_loss=0.2247, pruned_loss=0.03476, over 4898.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2077, pruned_loss=0.02992, over 972598.29 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 15:51:14,626 INFO [train.py:715] (0/8) Epoch 16, batch 7500, loss[loss=0.1406, simple_loss=0.2146, pruned_loss=0.03328, over 4790.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03055, over 972314.71 frames.], batch size: 12, lr: 1.39e-04 2022-05-08 15:51:53,689 INFO [train.py:715] (0/8) Epoch 16, batch 7550, loss[loss=0.1373, simple_loss=0.2013, pruned_loss=0.03665, over 4844.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02984, over 972634.25 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:52:33,698 INFO [train.py:715] (0/8) Epoch 16, batch 7600, loss[loss=0.1459, simple_loss=0.2156, pruned_loss=0.03809, over 4965.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03004, over 973358.44 frames.], batch size: 39, lr: 1.39e-04 2022-05-08 15:53:14,080 INFO [train.py:715] (0/8) Epoch 16, batch 7650, loss[loss=0.1712, simple_loss=0.2488, pruned_loss=0.0468, over 4868.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02989, over 972236.04 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 15:53:54,219 INFO [train.py:715] (0/8) Epoch 16, batch 7700, loss[loss=0.153, simple_loss=0.2394, pruned_loss=0.0333, over 4785.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02995, over 971512.19 frames.], batch size: 17, lr: 1.39e-04 2022-05-08 15:54:33,731 INFO [train.py:715] (0/8) Epoch 16, batch 7750, loss[loss=0.1493, simple_loss=0.2232, pruned_loss=0.03763, over 4939.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03021, over 971867.23 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 15:55:13,925 INFO [train.py:715] (0/8) Epoch 16, batch 7800, loss[loss=0.1272, simple_loss=0.1906, pruned_loss=0.03194, over 4780.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02984, over 972149.00 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 15:55:54,768 INFO [train.py:715] (0/8) Epoch 16, batch 7850, loss[loss=0.1262, simple_loss=0.2069, pruned_loss=0.02271, over 4807.00 frames.], tot_loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.0298, over 972539.48 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 15:56:34,179 INFO [train.py:715] (0/8) Epoch 16, batch 7900, loss[loss=0.1293, simple_loss=0.2007, pruned_loss=0.0289, over 4805.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.03011, over 973467.27 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 15:57:14,075 INFO [train.py:715] (0/8) Epoch 16, batch 7950, loss[loss=0.146, simple_loss=0.2085, pruned_loss=0.04175, over 4977.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03026, over 972768.17 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 15:57:54,567 INFO [train.py:715] (0/8) Epoch 16, batch 8000, loss[loss=0.1242, simple_loss=0.1997, pruned_loss=0.02439, over 4837.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2093, pruned_loss=0.03056, over 973401.56 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 15:58:34,605 INFO [train.py:715] (0/8) Epoch 16, batch 8050, loss[loss=0.1155, simple_loss=0.1961, pruned_loss=0.01741, over 4980.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.0304, over 972822.13 frames.], batch size: 24, lr: 1.39e-04 2022-05-08 15:59:14,258 INFO [train.py:715] (0/8) Epoch 16, batch 8100, loss[loss=0.147, simple_loss=0.2203, pruned_loss=0.03683, over 4879.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2084, pruned_loss=0.03052, over 971874.84 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 15:59:54,678 INFO [train.py:715] (0/8) Epoch 16, batch 8150, loss[loss=0.1692, simple_loss=0.2426, pruned_loss=0.04793, over 4801.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2087, pruned_loss=0.0309, over 971994.57 frames.], batch size: 17, lr: 1.39e-04 2022-05-08 16:00:35,753 INFO [train.py:715] (0/8) Epoch 16, batch 8200, loss[loss=0.1472, simple_loss=0.2204, pruned_loss=0.03699, over 4948.00 frames.], tot_loss[loss=0.1362, simple_loss=0.2099, pruned_loss=0.03121, over 971585.99 frames.], batch size: 35, lr: 1.39e-04 2022-05-08 16:01:15,836 INFO [train.py:715] (0/8) Epoch 16, batch 8250, loss[loss=0.1183, simple_loss=0.1918, pruned_loss=0.02245, over 4910.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2095, pruned_loss=0.03075, over 971818.37 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 16:01:55,595 INFO [train.py:715] (0/8) Epoch 16, batch 8300, loss[loss=0.1189, simple_loss=0.184, pruned_loss=0.02685, over 4775.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2089, pruned_loss=0.03059, over 972620.25 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 16:02:36,303 INFO [train.py:715] (0/8) Epoch 16, batch 8350, loss[loss=0.1096, simple_loss=0.1894, pruned_loss=0.01496, over 4838.00 frames.], tot_loss[loss=0.1351, simple_loss=0.209, pruned_loss=0.03058, over 972717.04 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 16:03:16,601 INFO [train.py:715] (0/8) Epoch 16, batch 8400, loss[loss=0.1108, simple_loss=0.1857, pruned_loss=0.01799, over 4891.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03018, over 972487.71 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:03:55,135 INFO [train.py:715] (0/8) Epoch 16, batch 8450, loss[loss=0.1216, simple_loss=0.2053, pruned_loss=0.01899, over 4872.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.0297, over 972240.08 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 16:04:34,539 INFO [train.py:715] (0/8) Epoch 16, batch 8500, loss[loss=0.1632, simple_loss=0.2333, pruned_loss=0.04655, over 4779.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02959, over 972185.52 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 16:05:13,259 INFO [train.py:715] (0/8) Epoch 16, batch 8550, loss[loss=0.1351, simple_loss=0.2196, pruned_loss=0.0253, over 4970.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02958, over 971944.77 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:05:51,577 INFO [train.py:715] (0/8) Epoch 16, batch 8600, loss[loss=0.1126, simple_loss=0.1938, pruned_loss=0.01569, over 4973.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2064, pruned_loss=0.02955, over 971755.97 frames.], batch size: 24, lr: 1.39e-04 2022-05-08 16:06:29,594 INFO [train.py:715] (0/8) Epoch 16, batch 8650, loss[loss=0.15, simple_loss=0.2308, pruned_loss=0.0346, over 4982.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02972, over 972480.37 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 16:07:08,667 INFO [train.py:715] (0/8) Epoch 16, batch 8700, loss[loss=0.1378, simple_loss=0.2059, pruned_loss=0.03482, over 4839.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2063, pruned_loss=0.02954, over 971771.20 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:07:47,714 INFO [train.py:715] (0/8) Epoch 16, batch 8750, loss[loss=0.1471, simple_loss=0.2128, pruned_loss=0.04066, over 4799.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2073, pruned_loss=0.03024, over 972745.18 frames.], batch size: 12, lr: 1.39e-04 2022-05-08 16:08:26,276 INFO [train.py:715] (0/8) Epoch 16, batch 8800, loss[loss=0.1325, simple_loss=0.2053, pruned_loss=0.02988, over 4810.00 frames.], tot_loss[loss=0.1335, simple_loss=0.207, pruned_loss=0.02999, over 972957.36 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 16:09:04,972 INFO [train.py:715] (0/8) Epoch 16, batch 8850, loss[loss=0.1303, simple_loss=0.1993, pruned_loss=0.03062, over 4852.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2075, pruned_loss=0.03001, over 972367.34 frames.], batch size: 13, lr: 1.39e-04 2022-05-08 16:09:44,462 INFO [train.py:715] (0/8) Epoch 16, batch 8900, loss[loss=0.1281, simple_loss=0.2006, pruned_loss=0.02779, over 4838.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2073, pruned_loss=0.03009, over 972132.16 frames.], batch size: 13, lr: 1.39e-04 2022-05-08 16:10:22,904 INFO [train.py:715] (0/8) Epoch 16, batch 8950, loss[loss=0.1531, simple_loss=0.2307, pruned_loss=0.03775, over 4795.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2084, pruned_loss=0.03065, over 971538.27 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 16:11:01,134 INFO [train.py:715] (0/8) Epoch 16, batch 9000, loss[loss=0.1236, simple_loss=0.2024, pruned_loss=0.02242, over 4933.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2076, pruned_loss=0.03, over 972040.01 frames.], batch size: 39, lr: 1.39e-04 2022-05-08 16:11:01,135 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 16:11:23,895 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.105, simple_loss=0.1884, pruned_loss=0.01076, over 914524.00 frames. 2022-05-08 16:12:02,816 INFO [train.py:715] (0/8) Epoch 16, batch 9050, loss[loss=0.1234, simple_loss=0.1962, pruned_loss=0.02533, over 4980.00 frames.], tot_loss[loss=0.1334, simple_loss=0.207, pruned_loss=0.02989, over 972659.92 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:12:41,944 INFO [train.py:715] (0/8) Epoch 16, batch 9100, loss[loss=0.127, simple_loss=0.2027, pruned_loss=0.02567, over 4694.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2077, pruned_loss=0.03033, over 971819.10 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:13:20,952 INFO [train.py:715] (0/8) Epoch 16, batch 9150, loss[loss=0.119, simple_loss=0.1888, pruned_loss=0.02458, over 4755.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2071, pruned_loss=0.02988, over 971999.82 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:13:58,476 INFO [train.py:715] (0/8) Epoch 16, batch 9200, loss[loss=0.1149, simple_loss=0.1954, pruned_loss=0.01718, over 4859.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2075, pruned_loss=0.03002, over 972241.47 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 16:14:37,125 INFO [train.py:715] (0/8) Epoch 16, batch 9250, loss[loss=0.1183, simple_loss=0.1889, pruned_loss=0.02386, over 4917.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02985, over 972544.01 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 16:15:16,081 INFO [train.py:715] (0/8) Epoch 16, batch 9300, loss[loss=0.1309, simple_loss=0.207, pruned_loss=0.02734, over 4829.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03009, over 972463.85 frames.], batch size: 27, lr: 1.39e-04 2022-05-08 16:15:54,775 INFO [train.py:715] (0/8) Epoch 16, batch 9350, loss[loss=0.1223, simple_loss=0.1974, pruned_loss=0.02354, over 4910.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.02941, over 973091.30 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:16:33,099 INFO [train.py:715] (0/8) Epoch 16, batch 9400, loss[loss=0.1261, simple_loss=0.1979, pruned_loss=0.02711, over 4901.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2069, pruned_loss=0.02975, over 972295.19 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 16:17:11,619 INFO [train.py:715] (0/8) Epoch 16, batch 9450, loss[loss=0.1094, simple_loss=0.1811, pruned_loss=0.01889, over 4798.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.02997, over 971532.59 frames.], batch size: 13, lr: 1.39e-04 2022-05-08 16:17:50,524 INFO [train.py:715] (0/8) Epoch 16, batch 9500, loss[loss=0.1243, simple_loss=0.1811, pruned_loss=0.03379, over 4870.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2075, pruned_loss=0.03009, over 971881.80 frames.], batch size: 32, lr: 1.39e-04 2022-05-08 16:18:28,782 INFO [train.py:715] (0/8) Epoch 16, batch 9550, loss[loss=0.1434, simple_loss=0.2126, pruned_loss=0.03715, over 4824.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.03001, over 971962.00 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:19:08,104 INFO [train.py:715] (0/8) Epoch 16, batch 9600, loss[loss=0.1304, simple_loss=0.2076, pruned_loss=0.02662, over 4825.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.03, over 972676.72 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 16:19:47,950 INFO [train.py:715] (0/8) Epoch 16, batch 9650, loss[loss=0.1065, simple_loss=0.1877, pruned_loss=0.01262, over 4867.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02996, over 972035.57 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 16:20:27,612 INFO [train.py:715] (0/8) Epoch 16, batch 9700, loss[loss=0.1211, simple_loss=0.198, pruned_loss=0.02209, over 4908.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2077, pruned_loss=0.03043, over 971461.23 frames.], batch size: 29, lr: 1.39e-04 2022-05-08 16:21:08,021 INFO [train.py:715] (0/8) Epoch 16, batch 9750, loss[loss=0.1507, simple_loss=0.2411, pruned_loss=0.03013, over 4822.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2087, pruned_loss=0.03048, over 971347.41 frames.], batch size: 27, lr: 1.39e-04 2022-05-08 16:21:49,082 INFO [train.py:715] (0/8) Epoch 16, batch 9800, loss[loss=0.1103, simple_loss=0.1874, pruned_loss=0.01666, over 4797.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.0304, over 971662.01 frames.], batch size: 21, lr: 1.39e-04 2022-05-08 16:22:29,515 INFO [train.py:715] (0/8) Epoch 16, batch 9850, loss[loss=0.1433, simple_loss=0.224, pruned_loss=0.03127, over 4832.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03009, over 970749.73 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:23:09,305 INFO [train.py:715] (0/8) Epoch 16, batch 9900, loss[loss=0.1307, simple_loss=0.2019, pruned_loss=0.02974, over 4777.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2086, pruned_loss=0.03064, over 971512.88 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 16:23:49,503 INFO [train.py:715] (0/8) Epoch 16, batch 9950, loss[loss=0.1347, simple_loss=0.2063, pruned_loss=0.03151, over 4963.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2088, pruned_loss=0.03072, over 971557.46 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 16:24:30,462 INFO [train.py:715] (0/8) Epoch 16, batch 10000, loss[loss=0.1425, simple_loss=0.214, pruned_loss=0.03551, over 4709.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2091, pruned_loss=0.03055, over 971693.98 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:25:09,393 INFO [train.py:715] (0/8) Epoch 16, batch 10050, loss[loss=0.1355, simple_loss=0.2132, pruned_loss=0.02892, over 4761.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2095, pruned_loss=0.03055, over 972776.46 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 16:25:49,623 INFO [train.py:715] (0/8) Epoch 16, batch 10100, loss[loss=0.1449, simple_loss=0.2261, pruned_loss=0.03182, over 4770.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2092, pruned_loss=0.03033, over 972702.46 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:26:30,393 INFO [train.py:715] (0/8) Epoch 16, batch 10150, loss[loss=0.1546, simple_loss=0.2161, pruned_loss=0.04654, over 4887.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03069, over 972569.80 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 16:27:10,596 INFO [train.py:715] (0/8) Epoch 16, batch 10200, loss[loss=0.1509, simple_loss=0.2235, pruned_loss=0.03912, over 4912.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03011, over 972844.94 frames.], batch size: 17, lr: 1.39e-04 2022-05-08 16:27:49,587 INFO [train.py:715] (0/8) Epoch 16, batch 10250, loss[loss=0.1403, simple_loss=0.2064, pruned_loss=0.03712, over 4870.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02974, over 973017.10 frames.], batch size: 20, lr: 1.39e-04 2022-05-08 16:28:29,495 INFO [train.py:715] (0/8) Epoch 16, batch 10300, loss[loss=0.138, simple_loss=0.2036, pruned_loss=0.03625, over 4910.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.02971, over 973074.75 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 16:29:09,153 INFO [train.py:715] (0/8) Epoch 16, batch 10350, loss[loss=0.1489, simple_loss=0.2214, pruned_loss=0.03822, over 4758.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2087, pruned_loss=0.03031, over 971952.60 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:29:47,473 INFO [train.py:715] (0/8) Epoch 16, batch 10400, loss[loss=0.1231, simple_loss=0.1966, pruned_loss=0.02483, over 4744.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03004, over 971154.35 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:30:26,266 INFO [train.py:715] (0/8) Epoch 16, batch 10450, loss[loss=0.123, simple_loss=0.1974, pruned_loss=0.02425, over 4915.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03036, over 969721.12 frames.], batch size: 29, lr: 1.39e-04 2022-05-08 16:31:05,203 INFO [train.py:715] (0/8) Epoch 16, batch 10500, loss[loss=0.1342, simple_loss=0.2184, pruned_loss=0.02503, over 4802.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02978, over 969579.89 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 16:31:44,637 INFO [train.py:715] (0/8) Epoch 16, batch 10550, loss[loss=0.1266, simple_loss=0.1989, pruned_loss=0.02721, over 4790.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02967, over 970428.41 frames.], batch size: 24, lr: 1.39e-04 2022-05-08 16:32:22,613 INFO [train.py:715] (0/8) Epoch 16, batch 10600, loss[loss=0.152, simple_loss=0.2208, pruned_loss=0.04159, over 4950.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02959, over 972180.59 frames.], batch size: 35, lr: 1.39e-04 2022-05-08 16:33:01,315 INFO [train.py:715] (0/8) Epoch 16, batch 10650, loss[loss=0.117, simple_loss=0.1826, pruned_loss=0.02572, over 4918.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2075, pruned_loss=0.02921, over 972359.37 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:33:40,766 INFO [train.py:715] (0/8) Epoch 16, batch 10700, loss[loss=0.1421, simple_loss=0.2181, pruned_loss=0.03301, over 4826.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.0291, over 973182.46 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 16:34:19,601 INFO [train.py:715] (0/8) Epoch 16, batch 10750, loss[loss=0.1266, simple_loss=0.2084, pruned_loss=0.02235, over 4798.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02897, over 973190.55 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 16:34:58,502 INFO [train.py:715] (0/8) Epoch 16, batch 10800, loss[loss=0.1096, simple_loss=0.1835, pruned_loss=0.01785, over 4985.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02908, over 973574.54 frames.], batch size: 25, lr: 1.39e-04 2022-05-08 16:35:37,664 INFO [train.py:715] (0/8) Epoch 16, batch 10850, loss[loss=0.1186, simple_loss=0.184, pruned_loss=0.02659, over 4914.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2065, pruned_loss=0.02925, over 972916.10 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:36:17,325 INFO [train.py:715] (0/8) Epoch 16, batch 10900, loss[loss=0.1468, simple_loss=0.2185, pruned_loss=0.03758, over 4975.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02943, over 973207.12 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:36:55,549 INFO [train.py:715] (0/8) Epoch 16, batch 10950, loss[loss=0.1204, simple_loss=0.203, pruned_loss=0.01888, over 4859.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02965, over 972688.73 frames.], batch size: 20, lr: 1.39e-04 2022-05-08 16:37:34,518 INFO [train.py:715] (0/8) Epoch 16, batch 11000, loss[loss=0.1429, simple_loss=0.2224, pruned_loss=0.03171, over 4975.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2086, pruned_loss=0.03005, over 973404.99 frames.], batch size: 28, lr: 1.39e-04 2022-05-08 16:38:13,973 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-568000.pt 2022-05-08 16:38:16,604 INFO [train.py:715] (0/8) Epoch 16, batch 11050, loss[loss=0.1277, simple_loss=0.1932, pruned_loss=0.03114, over 4806.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02949, over 974146.41 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 16:38:55,218 INFO [train.py:715] (0/8) Epoch 16, batch 11100, loss[loss=0.1093, simple_loss=0.1865, pruned_loss=0.01605, over 4890.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.0293, over 973378.07 frames.], batch size: 22, lr: 1.39e-04 2022-05-08 16:39:33,647 INFO [train.py:715] (0/8) Epoch 16, batch 11150, loss[loss=0.1194, simple_loss=0.1956, pruned_loss=0.02157, over 4755.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2066, pruned_loss=0.02945, over 973227.57 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:40:12,895 INFO [train.py:715] (0/8) Epoch 16, batch 11200, loss[loss=0.1588, simple_loss=0.229, pruned_loss=0.04432, over 4890.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02953, over 972808.96 frames.], batch size: 17, lr: 1.39e-04 2022-05-08 16:40:51,681 INFO [train.py:715] (0/8) Epoch 16, batch 11250, loss[loss=0.09637, simple_loss=0.1717, pruned_loss=0.01052, over 4771.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02964, over 973016.20 frames.], batch size: 12, lr: 1.39e-04 2022-05-08 16:41:29,872 INFO [train.py:715] (0/8) Epoch 16, batch 11300, loss[loss=0.1161, simple_loss=0.1843, pruned_loss=0.02394, over 4788.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02951, over 972411.14 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 16:42:08,147 INFO [train.py:715] (0/8) Epoch 16, batch 11350, loss[loss=0.1567, simple_loss=0.2402, pruned_loss=0.03664, over 4875.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02989, over 971514.22 frames.], batch size: 16, lr: 1.39e-04 2022-05-08 16:42:47,126 INFO [train.py:715] (0/8) Epoch 16, batch 11400, loss[loss=0.1153, simple_loss=0.1887, pruned_loss=0.02093, over 4813.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02992, over 971390.44 frames.], batch size: 12, lr: 1.39e-04 2022-05-08 16:43:25,153 INFO [train.py:715] (0/8) Epoch 16, batch 11450, loss[loss=0.1357, simple_loss=0.2027, pruned_loss=0.03438, over 4911.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02961, over 971651.22 frames.], batch size: 17, lr: 1.39e-04 2022-05-08 16:44:03,092 INFO [train.py:715] (0/8) Epoch 16, batch 11500, loss[loss=0.1337, simple_loss=0.206, pruned_loss=0.0307, over 4827.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2064, pruned_loss=0.02914, over 971903.69 frames.], batch size: 26, lr: 1.39e-04 2022-05-08 16:44:41,799 INFO [train.py:715] (0/8) Epoch 16, batch 11550, loss[loss=0.1245, simple_loss=0.2149, pruned_loss=0.01702, over 4926.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02963, over 972214.31 frames.], batch size: 18, lr: 1.39e-04 2022-05-08 16:45:20,365 INFO [train.py:715] (0/8) Epoch 16, batch 11600, loss[loss=0.1595, simple_loss=0.2375, pruned_loss=0.04077, over 4970.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.0294, over 972849.81 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:45:57,959 INFO [train.py:715] (0/8) Epoch 16, batch 11650, loss[loss=0.1245, simple_loss=0.2012, pruned_loss=0.02393, over 4910.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02882, over 972985.82 frames.], batch size: 19, lr: 1.39e-04 2022-05-08 16:46:36,438 INFO [train.py:715] (0/8) Epoch 16, batch 11700, loss[loss=0.12, simple_loss=0.1949, pruned_loss=0.02256, over 4834.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02866, over 972649.45 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:47:15,520 INFO [train.py:715] (0/8) Epoch 16, batch 11750, loss[loss=0.1248, simple_loss=0.1958, pruned_loss=0.02687, over 4825.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02899, over 972915.59 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:47:53,670 INFO [train.py:715] (0/8) Epoch 16, batch 11800, loss[loss=0.1188, simple_loss=0.1926, pruned_loss=0.0225, over 4920.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02889, over 974194.15 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 16:48:31,490 INFO [train.py:715] (0/8) Epoch 16, batch 11850, loss[loss=0.1346, simple_loss=0.2089, pruned_loss=0.03012, over 4926.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02934, over 974210.79 frames.], batch size: 23, lr: 1.39e-04 2022-05-08 16:49:10,173 INFO [train.py:715] (0/8) Epoch 16, batch 11900, loss[loss=0.1158, simple_loss=0.1864, pruned_loss=0.02265, over 4687.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02956, over 972639.99 frames.], batch size: 15, lr: 1.39e-04 2022-05-08 16:49:48,593 INFO [train.py:715] (0/8) Epoch 16, batch 11950, loss[loss=0.1365, simple_loss=0.2086, pruned_loss=0.03217, over 4980.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2067, pruned_loss=0.02945, over 972628.59 frames.], batch size: 14, lr: 1.39e-04 2022-05-08 16:50:26,414 INFO [train.py:715] (0/8) Epoch 16, batch 12000, loss[loss=0.1079, simple_loss=0.1923, pruned_loss=0.01177, over 4800.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02887, over 972625.60 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 16:50:26,415 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 16:50:37,201 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.1049, simple_loss=0.1884, pruned_loss=0.01072, over 914524.00 frames. 2022-05-08 16:51:16,046 INFO [train.py:715] (0/8) Epoch 16, batch 12050, loss[loss=0.1388, simple_loss=0.2086, pruned_loss=0.03446, over 4918.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2072, pruned_loss=0.02967, over 972078.11 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 16:51:55,268 INFO [train.py:715] (0/8) Epoch 16, batch 12100, loss[loss=0.1947, simple_loss=0.2756, pruned_loss=0.05695, over 4923.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03007, over 972103.16 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 16:52:34,706 INFO [train.py:715] (0/8) Epoch 16, batch 12150, loss[loss=0.1279, simple_loss=0.2051, pruned_loss=0.02538, over 4868.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2075, pruned_loss=0.03017, over 972080.42 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 16:53:12,371 INFO [train.py:715] (0/8) Epoch 16, batch 12200, loss[loss=0.1146, simple_loss=0.1865, pruned_loss=0.02132, over 4984.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2072, pruned_loss=0.02971, over 972298.92 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 16:53:50,651 INFO [train.py:715] (0/8) Epoch 16, batch 12250, loss[loss=0.1525, simple_loss=0.2211, pruned_loss=0.04191, over 4832.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02964, over 971995.55 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 16:54:29,688 INFO [train.py:715] (0/8) Epoch 16, batch 12300, loss[loss=0.1231, simple_loss=0.1971, pruned_loss=0.02451, over 4923.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02894, over 971977.62 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 16:55:08,770 INFO [train.py:715] (0/8) Epoch 16, batch 12350, loss[loss=0.1576, simple_loss=0.2227, pruned_loss=0.04626, over 4866.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.02865, over 971943.18 frames.], batch size: 32, lr: 1.38e-04 2022-05-08 16:55:47,029 INFO [train.py:715] (0/8) Epoch 16, batch 12400, loss[loss=0.1066, simple_loss=0.181, pruned_loss=0.01615, over 4756.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02931, over 971148.31 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 16:56:26,127 INFO [train.py:715] (0/8) Epoch 16, batch 12450, loss[loss=0.1252, simple_loss=0.2007, pruned_loss=0.02481, over 4927.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02927, over 971796.02 frames.], batch size: 23, lr: 1.38e-04 2022-05-08 16:57:05,987 INFO [train.py:715] (0/8) Epoch 16, batch 12500, loss[loss=0.1272, simple_loss=0.2005, pruned_loss=0.02699, over 4663.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02948, over 971442.77 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 16:57:44,596 INFO [train.py:715] (0/8) Epoch 16, batch 12550, loss[loss=0.1183, simple_loss=0.1811, pruned_loss=0.02778, over 4784.00 frames.], tot_loss[loss=0.1331, simple_loss=0.207, pruned_loss=0.02957, over 971733.76 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 16:58:23,180 INFO [train.py:715] (0/8) Epoch 16, batch 12600, loss[loss=0.1241, simple_loss=0.1905, pruned_loss=0.02885, over 4798.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02963, over 971054.17 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 16:59:01,904 INFO [train.py:715] (0/8) Epoch 16, batch 12650, loss[loss=0.1141, simple_loss=0.1902, pruned_loss=0.01901, over 4747.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02951, over 970930.12 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 16:59:40,536 INFO [train.py:715] (0/8) Epoch 16, batch 12700, loss[loss=0.134, simple_loss=0.213, pruned_loss=0.02749, over 4986.00 frames.], tot_loss[loss=0.1331, simple_loss=0.207, pruned_loss=0.02961, over 971483.18 frames.], batch size: 28, lr: 1.38e-04 2022-05-08 17:00:18,080 INFO [train.py:715] (0/8) Epoch 16, batch 12750, loss[loss=0.1188, simple_loss=0.1916, pruned_loss=0.02302, over 4921.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2067, pruned_loss=0.02973, over 972011.59 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:00:57,704 INFO [train.py:715] (0/8) Epoch 16, batch 12800, loss[loss=0.1242, simple_loss=0.1917, pruned_loss=0.02837, over 4915.00 frames.], tot_loss[loss=0.1336, simple_loss=0.207, pruned_loss=0.03007, over 971861.92 frames.], batch size: 23, lr: 1.38e-04 2022-05-08 17:01:36,681 INFO [train.py:715] (0/8) Epoch 16, batch 12850, loss[loss=0.1336, simple_loss=0.2085, pruned_loss=0.02938, over 4919.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2057, pruned_loss=0.02938, over 971628.78 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:02:15,044 INFO [train.py:715] (0/8) Epoch 16, batch 12900, loss[loss=0.1558, simple_loss=0.2229, pruned_loss=0.0443, over 4774.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2059, pruned_loss=0.02954, over 971833.44 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 17:02:53,757 INFO [train.py:715] (0/8) Epoch 16, batch 12950, loss[loss=0.1286, simple_loss=0.2111, pruned_loss=0.02308, over 4848.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2058, pruned_loss=0.02942, over 971374.71 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:03:32,776 INFO [train.py:715] (0/8) Epoch 16, batch 13000, loss[loss=0.1572, simple_loss=0.2528, pruned_loss=0.03084, over 4750.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02942, over 971460.41 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:04:11,279 INFO [train.py:715] (0/8) Epoch 16, batch 13050, loss[loss=0.1388, simple_loss=0.2081, pruned_loss=0.0347, over 4987.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02986, over 971060.37 frames.], batch size: 31, lr: 1.38e-04 2022-05-08 17:04:49,800 INFO [train.py:715] (0/8) Epoch 16, batch 13100, loss[loss=0.12, simple_loss=0.1979, pruned_loss=0.02106, over 4693.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02986, over 971428.43 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:05:28,953 INFO [train.py:715] (0/8) Epoch 16, batch 13150, loss[loss=0.1324, simple_loss=0.2129, pruned_loss=0.02596, over 4938.00 frames.], tot_loss[loss=0.1331, simple_loss=0.207, pruned_loss=0.02956, over 971712.31 frames.], batch size: 35, lr: 1.38e-04 2022-05-08 17:06:08,074 INFO [train.py:715] (0/8) Epoch 16, batch 13200, loss[loss=0.1515, simple_loss=0.2151, pruned_loss=0.04398, over 4788.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03026, over 971321.73 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 17:06:46,152 INFO [train.py:715] (0/8) Epoch 16, batch 13250, loss[loss=0.1195, simple_loss=0.1984, pruned_loss=0.02027, over 4984.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2072, pruned_loss=0.03022, over 972151.61 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:07:25,005 INFO [train.py:715] (0/8) Epoch 16, batch 13300, loss[loss=0.1191, simple_loss=0.198, pruned_loss=0.02011, over 4861.00 frames.], tot_loss[loss=0.133, simple_loss=0.2066, pruned_loss=0.02974, over 973101.44 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:08:04,358 INFO [train.py:715] (0/8) Epoch 16, batch 13350, loss[loss=0.1441, simple_loss=0.2126, pruned_loss=0.03778, over 4787.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2066, pruned_loss=0.02961, over 973763.34 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 17:08:42,681 INFO [train.py:715] (0/8) Epoch 16, batch 13400, loss[loss=0.1359, simple_loss=0.2075, pruned_loss=0.0322, over 4864.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02982, over 973542.19 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:09:21,145 INFO [train.py:715] (0/8) Epoch 16, batch 13450, loss[loss=0.1321, simple_loss=0.2031, pruned_loss=0.03055, over 4858.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02943, over 972684.71 frames.], batch size: 30, lr: 1.38e-04 2022-05-08 17:10:00,905 INFO [train.py:715] (0/8) Epoch 16, batch 13500, loss[loss=0.1396, simple_loss=0.2128, pruned_loss=0.03318, over 4817.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02954, over 972836.25 frames.], batch size: 27, lr: 1.38e-04 2022-05-08 17:10:39,235 INFO [train.py:715] (0/8) Epoch 16, batch 13550, loss[loss=0.1577, simple_loss=0.2239, pruned_loss=0.04574, over 4854.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2083, pruned_loss=0.02967, over 972818.99 frames.], batch size: 30, lr: 1.38e-04 2022-05-08 17:11:17,349 INFO [train.py:715] (0/8) Epoch 16, batch 13600, loss[loss=0.1358, simple_loss=0.2083, pruned_loss=0.0316, over 4994.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02955, over 973507.21 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:11:56,183 INFO [train.py:715] (0/8) Epoch 16, batch 13650, loss[loss=0.1309, simple_loss=0.2044, pruned_loss=0.02871, over 4960.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02948, over 973075.85 frames.], batch size: 24, lr: 1.38e-04 2022-05-08 17:12:35,106 INFO [train.py:715] (0/8) Epoch 16, batch 13700, loss[loss=0.1167, simple_loss=0.1857, pruned_loss=0.02384, over 4839.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02938, over 972471.20 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:13:13,497 INFO [train.py:715] (0/8) Epoch 16, batch 13750, loss[loss=0.1323, simple_loss=0.212, pruned_loss=0.02626, over 4806.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02978, over 972147.15 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:13:52,012 INFO [train.py:715] (0/8) Epoch 16, batch 13800, loss[loss=0.1305, simple_loss=0.2039, pruned_loss=0.02855, over 4771.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2068, pruned_loss=0.02997, over 972331.61 frames.], batch size: 14, lr: 1.38e-04 2022-05-08 17:14:30,648 INFO [train.py:715] (0/8) Epoch 16, batch 13850, loss[loss=0.1202, simple_loss=0.2073, pruned_loss=0.01657, over 4752.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02932, over 972208.98 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:15:08,620 INFO [train.py:715] (0/8) Epoch 16, batch 13900, loss[loss=0.1364, simple_loss=0.2133, pruned_loss=0.02979, over 4972.00 frames.], tot_loss[loss=0.1331, simple_loss=0.207, pruned_loss=0.0296, over 972044.09 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:15:46,308 INFO [train.py:715] (0/8) Epoch 16, batch 13950, loss[loss=0.1273, simple_loss=0.2002, pruned_loss=0.0272, over 4814.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.02936, over 971430.57 frames.], batch size: 27, lr: 1.38e-04 2022-05-08 17:16:24,660 INFO [train.py:715] (0/8) Epoch 16, batch 14000, loss[loss=0.1426, simple_loss=0.2199, pruned_loss=0.03265, over 4902.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02964, over 972022.05 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 17:17:03,281 INFO [train.py:715] (0/8) Epoch 16, batch 14050, loss[loss=0.1138, simple_loss=0.1925, pruned_loss=0.01757, over 4758.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02991, over 972212.56 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:17:41,055 INFO [train.py:715] (0/8) Epoch 16, batch 14100, loss[loss=0.1182, simple_loss=0.2053, pruned_loss=0.01555, over 4987.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02987, over 972564.69 frames.], batch size: 28, lr: 1.38e-04 2022-05-08 17:18:18,774 INFO [train.py:715] (0/8) Epoch 16, batch 14150, loss[loss=0.1436, simple_loss=0.2274, pruned_loss=0.02992, over 4878.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02951, over 971238.36 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:18:57,313 INFO [train.py:715] (0/8) Epoch 16, batch 14200, loss[loss=0.1364, simple_loss=0.2103, pruned_loss=0.03127, over 4945.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02982, over 971660.21 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:19:36,005 INFO [train.py:715] (0/8) Epoch 16, batch 14250, loss[loss=0.156, simple_loss=0.2292, pruned_loss=0.04137, over 4883.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.0301, over 971527.38 frames.], batch size: 39, lr: 1.38e-04 2022-05-08 17:20:14,624 INFO [train.py:715] (0/8) Epoch 16, batch 14300, loss[loss=0.1226, simple_loss=0.2007, pruned_loss=0.02224, over 4875.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2073, pruned_loss=0.03004, over 971509.77 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:20:53,329 INFO [train.py:715] (0/8) Epoch 16, batch 14350, loss[loss=0.1123, simple_loss=0.1886, pruned_loss=0.01806, over 4860.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02984, over 972023.56 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:21:32,523 INFO [train.py:715] (0/8) Epoch 16, batch 14400, loss[loss=0.1209, simple_loss=0.198, pruned_loss=0.02194, over 4966.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03007, over 972609.69 frames.], batch size: 35, lr: 1.38e-04 2022-05-08 17:22:10,259 INFO [train.py:715] (0/8) Epoch 16, batch 14450, loss[loss=0.1145, simple_loss=0.1882, pruned_loss=0.02041, over 4869.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2073, pruned_loss=0.02952, over 972527.46 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:22:49,088 INFO [train.py:715] (0/8) Epoch 16, batch 14500, loss[loss=0.1282, simple_loss=0.2097, pruned_loss=0.02337, over 4888.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02989, over 972132.52 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 17:23:28,024 INFO [train.py:715] (0/8) Epoch 16, batch 14550, loss[loss=0.1354, simple_loss=0.2092, pruned_loss=0.03083, over 4772.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.02998, over 972474.33 frames.], batch size: 14, lr: 1.38e-04 2022-05-08 17:24:06,691 INFO [train.py:715] (0/8) Epoch 16, batch 14600, loss[loss=0.1248, simple_loss=0.204, pruned_loss=0.02285, over 4797.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03042, over 972663.35 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:24:44,962 INFO [train.py:715] (0/8) Epoch 16, batch 14650, loss[loss=0.1381, simple_loss=0.2031, pruned_loss=0.03649, over 4981.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2078, pruned_loss=0.0303, over 972794.40 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:25:23,538 INFO [train.py:715] (0/8) Epoch 16, batch 14700, loss[loss=0.1238, simple_loss=0.2063, pruned_loss=0.02064, over 4904.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2074, pruned_loss=0.03, over 972647.27 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:26:02,837 INFO [train.py:715] (0/8) Epoch 16, batch 14750, loss[loss=0.1323, simple_loss=0.2075, pruned_loss=0.02854, over 4880.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03029, over 973149.52 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:26:40,631 INFO [train.py:715] (0/8) Epoch 16, batch 14800, loss[loss=0.1241, simple_loss=0.2009, pruned_loss=0.0236, over 4774.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2078, pruned_loss=0.03021, over 972834.48 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:27:19,697 INFO [train.py:715] (0/8) Epoch 16, batch 14850, loss[loss=0.1228, simple_loss=0.1976, pruned_loss=0.02401, over 4974.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2084, pruned_loss=0.03016, over 972734.06 frames.], batch size: 28, lr: 1.38e-04 2022-05-08 17:27:58,603 INFO [train.py:715] (0/8) Epoch 16, batch 14900, loss[loss=0.1234, simple_loss=0.1913, pruned_loss=0.02773, over 4861.00 frames.], tot_loss[loss=0.134, simple_loss=0.208, pruned_loss=0.02999, over 972297.42 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 17:28:37,056 INFO [train.py:715] (0/8) Epoch 16, batch 14950, loss[loss=0.1592, simple_loss=0.2305, pruned_loss=0.04397, over 4933.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02977, over 972184.78 frames.], batch size: 23, lr: 1.38e-04 2022-05-08 17:29:16,114 INFO [train.py:715] (0/8) Epoch 16, batch 15000, loss[loss=0.1402, simple_loss=0.2217, pruned_loss=0.02938, over 4886.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02958, over 972086.52 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:29:16,115 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 17:29:25,727 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.1049, simple_loss=0.1884, pruned_loss=0.01069, over 914524.00 frames. 2022-05-08 17:30:03,997 INFO [train.py:715] (0/8) Epoch 16, batch 15050, loss[loss=0.1341, simple_loss=0.2118, pruned_loss=0.02816, over 4859.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03008, over 971570.73 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:30:42,063 INFO [train.py:715] (0/8) Epoch 16, batch 15100, loss[loss=0.1173, simple_loss=0.1979, pruned_loss=0.01834, over 4765.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03026, over 971324.86 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:31:20,868 INFO [train.py:715] (0/8) Epoch 16, batch 15150, loss[loss=0.1605, simple_loss=0.246, pruned_loss=0.03752, over 4886.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03023, over 971963.57 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 17:31:58,563 INFO [train.py:715] (0/8) Epoch 16, batch 15200, loss[loss=0.1221, simple_loss=0.1983, pruned_loss=0.02289, over 4839.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2097, pruned_loss=0.03048, over 972770.64 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:32:36,115 INFO [train.py:715] (0/8) Epoch 16, batch 15250, loss[loss=0.1397, simple_loss=0.2163, pruned_loss=0.03159, over 4916.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03026, over 971815.08 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 17:33:14,307 INFO [train.py:715] (0/8) Epoch 16, batch 15300, loss[loss=0.1559, simple_loss=0.2443, pruned_loss=0.03377, over 4805.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2092, pruned_loss=0.03071, over 971670.88 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:33:52,454 INFO [train.py:715] (0/8) Epoch 16, batch 15350, loss[loss=0.1345, simple_loss=0.2075, pruned_loss=0.03074, over 4975.00 frames.], tot_loss[loss=0.1356, simple_loss=0.2092, pruned_loss=0.03099, over 971352.44 frames.], batch size: 14, lr: 1.38e-04 2022-05-08 17:34:30,723 INFO [train.py:715] (0/8) Epoch 16, batch 15400, loss[loss=0.1361, simple_loss=0.2006, pruned_loss=0.03578, over 4697.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2091, pruned_loss=0.03064, over 971105.70 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:35:08,736 INFO [train.py:715] (0/8) Epoch 16, batch 15450, loss[loss=0.1077, simple_loss=0.189, pruned_loss=0.0132, over 4773.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2092, pruned_loss=0.03087, over 971500.83 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 17:35:47,166 INFO [train.py:715] (0/8) Epoch 16, batch 15500, loss[loss=0.1163, simple_loss=0.191, pruned_loss=0.02076, over 4801.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03036, over 971689.05 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:36:24,797 INFO [train.py:715] (0/8) Epoch 16, batch 15550, loss[loss=0.1237, simple_loss=0.1893, pruned_loss=0.02909, over 4853.00 frames.], tot_loss[loss=0.1342, simple_loss=0.208, pruned_loss=0.03023, over 971938.43 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:37:02,464 INFO [train.py:715] (0/8) Epoch 16, batch 15600, loss[loss=0.1437, simple_loss=0.2124, pruned_loss=0.03752, over 4792.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2085, pruned_loss=0.03037, over 971996.71 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 17:37:41,081 INFO [train.py:715] (0/8) Epoch 16, batch 15650, loss[loss=0.1207, simple_loss=0.1998, pruned_loss=0.02082, over 4772.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2087, pruned_loss=0.0302, over 972038.84 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 17:38:19,114 INFO [train.py:715] (0/8) Epoch 16, batch 15700, loss[loss=0.1425, simple_loss=0.2189, pruned_loss=0.03311, over 4805.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.02996, over 972819.80 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:38:56,849 INFO [train.py:715] (0/8) Epoch 16, batch 15750, loss[loss=0.1484, simple_loss=0.2208, pruned_loss=0.03799, over 4891.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03029, over 973066.06 frames.], batch size: 32, lr: 1.38e-04 2022-05-08 17:39:34,740 INFO [train.py:715] (0/8) Epoch 16, batch 15800, loss[loss=0.1551, simple_loss=0.2212, pruned_loss=0.04447, over 4806.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03026, over 972218.08 frames.], batch size: 25, lr: 1.38e-04 2022-05-08 17:40:13,082 INFO [train.py:715] (0/8) Epoch 16, batch 15850, loss[loss=0.1349, simple_loss=0.2103, pruned_loss=0.02975, over 4893.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2087, pruned_loss=0.03016, over 973382.49 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 17:40:50,710 INFO [train.py:715] (0/8) Epoch 16, batch 15900, loss[loss=0.1381, simple_loss=0.2082, pruned_loss=0.034, over 4820.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03039, over 972599.95 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:41:28,314 INFO [train.py:715] (0/8) Epoch 16, batch 15950, loss[loss=0.1096, simple_loss=0.1863, pruned_loss=0.01645, over 4908.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02976, over 972616.16 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 17:42:06,728 INFO [train.py:715] (0/8) Epoch 16, batch 16000, loss[loss=0.1368, simple_loss=0.2047, pruned_loss=0.03446, over 4785.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.02998, over 972903.60 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 17:42:44,833 INFO [train.py:715] (0/8) Epoch 16, batch 16050, loss[loss=0.1432, simple_loss=0.2175, pruned_loss=0.03451, over 4849.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03034, over 972478.46 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 17:43:22,457 INFO [train.py:715] (0/8) Epoch 16, batch 16100, loss[loss=0.1617, simple_loss=0.211, pruned_loss=0.05618, over 4789.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03046, over 971644.49 frames.], batch size: 14, lr: 1.38e-04 2022-05-08 17:43:59,969 INFO [train.py:715] (0/8) Epoch 16, batch 16150, loss[loss=0.166, simple_loss=0.2287, pruned_loss=0.05164, over 4846.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2083, pruned_loss=0.03045, over 971727.81 frames.], batch size: 30, lr: 1.38e-04 2022-05-08 17:44:38,348 INFO [train.py:715] (0/8) Epoch 16, batch 16200, loss[loss=0.1407, simple_loss=0.204, pruned_loss=0.03866, over 4811.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03042, over 971185.20 frames.], batch size: 26, lr: 1.38e-04 2022-05-08 17:45:15,914 INFO [train.py:715] (0/8) Epoch 16, batch 16250, loss[loss=0.1396, simple_loss=0.2157, pruned_loss=0.03169, over 4926.00 frames.], tot_loss[loss=0.1353, simple_loss=0.2093, pruned_loss=0.03066, over 971224.94 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:45:53,544 INFO [train.py:715] (0/8) Epoch 16, batch 16300, loss[loss=0.1171, simple_loss=0.1787, pruned_loss=0.02776, over 4787.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2089, pruned_loss=0.03044, over 971189.71 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 17:46:31,872 INFO [train.py:715] (0/8) Epoch 16, batch 16350, loss[loss=0.119, simple_loss=0.1938, pruned_loss=0.02208, over 4846.00 frames.], tot_loss[loss=0.1347, simple_loss=0.209, pruned_loss=0.03016, over 971456.63 frames.], batch size: 26, lr: 1.38e-04 2022-05-08 17:47:10,529 INFO [train.py:715] (0/8) Epoch 16, batch 16400, loss[loss=0.1292, simple_loss=0.2084, pruned_loss=0.02495, over 4823.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03022, over 971991.75 frames.], batch size: 26, lr: 1.38e-04 2022-05-08 17:47:47,570 INFO [train.py:715] (0/8) Epoch 16, batch 16450, loss[loss=0.1251, simple_loss=0.206, pruned_loss=0.02211, over 4906.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02965, over 972382.27 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:48:25,519 INFO [train.py:715] (0/8) Epoch 16, batch 16500, loss[loss=0.122, simple_loss=0.1933, pruned_loss=0.02534, over 4877.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02944, over 972054.05 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:49:04,091 INFO [train.py:715] (0/8) Epoch 16, batch 16550, loss[loss=0.1145, simple_loss=0.1924, pruned_loss=0.01826, over 4881.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02959, over 971114.18 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 17:49:41,513 INFO [train.py:715] (0/8) Epoch 16, batch 16600, loss[loss=0.1499, simple_loss=0.2244, pruned_loss=0.03767, over 4880.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02969, over 970740.70 frames.], batch size: 32, lr: 1.38e-04 2022-05-08 17:50:19,530 INFO [train.py:715] (0/8) Epoch 16, batch 16650, loss[loss=0.1184, simple_loss=0.1943, pruned_loss=0.02126, over 4837.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02977, over 971549.69 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 17:50:57,793 INFO [train.py:715] (0/8) Epoch 16, batch 16700, loss[loss=0.1321, simple_loss=0.2089, pruned_loss=0.02762, over 4811.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2086, pruned_loss=0.03023, over 972293.43 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:51:35,935 INFO [train.py:715] (0/8) Epoch 16, batch 16750, loss[loss=0.1166, simple_loss=0.2003, pruned_loss=0.01638, over 4909.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02915, over 972102.00 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 17:52:13,451 INFO [train.py:715] (0/8) Epoch 16, batch 16800, loss[loss=0.1504, simple_loss=0.2236, pruned_loss=0.03861, over 4820.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02913, over 971062.29 frames.], batch size: 26, lr: 1.38e-04 2022-05-08 17:52:51,535 INFO [train.py:715] (0/8) Epoch 16, batch 16850, loss[loss=0.1733, simple_loss=0.2436, pruned_loss=0.05147, over 4971.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02968, over 971674.83 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 17:53:29,997 INFO [train.py:715] (0/8) Epoch 16, batch 16900, loss[loss=0.1395, simple_loss=0.2037, pruned_loss=0.03764, over 4876.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03029, over 972328.00 frames.], batch size: 32, lr: 1.38e-04 2022-05-08 17:54:07,589 INFO [train.py:715] (0/8) Epoch 16, batch 16950, loss[loss=0.1725, simple_loss=0.2381, pruned_loss=0.05348, over 4795.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2084, pruned_loss=0.03039, over 973076.81 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 17:54:45,474 INFO [train.py:715] (0/8) Epoch 16, batch 17000, loss[loss=0.1215, simple_loss=0.1984, pruned_loss=0.02234, over 4897.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2081, pruned_loss=0.03045, over 971799.30 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 17:55:23,670 INFO [train.py:715] (0/8) Epoch 16, batch 17050, loss[loss=0.1278, simple_loss=0.2071, pruned_loss=0.0243, over 4878.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03034, over 971232.83 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 17:56:02,255 INFO [train.py:715] (0/8) Epoch 16, batch 17100, loss[loss=0.1219, simple_loss=0.197, pruned_loss=0.02344, over 4885.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0297, over 971804.98 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:56:39,331 INFO [train.py:715] (0/8) Epoch 16, batch 17150, loss[loss=0.125, simple_loss=0.1959, pruned_loss=0.02702, over 4949.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03007, over 971932.72 frames.], batch size: 24, lr: 1.38e-04 2022-05-08 17:57:17,462 INFO [train.py:715] (0/8) Epoch 16, batch 17200, loss[loss=0.1269, simple_loss=0.1971, pruned_loss=0.02837, over 4954.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2088, pruned_loss=0.03037, over 972489.74 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 17:57:56,360 INFO [train.py:715] (0/8) Epoch 16, batch 17250, loss[loss=0.1495, simple_loss=0.2172, pruned_loss=0.04088, over 4870.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2089, pruned_loss=0.03026, over 972038.01 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:58:33,735 INFO [train.py:715] (0/8) Epoch 16, batch 17300, loss[loss=0.1307, simple_loss=0.2125, pruned_loss=0.02444, over 4821.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2082, pruned_loss=0.02986, over 972813.47 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 17:59:11,265 INFO [train.py:715] (0/8) Epoch 16, batch 17350, loss[loss=0.1201, simple_loss=0.197, pruned_loss=0.02162, over 4871.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.02989, over 973078.06 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 17:59:49,075 INFO [train.py:715] (0/8) Epoch 16, batch 17400, loss[loss=0.1272, simple_loss=0.2093, pruned_loss=0.02255, over 4763.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.02993, over 973313.59 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:00:27,742 INFO [train.py:715] (0/8) Epoch 16, batch 17450, loss[loss=0.1383, simple_loss=0.2115, pruned_loss=0.03251, over 4955.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02991, over 972628.87 frames.], batch size: 23, lr: 1.38e-04 2022-05-08 18:01:04,504 INFO [train.py:715] (0/8) Epoch 16, batch 17500, loss[loss=0.1194, simple_loss=0.1903, pruned_loss=0.02429, over 4800.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2088, pruned_loss=0.02998, over 972718.27 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 18:01:42,652 INFO [train.py:715] (0/8) Epoch 16, batch 17550, loss[loss=0.1285, simple_loss=0.213, pruned_loss=0.02198, over 4786.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02983, over 972591.76 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 18:02:21,338 INFO [train.py:715] (0/8) Epoch 16, batch 17600, loss[loss=0.1077, simple_loss=0.1797, pruned_loss=0.01786, over 4650.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02976, over 972544.51 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 18:02:58,690 INFO [train.py:715] (0/8) Epoch 16, batch 17650, loss[loss=0.09921, simple_loss=0.1679, pruned_loss=0.01523, over 4843.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02947, over 972538.80 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 18:03:36,634 INFO [train.py:715] (0/8) Epoch 16, batch 17700, loss[loss=0.1325, simple_loss=0.2122, pruned_loss=0.02645, over 4894.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.0293, over 972065.86 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:04:15,002 INFO [train.py:715] (0/8) Epoch 16, batch 17750, loss[loss=0.1354, simple_loss=0.2072, pruned_loss=0.03181, over 4951.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02904, over 972564.34 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:04:53,065 INFO [train.py:715] (0/8) Epoch 16, batch 17800, loss[loss=0.1296, simple_loss=0.2009, pruned_loss=0.02916, over 4764.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02949, over 972569.87 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:05:30,276 INFO [train.py:715] (0/8) Epoch 16, batch 17850, loss[loss=0.1304, simple_loss=0.2051, pruned_loss=0.02791, over 4844.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2071, pruned_loss=0.02963, over 972223.69 frames.], batch size: 30, lr: 1.38e-04 2022-05-08 18:06:08,447 INFO [train.py:715] (0/8) Epoch 16, batch 17900, loss[loss=0.1313, simple_loss=0.2098, pruned_loss=0.02643, over 4787.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02972, over 972474.80 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 18:06:46,888 INFO [train.py:715] (0/8) Epoch 16, batch 17950, loss[loss=0.1376, simple_loss=0.2118, pruned_loss=0.03174, over 4856.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02991, over 973351.06 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:07:24,271 INFO [train.py:715] (0/8) Epoch 16, batch 18000, loss[loss=0.1398, simple_loss=0.2163, pruned_loss=0.03168, over 4960.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.0295, over 973722.70 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:07:24,273 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 18:07:33,812 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.105, simple_loss=0.1884, pruned_loss=0.01082, over 914524.00 frames. 2022-05-08 18:08:11,764 INFO [train.py:715] (0/8) Epoch 16, batch 18050, loss[loss=0.1343, simple_loss=0.2024, pruned_loss=0.03316, over 4738.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02989, over 972909.12 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 18:08:50,176 INFO [train.py:715] (0/8) Epoch 16, batch 18100, loss[loss=0.1343, simple_loss=0.2091, pruned_loss=0.02975, over 4836.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02945, over 972976.76 frames.], batch size: 26, lr: 1.38e-04 2022-05-08 18:09:28,821 INFO [train.py:715] (0/8) Epoch 16, batch 18150, loss[loss=0.1276, simple_loss=0.2085, pruned_loss=0.02333, over 4908.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02982, over 972639.65 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 18:10:07,469 INFO [train.py:715] (0/8) Epoch 16, batch 18200, loss[loss=0.1165, simple_loss=0.1867, pruned_loss=0.02315, over 4978.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2092, pruned_loss=0.03017, over 972838.70 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:10:45,079 INFO [train.py:715] (0/8) Epoch 16, batch 18250, loss[loss=0.1306, simple_loss=0.2111, pruned_loss=0.02507, over 4942.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2091, pruned_loss=0.03023, over 973336.17 frames.], batch size: 29, lr: 1.38e-04 2022-05-08 18:11:23,842 INFO [train.py:715] (0/8) Epoch 16, batch 18300, loss[loss=0.1408, simple_loss=0.2273, pruned_loss=0.02716, over 4875.00 frames.], tot_loss[loss=0.1355, simple_loss=0.2097, pruned_loss=0.03063, over 973292.91 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 18:12:02,944 INFO [train.py:715] (0/8) Epoch 16, batch 18350, loss[loss=0.119, simple_loss=0.1936, pruned_loss=0.02223, over 4894.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2086, pruned_loss=0.03029, over 973245.20 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:12:40,716 INFO [train.py:715] (0/8) Epoch 16, batch 18400, loss[loss=0.1405, simple_loss=0.2219, pruned_loss=0.02953, over 4771.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03009, over 972972.85 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:13:19,240 INFO [train.py:715] (0/8) Epoch 16, batch 18450, loss[loss=0.1257, simple_loss=0.1979, pruned_loss=0.02668, over 4852.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.02995, over 972631.21 frames.], batch size: 30, lr: 1.38e-04 2022-05-08 18:13:57,847 INFO [train.py:715] (0/8) Epoch 16, batch 18500, loss[loss=0.1121, simple_loss=0.1933, pruned_loss=0.01541, over 4922.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02966, over 971744.51 frames.], batch size: 29, lr: 1.38e-04 2022-05-08 18:14:36,369 INFO [train.py:715] (0/8) Epoch 16, batch 18550, loss[loss=0.142, simple_loss=0.2135, pruned_loss=0.03527, over 4802.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.02932, over 972109.38 frames.], batch size: 14, lr: 1.38e-04 2022-05-08 18:15:13,853 INFO [train.py:715] (0/8) Epoch 16, batch 18600, loss[loss=0.1235, simple_loss=0.1942, pruned_loss=0.0264, over 4818.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02964, over 971790.40 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:15:52,136 INFO [train.py:715] (0/8) Epoch 16, batch 18650, loss[loss=0.1239, simple_loss=0.1987, pruned_loss=0.02452, over 4829.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02974, over 971232.92 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 18:16:30,639 INFO [train.py:715] (0/8) Epoch 16, batch 18700, loss[loss=0.1054, simple_loss=0.1727, pruned_loss=0.01909, over 4830.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02938, over 971870.60 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 18:17:08,137 INFO [train.py:715] (0/8) Epoch 16, batch 18750, loss[loss=0.1403, simple_loss=0.2188, pruned_loss=0.03095, over 4972.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02964, over 972225.41 frames.], batch size: 24, lr: 1.38e-04 2022-05-08 18:17:45,510 INFO [train.py:715] (0/8) Epoch 16, batch 18800, loss[loss=0.1261, simple_loss=0.2064, pruned_loss=0.02293, over 4976.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02948, over 972271.58 frames.], batch size: 25, lr: 1.38e-04 2022-05-08 18:18:23,819 INFO [train.py:715] (0/8) Epoch 16, batch 18850, loss[loss=0.1213, simple_loss=0.2004, pruned_loss=0.02112, over 4811.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02966, over 972445.62 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 18:19:02,090 INFO [train.py:715] (0/8) Epoch 16, batch 18900, loss[loss=0.1498, simple_loss=0.2275, pruned_loss=0.036, over 4897.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02932, over 972396.80 frames.], batch size: 39, lr: 1.38e-04 2022-05-08 18:19:39,519 INFO [train.py:715] (0/8) Epoch 16, batch 18950, loss[loss=0.1565, simple_loss=0.2316, pruned_loss=0.04068, over 4736.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02935, over 973450.16 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 18:20:17,357 INFO [train.py:715] (0/8) Epoch 16, batch 19000, loss[loss=0.1342, simple_loss=0.2132, pruned_loss=0.02758, over 4880.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02983, over 972502.79 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 18:20:55,963 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-576000.pt 2022-05-08 18:20:58,554 INFO [train.py:715] (0/8) Epoch 16, batch 19050, loss[loss=0.1292, simple_loss=0.2062, pruned_loss=0.02615, over 4817.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.0296, over 972161.25 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:21:36,425 INFO [train.py:715] (0/8) Epoch 16, batch 19100, loss[loss=0.1312, simple_loss=0.2054, pruned_loss=0.02849, over 4950.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.0297, over 972554.97 frames.], batch size: 21, lr: 1.38e-04 2022-05-08 18:22:14,086 INFO [train.py:715] (0/8) Epoch 16, batch 19150, loss[loss=0.1524, simple_loss=0.2282, pruned_loss=0.03826, over 4807.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2082, pruned_loss=0.03024, over 973250.75 frames.], batch size: 14, lr: 1.38e-04 2022-05-08 18:22:52,366 INFO [train.py:715] (0/8) Epoch 16, batch 19200, loss[loss=0.1337, simple_loss=0.2099, pruned_loss=0.0287, over 4816.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02986, over 972708.76 frames.], batch size: 27, lr: 1.38e-04 2022-05-08 18:23:31,017 INFO [train.py:715] (0/8) Epoch 16, batch 19250, loss[loss=0.1173, simple_loss=0.1992, pruned_loss=0.01769, over 4792.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2076, pruned_loss=0.02979, over 972855.64 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 18:24:08,552 INFO [train.py:715] (0/8) Epoch 16, batch 19300, loss[loss=0.1509, simple_loss=0.2301, pruned_loss=0.03587, over 4922.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.02952, over 972410.69 frames.], batch size: 18, lr: 1.38e-04 2022-05-08 18:24:46,580 INFO [train.py:715] (0/8) Epoch 16, batch 19350, loss[loss=0.1454, simple_loss=0.2108, pruned_loss=0.04, over 4740.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2084, pruned_loss=0.02968, over 971818.42 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 18:25:25,225 INFO [train.py:715] (0/8) Epoch 16, batch 19400, loss[loss=0.1434, simple_loss=0.2153, pruned_loss=0.0358, over 4986.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02956, over 972347.94 frames.], batch size: 16, lr: 1.38e-04 2022-05-08 18:26:03,256 INFO [train.py:715] (0/8) Epoch 16, batch 19450, loss[loss=0.1524, simple_loss=0.2279, pruned_loss=0.03847, over 4828.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2078, pruned_loss=0.02933, over 971677.56 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 18:26:40,796 INFO [train.py:715] (0/8) Epoch 16, batch 19500, loss[loss=0.1276, simple_loss=0.2033, pruned_loss=0.02593, over 4861.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02943, over 972031.95 frames.], batch size: 13, lr: 1.38e-04 2022-05-08 18:27:18,955 INFO [train.py:715] (0/8) Epoch 16, batch 19550, loss[loss=0.1211, simple_loss=0.1859, pruned_loss=0.02812, over 4773.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2081, pruned_loss=0.02961, over 971606.77 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 18:27:57,188 INFO [train.py:715] (0/8) Epoch 16, batch 19600, loss[loss=0.1558, simple_loss=0.2372, pruned_loss=0.0372, over 4910.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02973, over 971203.39 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:28:34,598 INFO [train.py:715] (0/8) Epoch 16, batch 19650, loss[loss=0.1229, simple_loss=0.1983, pruned_loss=0.02372, over 4757.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02981, over 971602.00 frames.], batch size: 12, lr: 1.38e-04 2022-05-08 18:29:12,873 INFO [train.py:715] (0/8) Epoch 16, batch 19700, loss[loss=0.1379, simple_loss=0.2212, pruned_loss=0.02726, over 4760.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.0296, over 972054.65 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:29:51,090 INFO [train.py:715] (0/8) Epoch 16, batch 19750, loss[loss=0.144, simple_loss=0.2185, pruned_loss=0.03473, over 4840.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02993, over 972326.44 frames.], batch size: 32, lr: 1.38e-04 2022-05-08 18:30:28,915 INFO [train.py:715] (0/8) Epoch 16, batch 19800, loss[loss=0.1523, simple_loss=0.2234, pruned_loss=0.04056, over 4918.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2093, pruned_loss=0.02993, over 972746.50 frames.], batch size: 17, lr: 1.38e-04 2022-05-08 18:31:06,633 INFO [train.py:715] (0/8) Epoch 16, batch 19850, loss[loss=0.1243, simple_loss=0.2046, pruned_loss=0.02202, over 4808.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2098, pruned_loss=0.03, over 971938.60 frames.], batch size: 25, lr: 1.38e-04 2022-05-08 18:31:44,939 INFO [train.py:715] (0/8) Epoch 16, batch 19900, loss[loss=0.1161, simple_loss=0.1891, pruned_loss=0.02149, over 4895.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2089, pruned_loss=0.02977, over 973250.22 frames.], batch size: 22, lr: 1.38e-04 2022-05-08 18:32:22,969 INFO [train.py:715] (0/8) Epoch 16, batch 19950, loss[loss=0.1204, simple_loss=0.1994, pruned_loss=0.02073, over 4761.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2081, pruned_loss=0.02939, over 971768.74 frames.], batch size: 19, lr: 1.38e-04 2022-05-08 18:33:00,611 INFO [train.py:715] (0/8) Epoch 16, batch 20000, loss[loss=0.1151, simple_loss=0.193, pruned_loss=0.01859, over 4801.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2075, pruned_loss=0.0291, over 971859.67 frames.], batch size: 24, lr: 1.38e-04 2022-05-08 18:33:38,890 INFO [train.py:715] (0/8) Epoch 16, batch 20050, loss[loss=0.1241, simple_loss=0.2015, pruned_loss=0.02339, over 4853.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02899, over 972011.42 frames.], batch size: 20, lr: 1.38e-04 2022-05-08 18:34:17,303 INFO [train.py:715] (0/8) Epoch 16, batch 20100, loss[loss=0.1429, simple_loss=0.222, pruned_loss=0.03189, over 4935.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02934, over 972788.45 frames.], batch size: 29, lr: 1.38e-04 2022-05-08 18:34:54,666 INFO [train.py:715] (0/8) Epoch 16, batch 20150, loss[loss=0.1337, simple_loss=0.2103, pruned_loss=0.02849, over 4938.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02906, over 973338.81 frames.], batch size: 29, lr: 1.38e-04 2022-05-08 18:35:32,574 INFO [train.py:715] (0/8) Epoch 16, batch 20200, loss[loss=0.1329, simple_loss=0.201, pruned_loss=0.03234, over 4695.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02956, over 972786.87 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:36:10,893 INFO [train.py:715] (0/8) Epoch 16, batch 20250, loss[loss=0.1369, simple_loss=0.2071, pruned_loss=0.03332, over 4846.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02992, over 972605.68 frames.], batch size: 32, lr: 1.38e-04 2022-05-08 18:36:49,188 INFO [train.py:715] (0/8) Epoch 16, batch 20300, loss[loss=0.1311, simple_loss=0.2091, pruned_loss=0.02653, over 4688.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03002, over 972603.06 frames.], batch size: 15, lr: 1.38e-04 2022-05-08 18:37:27,013 INFO [train.py:715] (0/8) Epoch 16, batch 20350, loss[loss=0.1397, simple_loss=0.2041, pruned_loss=0.03763, over 4833.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02985, over 972197.35 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 18:38:05,171 INFO [train.py:715] (0/8) Epoch 16, batch 20400, loss[loss=0.15, simple_loss=0.2261, pruned_loss=0.03697, over 4782.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02985, over 971636.61 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 18:38:43,163 INFO [train.py:715] (0/8) Epoch 16, batch 20450, loss[loss=0.1021, simple_loss=0.178, pruned_loss=0.01309, over 4964.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02993, over 972145.13 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 18:39:21,069 INFO [train.py:715] (0/8) Epoch 16, batch 20500, loss[loss=0.115, simple_loss=0.1947, pruned_loss=0.0176, over 4960.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2071, pruned_loss=0.02975, over 972439.34 frames.], batch size: 35, lr: 1.37e-04 2022-05-08 18:39:58,713 INFO [train.py:715] (0/8) Epoch 16, batch 20550, loss[loss=0.1284, simple_loss=0.1973, pruned_loss=0.02973, over 4791.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02961, over 972412.23 frames.], batch size: 14, lr: 1.37e-04 2022-05-08 18:40:37,504 INFO [train.py:715] (0/8) Epoch 16, batch 20600, loss[loss=0.1227, simple_loss=0.1922, pruned_loss=0.02661, over 4956.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02924, over 972255.28 frames.], batch size: 39, lr: 1.37e-04 2022-05-08 18:41:15,470 INFO [train.py:715] (0/8) Epoch 16, batch 20650, loss[loss=0.1147, simple_loss=0.1757, pruned_loss=0.02684, over 4868.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02945, over 972967.42 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 18:41:52,931 INFO [train.py:715] (0/8) Epoch 16, batch 20700, loss[loss=0.1376, simple_loss=0.215, pruned_loss=0.03011, over 4761.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02881, over 973214.11 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 18:42:31,437 INFO [train.py:715] (0/8) Epoch 16, batch 20750, loss[loss=0.1157, simple_loss=0.1863, pruned_loss=0.02253, over 4828.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02838, over 973555.46 frames.], batch size: 27, lr: 1.37e-04 2022-05-08 18:43:09,453 INFO [train.py:715] (0/8) Epoch 16, batch 20800, loss[loss=0.1182, simple_loss=0.1948, pruned_loss=0.02074, over 4795.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.0284, over 973966.17 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 18:43:47,987 INFO [train.py:715] (0/8) Epoch 16, batch 20850, loss[loss=0.1295, simple_loss=0.2037, pruned_loss=0.0276, over 4816.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02896, over 972956.11 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 18:44:25,951 INFO [train.py:715] (0/8) Epoch 16, batch 20900, loss[loss=0.1165, simple_loss=0.1904, pruned_loss=0.02128, over 4855.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.0294, over 972944.76 frames.], batch size: 32, lr: 1.37e-04 2022-05-08 18:45:05,221 INFO [train.py:715] (0/8) Epoch 16, batch 20950, loss[loss=0.1037, simple_loss=0.1791, pruned_loss=0.01418, over 4821.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2064, pruned_loss=0.02948, over 972004.84 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 18:45:43,428 INFO [train.py:715] (0/8) Epoch 16, batch 21000, loss[loss=0.1227, simple_loss=0.2048, pruned_loss=0.02036, over 4794.00 frames.], tot_loss[loss=0.1322, simple_loss=0.206, pruned_loss=0.02921, over 971853.25 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 18:45:43,429 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 18:45:53,029 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.1047, simple_loss=0.1882, pruned_loss=0.0106, over 914524.00 frames. 2022-05-08 18:46:31,911 INFO [train.py:715] (0/8) Epoch 16, batch 21050, loss[loss=0.1503, simple_loss=0.2184, pruned_loss=0.04115, over 4692.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2068, pruned_loss=0.02992, over 971464.64 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 18:47:10,471 INFO [train.py:715] (0/8) Epoch 16, batch 21100, loss[loss=0.1304, simple_loss=0.2026, pruned_loss=0.02906, over 4846.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02946, over 972041.01 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 18:47:49,068 INFO [train.py:715] (0/8) Epoch 16, batch 21150, loss[loss=0.1312, simple_loss=0.2181, pruned_loss=0.02213, over 4835.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02905, over 972012.20 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 18:48:27,787 INFO [train.py:715] (0/8) Epoch 16, batch 21200, loss[loss=0.1344, simple_loss=0.1932, pruned_loss=0.0378, over 4770.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2076, pruned_loss=0.0294, over 972436.57 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 18:49:06,845 INFO [train.py:715] (0/8) Epoch 16, batch 21250, loss[loss=0.1391, simple_loss=0.2115, pruned_loss=0.03331, over 4759.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2091, pruned_loss=0.03008, over 972689.83 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 18:49:44,917 INFO [train.py:715] (0/8) Epoch 16, batch 21300, loss[loss=0.1519, simple_loss=0.2201, pruned_loss=0.04184, over 4866.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03036, over 972330.35 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 18:50:23,519 INFO [train.py:715] (0/8) Epoch 16, batch 21350, loss[loss=0.1299, simple_loss=0.2066, pruned_loss=0.02666, over 4942.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2087, pruned_loss=0.03028, over 972403.57 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 18:51:01,533 INFO [train.py:715] (0/8) Epoch 16, batch 21400, loss[loss=0.148, simple_loss=0.2213, pruned_loss=0.03735, over 4839.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02976, over 972779.57 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 18:51:39,052 INFO [train.py:715] (0/8) Epoch 16, batch 21450, loss[loss=0.1476, simple_loss=0.2212, pruned_loss=0.03707, over 4913.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.0298, over 973799.94 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 18:52:17,447 INFO [train.py:715] (0/8) Epoch 16, batch 21500, loss[loss=0.1176, simple_loss=0.2007, pruned_loss=0.01724, over 4972.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2087, pruned_loss=0.02995, over 974018.21 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 18:52:55,410 INFO [train.py:715] (0/8) Epoch 16, batch 21550, loss[loss=0.1374, simple_loss=0.2053, pruned_loss=0.03477, over 4912.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02984, over 973001.61 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 18:53:33,000 INFO [train.py:715] (0/8) Epoch 16, batch 21600, loss[loss=0.1545, simple_loss=0.2196, pruned_loss=0.04469, over 4864.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02945, over 972119.51 frames.], batch size: 34, lr: 1.37e-04 2022-05-08 18:54:11,334 INFO [train.py:715] (0/8) Epoch 16, batch 21650, loss[loss=0.1153, simple_loss=0.1862, pruned_loss=0.02223, over 4868.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02914, over 972520.75 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 18:54:49,119 INFO [train.py:715] (0/8) Epoch 16, batch 21700, loss[loss=0.1385, simple_loss=0.2192, pruned_loss=0.02888, over 4880.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02917, over 971971.38 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 18:55:27,320 INFO [train.py:715] (0/8) Epoch 16, batch 21750, loss[loss=0.1691, simple_loss=0.2367, pruned_loss=0.05074, over 4891.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02948, over 972859.26 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 18:56:04,815 INFO [train.py:715] (0/8) Epoch 16, batch 21800, loss[loss=0.1456, simple_loss=0.2213, pruned_loss=0.03493, over 4793.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2077, pruned_loss=0.03007, over 972589.33 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 18:56:42,917 INFO [train.py:715] (0/8) Epoch 16, batch 21850, loss[loss=0.1378, simple_loss=0.2143, pruned_loss=0.03059, over 4816.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.0301, over 972716.01 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 18:57:20,559 INFO [train.py:715] (0/8) Epoch 16, batch 21900, loss[loss=0.1428, simple_loss=0.2185, pruned_loss=0.03348, over 4816.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0297, over 972101.73 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 18:57:57,976 INFO [train.py:715] (0/8) Epoch 16, batch 21950, loss[loss=0.1396, simple_loss=0.213, pruned_loss=0.03317, over 4949.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2081, pruned_loss=0.02984, over 971686.49 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 18:58:36,382 INFO [train.py:715] (0/8) Epoch 16, batch 22000, loss[loss=0.1395, simple_loss=0.217, pruned_loss=0.03098, over 4814.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02939, over 972098.00 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 18:59:13,996 INFO [train.py:715] (0/8) Epoch 16, batch 22050, loss[loss=0.1542, simple_loss=0.22, pruned_loss=0.04416, over 4874.00 frames.], tot_loss[loss=0.1331, simple_loss=0.207, pruned_loss=0.02959, over 970742.30 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 18:59:52,234 INFO [train.py:715] (0/8) Epoch 16, batch 22100, loss[loss=0.1415, simple_loss=0.2176, pruned_loss=0.0327, over 4830.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02965, over 971136.16 frames.], batch size: 27, lr: 1.37e-04 2022-05-08 19:00:29,949 INFO [train.py:715] (0/8) Epoch 16, batch 22150, loss[loss=0.1294, simple_loss=0.2109, pruned_loss=0.02393, over 4852.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2076, pruned_loss=0.02982, over 971625.56 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:01:08,385 INFO [train.py:715] (0/8) Epoch 16, batch 22200, loss[loss=0.1095, simple_loss=0.1861, pruned_loss=0.01641, over 4939.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02939, over 972427.92 frames.], batch size: 29, lr: 1.37e-04 2022-05-08 19:01:46,148 INFO [train.py:715] (0/8) Epoch 16, batch 22250, loss[loss=0.1116, simple_loss=0.1836, pruned_loss=0.01978, over 4908.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.0299, over 971984.98 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 19:02:24,235 INFO [train.py:715] (0/8) Epoch 16, batch 22300, loss[loss=0.1334, simple_loss=0.2065, pruned_loss=0.03019, over 4802.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0297, over 973273.76 frames.], batch size: 14, lr: 1.37e-04 2022-05-08 19:03:02,792 INFO [train.py:715] (0/8) Epoch 16, batch 22350, loss[loss=0.1495, simple_loss=0.2294, pruned_loss=0.0348, over 4968.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02999, over 973416.52 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:03:40,841 INFO [train.py:715] (0/8) Epoch 16, batch 22400, loss[loss=0.152, simple_loss=0.2224, pruned_loss=0.04082, over 4841.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03006, over 972644.07 frames.], batch size: 30, lr: 1.37e-04 2022-05-08 19:04:19,196 INFO [train.py:715] (0/8) Epoch 16, batch 22450, loss[loss=0.1224, simple_loss=0.1859, pruned_loss=0.02941, over 4740.00 frames.], tot_loss[loss=0.135, simple_loss=0.2085, pruned_loss=0.03072, over 971438.06 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 19:04:57,326 INFO [train.py:715] (0/8) Epoch 16, batch 22500, loss[loss=0.1277, simple_loss=0.2052, pruned_loss=0.0251, over 4830.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2077, pruned_loss=0.03036, over 971513.12 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 19:05:35,511 INFO [train.py:715] (0/8) Epoch 16, batch 22550, loss[loss=0.1127, simple_loss=0.1889, pruned_loss=0.01829, over 4825.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2075, pruned_loss=0.03043, over 971007.86 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 19:06:13,251 INFO [train.py:715] (0/8) Epoch 16, batch 22600, loss[loss=0.1522, simple_loss=0.2261, pruned_loss=0.03914, over 4927.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03035, over 971770.21 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 19:06:50,939 INFO [train.py:715] (0/8) Epoch 16, batch 22650, loss[loss=0.1172, simple_loss=0.1943, pruned_loss=0.02005, over 4829.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02975, over 971364.08 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 19:07:29,632 INFO [train.py:715] (0/8) Epoch 16, batch 22700, loss[loss=0.1236, simple_loss=0.1947, pruned_loss=0.02622, over 4862.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02973, over 971409.52 frames.], batch size: 38, lr: 1.37e-04 2022-05-08 19:08:07,675 INFO [train.py:715] (0/8) Epoch 16, batch 22750, loss[loss=0.1348, simple_loss=0.2201, pruned_loss=0.02469, over 4964.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02925, over 971947.02 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:08:45,788 INFO [train.py:715] (0/8) Epoch 16, batch 22800, loss[loss=0.1154, simple_loss=0.1926, pruned_loss=0.01914, over 4910.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02937, over 971949.02 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 19:09:23,697 INFO [train.py:715] (0/8) Epoch 16, batch 22850, loss[loss=0.1198, simple_loss=0.1895, pruned_loss=0.02509, over 4907.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02941, over 971847.74 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:10:01,844 INFO [train.py:715] (0/8) Epoch 16, batch 22900, loss[loss=0.1404, simple_loss=0.2167, pruned_loss=0.03199, over 4836.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02954, over 971431.65 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:10:39,880 INFO [train.py:715] (0/8) Epoch 16, batch 22950, loss[loss=0.1335, simple_loss=0.2006, pruned_loss=0.03322, over 4938.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02977, over 971918.16 frames.], batch size: 29, lr: 1.37e-04 2022-05-08 19:11:17,827 INFO [train.py:715] (0/8) Epoch 16, batch 23000, loss[loss=0.1469, simple_loss=0.224, pruned_loss=0.03484, over 4938.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03011, over 972188.94 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 19:11:56,365 INFO [train.py:715] (0/8) Epoch 16, batch 23050, loss[loss=0.1303, simple_loss=0.2104, pruned_loss=0.02513, over 4850.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03007, over 972441.66 frames.], batch size: 32, lr: 1.37e-04 2022-05-08 19:12:34,513 INFO [train.py:715] (0/8) Epoch 16, batch 23100, loss[loss=0.1311, simple_loss=0.2024, pruned_loss=0.02992, over 4867.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.0298, over 972623.00 frames.], batch size: 22, lr: 1.37e-04 2022-05-08 19:13:12,450 INFO [train.py:715] (0/8) Epoch 16, batch 23150, loss[loss=0.1325, simple_loss=0.2171, pruned_loss=0.02396, over 4757.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02995, over 972977.01 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 19:13:50,194 INFO [train.py:715] (0/8) Epoch 16, batch 23200, loss[loss=0.1284, simple_loss=0.2051, pruned_loss=0.02582, over 4852.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2067, pruned_loss=0.02946, over 972493.64 frames.], batch size: 30, lr: 1.37e-04 2022-05-08 19:14:28,508 INFO [train.py:715] (0/8) Epoch 16, batch 23250, loss[loss=0.129, simple_loss=0.2047, pruned_loss=0.02664, over 4778.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2062, pruned_loss=0.02937, over 972689.91 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:15:06,175 INFO [train.py:715] (0/8) Epoch 16, batch 23300, loss[loss=0.135, simple_loss=0.1978, pruned_loss=0.03613, over 4857.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2066, pruned_loss=0.02984, over 971972.21 frames.], batch size: 32, lr: 1.37e-04 2022-05-08 19:15:44,246 INFO [train.py:715] (0/8) Epoch 16, batch 23350, loss[loss=0.1404, simple_loss=0.208, pruned_loss=0.03645, over 4824.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2077, pruned_loss=0.03037, over 971916.75 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 19:16:21,894 INFO [train.py:715] (0/8) Epoch 16, batch 23400, loss[loss=0.1219, simple_loss=0.2049, pruned_loss=0.01949, over 4777.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2076, pruned_loss=0.03034, over 971830.27 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 19:16:59,782 INFO [train.py:715] (0/8) Epoch 16, batch 23450, loss[loss=0.09388, simple_loss=0.1647, pruned_loss=0.01152, over 4834.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2066, pruned_loss=0.02964, over 972266.22 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 19:17:37,688 INFO [train.py:715] (0/8) Epoch 16, batch 23500, loss[loss=0.1409, simple_loss=0.2197, pruned_loss=0.03111, over 4921.00 frames.], tot_loss[loss=0.133, simple_loss=0.2067, pruned_loss=0.02961, over 971895.60 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:18:15,671 INFO [train.py:715] (0/8) Epoch 16, batch 23550, loss[loss=0.1114, simple_loss=0.1809, pruned_loss=0.02102, over 4755.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2059, pruned_loss=0.02931, over 971715.09 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 19:18:54,221 INFO [train.py:715] (0/8) Epoch 16, batch 23600, loss[loss=0.1278, simple_loss=0.2084, pruned_loss=0.0236, over 4638.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02918, over 971422.34 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 19:19:31,587 INFO [train.py:715] (0/8) Epoch 16, batch 23650, loss[loss=0.1567, simple_loss=0.2147, pruned_loss=0.0493, over 4852.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.02932, over 971336.95 frames.], batch size: 30, lr: 1.37e-04 2022-05-08 19:20:09,498 INFO [train.py:715] (0/8) Epoch 16, batch 23700, loss[loss=0.1402, simple_loss=0.208, pruned_loss=0.03625, over 4854.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2059, pruned_loss=0.02921, over 971652.93 frames.], batch size: 30, lr: 1.37e-04 2022-05-08 19:20:47,873 INFO [train.py:715] (0/8) Epoch 16, batch 23750, loss[loss=0.1257, simple_loss=0.2045, pruned_loss=0.02348, over 4804.00 frames.], tot_loss[loss=0.132, simple_loss=0.2056, pruned_loss=0.02915, over 971237.23 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 19:21:25,948 INFO [train.py:715] (0/8) Epoch 16, batch 23800, loss[loss=0.1353, simple_loss=0.2078, pruned_loss=0.03139, over 4832.00 frames.], tot_loss[loss=0.1336, simple_loss=0.207, pruned_loss=0.03008, over 972176.34 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:22:04,204 INFO [train.py:715] (0/8) Epoch 16, batch 23850, loss[loss=0.147, simple_loss=0.2153, pruned_loss=0.0393, over 4934.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2065, pruned_loss=0.02958, over 972651.10 frames.], batch size: 39, lr: 1.37e-04 2022-05-08 19:22:42,140 INFO [train.py:715] (0/8) Epoch 16, batch 23900, loss[loss=0.1278, simple_loss=0.2059, pruned_loss=0.02479, over 4807.00 frames.], tot_loss[loss=0.1322, simple_loss=0.206, pruned_loss=0.02917, over 971813.80 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 19:23:20,417 INFO [train.py:715] (0/8) Epoch 16, batch 23950, loss[loss=0.1402, simple_loss=0.2118, pruned_loss=0.03432, over 4871.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.0288, over 971579.36 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 19:23:57,816 INFO [train.py:715] (0/8) Epoch 16, batch 24000, loss[loss=0.1245, simple_loss=0.1977, pruned_loss=0.02561, over 4778.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02865, over 971472.16 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 19:23:57,817 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 19:24:07,636 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.1049, simple_loss=0.1883, pruned_loss=0.01074, over 914524.00 frames. 2022-05-08 19:24:46,403 INFO [train.py:715] (0/8) Epoch 16, batch 24050, loss[loss=0.1626, simple_loss=0.2419, pruned_loss=0.04164, over 4848.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02847, over 971473.25 frames.], batch size: 30, lr: 1.37e-04 2022-05-08 19:25:24,727 INFO [train.py:715] (0/8) Epoch 16, batch 24100, loss[loss=0.1523, simple_loss=0.2209, pruned_loss=0.04188, over 4984.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02879, over 971644.50 frames.], batch size: 39, lr: 1.37e-04 2022-05-08 19:26:03,112 INFO [train.py:715] (0/8) Epoch 16, batch 24150, loss[loss=0.121, simple_loss=0.2013, pruned_loss=0.02036, over 4918.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2074, pruned_loss=0.02898, over 971704.02 frames.], batch size: 29, lr: 1.37e-04 2022-05-08 19:26:40,868 INFO [train.py:715] (0/8) Epoch 16, batch 24200, loss[loss=0.146, simple_loss=0.2193, pruned_loss=0.03633, over 4961.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02867, over 971995.22 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:27:19,227 INFO [train.py:715] (0/8) Epoch 16, batch 24250, loss[loss=0.1158, simple_loss=0.1949, pruned_loss=0.01831, over 4861.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2071, pruned_loss=0.02853, over 972303.96 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 19:27:57,170 INFO [train.py:715] (0/8) Epoch 16, batch 24300, loss[loss=0.1517, simple_loss=0.2152, pruned_loss=0.04411, over 4986.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.0291, over 972380.48 frames.], batch size: 31, lr: 1.37e-04 2022-05-08 19:28:35,669 INFO [train.py:715] (0/8) Epoch 16, batch 24350, loss[loss=0.1167, simple_loss=0.2032, pruned_loss=0.01514, over 4931.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02899, over 971944.45 frames.], batch size: 29, lr: 1.37e-04 2022-05-08 19:29:13,221 INFO [train.py:715] (0/8) Epoch 16, batch 24400, loss[loss=0.128, simple_loss=0.2106, pruned_loss=0.02273, over 4806.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.02898, over 972162.37 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 19:29:50,779 INFO [train.py:715] (0/8) Epoch 16, batch 24450, loss[loss=0.1467, simple_loss=0.2153, pruned_loss=0.03902, over 4833.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.0292, over 972343.08 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 19:30:28,694 INFO [train.py:715] (0/8) Epoch 16, batch 24500, loss[loss=0.1468, simple_loss=0.2184, pruned_loss=0.03761, over 4908.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02921, over 973485.36 frames.], batch size: 39, lr: 1.37e-04 2022-05-08 19:31:06,545 INFO [train.py:715] (0/8) Epoch 16, batch 24550, loss[loss=0.1186, simple_loss=0.1899, pruned_loss=0.02368, over 4796.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2066, pruned_loss=0.02946, over 973017.23 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:31:43,989 INFO [train.py:715] (0/8) Epoch 16, batch 24600, loss[loss=0.1318, simple_loss=0.2101, pruned_loss=0.02674, over 4793.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02938, over 972577.95 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 19:32:21,344 INFO [train.py:715] (0/8) Epoch 16, batch 24650, loss[loss=0.1253, simple_loss=0.1952, pruned_loss=0.02771, over 4900.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.0294, over 972525.09 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 19:32:59,494 INFO [train.py:715] (0/8) Epoch 16, batch 24700, loss[loss=0.145, simple_loss=0.227, pruned_loss=0.03146, over 4843.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02933, over 971534.63 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:33:37,065 INFO [train.py:715] (0/8) Epoch 16, batch 24750, loss[loss=0.1199, simple_loss=0.1877, pruned_loss=0.02609, over 4825.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02942, over 970982.20 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:34:14,864 INFO [train.py:715] (0/8) Epoch 16, batch 24800, loss[loss=0.1314, simple_loss=0.1979, pruned_loss=0.03243, over 4774.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02954, over 971331.76 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:34:52,610 INFO [train.py:715] (0/8) Epoch 16, batch 24850, loss[loss=0.1514, simple_loss=0.2331, pruned_loss=0.0348, over 4862.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02971, over 971721.10 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 19:35:30,366 INFO [train.py:715] (0/8) Epoch 16, batch 24900, loss[loss=0.1438, simple_loss=0.2094, pruned_loss=0.03909, over 4960.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02943, over 972344.09 frames.], batch size: 35, lr: 1.37e-04 2022-05-08 19:36:08,064 INFO [train.py:715] (0/8) Epoch 16, batch 24950, loss[loss=0.1264, simple_loss=0.2024, pruned_loss=0.02522, over 4776.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2074, pruned_loss=0.02924, over 972699.40 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 19:36:45,483 INFO [train.py:715] (0/8) Epoch 16, batch 25000, loss[loss=0.1308, simple_loss=0.2106, pruned_loss=0.02551, over 4762.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02892, over 972455.13 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 19:37:23,736 INFO [train.py:715] (0/8) Epoch 16, batch 25050, loss[loss=0.1256, simple_loss=0.1976, pruned_loss=0.02681, over 4840.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02857, over 973065.35 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:38:02,496 INFO [train.py:715] (0/8) Epoch 16, batch 25100, loss[loss=0.1195, simple_loss=0.195, pruned_loss=0.02203, over 4961.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2058, pruned_loss=0.02834, over 972328.90 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 19:38:40,220 INFO [train.py:715] (0/8) Epoch 16, batch 25150, loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03, over 4948.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.0287, over 971932.61 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 19:39:18,057 INFO [train.py:715] (0/8) Epoch 16, batch 25200, loss[loss=0.1339, simple_loss=0.2025, pruned_loss=0.03265, over 4933.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02899, over 971789.36 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:39:56,031 INFO [train.py:715] (0/8) Epoch 16, batch 25250, loss[loss=0.1433, simple_loss=0.2156, pruned_loss=0.0355, over 4691.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02922, over 971609.25 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:40:33,642 INFO [train.py:715] (0/8) Epoch 16, batch 25300, loss[loss=0.132, simple_loss=0.2134, pruned_loss=0.02528, over 4850.00 frames.], tot_loss[loss=0.133, simple_loss=0.2077, pruned_loss=0.02916, over 971135.30 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 19:41:10,908 INFO [train.py:715] (0/8) Epoch 16, batch 25350, loss[loss=0.1398, simple_loss=0.2111, pruned_loss=0.03423, over 4789.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02968, over 971133.86 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:41:49,012 INFO [train.py:715] (0/8) Epoch 16, batch 25400, loss[loss=0.126, simple_loss=0.1976, pruned_loss=0.02717, over 4855.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.02946, over 971636.18 frames.], batch size: 34, lr: 1.37e-04 2022-05-08 19:42:27,346 INFO [train.py:715] (0/8) Epoch 16, batch 25450, loss[loss=0.152, simple_loss=0.2319, pruned_loss=0.03602, over 4955.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02948, over 972158.51 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 19:43:04,840 INFO [train.py:715] (0/8) Epoch 16, batch 25500, loss[loss=0.1366, simple_loss=0.2123, pruned_loss=0.03041, over 4952.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02944, over 972661.89 frames.], batch size: 29, lr: 1.37e-04 2022-05-08 19:43:42,830 INFO [train.py:715] (0/8) Epoch 16, batch 25550, loss[loss=0.1358, simple_loss=0.2062, pruned_loss=0.03267, over 4978.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02976, over 973377.84 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:44:21,341 INFO [train.py:715] (0/8) Epoch 16, batch 25600, loss[loss=0.1479, simple_loss=0.2206, pruned_loss=0.03757, over 4832.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02967, over 972499.88 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 19:45:00,125 INFO [train.py:715] (0/8) Epoch 16, batch 25650, loss[loss=0.1206, simple_loss=0.2005, pruned_loss=0.02037, over 4748.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02926, over 971780.72 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 19:45:38,352 INFO [train.py:715] (0/8) Epoch 16, batch 25700, loss[loss=0.1331, simple_loss=0.2114, pruned_loss=0.02734, over 4955.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02965, over 971940.46 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 19:46:16,985 INFO [train.py:715] (0/8) Epoch 16, batch 25750, loss[loss=0.1212, simple_loss=0.1873, pruned_loss=0.02757, over 4839.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02983, over 972278.14 frames.], batch size: 32, lr: 1.37e-04 2022-05-08 19:46:55,622 INFO [train.py:715] (0/8) Epoch 16, batch 25800, loss[loss=0.1223, simple_loss=0.199, pruned_loss=0.02284, over 4905.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02969, over 972349.55 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 19:47:34,228 INFO [train.py:715] (0/8) Epoch 16, batch 25850, loss[loss=0.172, simple_loss=0.2411, pruned_loss=0.05144, over 4857.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02984, over 972336.96 frames.], batch size: 32, lr: 1.37e-04 2022-05-08 19:48:13,049 INFO [train.py:715] (0/8) Epoch 16, batch 25900, loss[loss=0.136, simple_loss=0.2067, pruned_loss=0.03262, over 4823.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02969, over 972384.83 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:48:52,467 INFO [train.py:715] (0/8) Epoch 16, batch 25950, loss[loss=0.1391, simple_loss=0.209, pruned_loss=0.03457, over 4809.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.02944, over 972425.18 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 19:49:32,203 INFO [train.py:715] (0/8) Epoch 16, batch 26000, loss[loss=0.1033, simple_loss=0.1735, pruned_loss=0.01654, over 4820.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02915, over 973040.96 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 19:50:11,555 INFO [train.py:715] (0/8) Epoch 16, batch 26050, loss[loss=0.08701, simple_loss=0.1575, pruned_loss=0.008236, over 4812.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02946, over 973215.08 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 19:50:50,793 INFO [train.py:715] (0/8) Epoch 16, batch 26100, loss[loss=0.1306, simple_loss=0.1987, pruned_loss=0.03127, over 4769.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2065, pruned_loss=0.02941, over 972817.25 frames.], batch size: 14, lr: 1.37e-04 2022-05-08 19:51:30,058 INFO [train.py:715] (0/8) Epoch 16, batch 26150, loss[loss=0.121, simple_loss=0.1948, pruned_loss=0.0236, over 4982.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2065, pruned_loss=0.02947, over 973090.10 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:52:08,699 INFO [train.py:715] (0/8) Epoch 16, batch 26200, loss[loss=0.1135, simple_loss=0.1799, pruned_loss=0.02354, over 4692.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2065, pruned_loss=0.02948, over 972816.55 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 19:52:48,173 INFO [train.py:715] (0/8) Epoch 16, batch 26250, loss[loss=0.1102, simple_loss=0.1848, pruned_loss=0.01776, over 4701.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02944, over 972737.62 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 19:53:27,328 INFO [train.py:715] (0/8) Epoch 16, batch 26300, loss[loss=0.1296, simple_loss=0.204, pruned_loss=0.02758, over 4982.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02985, over 972747.21 frames.], batch size: 35, lr: 1.37e-04 2022-05-08 19:54:06,988 INFO [train.py:715] (0/8) Epoch 16, batch 26350, loss[loss=0.1332, simple_loss=0.2005, pruned_loss=0.03291, over 4988.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02999, over 972750.67 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 19:54:46,279 INFO [train.py:715] (0/8) Epoch 16, batch 26400, loss[loss=0.1192, simple_loss=0.2004, pruned_loss=0.01895, over 4936.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02973, over 971921.37 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 19:55:26,166 INFO [train.py:715] (0/8) Epoch 16, batch 26450, loss[loss=0.1114, simple_loss=0.186, pruned_loss=0.01841, over 4813.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02947, over 971456.86 frames.], batch size: 26, lr: 1.37e-04 2022-05-08 19:56:05,122 INFO [train.py:715] (0/8) Epoch 16, batch 26500, loss[loss=0.1182, simple_loss=0.1897, pruned_loss=0.02338, over 4747.00 frames.], tot_loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02959, over 971675.40 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 19:56:44,033 INFO [train.py:715] (0/8) Epoch 16, batch 26550, loss[loss=0.1348, simple_loss=0.2095, pruned_loss=0.03006, over 4967.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2061, pruned_loss=0.02912, over 971850.75 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 19:57:23,099 INFO [train.py:715] (0/8) Epoch 16, batch 26600, loss[loss=0.1287, simple_loss=0.2102, pruned_loss=0.02357, over 4875.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2062, pruned_loss=0.0292, over 972057.56 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 19:58:02,095 INFO [train.py:715] (0/8) Epoch 16, batch 26650, loss[loss=0.1345, simple_loss=0.203, pruned_loss=0.033, over 4800.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2061, pruned_loss=0.02918, over 971383.38 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 19:58:41,424 INFO [train.py:715] (0/8) Epoch 16, batch 26700, loss[loss=0.1238, simple_loss=0.1959, pruned_loss=0.02582, over 4771.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02893, over 971667.79 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 19:59:20,658 INFO [train.py:715] (0/8) Epoch 16, batch 26750, loss[loss=0.1215, simple_loss=0.1913, pruned_loss=0.02584, over 4816.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2059, pruned_loss=0.02886, over 971695.77 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 20:00:00,472 INFO [train.py:715] (0/8) Epoch 16, batch 26800, loss[loss=0.1117, simple_loss=0.1782, pruned_loss=0.02257, over 4776.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02885, over 971767.93 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 20:00:39,346 INFO [train.py:715] (0/8) Epoch 16, batch 26850, loss[loss=0.1176, simple_loss=0.1946, pruned_loss=0.02031, over 4771.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2054, pruned_loss=0.02878, over 971861.52 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 20:01:18,825 INFO [train.py:715] (0/8) Epoch 16, batch 26900, loss[loss=0.1057, simple_loss=0.1786, pruned_loss=0.01643, over 4924.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2058, pruned_loss=0.02916, over 972266.24 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 20:01:58,323 INFO [train.py:715] (0/8) Epoch 16, batch 26950, loss[loss=0.1346, simple_loss=0.2118, pruned_loss=0.02871, over 4808.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2062, pruned_loss=0.02909, over 971971.01 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 20:02:37,501 INFO [train.py:715] (0/8) Epoch 16, batch 27000, loss[loss=0.1322, simple_loss=0.199, pruned_loss=0.03267, over 4982.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03007, over 972535.47 frames.], batch size: 35, lr: 1.37e-04 2022-05-08 20:02:37,502 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 20:02:47,199 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.1048, simple_loss=0.1883, pruned_loss=0.01067, over 914524.00 frames. 2022-05-08 20:03:26,294 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-584000.pt 2022-05-08 20:03:29,121 INFO [train.py:715] (0/8) Epoch 16, batch 27050, loss[loss=0.1456, simple_loss=0.2201, pruned_loss=0.03551, over 4943.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2088, pruned_loss=0.03019, over 973215.13 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 20:04:08,232 INFO [train.py:715] (0/8) Epoch 16, batch 27100, loss[loss=0.119, simple_loss=0.194, pruned_loss=0.02204, over 4782.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03027, over 973131.26 frames.], batch size: 14, lr: 1.37e-04 2022-05-08 20:04:47,170 INFO [train.py:715] (0/8) Epoch 16, batch 27150, loss[loss=0.1357, simple_loss=0.2154, pruned_loss=0.02801, over 4808.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03007, over 971951.47 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 20:05:26,603 INFO [train.py:715] (0/8) Epoch 16, batch 27200, loss[loss=0.1236, simple_loss=0.1984, pruned_loss=0.02443, over 4950.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2084, pruned_loss=0.02974, over 972936.83 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 20:06:05,789 INFO [train.py:715] (0/8) Epoch 16, batch 27250, loss[loss=0.1184, simple_loss=0.201, pruned_loss=0.01785, over 4837.00 frames.], tot_loss[loss=0.1334, simple_loss=0.208, pruned_loss=0.02941, over 972710.99 frames.], batch size: 13, lr: 1.37e-04 2022-05-08 20:06:45,174 INFO [train.py:715] (0/8) Epoch 16, batch 27300, loss[loss=0.1439, simple_loss=0.2283, pruned_loss=0.02974, over 4865.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2084, pruned_loss=0.02972, over 972189.02 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 20:07:24,242 INFO [train.py:715] (0/8) Epoch 16, batch 27350, loss[loss=0.1339, simple_loss=0.2101, pruned_loss=0.02892, over 4807.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02978, over 972288.12 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 20:08:03,611 INFO [train.py:715] (0/8) Epoch 16, batch 27400, loss[loss=0.1187, simple_loss=0.2086, pruned_loss=0.01441, over 4916.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02981, over 972677.89 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 20:08:42,902 INFO [train.py:715] (0/8) Epoch 16, batch 27450, loss[loss=0.1554, simple_loss=0.2248, pruned_loss=0.043, over 4911.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02978, over 972869.65 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 20:09:21,912 INFO [train.py:715] (0/8) Epoch 16, batch 27500, loss[loss=0.146, simple_loss=0.2204, pruned_loss=0.03583, over 4959.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02983, over 972667.97 frames.], batch size: 24, lr: 1.37e-04 2022-05-08 20:10:01,268 INFO [train.py:715] (0/8) Epoch 16, batch 27550, loss[loss=0.1274, simple_loss=0.2065, pruned_loss=0.02419, over 4911.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.03005, over 972978.56 frames.], batch size: 39, lr: 1.37e-04 2022-05-08 20:10:41,118 INFO [train.py:715] (0/8) Epoch 16, batch 27600, loss[loss=0.1204, simple_loss=0.1947, pruned_loss=0.02312, over 4893.00 frames.], tot_loss[loss=0.1352, simple_loss=0.2097, pruned_loss=0.03034, over 973190.99 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 20:11:20,144 INFO [train.py:715] (0/8) Epoch 16, batch 27650, loss[loss=0.1164, simple_loss=0.1956, pruned_loss=0.0186, over 4790.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2088, pruned_loss=0.02991, over 972525.50 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 20:11:59,672 INFO [train.py:715] (0/8) Epoch 16, batch 27700, loss[loss=0.1387, simple_loss=0.2258, pruned_loss=0.02583, over 4932.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2087, pruned_loss=0.02979, over 973314.62 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 20:12:38,978 INFO [train.py:715] (0/8) Epoch 16, batch 27750, loss[loss=0.1286, simple_loss=0.2116, pruned_loss=0.02279, over 4776.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2089, pruned_loss=0.0301, over 973050.66 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 20:13:18,192 INFO [train.py:715] (0/8) Epoch 16, batch 27800, loss[loss=0.1515, simple_loss=0.2318, pruned_loss=0.03558, over 4697.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2088, pruned_loss=0.02979, over 972353.34 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 20:13:57,551 INFO [train.py:715] (0/8) Epoch 16, batch 27850, loss[loss=0.1217, simple_loss=0.2005, pruned_loss=0.02141, over 4749.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2089, pruned_loss=0.03011, over 971756.52 frames.], batch size: 16, lr: 1.37e-04 2022-05-08 20:14:36,984 INFO [train.py:715] (0/8) Epoch 16, batch 27900, loss[loss=0.1404, simple_loss=0.2143, pruned_loss=0.03322, over 4836.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2089, pruned_loss=0.02998, over 971767.10 frames.], batch size: 15, lr: 1.37e-04 2022-05-08 20:15:16,663 INFO [train.py:715] (0/8) Epoch 16, batch 27950, loss[loss=0.1101, simple_loss=0.1847, pruned_loss=0.01777, over 4795.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02985, over 971511.13 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 20:15:55,979 INFO [train.py:715] (0/8) Epoch 16, batch 28000, loss[loss=0.1092, simple_loss=0.1856, pruned_loss=0.01643, over 4815.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02907, over 971628.35 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 20:16:35,538 INFO [train.py:715] (0/8) Epoch 16, batch 28050, loss[loss=0.1417, simple_loss=0.2215, pruned_loss=0.03097, over 4994.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02964, over 972598.09 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 20:17:15,205 INFO [train.py:715] (0/8) Epoch 16, batch 28100, loss[loss=0.1279, simple_loss=0.2085, pruned_loss=0.02368, over 4960.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02964, over 972268.73 frames.], batch size: 21, lr: 1.37e-04 2022-05-08 20:17:54,187 INFO [train.py:715] (0/8) Epoch 16, batch 28150, loss[loss=0.1396, simple_loss=0.2092, pruned_loss=0.03501, over 4777.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02981, over 973060.00 frames.], batch size: 14, lr: 1.37e-04 2022-05-08 20:18:33,940 INFO [train.py:715] (0/8) Epoch 16, batch 28200, loss[loss=0.119, simple_loss=0.1998, pruned_loss=0.01912, over 4823.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02987, over 973072.74 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 20:19:13,269 INFO [train.py:715] (0/8) Epoch 16, batch 28250, loss[loss=0.1422, simple_loss=0.2073, pruned_loss=0.0386, over 4944.00 frames.], tot_loss[loss=0.1351, simple_loss=0.2094, pruned_loss=0.03047, over 972401.40 frames.], batch size: 23, lr: 1.37e-04 2022-05-08 20:19:51,888 INFO [train.py:715] (0/8) Epoch 16, batch 28300, loss[loss=0.1262, simple_loss=0.2046, pruned_loss=0.02385, over 4903.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2085, pruned_loss=0.03009, over 972194.65 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 20:20:31,603 INFO [train.py:715] (0/8) Epoch 16, batch 28350, loss[loss=0.1206, simple_loss=0.1984, pruned_loss=0.02143, over 4987.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03008, over 972621.35 frames.], batch size: 25, lr: 1.37e-04 2022-05-08 20:21:11,557 INFO [train.py:715] (0/8) Epoch 16, batch 28400, loss[loss=0.1276, simple_loss=0.1986, pruned_loss=0.02826, over 4849.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2084, pruned_loss=0.0301, over 972422.18 frames.], batch size: 32, lr: 1.37e-04 2022-05-08 20:21:51,017 INFO [train.py:715] (0/8) Epoch 16, batch 28450, loss[loss=0.1265, simple_loss=0.208, pruned_loss=0.02251, over 4908.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02961, over 972551.30 frames.], batch size: 19, lr: 1.37e-04 2022-05-08 20:22:29,719 INFO [train.py:715] (0/8) Epoch 16, batch 28500, loss[loss=0.1399, simple_loss=0.2047, pruned_loss=0.03758, over 4787.00 frames.], tot_loss[loss=0.1337, simple_loss=0.208, pruned_loss=0.02972, over 972353.85 frames.], batch size: 12, lr: 1.37e-04 2022-05-08 20:23:09,883 INFO [train.py:715] (0/8) Epoch 16, batch 28550, loss[loss=0.118, simple_loss=0.1911, pruned_loss=0.02243, over 4794.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2084, pruned_loss=0.03, over 971957.28 frames.], batch size: 17, lr: 1.37e-04 2022-05-08 20:23:49,358 INFO [train.py:715] (0/8) Epoch 16, batch 28600, loss[loss=0.1414, simple_loss=0.2194, pruned_loss=0.03167, over 4856.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0297, over 972560.12 frames.], batch size: 20, lr: 1.37e-04 2022-05-08 20:24:28,944 INFO [train.py:715] (0/8) Epoch 16, batch 28650, loss[loss=0.1384, simple_loss=0.2155, pruned_loss=0.03069, over 4987.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02973, over 972165.11 frames.], batch size: 28, lr: 1.37e-04 2022-05-08 20:25:08,098 INFO [train.py:715] (0/8) Epoch 16, batch 28700, loss[loss=0.1393, simple_loss=0.2044, pruned_loss=0.03708, over 4974.00 frames.], tot_loss[loss=0.1346, simple_loss=0.209, pruned_loss=0.03008, over 972971.28 frames.], batch size: 35, lr: 1.37e-04 2022-05-08 20:25:47,664 INFO [train.py:715] (0/8) Epoch 16, batch 28750, loss[loss=0.1415, simple_loss=0.2178, pruned_loss=0.03261, over 4796.00 frames.], tot_loss[loss=0.1348, simple_loss=0.209, pruned_loss=0.03025, over 973185.42 frames.], batch size: 18, lr: 1.37e-04 2022-05-08 20:26:27,376 INFO [train.py:715] (0/8) Epoch 16, batch 28800, loss[loss=0.1152, simple_loss=0.1922, pruned_loss=0.01913, over 4939.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2084, pruned_loss=0.02968, over 972691.51 frames.], batch size: 21, lr: 1.36e-04 2022-05-08 20:27:06,537 INFO [train.py:715] (0/8) Epoch 16, batch 28850, loss[loss=0.1255, simple_loss=0.2029, pruned_loss=0.02401, over 4981.00 frames.], tot_loss[loss=0.1342, simple_loss=0.209, pruned_loss=0.02975, over 972799.57 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 20:27:46,353 INFO [train.py:715] (0/8) Epoch 16, batch 28900, loss[loss=0.151, simple_loss=0.2092, pruned_loss=0.04641, over 4955.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2088, pruned_loss=0.02972, over 974099.88 frames.], batch size: 35, lr: 1.36e-04 2022-05-08 20:28:25,935 INFO [train.py:715] (0/8) Epoch 16, batch 28950, loss[loss=0.1723, simple_loss=0.2387, pruned_loss=0.05295, over 4849.00 frames.], tot_loss[loss=0.134, simple_loss=0.2085, pruned_loss=0.02978, over 974351.10 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 20:29:05,873 INFO [train.py:715] (0/8) Epoch 16, batch 29000, loss[loss=0.1337, simple_loss=0.2121, pruned_loss=0.02764, over 4791.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02947, over 974002.69 frames.], batch size: 17, lr: 1.36e-04 2022-05-08 20:29:45,337 INFO [train.py:715] (0/8) Epoch 16, batch 29050, loss[loss=0.1284, simple_loss=0.202, pruned_loss=0.02746, over 4848.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.02951, over 972816.33 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 20:30:25,180 INFO [train.py:715] (0/8) Epoch 16, batch 29100, loss[loss=0.1818, simple_loss=0.254, pruned_loss=0.05478, over 4904.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02951, over 972904.00 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 20:31:06,237 INFO [train.py:715] (0/8) Epoch 16, batch 29150, loss[loss=0.1533, simple_loss=0.2361, pruned_loss=0.03528, over 4802.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02911, over 973030.77 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:31:46,265 INFO [train.py:715] (0/8) Epoch 16, batch 29200, loss[loss=0.1283, simple_loss=0.2038, pruned_loss=0.02637, over 4821.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2077, pruned_loss=0.02928, over 973371.35 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 20:32:27,444 INFO [train.py:715] (0/8) Epoch 16, batch 29250, loss[loss=0.1277, simple_loss=0.2008, pruned_loss=0.02729, over 4797.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02908, over 972806.04 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 20:33:08,438 INFO [train.py:715] (0/8) Epoch 16, batch 29300, loss[loss=0.1594, simple_loss=0.2303, pruned_loss=0.04427, over 4685.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.0291, over 972449.46 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 20:33:49,838 INFO [train.py:715] (0/8) Epoch 16, batch 29350, loss[loss=0.1437, simple_loss=0.2192, pruned_loss=0.03415, over 4828.00 frames.], tot_loss[loss=0.1337, simple_loss=0.208, pruned_loss=0.02972, over 971762.40 frames.], batch size: 26, lr: 1.36e-04 2022-05-08 20:34:30,964 INFO [train.py:715] (0/8) Epoch 16, batch 29400, loss[loss=0.1143, simple_loss=0.1877, pruned_loss=0.02041, over 4956.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2065, pruned_loss=0.02952, over 972177.87 frames.], batch size: 21, lr: 1.36e-04 2022-05-08 20:35:12,716 INFO [train.py:715] (0/8) Epoch 16, batch 29450, loss[loss=0.1262, simple_loss=0.208, pruned_loss=0.02225, over 4943.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2055, pruned_loss=0.02893, over 973064.00 frames.], batch size: 29, lr: 1.36e-04 2022-05-08 20:35:54,214 INFO [train.py:715] (0/8) Epoch 16, batch 29500, loss[loss=0.1582, simple_loss=0.2216, pruned_loss=0.04746, over 4845.00 frames.], tot_loss[loss=0.1322, simple_loss=0.206, pruned_loss=0.02918, over 973577.32 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 20:36:36,039 INFO [train.py:715] (0/8) Epoch 16, batch 29550, loss[loss=0.141, simple_loss=0.2185, pruned_loss=0.03176, over 4927.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02937, over 973485.65 frames.], batch size: 23, lr: 1.36e-04 2022-05-08 20:37:17,259 INFO [train.py:715] (0/8) Epoch 16, batch 29600, loss[loss=0.1295, simple_loss=0.2009, pruned_loss=0.02904, over 4944.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02929, over 973252.22 frames.], batch size: 35, lr: 1.36e-04 2022-05-08 20:37:59,053 INFO [train.py:715] (0/8) Epoch 16, batch 29650, loss[loss=0.1483, simple_loss=0.2118, pruned_loss=0.04238, over 4968.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2066, pruned_loss=0.02924, over 973118.87 frames.], batch size: 35, lr: 1.36e-04 2022-05-08 20:38:40,540 INFO [train.py:715] (0/8) Epoch 16, batch 29700, loss[loss=0.1338, simple_loss=0.2136, pruned_loss=0.027, over 4887.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2062, pruned_loss=0.02921, over 972466.52 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 20:39:21,784 INFO [train.py:715] (0/8) Epoch 16, batch 29750, loss[loss=0.1158, simple_loss=0.1895, pruned_loss=0.02106, over 4914.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02925, over 972906.46 frames.], batch size: 18, lr: 1.36e-04 2022-05-08 20:40:02,892 INFO [train.py:715] (0/8) Epoch 16, batch 29800, loss[loss=0.1296, simple_loss=0.2189, pruned_loss=0.02018, over 4978.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02924, over 971791.47 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 20:40:44,786 INFO [train.py:715] (0/8) Epoch 16, batch 29850, loss[loss=0.1222, simple_loss=0.1877, pruned_loss=0.02836, over 4856.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.0291, over 972078.52 frames.], batch size: 32, lr: 1.36e-04 2022-05-08 20:41:26,315 INFO [train.py:715] (0/8) Epoch 16, batch 29900, loss[loss=0.1296, simple_loss=0.209, pruned_loss=0.02509, over 4792.00 frames.], tot_loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02917, over 972461.10 frames.], batch size: 18, lr: 1.36e-04 2022-05-08 20:42:07,621 INFO [train.py:715] (0/8) Epoch 16, batch 29950, loss[loss=0.1104, simple_loss=0.1822, pruned_loss=0.01926, over 4822.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02935, over 972344.85 frames.], batch size: 26, lr: 1.36e-04 2022-05-08 20:42:50,228 INFO [train.py:715] (0/8) Epoch 16, batch 30000, loss[loss=0.1376, simple_loss=0.1913, pruned_loss=0.04191, over 4929.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02985, over 972622.54 frames.], batch size: 35, lr: 1.36e-04 2022-05-08 20:42:50,229 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 20:43:01,794 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.1047, simple_loss=0.1883, pruned_loss=0.01058, over 914524.00 frames. 2022-05-08 20:43:44,298 INFO [train.py:715] (0/8) Epoch 16, batch 30050, loss[loss=0.1254, simple_loss=0.2031, pruned_loss=0.02383, over 4911.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03023, over 972315.30 frames.], batch size: 18, lr: 1.36e-04 2022-05-08 20:44:26,131 INFO [train.py:715] (0/8) Epoch 16, batch 30100, loss[loss=0.1451, simple_loss=0.2148, pruned_loss=0.03766, over 4842.00 frames.], tot_loss[loss=0.1348, simple_loss=0.2089, pruned_loss=0.03033, over 971953.84 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 20:45:06,932 INFO [train.py:715] (0/8) Epoch 16, batch 30150, loss[loss=0.1338, simple_loss=0.2132, pruned_loss=0.02725, over 4828.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03015, over 971564.07 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 20:45:48,623 INFO [train.py:715] (0/8) Epoch 16, batch 30200, loss[loss=0.1338, simple_loss=0.2193, pruned_loss=0.02416, over 4800.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2088, pruned_loss=0.02999, over 971345.18 frames.], batch size: 21, lr: 1.36e-04 2022-05-08 20:46:29,886 INFO [train.py:715] (0/8) Epoch 16, batch 30250, loss[loss=0.1298, simple_loss=0.1986, pruned_loss=0.03054, over 4938.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02996, over 971444.37 frames.], batch size: 29, lr: 1.36e-04 2022-05-08 20:47:09,987 INFO [train.py:715] (0/8) Epoch 16, batch 30300, loss[loss=0.1371, simple_loss=0.2186, pruned_loss=0.02775, over 4778.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.02995, over 971763.15 frames.], batch size: 17, lr: 1.36e-04 2022-05-08 20:47:50,183 INFO [train.py:715] (0/8) Epoch 16, batch 30350, loss[loss=0.14, simple_loss=0.2175, pruned_loss=0.03124, over 4980.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2075, pruned_loss=0.02993, over 971787.66 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:48:30,610 INFO [train.py:715] (0/8) Epoch 16, batch 30400, loss[loss=0.1486, simple_loss=0.2289, pruned_loss=0.03412, over 4797.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02939, over 971219.41 frames.], batch size: 21, lr: 1.36e-04 2022-05-08 20:49:10,255 INFO [train.py:715] (0/8) Epoch 16, batch 30450, loss[loss=0.142, simple_loss=0.206, pruned_loss=0.03894, over 4837.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02968, over 971687.69 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 20:49:49,410 INFO [train.py:715] (0/8) Epoch 16, batch 30500, loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02941, over 4784.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02983, over 971655.18 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:50:29,338 INFO [train.py:715] (0/8) Epoch 16, batch 30550, loss[loss=0.1126, simple_loss=0.189, pruned_loss=0.01812, over 4792.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02988, over 972069.26 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:51:09,800 INFO [train.py:715] (0/8) Epoch 16, batch 30600, loss[loss=0.1515, simple_loss=0.2236, pruned_loss=0.03969, over 4982.00 frames.], tot_loss[loss=0.1342, simple_loss=0.208, pruned_loss=0.03017, over 972448.85 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 20:51:49,045 INFO [train.py:715] (0/8) Epoch 16, batch 30650, loss[loss=0.1028, simple_loss=0.183, pruned_loss=0.0113, over 4958.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2087, pruned_loss=0.03009, over 973140.70 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 20:52:28,798 INFO [train.py:715] (0/8) Epoch 16, batch 30700, loss[loss=0.1343, simple_loss=0.1957, pruned_loss=0.03649, over 4956.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02994, over 972402.04 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:53:10,031 INFO [train.py:715] (0/8) Epoch 16, batch 30750, loss[loss=0.1499, simple_loss=0.2208, pruned_loss=0.03949, over 4904.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2084, pruned_loss=0.02959, over 972406.34 frames.], batch size: 17, lr: 1.36e-04 2022-05-08 20:53:49,624 INFO [train.py:715] (0/8) Epoch 16, batch 30800, loss[loss=0.1212, simple_loss=0.1915, pruned_loss=0.02547, over 4945.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02935, over 973516.38 frames.], batch size: 29, lr: 1.36e-04 2022-05-08 20:54:28,445 INFO [train.py:715] (0/8) Epoch 16, batch 30850, loss[loss=0.1415, simple_loss=0.2072, pruned_loss=0.03785, over 4977.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02976, over 972827.71 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:55:08,441 INFO [train.py:715] (0/8) Epoch 16, batch 30900, loss[loss=0.2167, simple_loss=0.276, pruned_loss=0.07873, over 4989.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2086, pruned_loss=0.02998, over 972450.51 frames.], batch size: 31, lr: 1.36e-04 2022-05-08 20:55:47,873 INFO [train.py:715] (0/8) Epoch 16, batch 30950, loss[loss=0.1216, simple_loss=0.2001, pruned_loss=0.02154, over 4885.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03022, over 972116.40 frames.], batch size: 22, lr: 1.36e-04 2022-05-08 20:56:26,905 INFO [train.py:715] (0/8) Epoch 16, batch 31000, loss[loss=0.1264, simple_loss=0.1881, pruned_loss=0.03233, over 4994.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2079, pruned_loss=0.0302, over 972366.19 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 20:57:06,083 INFO [train.py:715] (0/8) Epoch 16, batch 31050, loss[loss=0.14, simple_loss=0.2121, pruned_loss=0.03394, over 4821.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03007, over 971461.53 frames.], batch size: 27, lr: 1.36e-04 2022-05-08 20:57:45,840 INFO [train.py:715] (0/8) Epoch 16, batch 31100, loss[loss=0.1405, simple_loss=0.218, pruned_loss=0.03148, over 4957.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02995, over 971603.14 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 20:58:25,688 INFO [train.py:715] (0/8) Epoch 16, batch 31150, loss[loss=0.1499, simple_loss=0.2393, pruned_loss=0.03022, over 4890.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03012, over 971494.50 frames.], batch size: 17, lr: 1.36e-04 2022-05-08 20:59:04,404 INFO [train.py:715] (0/8) Epoch 16, batch 31200, loss[loss=0.1341, simple_loss=0.215, pruned_loss=0.02656, over 4893.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03032, over 971864.85 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 20:59:44,072 INFO [train.py:715] (0/8) Epoch 16, batch 31250, loss[loss=0.1339, simple_loss=0.216, pruned_loss=0.02595, over 4920.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2076, pruned_loss=0.03005, over 971940.33 frames.], batch size: 29, lr: 1.36e-04 2022-05-08 21:00:23,613 INFO [train.py:715] (0/8) Epoch 16, batch 31300, loss[loss=0.1121, simple_loss=0.1941, pruned_loss=0.01501, over 4939.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2072, pruned_loss=0.02974, over 972201.20 frames.], batch size: 23, lr: 1.36e-04 2022-05-08 21:01:03,208 INFO [train.py:715] (0/8) Epoch 16, batch 31350, loss[loss=0.1164, simple_loss=0.1959, pruned_loss=0.0185, over 4700.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02962, over 972392.05 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:01:42,655 INFO [train.py:715] (0/8) Epoch 16, batch 31400, loss[loss=0.1596, simple_loss=0.2274, pruned_loss=0.04585, over 4827.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2078, pruned_loss=0.02976, over 972508.01 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:02:22,718 INFO [train.py:715] (0/8) Epoch 16, batch 31450, loss[loss=0.131, simple_loss=0.2133, pruned_loss=0.02432, over 4912.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.02953, over 972968.48 frames.], batch size: 17, lr: 1.36e-04 2022-05-08 21:03:01,697 INFO [train.py:715] (0/8) Epoch 16, batch 31500, loss[loss=0.1065, simple_loss=0.1788, pruned_loss=0.01706, over 4796.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2081, pruned_loss=0.02943, over 973859.01 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:03:40,542 INFO [train.py:715] (0/8) Epoch 16, batch 31550, loss[loss=0.1377, simple_loss=0.2081, pruned_loss=0.03367, over 4776.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02942, over 973317.72 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 21:04:19,805 INFO [train.py:715] (0/8) Epoch 16, batch 31600, loss[loss=0.1548, simple_loss=0.2251, pruned_loss=0.04222, over 4908.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.0299, over 973512.56 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 21:04:58,917 INFO [train.py:715] (0/8) Epoch 16, batch 31650, loss[loss=0.1575, simple_loss=0.2164, pruned_loss=0.0493, over 4805.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.03003, over 972769.85 frames.], batch size: 26, lr: 1.36e-04 2022-05-08 21:05:37,856 INFO [train.py:715] (0/8) Epoch 16, batch 31700, loss[loss=0.1016, simple_loss=0.168, pruned_loss=0.01765, over 4820.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02971, over 972547.87 frames.], batch size: 13, lr: 1.36e-04 2022-05-08 21:06:17,107 INFO [train.py:715] (0/8) Epoch 16, batch 31750, loss[loss=0.1176, simple_loss=0.1978, pruned_loss=0.01867, over 4883.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02909, over 972765.98 frames.], batch size: 22, lr: 1.36e-04 2022-05-08 21:06:56,939 INFO [train.py:715] (0/8) Epoch 16, batch 31800, loss[loss=0.1419, simple_loss=0.208, pruned_loss=0.03789, over 4768.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02906, over 972566.50 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 21:07:36,867 INFO [train.py:715] (0/8) Epoch 16, batch 31850, loss[loss=0.122, simple_loss=0.1978, pruned_loss=0.02312, over 4824.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02888, over 972564.00 frames.], batch size: 27, lr: 1.36e-04 2022-05-08 21:08:15,870 INFO [train.py:715] (0/8) Epoch 16, batch 31900, loss[loss=0.1813, simple_loss=0.2574, pruned_loss=0.05255, over 4772.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02911, over 971621.98 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 21:08:55,065 INFO [train.py:715] (0/8) Epoch 16, batch 31950, loss[loss=0.2003, simple_loss=0.2632, pruned_loss=0.06873, over 4975.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02947, over 972276.23 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:09:34,440 INFO [train.py:715] (0/8) Epoch 16, batch 32000, loss[loss=0.1168, simple_loss=0.1923, pruned_loss=0.02064, over 4788.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02918, over 972189.35 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:10:13,574 INFO [train.py:715] (0/8) Epoch 16, batch 32050, loss[loss=0.1391, simple_loss=0.2016, pruned_loss=0.03831, over 4869.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02874, over 971952.82 frames.], batch size: 16, lr: 1.36e-04 2022-05-08 21:10:53,095 INFO [train.py:715] (0/8) Epoch 16, batch 32100, loss[loss=0.1162, simple_loss=0.1869, pruned_loss=0.02271, over 4644.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02885, over 972110.36 frames.], batch size: 13, lr: 1.36e-04 2022-05-08 21:11:32,629 INFO [train.py:715] (0/8) Epoch 16, batch 32150, loss[loss=0.1728, simple_loss=0.2372, pruned_loss=0.0542, over 4986.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.0285, over 971761.52 frames.], batch size: 28, lr: 1.36e-04 2022-05-08 21:12:12,707 INFO [train.py:715] (0/8) Epoch 16, batch 32200, loss[loss=0.1402, simple_loss=0.2201, pruned_loss=0.03016, over 4784.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02865, over 971942.51 frames.], batch size: 18, lr: 1.36e-04 2022-05-08 21:12:51,831 INFO [train.py:715] (0/8) Epoch 16, batch 32250, loss[loss=0.1572, simple_loss=0.2396, pruned_loss=0.03737, over 4918.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02871, over 972565.51 frames.], batch size: 39, lr: 1.36e-04 2022-05-08 21:13:31,326 INFO [train.py:715] (0/8) Epoch 16, batch 32300, loss[loss=0.161, simple_loss=0.2421, pruned_loss=0.03995, over 4977.00 frames.], tot_loss[loss=0.1324, simple_loss=0.207, pruned_loss=0.02885, over 972793.70 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 21:14:11,344 INFO [train.py:715] (0/8) Epoch 16, batch 32350, loss[loss=0.1323, simple_loss=0.2161, pruned_loss=0.02424, over 4968.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02856, over 973395.34 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 21:14:50,971 INFO [train.py:715] (0/8) Epoch 16, batch 32400, loss[loss=0.1415, simple_loss=0.2106, pruned_loss=0.03623, over 4834.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02904, over 972694.31 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 21:15:30,002 INFO [train.py:715] (0/8) Epoch 16, batch 32450, loss[loss=0.1447, simple_loss=0.2191, pruned_loss=0.03521, over 4814.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02898, over 972649.05 frames.], batch size: 25, lr: 1.36e-04 2022-05-08 21:16:10,034 INFO [train.py:715] (0/8) Epoch 16, batch 32500, loss[loss=0.1205, simple_loss=0.1966, pruned_loss=0.02217, over 4807.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02884, over 972641.29 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:16:49,324 INFO [train.py:715] (0/8) Epoch 16, batch 32550, loss[loss=0.1104, simple_loss=0.1849, pruned_loss=0.01792, over 4740.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02846, over 973054.77 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:17:28,287 INFO [train.py:715] (0/8) Epoch 16, batch 32600, loss[loss=0.1386, simple_loss=0.2128, pruned_loss=0.03218, over 4925.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 972322.31 frames.], batch size: 23, lr: 1.36e-04 2022-05-08 21:18:07,199 INFO [train.py:715] (0/8) Epoch 16, batch 32650, loss[loss=0.1436, simple_loss=0.2195, pruned_loss=0.03387, over 4785.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02938, over 972637.96 frames.], batch size: 18, lr: 1.36e-04 2022-05-08 21:18:46,393 INFO [train.py:715] (0/8) Epoch 16, batch 32700, loss[loss=0.1321, simple_loss=0.2086, pruned_loss=0.02776, over 4881.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02932, over 972881.38 frames.], batch size: 16, lr: 1.36e-04 2022-05-08 21:19:25,736 INFO [train.py:715] (0/8) Epoch 16, batch 32750, loss[loss=0.1241, simple_loss=0.1929, pruned_loss=0.02767, over 4825.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2067, pruned_loss=0.0296, over 972545.60 frames.], batch size: 25, lr: 1.36e-04 2022-05-08 21:20:05,461 INFO [train.py:715] (0/8) Epoch 16, batch 32800, loss[loss=0.1899, simple_loss=0.2497, pruned_loss=0.06502, over 4851.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02986, over 972233.59 frames.], batch size: 34, lr: 1.36e-04 2022-05-08 21:20:44,824 INFO [train.py:715] (0/8) Epoch 16, batch 32850, loss[loss=0.1315, simple_loss=0.1913, pruned_loss=0.03581, over 4772.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.0295, over 972081.71 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:21:24,459 INFO [train.py:715] (0/8) Epoch 16, batch 32900, loss[loss=0.1416, simple_loss=0.2106, pruned_loss=0.03633, over 4885.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02963, over 972831.60 frames.], batch size: 22, lr: 1.36e-04 2022-05-08 21:22:03,427 INFO [train.py:715] (0/8) Epoch 16, batch 32950, loss[loss=0.1166, simple_loss=0.1912, pruned_loss=0.02104, over 4978.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02915, over 973313.35 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:22:42,572 INFO [train.py:715] (0/8) Epoch 16, batch 33000, loss[loss=0.1336, simple_loss=0.2036, pruned_loss=0.03177, over 4845.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2075, pruned_loss=0.02911, over 973410.09 frames.], batch size: 34, lr: 1.36e-04 2022-05-08 21:22:42,574 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 21:22:55,771 INFO [train.py:742] (0/8) Epoch 16, validation: loss=0.105, simple_loss=0.1884, pruned_loss=0.01078, over 914524.00 frames. 2022-05-08 21:23:35,557 INFO [train.py:715] (0/8) Epoch 16, batch 33050, loss[loss=0.1353, simple_loss=0.2049, pruned_loss=0.03292, over 4880.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.0292, over 972261.53 frames.], batch size: 20, lr: 1.36e-04 2022-05-08 21:24:14,880 INFO [train.py:715] (0/8) Epoch 16, batch 33100, loss[loss=0.147, simple_loss=0.2195, pruned_loss=0.03726, over 4935.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02918, over 972150.66 frames.], batch size: 23, lr: 1.36e-04 2022-05-08 21:24:54,244 INFO [train.py:715] (0/8) Epoch 16, batch 33150, loss[loss=0.1191, simple_loss=0.1974, pruned_loss=0.02042, over 4833.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.0289, over 972491.46 frames.], batch size: 26, lr: 1.36e-04 2022-05-08 21:25:34,101 INFO [train.py:715] (0/8) Epoch 16, batch 33200, loss[loss=0.1517, simple_loss=0.2211, pruned_loss=0.04116, over 4935.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02912, over 972675.30 frames.], batch size: 18, lr: 1.36e-04 2022-05-08 21:26:13,845 INFO [train.py:715] (0/8) Epoch 16, batch 33250, loss[loss=0.1652, simple_loss=0.2468, pruned_loss=0.04183, over 4852.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02947, over 972604.14 frames.], batch size: 20, lr: 1.36e-04 2022-05-08 21:26:53,434 INFO [train.py:715] (0/8) Epoch 16, batch 33300, loss[loss=0.134, simple_loss=0.2091, pruned_loss=0.02945, over 4964.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02919, over 973198.19 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 21:27:32,748 INFO [train.py:715] (0/8) Epoch 16, batch 33350, loss[loss=0.1563, simple_loss=0.2421, pruned_loss=0.03527, over 4777.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02919, over 973724.23 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 21:28:12,217 INFO [train.py:715] (0/8) Epoch 16, batch 33400, loss[loss=0.1201, simple_loss=0.1942, pruned_loss=0.02305, over 4953.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02975, over 973694.79 frames.], batch size: 21, lr: 1.36e-04 2022-05-08 21:28:51,454 INFO [train.py:715] (0/8) Epoch 16, batch 33450, loss[loss=0.1012, simple_loss=0.1715, pruned_loss=0.01548, over 4757.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02992, over 972709.99 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:29:30,507 INFO [train.py:715] (0/8) Epoch 16, batch 33500, loss[loss=0.152, simple_loss=0.2204, pruned_loss=0.04183, over 4775.00 frames.], tot_loss[loss=0.134, simple_loss=0.2086, pruned_loss=0.02973, over 971440.61 frames.], batch size: 14, lr: 1.36e-04 2022-05-08 21:30:09,461 INFO [train.py:715] (0/8) Epoch 16, batch 33550, loss[loss=0.1221, simple_loss=0.1971, pruned_loss=0.02357, over 4787.00 frames.], tot_loss[loss=0.1337, simple_loss=0.208, pruned_loss=0.02973, over 971348.89 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 21:30:49,228 INFO [train.py:715] (0/8) Epoch 16, batch 33600, loss[loss=0.119, simple_loss=0.1968, pruned_loss=0.0206, over 4868.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02964, over 971447.10 frames.], batch size: 22, lr: 1.36e-04 2022-05-08 21:31:28,202 INFO [train.py:715] (0/8) Epoch 16, batch 33650, loss[loss=0.1179, simple_loss=0.1895, pruned_loss=0.02313, over 4847.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2087, pruned_loss=0.02998, over 970817.05 frames.], batch size: 32, lr: 1.36e-04 2022-05-08 21:32:07,844 INFO [train.py:715] (0/8) Epoch 16, batch 33700, loss[loss=0.1356, simple_loss=0.2136, pruned_loss=0.02886, over 4796.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.0297, over 971166.95 frames.], batch size: 24, lr: 1.36e-04 2022-05-08 21:32:46,795 INFO [train.py:715] (0/8) Epoch 16, batch 33750, loss[loss=0.13, simple_loss=0.2119, pruned_loss=0.02403, over 4688.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02944, over 971359.18 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:33:25,785 INFO [train.py:715] (0/8) Epoch 16, batch 33800, loss[loss=0.1498, simple_loss=0.2304, pruned_loss=0.03458, over 4987.00 frames.], tot_loss[loss=0.1341, simple_loss=0.209, pruned_loss=0.02961, over 971593.23 frames.], batch size: 25, lr: 1.36e-04 2022-05-08 21:34:05,047 INFO [train.py:715] (0/8) Epoch 16, batch 33850, loss[loss=0.1153, simple_loss=0.1913, pruned_loss=0.01964, over 4830.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.0294, over 971708.67 frames.], batch size: 26, lr: 1.36e-04 2022-05-08 21:34:44,272 INFO [train.py:715] (0/8) Epoch 16, batch 33900, loss[loss=0.1318, simple_loss=0.2087, pruned_loss=0.02744, over 4967.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2075, pruned_loss=0.02901, over 972142.11 frames.], batch size: 35, lr: 1.36e-04 2022-05-08 21:35:24,621 INFO [train.py:715] (0/8) Epoch 16, batch 33950, loss[loss=0.1445, simple_loss=0.2223, pruned_loss=0.03337, over 4855.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2081, pruned_loss=0.02939, over 972884.19 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:36:03,123 INFO [train.py:715] (0/8) Epoch 16, batch 34000, loss[loss=0.09842, simple_loss=0.1754, pruned_loss=0.01074, over 4847.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2084, pruned_loss=0.02991, over 973087.83 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:36:43,149 INFO [train.py:715] (0/8) Epoch 16, batch 34050, loss[loss=0.1188, simple_loss=0.1965, pruned_loss=0.0205, over 4754.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02955, over 972408.48 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 21:37:22,557 INFO [train.py:715] (0/8) Epoch 16, batch 34100, loss[loss=0.1263, simple_loss=0.1917, pruned_loss=0.0304, over 4846.00 frames.], tot_loss[loss=0.134, simple_loss=0.2086, pruned_loss=0.02967, over 972326.53 frames.], batch size: 32, lr: 1.36e-04 2022-05-08 21:38:01,718 INFO [train.py:715] (0/8) Epoch 16, batch 34150, loss[loss=0.1706, simple_loss=0.2461, pruned_loss=0.04754, over 4967.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2083, pruned_loss=0.02964, over 971991.35 frames.], batch size: 25, lr: 1.36e-04 2022-05-08 21:38:41,100 INFO [train.py:715] (0/8) Epoch 16, batch 34200, loss[loss=0.1331, simple_loss=0.2043, pruned_loss=0.03097, over 4984.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02966, over 971579.25 frames.], batch size: 31, lr: 1.36e-04 2022-05-08 21:39:20,450 INFO [train.py:715] (0/8) Epoch 16, batch 34250, loss[loss=0.149, simple_loss=0.2196, pruned_loss=0.03923, over 4842.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2087, pruned_loss=0.02979, over 971938.26 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:40:00,527 INFO [train.py:715] (0/8) Epoch 16, batch 34300, loss[loss=0.1222, simple_loss=0.1914, pruned_loss=0.02651, over 4751.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2087, pruned_loss=0.03006, over 972112.03 frames.], batch size: 12, lr: 1.36e-04 2022-05-08 21:40:39,497 INFO [train.py:715] (0/8) Epoch 16, batch 34350, loss[loss=0.137, simple_loss=0.207, pruned_loss=0.03347, over 4971.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02993, over 971775.21 frames.], batch size: 35, lr: 1.36e-04 2022-05-08 21:41:18,849 INFO [train.py:715] (0/8) Epoch 16, batch 34400, loss[loss=0.1485, simple_loss=0.2204, pruned_loss=0.03829, over 4935.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.0297, over 972310.22 frames.], batch size: 29, lr: 1.36e-04 2022-05-08 21:41:58,452 INFO [train.py:715] (0/8) Epoch 16, batch 34450, loss[loss=0.1235, simple_loss=0.2013, pruned_loss=0.02281, over 4886.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02945, over 971701.29 frames.], batch size: 19, lr: 1.36e-04 2022-05-08 21:42:37,726 INFO [train.py:715] (0/8) Epoch 16, batch 34500, loss[loss=0.1421, simple_loss=0.2096, pruned_loss=0.03727, over 4851.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02961, over 971745.38 frames.], batch size: 20, lr: 1.36e-04 2022-05-08 21:43:17,124 INFO [train.py:715] (0/8) Epoch 16, batch 34550, loss[loss=0.1474, simple_loss=0.2122, pruned_loss=0.04131, over 4855.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.0295, over 971992.50 frames.], batch size: 30, lr: 1.36e-04 2022-05-08 21:43:56,247 INFO [train.py:715] (0/8) Epoch 16, batch 34600, loss[loss=0.1403, simple_loss=0.2207, pruned_loss=0.02994, over 4829.00 frames.], tot_loss[loss=0.1335, simple_loss=0.208, pruned_loss=0.02948, over 971834.89 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:44:36,198 INFO [train.py:715] (0/8) Epoch 16, batch 34650, loss[loss=0.1027, simple_loss=0.1828, pruned_loss=0.01132, over 4819.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02901, over 972566.53 frames.], batch size: 25, lr: 1.36e-04 2022-05-08 21:45:15,694 INFO [train.py:715] (0/8) Epoch 16, batch 34700, loss[loss=0.1208, simple_loss=0.1955, pruned_loss=0.02308, over 4822.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02907, over 971732.00 frames.], batch size: 15, lr: 1.36e-04 2022-05-08 21:45:54,802 INFO [train.py:715] (0/8) Epoch 16, batch 34750, loss[loss=0.1286, simple_loss=0.2024, pruned_loss=0.02745, over 4724.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02939, over 970598.45 frames.], batch size: 16, lr: 1.36e-04 2022-05-08 21:46:32,018 INFO [train.py:715] (0/8) Epoch 16, batch 34800, loss[loss=0.1146, simple_loss=0.1957, pruned_loss=0.01669, over 4933.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02941, over 970958.09 frames.], batch size: 23, lr: 1.36e-04 2022-05-08 21:46:40,540 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-16.pt 2022-05-08 21:47:23,864 INFO [train.py:715] (0/8) Epoch 17, batch 0, loss[loss=0.1287, simple_loss=0.1982, pruned_loss=0.02961, over 4901.00 frames.], tot_loss[loss=0.1287, simple_loss=0.1982, pruned_loss=0.02961, over 4901.00 frames.], batch size: 19, lr: 1.32e-04 2022-05-08 21:48:03,333 INFO [train.py:715] (0/8) Epoch 17, batch 50, loss[loss=0.1068, simple_loss=0.1726, pruned_loss=0.02047, over 4739.00 frames.], tot_loss[loss=0.1354, simple_loss=0.2074, pruned_loss=0.03164, over 219562.50 frames.], batch size: 12, lr: 1.32e-04 2022-05-08 21:48:44,383 INFO [train.py:715] (0/8) Epoch 17, batch 100, loss[loss=0.1166, simple_loss=0.1914, pruned_loss=0.02092, over 4939.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2066, pruned_loss=0.03062, over 387084.79 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 21:49:25,329 INFO [train.py:715] (0/8) Epoch 17, batch 150, loss[loss=0.1356, simple_loss=0.2132, pruned_loss=0.02904, over 4830.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2066, pruned_loss=0.02999, over 516033.61 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 21:50:06,392 INFO [train.py:715] (0/8) Epoch 17, batch 200, loss[loss=0.132, simple_loss=0.2055, pruned_loss=0.02927, over 4764.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2072, pruned_loss=0.03002, over 616686.94 frames.], batch size: 16, lr: 1.32e-04 2022-05-08 21:50:40,193 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-592000.pt 2022-05-08 21:50:49,384 INFO [train.py:715] (0/8) Epoch 17, batch 250, loss[loss=0.1205, simple_loss=0.2022, pruned_loss=0.01944, over 4971.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02936, over 695828.22 frames.], batch size: 28, lr: 1.32e-04 2022-05-08 21:51:30,988 INFO [train.py:715] (0/8) Epoch 17, batch 300, loss[loss=0.14, simple_loss=0.2182, pruned_loss=0.03095, over 4832.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.02916, over 756267.63 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 21:52:11,853 INFO [train.py:715] (0/8) Epoch 17, batch 350, loss[loss=0.1158, simple_loss=0.1919, pruned_loss=0.01986, over 4803.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02929, over 803841.18 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 21:52:52,793 INFO [train.py:715] (0/8) Epoch 17, batch 400, loss[loss=0.1336, simple_loss=0.22, pruned_loss=0.0236, over 4852.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02954, over 841359.04 frames.], batch size: 20, lr: 1.32e-04 2022-05-08 21:53:33,713 INFO [train.py:715] (0/8) Epoch 17, batch 450, loss[loss=0.1242, simple_loss=0.1888, pruned_loss=0.02984, over 4845.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02932, over 869922.69 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 21:54:14,776 INFO [train.py:715] (0/8) Epoch 17, batch 500, loss[loss=0.1188, simple_loss=0.1977, pruned_loss=0.01993, over 4911.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02958, over 893185.98 frames.], batch size: 23, lr: 1.32e-04 2022-05-08 21:54:56,775 INFO [train.py:715] (0/8) Epoch 17, batch 550, loss[loss=0.1683, simple_loss=0.2334, pruned_loss=0.05157, over 4850.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.0299, over 911151.07 frames.], batch size: 32, lr: 1.32e-04 2022-05-08 21:55:37,893 INFO [train.py:715] (0/8) Epoch 17, batch 600, loss[loss=0.1549, simple_loss=0.2296, pruned_loss=0.04004, over 4928.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2086, pruned_loss=0.02961, over 925101.48 frames.], batch size: 18, lr: 1.32e-04 2022-05-08 21:56:20,083 INFO [train.py:715] (0/8) Epoch 17, batch 650, loss[loss=0.155, simple_loss=0.2287, pruned_loss=0.04062, over 4990.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2084, pruned_loss=0.0296, over 935662.77 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 21:57:01,707 INFO [train.py:715] (0/8) Epoch 17, batch 700, loss[loss=0.1198, simple_loss=0.1897, pruned_loss=0.02499, over 4769.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02942, over 942841.26 frames.], batch size: 19, lr: 1.32e-04 2022-05-08 21:57:42,593 INFO [train.py:715] (0/8) Epoch 17, batch 750, loss[loss=0.1359, simple_loss=0.2117, pruned_loss=0.03003, over 4782.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02955, over 949011.95 frames.], batch size: 17, lr: 1.32e-04 2022-05-08 21:58:23,385 INFO [train.py:715] (0/8) Epoch 17, batch 800, loss[loss=0.1138, simple_loss=0.1901, pruned_loss=0.01872, over 4783.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02979, over 955054.97 frames.], batch size: 17, lr: 1.32e-04 2022-05-08 21:59:03,980 INFO [train.py:715] (0/8) Epoch 17, batch 850, loss[loss=0.1145, simple_loss=0.1873, pruned_loss=0.02088, over 4763.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02962, over 958561.52 frames.], batch size: 19, lr: 1.32e-04 2022-05-08 21:59:45,376 INFO [train.py:715] (0/8) Epoch 17, batch 900, loss[loss=0.138, simple_loss=0.2116, pruned_loss=0.0322, over 4780.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.02998, over 960746.38 frames.], batch size: 17, lr: 1.32e-04 2022-05-08 22:00:26,262 INFO [train.py:715] (0/8) Epoch 17, batch 950, loss[loss=0.1142, simple_loss=0.1858, pruned_loss=0.0213, over 4739.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02954, over 962927.39 frames.], batch size: 16, lr: 1.32e-04 2022-05-08 22:01:07,642 INFO [train.py:715] (0/8) Epoch 17, batch 1000, loss[loss=0.1235, simple_loss=0.1972, pruned_loss=0.02495, over 4809.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2071, pruned_loss=0.02966, over 965058.99 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 22:01:48,895 INFO [train.py:715] (0/8) Epoch 17, batch 1050, loss[loss=0.1104, simple_loss=0.1825, pruned_loss=0.01914, over 4815.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02943, over 966132.32 frames.], batch size: 12, lr: 1.32e-04 2022-05-08 22:02:29,886 INFO [train.py:715] (0/8) Epoch 17, batch 1100, loss[loss=0.1177, simple_loss=0.1898, pruned_loss=0.02285, over 4908.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2069, pruned_loss=0.02983, over 967477.68 frames.], batch size: 18, lr: 1.32e-04 2022-05-08 22:03:10,394 INFO [train.py:715] (0/8) Epoch 17, batch 1150, loss[loss=0.1587, simple_loss=0.2328, pruned_loss=0.0423, over 4822.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2078, pruned_loss=0.03021, over 968660.75 frames.], batch size: 27, lr: 1.32e-04 2022-05-08 22:03:51,869 INFO [train.py:715] (0/8) Epoch 17, batch 1200, loss[loss=0.1024, simple_loss=0.1686, pruned_loss=0.01807, over 4650.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2069, pruned_loss=0.0299, over 969065.51 frames.], batch size: 13, lr: 1.32e-04 2022-05-08 22:04:32,826 INFO [train.py:715] (0/8) Epoch 17, batch 1250, loss[loss=0.1353, simple_loss=0.2146, pruned_loss=0.02796, over 4956.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2061, pruned_loss=0.02961, over 969758.98 frames.], batch size: 24, lr: 1.32e-04 2022-05-08 22:05:13,883 INFO [train.py:715] (0/8) Epoch 17, batch 1300, loss[loss=0.1246, simple_loss=0.1978, pruned_loss=0.02567, over 4961.00 frames.], tot_loss[loss=0.1336, simple_loss=0.207, pruned_loss=0.03006, over 969762.98 frames.], batch size: 24, lr: 1.32e-04 2022-05-08 22:05:55,245 INFO [train.py:715] (0/8) Epoch 17, batch 1350, loss[loss=0.1171, simple_loss=0.1846, pruned_loss=0.02481, over 4784.00 frames.], tot_loss[loss=0.1334, simple_loss=0.207, pruned_loss=0.02991, over 970223.61 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 22:06:36,195 INFO [train.py:715] (0/8) Epoch 17, batch 1400, loss[loss=0.1444, simple_loss=0.2221, pruned_loss=0.03339, over 4770.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.0297, over 969894.68 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 22:07:16,801 INFO [train.py:715] (0/8) Epoch 17, batch 1450, loss[loss=0.1344, simple_loss=0.2077, pruned_loss=0.03049, over 4988.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02928, over 969766.75 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 22:07:57,517 INFO [train.py:715] (0/8) Epoch 17, batch 1500, loss[loss=0.09593, simple_loss=0.1687, pruned_loss=0.01159, over 4935.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02936, over 969881.93 frames.], batch size: 29, lr: 1.32e-04 2022-05-08 22:08:39,013 INFO [train.py:715] (0/8) Epoch 17, batch 1550, loss[loss=0.1489, simple_loss=0.2289, pruned_loss=0.03446, over 4957.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2088, pruned_loss=0.03013, over 970275.17 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 22:09:20,353 INFO [train.py:715] (0/8) Epoch 17, batch 1600, loss[loss=0.1318, simple_loss=0.204, pruned_loss=0.02981, over 4635.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2088, pruned_loss=0.03001, over 970846.21 frames.], batch size: 13, lr: 1.32e-04 2022-05-08 22:10:01,026 INFO [train.py:715] (0/8) Epoch 17, batch 1650, loss[loss=0.1538, simple_loss=0.2232, pruned_loss=0.04221, over 4840.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2086, pruned_loss=0.02979, over 971098.72 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 22:10:42,380 INFO [train.py:715] (0/8) Epoch 17, batch 1700, loss[loss=0.1152, simple_loss=0.1935, pruned_loss=0.01844, over 4925.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03008, over 971116.71 frames.], batch size: 23, lr: 1.32e-04 2022-05-08 22:11:23,608 INFO [train.py:715] (0/8) Epoch 17, batch 1750, loss[loss=0.1315, simple_loss=0.2027, pruned_loss=0.03018, over 4977.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2086, pruned_loss=0.0302, over 971810.53 frames.], batch size: 31, lr: 1.32e-04 2022-05-08 22:12:04,547 INFO [train.py:715] (0/8) Epoch 17, batch 1800, loss[loss=0.1456, simple_loss=0.2063, pruned_loss=0.04244, over 4749.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2081, pruned_loss=0.03027, over 971854.75 frames.], batch size: 16, lr: 1.32e-04 2022-05-08 22:12:45,637 INFO [train.py:715] (0/8) Epoch 17, batch 1850, loss[loss=0.1224, simple_loss=0.1967, pruned_loss=0.02407, over 4985.00 frames.], tot_loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.03025, over 972446.01 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 22:13:27,331 INFO [train.py:715] (0/8) Epoch 17, batch 1900, loss[loss=0.131, simple_loss=0.2098, pruned_loss=0.02607, over 4775.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.03016, over 971505.14 frames.], batch size: 17, lr: 1.32e-04 2022-05-08 22:14:08,485 INFO [train.py:715] (0/8) Epoch 17, batch 1950, loss[loss=0.1262, simple_loss=0.2039, pruned_loss=0.02426, over 4724.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02974, over 971544.59 frames.], batch size: 12, lr: 1.32e-04 2022-05-08 22:14:49,364 INFO [train.py:715] (0/8) Epoch 17, batch 2000, loss[loss=0.1571, simple_loss=0.2158, pruned_loss=0.04917, over 4745.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2082, pruned_loss=0.03007, over 972289.26 frames.], batch size: 16, lr: 1.32e-04 2022-05-08 22:15:30,343 INFO [train.py:715] (0/8) Epoch 17, batch 2050, loss[loss=0.1271, simple_loss=0.2033, pruned_loss=0.02545, over 4982.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2087, pruned_loss=0.03039, over 972410.25 frames.], batch size: 35, lr: 1.32e-04 2022-05-08 22:16:11,452 INFO [train.py:715] (0/8) Epoch 17, batch 2100, loss[loss=0.1271, simple_loss=0.1953, pruned_loss=0.02942, over 4782.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.03001, over 971912.96 frames.], batch size: 18, lr: 1.32e-04 2022-05-08 22:16:52,788 INFO [train.py:715] (0/8) Epoch 17, batch 2150, loss[loss=0.1229, simple_loss=0.1895, pruned_loss=0.02813, over 4887.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2076, pruned_loss=0.02999, over 971737.11 frames.], batch size: 22, lr: 1.32e-04 2022-05-08 22:17:34,185 INFO [train.py:715] (0/8) Epoch 17, batch 2200, loss[loss=0.1107, simple_loss=0.1809, pruned_loss=0.02023, over 4811.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2065, pruned_loss=0.02922, over 971187.39 frames.], batch size: 27, lr: 1.32e-04 2022-05-08 22:18:15,308 INFO [train.py:715] (0/8) Epoch 17, batch 2250, loss[loss=0.1482, simple_loss=0.2147, pruned_loss=0.04087, over 4813.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02919, over 971463.67 frames.], batch size: 13, lr: 1.32e-04 2022-05-08 22:18:56,042 INFO [train.py:715] (0/8) Epoch 17, batch 2300, loss[loss=0.111, simple_loss=0.1888, pruned_loss=0.0166, over 4816.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2056, pruned_loss=0.02876, over 971297.50 frames.], batch size: 13, lr: 1.32e-04 2022-05-08 22:19:36,480 INFO [train.py:715] (0/8) Epoch 17, batch 2350, loss[loss=0.1147, simple_loss=0.1924, pruned_loss=0.01846, over 4969.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2052, pruned_loss=0.02881, over 971916.90 frames.], batch size: 24, lr: 1.32e-04 2022-05-08 22:20:17,317 INFO [train.py:715] (0/8) Epoch 17, batch 2400, loss[loss=0.1264, simple_loss=0.1962, pruned_loss=0.02827, over 4795.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2055, pruned_loss=0.02878, over 971842.98 frames.], batch size: 12, lr: 1.32e-04 2022-05-08 22:20:58,226 INFO [train.py:715] (0/8) Epoch 17, batch 2450, loss[loss=0.1289, simple_loss=0.1993, pruned_loss=0.0292, over 4953.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2061, pruned_loss=0.02904, over 973247.96 frames.], batch size: 35, lr: 1.32e-04 2022-05-08 22:21:39,040 INFO [train.py:715] (0/8) Epoch 17, batch 2500, loss[loss=0.1528, simple_loss=0.2315, pruned_loss=0.03709, over 4699.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02924, over 972807.41 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 22:22:20,001 INFO [train.py:715] (0/8) Epoch 17, batch 2550, loss[loss=0.1053, simple_loss=0.1786, pruned_loss=0.01601, over 4928.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02917, over 973092.97 frames.], batch size: 29, lr: 1.32e-04 2022-05-08 22:23:00,960 INFO [train.py:715] (0/8) Epoch 17, batch 2600, loss[loss=0.1158, simple_loss=0.1929, pruned_loss=0.01941, over 4955.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02948, over 973053.47 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 22:23:42,201 INFO [train.py:715] (0/8) Epoch 17, batch 2650, loss[loss=0.1105, simple_loss=0.1846, pruned_loss=0.01817, over 4650.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02923, over 972428.76 frames.], batch size: 13, lr: 1.32e-04 2022-05-08 22:24:22,846 INFO [train.py:715] (0/8) Epoch 17, batch 2700, loss[loss=0.1222, simple_loss=0.1927, pruned_loss=0.02581, over 4930.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02936, over 973096.56 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 22:25:04,064 INFO [train.py:715] (0/8) Epoch 17, batch 2750, loss[loss=0.1543, simple_loss=0.2218, pruned_loss=0.04338, over 4843.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02934, over 972141.11 frames.], batch size: 20, lr: 1.32e-04 2022-05-08 22:25:44,593 INFO [train.py:715] (0/8) Epoch 17, batch 2800, loss[loss=0.1263, simple_loss=0.1944, pruned_loss=0.02915, over 4768.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02912, over 972148.18 frames.], batch size: 18, lr: 1.32e-04 2022-05-08 22:26:25,555 INFO [train.py:715] (0/8) Epoch 17, batch 2850, loss[loss=0.158, simple_loss=0.2283, pruned_loss=0.04384, over 4875.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02908, over 972084.63 frames.], batch size: 32, lr: 1.32e-04 2022-05-08 22:27:06,300 INFO [train.py:715] (0/8) Epoch 17, batch 2900, loss[loss=0.137, simple_loss=0.2055, pruned_loss=0.03422, over 4890.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.0294, over 972959.58 frames.], batch size: 22, lr: 1.32e-04 2022-05-08 22:27:47,265 INFO [train.py:715] (0/8) Epoch 17, batch 2950, loss[loss=0.09938, simple_loss=0.1749, pruned_loss=0.01195, over 4808.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02877, over 972427.85 frames.], batch size: 12, lr: 1.32e-04 2022-05-08 22:28:28,416 INFO [train.py:715] (0/8) Epoch 17, batch 3000, loss[loss=0.1072, simple_loss=0.1785, pruned_loss=0.01794, over 4763.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2054, pruned_loss=0.02874, over 972262.85 frames.], batch size: 19, lr: 1.32e-04 2022-05-08 22:28:28,417 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 22:28:43,494 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1047, simple_loss=0.1882, pruned_loss=0.01063, over 914524.00 frames. 2022-05-08 22:29:24,682 INFO [train.py:715] (0/8) Epoch 17, batch 3050, loss[loss=0.1076, simple_loss=0.1771, pruned_loss=0.01907, over 4799.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2056, pruned_loss=0.02943, over 971083.41 frames.], batch size: 12, lr: 1.32e-04 2022-05-08 22:30:05,284 INFO [train.py:715] (0/8) Epoch 17, batch 3100, loss[loss=0.1314, simple_loss=0.2146, pruned_loss=0.02407, over 4862.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2056, pruned_loss=0.02942, over 971238.30 frames.], batch size: 20, lr: 1.32e-04 2022-05-08 22:30:46,407 INFO [train.py:715] (0/8) Epoch 17, batch 3150, loss[loss=0.1541, simple_loss=0.2426, pruned_loss=0.03283, over 4701.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2061, pruned_loss=0.02973, over 970951.39 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 22:31:26,341 INFO [train.py:715] (0/8) Epoch 17, batch 3200, loss[loss=0.1254, simple_loss=0.1925, pruned_loss=0.02911, over 4903.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2062, pruned_loss=0.0298, over 971611.88 frames.], batch size: 22, lr: 1.32e-04 2022-05-08 22:32:07,676 INFO [train.py:715] (0/8) Epoch 17, batch 3250, loss[loss=0.1325, simple_loss=0.2042, pruned_loss=0.03044, over 4801.00 frames.], tot_loss[loss=0.1324, simple_loss=0.206, pruned_loss=0.02945, over 970906.25 frames.], batch size: 24, lr: 1.32e-04 2022-05-08 22:32:47,737 INFO [train.py:715] (0/8) Epoch 17, batch 3300, loss[loss=0.1579, simple_loss=0.242, pruned_loss=0.03688, over 4941.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2059, pruned_loss=0.02892, over 971807.45 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 22:33:28,437 INFO [train.py:715] (0/8) Epoch 17, batch 3350, loss[loss=0.1449, simple_loss=0.2151, pruned_loss=0.03733, over 4956.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02887, over 972391.22 frames.], batch size: 24, lr: 1.32e-04 2022-05-08 22:34:09,185 INFO [train.py:715] (0/8) Epoch 17, batch 3400, loss[loss=0.1267, simple_loss=0.1993, pruned_loss=0.027, over 4945.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02949, over 972225.79 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 22:34:50,593 INFO [train.py:715] (0/8) Epoch 17, batch 3450, loss[loss=0.1273, simple_loss=0.2055, pruned_loss=0.02457, over 4897.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2064, pruned_loss=0.02941, over 972399.15 frames.], batch size: 22, lr: 1.32e-04 2022-05-08 22:35:30,943 INFO [train.py:715] (0/8) Epoch 17, batch 3500, loss[loss=0.1098, simple_loss=0.1824, pruned_loss=0.01861, over 4886.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2053, pruned_loss=0.02901, over 972010.98 frames.], batch size: 22, lr: 1.32e-04 2022-05-08 22:36:11,164 INFO [train.py:715] (0/8) Epoch 17, batch 3550, loss[loss=0.1191, simple_loss=0.1946, pruned_loss=0.02178, over 4988.00 frames.], tot_loss[loss=0.132, simple_loss=0.2057, pruned_loss=0.02916, over 972357.39 frames.], batch size: 27, lr: 1.32e-04 2022-05-08 22:36:52,130 INFO [train.py:715] (0/8) Epoch 17, batch 3600, loss[loss=0.1575, simple_loss=0.2381, pruned_loss=0.03843, over 4898.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2059, pruned_loss=0.02948, over 972152.08 frames.], batch size: 19, lr: 1.32e-04 2022-05-08 22:37:31,756 INFO [train.py:715] (0/8) Epoch 17, batch 3650, loss[loss=0.13, simple_loss=0.2028, pruned_loss=0.02857, over 4802.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2062, pruned_loss=0.02951, over 972142.21 frames.], batch size: 21, lr: 1.32e-04 2022-05-08 22:38:11,923 INFO [train.py:715] (0/8) Epoch 17, batch 3700, loss[loss=0.1261, simple_loss=0.2056, pruned_loss=0.02336, over 4902.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2062, pruned_loss=0.0296, over 972374.95 frames.], batch size: 17, lr: 1.32e-04 2022-05-08 22:38:52,845 INFO [train.py:715] (0/8) Epoch 17, batch 3750, loss[loss=0.1228, simple_loss=0.1977, pruned_loss=0.0239, over 4840.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2062, pruned_loss=0.02973, over 972370.27 frames.], batch size: 30, lr: 1.32e-04 2022-05-08 22:39:33,612 INFO [train.py:715] (0/8) Epoch 17, batch 3800, loss[loss=0.1093, simple_loss=0.1833, pruned_loss=0.01766, over 4837.00 frames.], tot_loss[loss=0.132, simple_loss=0.2055, pruned_loss=0.02925, over 972032.46 frames.], batch size: 13, lr: 1.32e-04 2022-05-08 22:40:14,220 INFO [train.py:715] (0/8) Epoch 17, batch 3850, loss[loss=0.1131, simple_loss=0.189, pruned_loss=0.01857, over 4977.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2062, pruned_loss=0.02942, over 972889.45 frames.], batch size: 14, lr: 1.32e-04 2022-05-08 22:40:54,277 INFO [train.py:715] (0/8) Epoch 17, batch 3900, loss[loss=0.1365, simple_loss=0.2179, pruned_loss=0.02758, over 4964.00 frames.], tot_loss[loss=0.132, simple_loss=0.2059, pruned_loss=0.02904, over 973610.58 frames.], batch size: 24, lr: 1.32e-04 2022-05-08 22:41:35,758 INFO [train.py:715] (0/8) Epoch 17, batch 3950, loss[loss=0.11, simple_loss=0.1989, pruned_loss=0.01052, over 4751.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2056, pruned_loss=0.02892, over 972404.38 frames.], batch size: 16, lr: 1.32e-04 2022-05-08 22:42:15,633 INFO [train.py:715] (0/8) Epoch 17, batch 4000, loss[loss=0.1182, simple_loss=0.203, pruned_loss=0.01664, over 4808.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2057, pruned_loss=0.02888, over 972992.32 frames.], batch size: 25, lr: 1.32e-04 2022-05-08 22:42:56,136 INFO [train.py:715] (0/8) Epoch 17, batch 4050, loss[loss=0.1562, simple_loss=0.2152, pruned_loss=0.04863, over 4944.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2058, pruned_loss=0.02929, over 973138.13 frames.], batch size: 35, lr: 1.32e-04 2022-05-08 22:43:36,610 INFO [train.py:715] (0/8) Epoch 17, batch 4100, loss[loss=0.1448, simple_loss=0.2108, pruned_loss=0.0394, over 4826.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2055, pruned_loss=0.02892, over 973297.01 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 22:44:17,667 INFO [train.py:715] (0/8) Epoch 17, batch 4150, loss[loss=0.1384, simple_loss=0.2151, pruned_loss=0.03087, over 4818.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2054, pruned_loss=0.0288, over 972847.99 frames.], batch size: 15, lr: 1.32e-04 2022-05-08 22:44:56,903 INFO [train.py:715] (0/8) Epoch 17, batch 4200, loss[loss=0.1159, simple_loss=0.1931, pruned_loss=0.0193, over 4781.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2055, pruned_loss=0.02896, over 972799.71 frames.], batch size: 18, lr: 1.32e-04 2022-05-08 22:45:36,941 INFO [train.py:715] (0/8) Epoch 17, batch 4250, loss[loss=0.1327, simple_loss=0.1998, pruned_loss=0.03286, over 4892.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2063, pruned_loss=0.02915, over 972457.86 frames.], batch size: 19, lr: 1.32e-04 2022-05-08 22:46:18,117 INFO [train.py:715] (0/8) Epoch 17, batch 4300, loss[loss=0.1405, simple_loss=0.2057, pruned_loss=0.03767, over 4952.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02934, over 972708.78 frames.], batch size: 29, lr: 1.31e-04 2022-05-08 22:46:58,165 INFO [train.py:715] (0/8) Epoch 17, batch 4350, loss[loss=0.139, simple_loss=0.2132, pruned_loss=0.03239, over 4854.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02956, over 972898.76 frames.], batch size: 34, lr: 1.31e-04 2022-05-08 22:47:38,038 INFO [train.py:715] (0/8) Epoch 17, batch 4400, loss[loss=0.1311, simple_loss=0.2074, pruned_loss=0.02739, over 4968.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02931, over 972307.21 frames.], batch size: 25, lr: 1.31e-04 2022-05-08 22:48:18,893 INFO [train.py:715] (0/8) Epoch 17, batch 4450, loss[loss=0.143, simple_loss=0.2161, pruned_loss=0.03493, over 4846.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02921, over 971984.78 frames.], batch size: 32, lr: 1.31e-04 2022-05-08 22:48:59,884 INFO [train.py:715] (0/8) Epoch 17, batch 4500, loss[loss=0.1108, simple_loss=0.1847, pruned_loss=0.01846, over 4765.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02899, over 971028.67 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 22:49:39,756 INFO [train.py:715] (0/8) Epoch 17, batch 4550, loss[loss=0.1505, simple_loss=0.2204, pruned_loss=0.04029, over 4806.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02964, over 971093.37 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 22:50:20,189 INFO [train.py:715] (0/8) Epoch 17, batch 4600, loss[loss=0.1245, simple_loss=0.2047, pruned_loss=0.02212, over 4986.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02933, over 970909.70 frames.], batch size: 28, lr: 1.31e-04 2022-05-08 22:51:01,212 INFO [train.py:715] (0/8) Epoch 17, batch 4650, loss[loss=0.1422, simple_loss=0.2212, pruned_loss=0.03166, over 4775.00 frames.], tot_loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02924, over 970851.11 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 22:51:41,121 INFO [train.py:715] (0/8) Epoch 17, batch 4700, loss[loss=0.1155, simple_loss=0.1831, pruned_loss=0.024, over 4991.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02933, over 971864.65 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 22:52:21,065 INFO [train.py:715] (0/8) Epoch 17, batch 4750, loss[loss=0.1337, simple_loss=0.214, pruned_loss=0.02675, over 4755.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02914, over 971639.26 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 22:53:02,042 INFO [train.py:715] (0/8) Epoch 17, batch 4800, loss[loss=0.1387, simple_loss=0.2151, pruned_loss=0.03115, over 4897.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02935, over 972899.82 frames.], batch size: 19, lr: 1.31e-04 2022-05-08 22:53:42,800 INFO [train.py:715] (0/8) Epoch 17, batch 4850, loss[loss=0.1137, simple_loss=0.1929, pruned_loss=0.01721, over 4885.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02892, over 973436.23 frames.], batch size: 22, lr: 1.31e-04 2022-05-08 22:54:22,667 INFO [train.py:715] (0/8) Epoch 17, batch 4900, loss[loss=0.1394, simple_loss=0.2091, pruned_loss=0.03489, over 4880.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02908, over 972179.75 frames.], batch size: 32, lr: 1.31e-04 2022-05-08 22:55:03,095 INFO [train.py:715] (0/8) Epoch 17, batch 4950, loss[loss=0.1572, simple_loss=0.2275, pruned_loss=0.04342, over 4837.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2068, pruned_loss=0.02933, over 972800.77 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 22:55:44,139 INFO [train.py:715] (0/8) Epoch 17, batch 5000, loss[loss=0.117, simple_loss=0.1967, pruned_loss=0.0187, over 4823.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02939, over 972390.23 frames.], batch size: 26, lr: 1.31e-04 2022-05-08 22:56:24,627 INFO [train.py:715] (0/8) Epoch 17, batch 5050, loss[loss=0.1679, simple_loss=0.2375, pruned_loss=0.04914, over 4776.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02906, over 972932.04 frames.], batch size: 14, lr: 1.31e-04 2022-05-08 22:57:04,161 INFO [train.py:715] (0/8) Epoch 17, batch 5100, loss[loss=0.1642, simple_loss=0.2187, pruned_loss=0.05481, over 4842.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02863, over 972763.52 frames.], batch size: 13, lr: 1.31e-04 2022-05-08 22:57:44,982 INFO [train.py:715] (0/8) Epoch 17, batch 5150, loss[loss=0.147, simple_loss=0.2211, pruned_loss=0.03645, over 4777.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02889, over 972810.56 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 22:58:26,132 INFO [train.py:715] (0/8) Epoch 17, batch 5200, loss[loss=0.1369, simple_loss=0.212, pruned_loss=0.03089, over 4948.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02882, over 973095.56 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 22:59:05,340 INFO [train.py:715] (0/8) Epoch 17, batch 5250, loss[loss=0.1417, simple_loss=0.2167, pruned_loss=0.03332, over 4845.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02869, over 972986.46 frames.], batch size: 30, lr: 1.31e-04 2022-05-08 22:59:44,887 INFO [train.py:715] (0/8) Epoch 17, batch 5300, loss[loss=0.1208, simple_loss=0.1945, pruned_loss=0.02349, over 4860.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.0288, over 972721.53 frames.], batch size: 20, lr: 1.31e-04 2022-05-08 23:00:25,447 INFO [train.py:715] (0/8) Epoch 17, batch 5350, loss[loss=0.1407, simple_loss=0.2147, pruned_loss=0.03341, over 4876.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2059, pruned_loss=0.0286, over 972500.57 frames.], batch size: 32, lr: 1.31e-04 2022-05-08 23:01:06,236 INFO [train.py:715] (0/8) Epoch 17, batch 5400, loss[loss=0.1496, simple_loss=0.2252, pruned_loss=0.03703, over 4875.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.02859, over 972120.34 frames.], batch size: 32, lr: 1.31e-04 2022-05-08 23:01:45,348 INFO [train.py:715] (0/8) Epoch 17, batch 5450, loss[loss=0.1594, simple_loss=0.2326, pruned_loss=0.04306, over 4925.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02922, over 972127.12 frames.], batch size: 23, lr: 1.31e-04 2022-05-08 23:02:26,547 INFO [train.py:715] (0/8) Epoch 17, batch 5500, loss[loss=0.1508, simple_loss=0.222, pruned_loss=0.03983, over 4925.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02973, over 971916.27 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 23:03:07,878 INFO [train.py:715] (0/8) Epoch 17, batch 5550, loss[loss=0.1253, simple_loss=0.2003, pruned_loss=0.02509, over 4793.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02975, over 971339.29 frames.], batch size: 24, lr: 1.31e-04 2022-05-08 23:03:46,992 INFO [train.py:715] (0/8) Epoch 17, batch 5600, loss[loss=0.137, simple_loss=0.2206, pruned_loss=0.02671, over 4762.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02954, over 971903.73 frames.], batch size: 17, lr: 1.31e-04 2022-05-08 23:04:27,249 INFO [train.py:715] (0/8) Epoch 17, batch 5650, loss[loss=0.1275, simple_loss=0.2018, pruned_loss=0.0266, over 4876.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2081, pruned_loss=0.02942, over 971885.86 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:05:08,282 INFO [train.py:715] (0/8) Epoch 17, batch 5700, loss[loss=0.1451, simple_loss=0.2105, pruned_loss=0.03988, over 4849.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2079, pruned_loss=0.0293, over 971771.11 frames.], batch size: 32, lr: 1.31e-04 2022-05-08 23:05:48,492 INFO [train.py:715] (0/8) Epoch 17, batch 5750, loss[loss=0.1636, simple_loss=0.2265, pruned_loss=0.05038, over 4851.00 frames.], tot_loss[loss=0.1333, simple_loss=0.208, pruned_loss=0.02927, over 971693.08 frames.], batch size: 13, lr: 1.31e-04 2022-05-08 23:06:27,749 INFO [train.py:715] (0/8) Epoch 17, batch 5800, loss[loss=0.1752, simple_loss=0.2537, pruned_loss=0.0484, over 4758.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02992, over 971893.87 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:07:08,773 INFO [train.py:715] (0/8) Epoch 17, batch 5850, loss[loss=0.1274, simple_loss=0.1916, pruned_loss=0.03158, over 4735.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02935, over 972411.22 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:07:49,077 INFO [train.py:715] (0/8) Epoch 17, batch 5900, loss[loss=0.1245, simple_loss=0.2051, pruned_loss=0.02193, over 4983.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2073, pruned_loss=0.02901, over 972178.01 frames.], batch size: 27, lr: 1.31e-04 2022-05-08 23:08:29,703 INFO [train.py:715] (0/8) Epoch 17, batch 5950, loss[loss=0.1473, simple_loss=0.2238, pruned_loss=0.03542, over 4958.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02924, over 971806.14 frames.], batch size: 24, lr: 1.31e-04 2022-05-08 23:09:09,143 INFO [train.py:715] (0/8) Epoch 17, batch 6000, loss[loss=0.1782, simple_loss=0.2516, pruned_loss=0.05236, over 4763.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02936, over 971820.47 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 23:09:09,144 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 23:09:23,455 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1047, simple_loss=0.1881, pruned_loss=0.01069, over 914524.00 frames. 2022-05-08 23:10:02,838 INFO [train.py:715] (0/8) Epoch 17, batch 6050, loss[loss=0.1177, simple_loss=0.1872, pruned_loss=0.02412, over 4953.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02964, over 971336.59 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:10:43,305 INFO [train.py:715] (0/8) Epoch 17, batch 6100, loss[loss=0.1443, simple_loss=0.228, pruned_loss=0.03024, over 4887.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02945, over 971645.69 frames.], batch size: 19, lr: 1.31e-04 2022-05-08 23:11:22,424 INFO [train.py:715] (0/8) Epoch 17, batch 6150, loss[loss=0.1309, simple_loss=0.2095, pruned_loss=0.02616, over 4875.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02947, over 972041.80 frames.], batch size: 22, lr: 1.31e-04 2022-05-08 23:12:02,006 INFO [train.py:715] (0/8) Epoch 17, batch 6200, loss[loss=0.1768, simple_loss=0.2392, pruned_loss=0.05721, over 4919.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02971, over 972345.20 frames.], batch size: 39, lr: 1.31e-04 2022-05-08 23:12:42,481 INFO [train.py:715] (0/8) Epoch 17, batch 6250, loss[loss=0.1299, simple_loss=0.2021, pruned_loss=0.02887, over 4905.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02918, over 972673.49 frames.], batch size: 17, lr: 1.31e-04 2022-05-08 23:13:22,265 INFO [train.py:715] (0/8) Epoch 17, batch 6300, loss[loss=0.1218, simple_loss=0.1844, pruned_loss=0.02957, over 4973.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2062, pruned_loss=0.02923, over 972930.86 frames.], batch size: 14, lr: 1.31e-04 2022-05-08 23:14:01,680 INFO [train.py:715] (0/8) Epoch 17, batch 6350, loss[loss=0.1099, simple_loss=0.1782, pruned_loss=0.02074, over 4750.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02926, over 972449.64 frames.], batch size: 19, lr: 1.31e-04 2022-05-08 23:14:41,497 INFO [train.py:715] (0/8) Epoch 17, batch 6400, loss[loss=0.1459, simple_loss=0.2175, pruned_loss=0.03713, over 4966.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02926, over 972594.74 frames.], batch size: 39, lr: 1.31e-04 2022-05-08 23:15:21,775 INFO [train.py:715] (0/8) Epoch 17, batch 6450, loss[loss=0.1289, simple_loss=0.2079, pruned_loss=0.02497, over 4913.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02924, over 972685.54 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 23:16:01,143 INFO [train.py:715] (0/8) Epoch 17, batch 6500, loss[loss=0.1311, simple_loss=0.209, pruned_loss=0.02665, over 4786.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2063, pruned_loss=0.02925, over 972909.94 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:16:40,474 INFO [train.py:715] (0/8) Epoch 17, batch 6550, loss[loss=0.1379, simple_loss=0.2041, pruned_loss=0.03582, over 4872.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02937, over 972455.85 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:17:20,846 INFO [train.py:715] (0/8) Epoch 17, batch 6600, loss[loss=0.1235, simple_loss=0.1964, pruned_loss=0.02535, over 4845.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2062, pruned_loss=0.0291, over 971672.96 frames.], batch size: 20, lr: 1.31e-04 2022-05-08 23:18:01,034 INFO [train.py:715] (0/8) Epoch 17, batch 6650, loss[loss=0.1033, simple_loss=0.184, pruned_loss=0.01127, over 4909.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02894, over 971615.62 frames.], batch size: 23, lr: 1.31e-04 2022-05-08 23:18:40,482 INFO [train.py:715] (0/8) Epoch 17, batch 6700, loss[loss=0.1274, simple_loss=0.2131, pruned_loss=0.02085, over 4780.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02847, over 971322.20 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 23:19:20,728 INFO [train.py:715] (0/8) Epoch 17, batch 6750, loss[loss=0.1367, simple_loss=0.2124, pruned_loss=0.03047, over 4694.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02881, over 970709.04 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:20:00,495 INFO [train.py:715] (0/8) Epoch 17, batch 6800, loss[loss=0.1089, simple_loss=0.1816, pruned_loss=0.01812, over 4790.00 frames.], tot_loss[loss=0.132, simple_loss=0.206, pruned_loss=0.02905, over 970479.35 frames.], batch size: 17, lr: 1.31e-04 2022-05-08 23:20:41,163 INFO [train.py:715] (0/8) Epoch 17, batch 6850, loss[loss=0.1303, simple_loss=0.2052, pruned_loss=0.02771, over 4893.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02911, over 970078.68 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:21:20,237 INFO [train.py:715] (0/8) Epoch 17, batch 6900, loss[loss=0.1579, simple_loss=0.2335, pruned_loss=0.04112, over 4939.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02941, over 970532.08 frames.], batch size: 29, lr: 1.31e-04 2022-05-08 23:22:00,926 INFO [train.py:715] (0/8) Epoch 17, batch 6950, loss[loss=0.1213, simple_loss=0.198, pruned_loss=0.02229, over 4959.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02936, over 970866.94 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:22:40,662 INFO [train.py:715] (0/8) Epoch 17, batch 7000, loss[loss=0.147, simple_loss=0.2179, pruned_loss=0.03811, over 4882.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02925, over 971932.71 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:23:20,233 INFO [train.py:715] (0/8) Epoch 17, batch 7050, loss[loss=0.116, simple_loss=0.1925, pruned_loss=0.01977, over 4852.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02977, over 971440.22 frames.], batch size: 13, lr: 1.31e-04 2022-05-08 23:24:00,497 INFO [train.py:715] (0/8) Epoch 17, batch 7100, loss[loss=0.1335, simple_loss=0.215, pruned_loss=0.02605, over 4911.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02959, over 972326.96 frames.], batch size: 39, lr: 1.31e-04 2022-05-08 23:24:40,019 INFO [train.py:715] (0/8) Epoch 17, batch 7150, loss[loss=0.1132, simple_loss=0.1894, pruned_loss=0.01852, over 4800.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2067, pruned_loss=0.02944, over 973296.15 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:25:19,631 INFO [train.py:715] (0/8) Epoch 17, batch 7200, loss[loss=0.1298, simple_loss=0.1989, pruned_loss=0.03036, over 4936.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.0299, over 973593.42 frames.], batch size: 23, lr: 1.31e-04 2022-05-08 23:25:58,579 INFO [train.py:715] (0/8) Epoch 17, batch 7250, loss[loss=0.1593, simple_loss=0.2326, pruned_loss=0.043, over 4931.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02959, over 973322.08 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:26:39,074 INFO [train.py:715] (0/8) Epoch 17, batch 7300, loss[loss=0.1293, simple_loss=0.1975, pruned_loss=0.03054, over 4776.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.02975, over 972451.04 frames.], batch size: 17, lr: 1.31e-04 2022-05-08 23:27:18,022 INFO [train.py:715] (0/8) Epoch 17, batch 7350, loss[loss=0.102, simple_loss=0.1694, pruned_loss=0.01725, over 4770.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.0299, over 972824.94 frames.], batch size: 12, lr: 1.31e-04 2022-05-08 23:27:56,385 INFO [train.py:715] (0/8) Epoch 17, batch 7400, loss[loss=0.1509, simple_loss=0.2319, pruned_loss=0.0349, over 4941.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03001, over 973236.84 frames.], batch size: 35, lr: 1.31e-04 2022-05-08 23:28:36,426 INFO [train.py:715] (0/8) Epoch 17, batch 7450, loss[loss=0.146, simple_loss=0.2172, pruned_loss=0.03742, over 4973.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02957, over 972698.96 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:29:15,430 INFO [train.py:715] (0/8) Epoch 17, batch 7500, loss[loss=0.1377, simple_loss=0.2088, pruned_loss=0.03328, over 4878.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02932, over 972352.42 frames.], batch size: 22, lr: 1.31e-04 2022-05-08 23:29:55,162 INFO [train.py:715] (0/8) Epoch 17, batch 7550, loss[loss=0.1524, simple_loss=0.2282, pruned_loss=0.03833, over 4907.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2074, pruned_loss=0.02921, over 971375.11 frames.], batch size: 39, lr: 1.31e-04 2022-05-08 23:30:34,490 INFO [train.py:715] (0/8) Epoch 17, batch 7600, loss[loss=0.1263, simple_loss=0.2025, pruned_loss=0.02508, over 4830.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02964, over 971293.05 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:31:14,610 INFO [train.py:715] (0/8) Epoch 17, batch 7650, loss[loss=0.1443, simple_loss=0.2265, pruned_loss=0.03099, over 4877.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2083, pruned_loss=0.02966, over 971286.67 frames.], batch size: 20, lr: 1.31e-04 2022-05-08 23:31:54,495 INFO [train.py:715] (0/8) Epoch 17, batch 7700, loss[loss=0.1309, simple_loss=0.214, pruned_loss=0.02392, over 4939.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2087, pruned_loss=0.03005, over 971792.69 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:32:33,790 INFO [train.py:715] (0/8) Epoch 17, batch 7750, loss[loss=0.1503, simple_loss=0.2331, pruned_loss=0.03378, over 4760.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02976, over 970886.95 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:33:14,390 INFO [train.py:715] (0/8) Epoch 17, batch 7800, loss[loss=0.1001, simple_loss=0.1753, pruned_loss=0.01248, over 4828.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02979, over 971110.28 frames.], batch size: 26, lr: 1.31e-04 2022-05-08 23:33:54,605 INFO [train.py:715] (0/8) Epoch 17, batch 7850, loss[loss=0.1374, simple_loss=0.2199, pruned_loss=0.02746, over 4987.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2091, pruned_loss=0.03002, over 971924.21 frames.], batch size: 26, lr: 1.31e-04 2022-05-08 23:34:34,853 INFO [train.py:715] (0/8) Epoch 17, batch 7900, loss[loss=0.1256, simple_loss=0.2016, pruned_loss=0.02484, over 4708.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2088, pruned_loss=0.02983, over 972005.19 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:35:13,812 INFO [train.py:715] (0/8) Epoch 17, batch 7950, loss[loss=0.1323, simple_loss=0.2144, pruned_loss=0.02506, over 4921.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2079, pruned_loss=0.02939, over 972816.60 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 23:35:53,564 INFO [train.py:715] (0/8) Epoch 17, batch 8000, loss[loss=0.1239, simple_loss=0.1915, pruned_loss=0.02812, over 4993.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02914, over 973553.19 frames.], batch size: 14, lr: 1.31e-04 2022-05-08 23:36:33,456 INFO [train.py:715] (0/8) Epoch 17, batch 8050, loss[loss=0.1247, simple_loss=0.2051, pruned_loss=0.02214, over 4804.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02926, over 972985.00 frames.], batch size: 24, lr: 1.31e-04 2022-05-08 23:37:12,790 INFO [train.py:715] (0/8) Epoch 17, batch 8100, loss[loss=0.1148, simple_loss=0.183, pruned_loss=0.02334, over 4752.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02884, over 973387.68 frames.], batch size: 12, lr: 1.31e-04 2022-05-08 23:37:52,693 INFO [train.py:715] (0/8) Epoch 17, batch 8150, loss[loss=0.1518, simple_loss=0.2272, pruned_loss=0.03823, over 4809.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02926, over 973133.14 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:38:32,360 INFO [train.py:715] (0/8) Epoch 17, batch 8200, loss[loss=0.1207, simple_loss=0.2011, pruned_loss=0.02012, over 4734.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02924, over 972724.97 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:39:04,816 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-600000.pt 2022-05-08 23:39:14,693 INFO [train.py:715] (0/8) Epoch 17, batch 8250, loss[loss=0.1182, simple_loss=0.1893, pruned_loss=0.02349, over 4834.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02991, over 972300.35 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:39:53,905 INFO [train.py:715] (0/8) Epoch 17, batch 8300, loss[loss=0.1427, simple_loss=0.2283, pruned_loss=0.02857, over 4825.00 frames.], tot_loss[loss=0.1345, simple_loss=0.2087, pruned_loss=0.03021, over 972432.11 frames.], batch size: 25, lr: 1.31e-04 2022-05-08 23:40:33,623 INFO [train.py:715] (0/8) Epoch 17, batch 8350, loss[loss=0.1695, simple_loss=0.2615, pruned_loss=0.0388, over 4951.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2083, pruned_loss=0.03005, over 971651.52 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:41:13,215 INFO [train.py:715] (0/8) Epoch 17, batch 8400, loss[loss=0.1205, simple_loss=0.1968, pruned_loss=0.02213, over 4985.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02968, over 972199.81 frames.], batch size: 25, lr: 1.31e-04 2022-05-08 23:41:52,761 INFO [train.py:715] (0/8) Epoch 17, batch 8450, loss[loss=0.1073, simple_loss=0.1857, pruned_loss=0.01445, over 4829.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2066, pruned_loss=0.0295, over 971654.97 frames.], batch size: 27, lr: 1.31e-04 2022-05-08 23:42:32,328 INFO [train.py:715] (0/8) Epoch 17, batch 8500, loss[loss=0.1156, simple_loss=0.2001, pruned_loss=0.01558, over 4816.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2064, pruned_loss=0.02967, over 972017.22 frames.], batch size: 25, lr: 1.31e-04 2022-05-08 23:43:12,146 INFO [train.py:715] (0/8) Epoch 17, batch 8550, loss[loss=0.1322, simple_loss=0.2139, pruned_loss=0.02527, over 4903.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.02953, over 971854.60 frames.], batch size: 19, lr: 1.31e-04 2022-05-08 23:43:52,007 INFO [train.py:715] (0/8) Epoch 17, batch 8600, loss[loss=0.1313, simple_loss=0.2124, pruned_loss=0.02504, over 4873.00 frames.], tot_loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02962, over 971525.56 frames.], batch size: 22, lr: 1.31e-04 2022-05-08 23:44:31,014 INFO [train.py:715] (0/8) Epoch 17, batch 8650, loss[loss=0.1066, simple_loss=0.1904, pruned_loss=0.01141, over 4791.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02923, over 971615.52 frames.], batch size: 14, lr: 1.31e-04 2022-05-08 23:45:10,886 INFO [train.py:715] (0/8) Epoch 17, batch 8700, loss[loss=0.1207, simple_loss=0.1999, pruned_loss=0.02082, over 4929.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02952, over 971789.18 frames.], batch size: 29, lr: 1.31e-04 2022-05-08 23:45:50,293 INFO [train.py:715] (0/8) Epoch 17, batch 8750, loss[loss=0.1469, simple_loss=0.2136, pruned_loss=0.04009, over 4768.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02956, over 972130.91 frames.], batch size: 17, lr: 1.31e-04 2022-05-08 23:46:29,855 INFO [train.py:715] (0/8) Epoch 17, batch 8800, loss[loss=0.1672, simple_loss=0.2352, pruned_loss=0.04964, over 4756.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2069, pruned_loss=0.02965, over 971921.37 frames.], batch size: 16, lr: 1.31e-04 2022-05-08 23:47:09,588 INFO [train.py:715] (0/8) Epoch 17, batch 8850, loss[loss=0.1285, simple_loss=0.2018, pruned_loss=0.02764, over 4962.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2063, pruned_loss=0.02941, over 972372.17 frames.], batch size: 21, lr: 1.31e-04 2022-05-08 23:47:48,797 INFO [train.py:715] (0/8) Epoch 17, batch 8900, loss[loss=0.153, simple_loss=0.2118, pruned_loss=0.04708, over 4928.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2057, pruned_loss=0.02905, over 972219.54 frames.], batch size: 35, lr: 1.31e-04 2022-05-08 23:48:28,443 INFO [train.py:715] (0/8) Epoch 17, batch 8950, loss[loss=0.146, simple_loss=0.2202, pruned_loss=0.0359, over 4886.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2059, pruned_loss=0.02923, over 971741.89 frames.], batch size: 22, lr: 1.31e-04 2022-05-08 23:49:07,473 INFO [train.py:715] (0/8) Epoch 17, batch 9000, loss[loss=0.136, simple_loss=0.2178, pruned_loss=0.02707, over 4823.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02939, over 971055.11 frames.], batch size: 25, lr: 1.31e-04 2022-05-08 23:49:07,474 INFO [train.py:733] (0/8) Computing validation loss 2022-05-08 23:49:17,247 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1048, simple_loss=0.1882, pruned_loss=0.01072, over 914524.00 frames. 2022-05-08 23:49:56,408 INFO [train.py:715] (0/8) Epoch 17, batch 9050, loss[loss=0.1497, simple_loss=0.2398, pruned_loss=0.02984, over 4923.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02987, over 971187.74 frames.], batch size: 35, lr: 1.31e-04 2022-05-08 23:50:36,246 INFO [train.py:715] (0/8) Epoch 17, batch 9100, loss[loss=0.1151, simple_loss=0.195, pruned_loss=0.01762, over 4927.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03018, over 971618.32 frames.], batch size: 29, lr: 1.31e-04 2022-05-08 23:51:15,867 INFO [train.py:715] (0/8) Epoch 17, batch 9150, loss[loss=0.1503, simple_loss=0.2146, pruned_loss=0.04299, over 4831.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2076, pruned_loss=0.02943, over 971801.03 frames.], batch size: 30, lr: 1.31e-04 2022-05-08 23:51:54,745 INFO [train.py:715] (0/8) Epoch 17, batch 9200, loss[loss=0.1302, simple_loss=0.2045, pruned_loss=0.02796, over 4803.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02962, over 972073.93 frames.], batch size: 25, lr: 1.31e-04 2022-05-08 23:52:34,933 INFO [train.py:715] (0/8) Epoch 17, batch 9250, loss[loss=0.1479, simple_loss=0.2131, pruned_loss=0.04137, over 4973.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02932, over 973544.86 frames.], batch size: 24, lr: 1.31e-04 2022-05-08 23:53:14,613 INFO [train.py:715] (0/8) Epoch 17, batch 9300, loss[loss=0.1595, simple_loss=0.2267, pruned_loss=0.04615, over 4691.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02916, over 973156.24 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:53:53,949 INFO [train.py:715] (0/8) Epoch 17, batch 9350, loss[loss=0.1296, simple_loss=0.1988, pruned_loss=0.03016, over 4848.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02954, over 973396.99 frames.], batch size: 32, lr: 1.31e-04 2022-05-08 23:54:33,279 INFO [train.py:715] (0/8) Epoch 17, batch 9400, loss[loss=0.1461, simple_loss=0.2161, pruned_loss=0.03806, over 4824.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2075, pruned_loss=0.02916, over 973194.83 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:55:13,698 INFO [train.py:715] (0/8) Epoch 17, batch 9450, loss[loss=0.1452, simple_loss=0.2121, pruned_loss=0.03914, over 4775.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.0295, over 972269.22 frames.], batch size: 18, lr: 1.31e-04 2022-05-08 23:55:53,690 INFO [train.py:715] (0/8) Epoch 17, batch 9500, loss[loss=0.1268, simple_loss=0.2028, pruned_loss=0.0254, over 4773.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02961, over 972585.96 frames.], batch size: 14, lr: 1.31e-04 2022-05-08 23:56:32,924 INFO [train.py:715] (0/8) Epoch 17, batch 9550, loss[loss=0.1044, simple_loss=0.1782, pruned_loss=0.01536, over 4874.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02972, over 971995.89 frames.], batch size: 22, lr: 1.31e-04 2022-05-08 23:57:12,482 INFO [train.py:715] (0/8) Epoch 17, batch 9600, loss[loss=0.1218, simple_loss=0.1977, pruned_loss=0.02294, over 4939.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2082, pruned_loss=0.02959, over 972028.57 frames.], batch size: 29, lr: 1.31e-04 2022-05-08 23:57:52,756 INFO [train.py:715] (0/8) Epoch 17, batch 9650, loss[loss=0.1492, simple_loss=0.2134, pruned_loss=0.04252, over 4830.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02936, over 972029.82 frames.], batch size: 15, lr: 1.31e-04 2022-05-08 23:58:31,947 INFO [train.py:715] (0/8) Epoch 17, batch 9700, loss[loss=0.1431, simple_loss=0.2113, pruned_loss=0.03739, over 4909.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02938, over 972036.60 frames.], batch size: 17, lr: 1.31e-04 2022-05-08 23:59:11,713 INFO [train.py:715] (0/8) Epoch 17, batch 9750, loss[loss=0.1303, simple_loss=0.1996, pruned_loss=0.03045, over 4798.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2077, pruned_loss=0.02926, over 972711.67 frames.], batch size: 14, lr: 1.31e-04 2022-05-08 23:59:51,455 INFO [train.py:715] (0/8) Epoch 17, batch 9800, loss[loss=0.108, simple_loss=0.1792, pruned_loss=0.01843, over 4741.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02939, over 972980.96 frames.], batch size: 12, lr: 1.31e-04 2022-05-09 00:00:31,044 INFO [train.py:715] (0/8) Epoch 17, batch 9850, loss[loss=0.1228, simple_loss=0.1906, pruned_loss=0.02751, over 4980.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2081, pruned_loss=0.02959, over 972658.16 frames.], batch size: 35, lr: 1.31e-04 2022-05-09 00:01:10,440 INFO [train.py:715] (0/8) Epoch 17, batch 9900, loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 4807.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2079, pruned_loss=0.0294, over 971991.14 frames.], batch size: 14, lr: 1.31e-04 2022-05-09 00:01:49,847 INFO [train.py:715] (0/8) Epoch 17, batch 9950, loss[loss=0.1569, simple_loss=0.2228, pruned_loss=0.04548, over 4708.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02944, over 972543.47 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:02:30,138 INFO [train.py:715] (0/8) Epoch 17, batch 10000, loss[loss=0.1444, simple_loss=0.2148, pruned_loss=0.03703, over 4981.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02995, over 973273.35 frames.], batch size: 27, lr: 1.31e-04 2022-05-09 00:03:09,387 INFO [train.py:715] (0/8) Epoch 17, batch 10050, loss[loss=0.1112, simple_loss=0.1978, pruned_loss=0.01232, over 4978.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02946, over 974343.02 frames.], batch size: 28, lr: 1.31e-04 2022-05-09 00:03:48,273 INFO [train.py:715] (0/8) Epoch 17, batch 10100, loss[loss=0.1465, simple_loss=0.2159, pruned_loss=0.03852, over 4970.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02897, over 974389.34 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:04:27,590 INFO [train.py:715] (0/8) Epoch 17, batch 10150, loss[loss=0.1182, simple_loss=0.1873, pruned_loss=0.0245, over 4938.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02909, over 974618.63 frames.], batch size: 21, lr: 1.31e-04 2022-05-09 00:05:06,927 INFO [train.py:715] (0/8) Epoch 17, batch 10200, loss[loss=0.1354, simple_loss=0.2067, pruned_loss=0.03204, over 4861.00 frames.], tot_loss[loss=0.1324, simple_loss=0.207, pruned_loss=0.02883, over 974388.62 frames.], batch size: 20, lr: 1.31e-04 2022-05-09 00:05:44,868 INFO [train.py:715] (0/8) Epoch 17, batch 10250, loss[loss=0.1489, simple_loss=0.2274, pruned_loss=0.03516, over 4735.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02892, over 973833.97 frames.], batch size: 16, lr: 1.31e-04 2022-05-09 00:06:24,646 INFO [train.py:715] (0/8) Epoch 17, batch 10300, loss[loss=0.136, simple_loss=0.2142, pruned_loss=0.02894, over 4778.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02913, over 973809.85 frames.], batch size: 18, lr: 1.31e-04 2022-05-09 00:07:04,574 INFO [train.py:715] (0/8) Epoch 17, batch 10350, loss[loss=0.1275, simple_loss=0.2126, pruned_loss=0.02119, over 4716.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2076, pruned_loss=0.02894, over 973247.65 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:07:43,242 INFO [train.py:715] (0/8) Epoch 17, batch 10400, loss[loss=0.1488, simple_loss=0.2142, pruned_loss=0.04173, over 4955.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2079, pruned_loss=0.02935, over 973972.95 frames.], batch size: 35, lr: 1.31e-04 2022-05-09 00:08:22,340 INFO [train.py:715] (0/8) Epoch 17, batch 10450, loss[loss=0.118, simple_loss=0.193, pruned_loss=0.02151, over 4932.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2073, pruned_loss=0.02905, over 973843.35 frames.], batch size: 23, lr: 1.31e-04 2022-05-09 00:09:02,375 INFO [train.py:715] (0/8) Epoch 17, batch 10500, loss[loss=0.142, simple_loss=0.2242, pruned_loss=0.02993, over 4911.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02913, over 973799.03 frames.], batch size: 39, lr: 1.31e-04 2022-05-09 00:09:41,411 INFO [train.py:715] (0/8) Epoch 17, batch 10550, loss[loss=0.149, simple_loss=0.2324, pruned_loss=0.03287, over 4698.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02887, over 973273.84 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:10:19,761 INFO [train.py:715] (0/8) Epoch 17, batch 10600, loss[loss=0.1416, simple_loss=0.2055, pruned_loss=0.03881, over 4861.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02895, over 972602.66 frames.], batch size: 20, lr: 1.31e-04 2022-05-09 00:10:59,062 INFO [train.py:715] (0/8) Epoch 17, batch 10650, loss[loss=0.1977, simple_loss=0.2911, pruned_loss=0.05216, over 4689.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02892, over 973049.91 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:11:38,574 INFO [train.py:715] (0/8) Epoch 17, batch 10700, loss[loss=0.1502, simple_loss=0.2279, pruned_loss=0.0363, over 4813.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02847, over 972511.96 frames.], batch size: 25, lr: 1.31e-04 2022-05-09 00:12:17,254 INFO [train.py:715] (0/8) Epoch 17, batch 10750, loss[loss=0.1295, simple_loss=0.2074, pruned_loss=0.0258, over 4847.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2063, pruned_loss=0.02846, over 972022.54 frames.], batch size: 32, lr: 1.31e-04 2022-05-09 00:12:56,255 INFO [train.py:715] (0/8) Epoch 17, batch 10800, loss[loss=0.1247, simple_loss=0.1982, pruned_loss=0.02561, over 4865.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.02881, over 972970.86 frames.], batch size: 22, lr: 1.31e-04 2022-05-09 00:13:36,019 INFO [train.py:715] (0/8) Epoch 17, batch 10850, loss[loss=0.1282, simple_loss=0.2058, pruned_loss=0.02532, over 4969.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02877, over 973082.76 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:14:15,587 INFO [train.py:715] (0/8) Epoch 17, batch 10900, loss[loss=0.1365, simple_loss=0.2044, pruned_loss=0.03435, over 4821.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.02843, over 972563.54 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:14:53,757 INFO [train.py:715] (0/8) Epoch 17, batch 10950, loss[loss=0.1402, simple_loss=0.2163, pruned_loss=0.0321, over 4891.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02909, over 971919.60 frames.], batch size: 32, lr: 1.31e-04 2022-05-09 00:15:33,874 INFO [train.py:715] (0/8) Epoch 17, batch 11000, loss[loss=0.1175, simple_loss=0.1967, pruned_loss=0.01918, over 4985.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02875, over 971208.64 frames.], batch size: 28, lr: 1.31e-04 2022-05-09 00:16:13,743 INFO [train.py:715] (0/8) Epoch 17, batch 11050, loss[loss=0.122, simple_loss=0.2005, pruned_loss=0.0218, over 4701.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02866, over 971489.29 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:16:52,422 INFO [train.py:715] (0/8) Epoch 17, batch 11100, loss[loss=0.106, simple_loss=0.1875, pruned_loss=0.01224, over 4811.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02859, over 971404.02 frames.], batch size: 26, lr: 1.31e-04 2022-05-09 00:17:31,477 INFO [train.py:715] (0/8) Epoch 17, batch 11150, loss[loss=0.1165, simple_loss=0.197, pruned_loss=0.01798, over 4939.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02893, over 970934.43 frames.], batch size: 21, lr: 1.31e-04 2022-05-09 00:18:11,484 INFO [train.py:715] (0/8) Epoch 17, batch 11200, loss[loss=0.1723, simple_loss=0.2469, pruned_loss=0.04889, over 4839.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02906, over 970915.11 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:18:51,603 INFO [train.py:715] (0/8) Epoch 17, batch 11250, loss[loss=0.132, simple_loss=0.2036, pruned_loss=0.03015, over 4870.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02953, over 970739.89 frames.], batch size: 22, lr: 1.31e-04 2022-05-09 00:19:29,833 INFO [train.py:715] (0/8) Epoch 17, batch 11300, loss[loss=0.1423, simple_loss=0.2194, pruned_loss=0.03257, over 4798.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02928, over 970849.28 frames.], batch size: 21, lr: 1.31e-04 2022-05-09 00:20:09,302 INFO [train.py:715] (0/8) Epoch 17, batch 11350, loss[loss=0.1315, simple_loss=0.2099, pruned_loss=0.02657, over 4808.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.0297, over 971008.46 frames.], batch size: 13, lr: 1.31e-04 2022-05-09 00:20:49,485 INFO [train.py:715] (0/8) Epoch 17, batch 11400, loss[loss=0.1184, simple_loss=0.1931, pruned_loss=0.02185, over 4756.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2076, pruned_loss=0.0298, over 971046.51 frames.], batch size: 16, lr: 1.31e-04 2022-05-09 00:21:28,498 INFO [train.py:715] (0/8) Epoch 17, batch 11450, loss[loss=0.1138, simple_loss=0.187, pruned_loss=0.02026, over 4889.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02968, over 972748.01 frames.], batch size: 22, lr: 1.31e-04 2022-05-09 00:22:07,510 INFO [train.py:715] (0/8) Epoch 17, batch 11500, loss[loss=0.1165, simple_loss=0.1923, pruned_loss=0.02041, over 4975.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.0301, over 972203.09 frames.], batch size: 14, lr: 1.31e-04 2022-05-09 00:22:47,218 INFO [train.py:715] (0/8) Epoch 17, batch 11550, loss[loss=0.1492, simple_loss=0.2303, pruned_loss=0.03409, over 4810.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02977, over 972897.98 frames.], batch size: 24, lr: 1.31e-04 2022-05-09 00:23:27,157 INFO [train.py:715] (0/8) Epoch 17, batch 11600, loss[loss=0.1398, simple_loss=0.2121, pruned_loss=0.03377, over 4934.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02983, over 973260.50 frames.], batch size: 18, lr: 1.31e-04 2022-05-09 00:24:05,126 INFO [train.py:715] (0/8) Epoch 17, batch 11650, loss[loss=0.1222, simple_loss=0.2016, pruned_loss=0.02138, over 4808.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02968, over 972716.10 frames.], batch size: 21, lr: 1.31e-04 2022-05-09 00:24:44,949 INFO [train.py:715] (0/8) Epoch 17, batch 11700, loss[loss=0.135, simple_loss=0.2102, pruned_loss=0.02986, over 4854.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02941, over 973052.22 frames.], batch size: 20, lr: 1.31e-04 2022-05-09 00:25:24,934 INFO [train.py:715] (0/8) Epoch 17, batch 11750, loss[loss=0.1598, simple_loss=0.2285, pruned_loss=0.04557, over 4694.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.02933, over 973003.97 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:26:03,879 INFO [train.py:715] (0/8) Epoch 17, batch 11800, loss[loss=0.128, simple_loss=0.2114, pruned_loss=0.0223, over 4949.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2062, pruned_loss=0.02907, over 972752.54 frames.], batch size: 24, lr: 1.31e-04 2022-05-09 00:26:42,873 INFO [train.py:715] (0/8) Epoch 17, batch 11850, loss[loss=0.1212, simple_loss=0.1949, pruned_loss=0.02379, over 4816.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02909, over 972832.02 frames.], batch size: 13, lr: 1.31e-04 2022-05-09 00:27:22,142 INFO [train.py:715] (0/8) Epoch 17, batch 11900, loss[loss=0.1249, simple_loss=0.2028, pruned_loss=0.02355, over 4787.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02883, over 971983.14 frames.], batch size: 21, lr: 1.31e-04 2022-05-09 00:28:01,951 INFO [train.py:715] (0/8) Epoch 17, batch 11950, loss[loss=0.127, simple_loss=0.2039, pruned_loss=0.0251, over 4772.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2053, pruned_loss=0.02843, over 972119.21 frames.], batch size: 18, lr: 1.31e-04 2022-05-09 00:28:40,964 INFO [train.py:715] (0/8) Epoch 17, batch 12000, loss[loss=0.1378, simple_loss=0.2109, pruned_loss=0.03238, over 4879.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2058, pruned_loss=0.02875, over 972284.92 frames.], batch size: 16, lr: 1.31e-04 2022-05-09 00:28:40,965 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 00:28:52,720 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1048, simple_loss=0.1882, pruned_loss=0.0107, over 914524.00 frames. 2022-05-09 00:29:31,823 INFO [train.py:715] (0/8) Epoch 17, batch 12050, loss[loss=0.1319, simple_loss=0.2021, pruned_loss=0.03085, over 4706.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.02891, over 972206.41 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:30:10,917 INFO [train.py:715] (0/8) Epoch 17, batch 12100, loss[loss=0.1055, simple_loss=0.1688, pruned_loss=0.02107, over 4835.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2053, pruned_loss=0.02859, over 972147.00 frames.], batch size: 12, lr: 1.31e-04 2022-05-09 00:30:50,929 INFO [train.py:715] (0/8) Epoch 17, batch 12150, loss[loss=0.1326, simple_loss=0.1988, pruned_loss=0.0332, over 4830.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2051, pruned_loss=0.02839, over 972431.07 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:31:29,660 INFO [train.py:715] (0/8) Epoch 17, batch 12200, loss[loss=0.1509, simple_loss=0.2171, pruned_loss=0.04236, over 4954.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2053, pruned_loss=0.02857, over 971362.06 frames.], batch size: 24, lr: 1.31e-04 2022-05-09 00:32:08,194 INFO [train.py:715] (0/8) Epoch 17, batch 12250, loss[loss=0.1224, simple_loss=0.1966, pruned_loss=0.02412, over 4988.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.02845, over 971260.34 frames.], batch size: 28, lr: 1.31e-04 2022-05-09 00:32:47,685 INFO [train.py:715] (0/8) Epoch 17, batch 12300, loss[loss=0.1435, simple_loss=0.2148, pruned_loss=0.03608, over 4916.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02855, over 971853.63 frames.], batch size: 23, lr: 1.31e-04 2022-05-09 00:33:26,858 INFO [train.py:715] (0/8) Epoch 17, batch 12350, loss[loss=0.1281, simple_loss=0.2022, pruned_loss=0.02702, over 4986.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02855, over 971646.46 frames.], batch size: 28, lr: 1.31e-04 2022-05-09 00:34:05,565 INFO [train.py:715] (0/8) Epoch 17, batch 12400, loss[loss=0.166, simple_loss=0.2414, pruned_loss=0.04524, over 4972.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2059, pruned_loss=0.02891, over 971884.37 frames.], batch size: 25, lr: 1.31e-04 2022-05-09 00:34:44,614 INFO [train.py:715] (0/8) Epoch 17, batch 12450, loss[loss=0.1312, simple_loss=0.1991, pruned_loss=0.03167, over 4959.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02885, over 971975.21 frames.], batch size: 24, lr: 1.31e-04 2022-05-09 00:35:24,996 INFO [train.py:715] (0/8) Epoch 17, batch 12500, loss[loss=0.1262, simple_loss=0.1917, pruned_loss=0.03032, over 4908.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2057, pruned_loss=0.02872, over 972423.74 frames.], batch size: 17, lr: 1.31e-04 2022-05-09 00:36:03,576 INFO [train.py:715] (0/8) Epoch 17, batch 12550, loss[loss=0.1302, simple_loss=0.1943, pruned_loss=0.03303, over 4816.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02868, over 972731.65 frames.], batch size: 13, lr: 1.31e-04 2022-05-09 00:36:42,924 INFO [train.py:715] (0/8) Epoch 17, batch 12600, loss[loss=0.125, simple_loss=0.208, pruned_loss=0.02099, over 4794.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2059, pruned_loss=0.02892, over 972394.66 frames.], batch size: 25, lr: 1.31e-04 2022-05-09 00:37:22,855 INFO [train.py:715] (0/8) Epoch 17, batch 12650, loss[loss=0.1619, simple_loss=0.2231, pruned_loss=0.05035, over 4892.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2054, pruned_loss=0.02883, over 972391.10 frames.], batch size: 32, lr: 1.31e-04 2022-05-09 00:38:02,851 INFO [train.py:715] (0/8) Epoch 17, batch 12700, loss[loss=0.1302, simple_loss=0.1966, pruned_loss=0.03189, over 4802.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2058, pruned_loss=0.02917, over 971887.11 frames.], batch size: 24, lr: 1.31e-04 2022-05-09 00:38:42,157 INFO [train.py:715] (0/8) Epoch 17, batch 12750, loss[loss=0.1845, simple_loss=0.2672, pruned_loss=0.05095, over 4803.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2065, pruned_loss=0.02957, over 971490.31 frames.], batch size: 25, lr: 1.31e-04 2022-05-09 00:39:20,959 INFO [train.py:715] (0/8) Epoch 17, batch 12800, loss[loss=0.1082, simple_loss=0.1854, pruned_loss=0.01545, over 4977.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02953, over 972495.43 frames.], batch size: 14, lr: 1.31e-04 2022-05-09 00:40:00,595 INFO [train.py:715] (0/8) Epoch 17, batch 12850, loss[loss=0.1274, simple_loss=0.2174, pruned_loss=0.01875, over 4831.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02903, over 972106.96 frames.], batch size: 25, lr: 1.31e-04 2022-05-09 00:40:39,037 INFO [train.py:715] (0/8) Epoch 17, batch 12900, loss[loss=0.1198, simple_loss=0.194, pruned_loss=0.02283, over 4917.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02988, over 971642.50 frames.], batch size: 18, lr: 1.31e-04 2022-05-09 00:41:18,422 INFO [train.py:715] (0/8) Epoch 17, batch 12950, loss[loss=0.1392, simple_loss=0.2127, pruned_loss=0.03285, over 4816.00 frames.], tot_loss[loss=0.133, simple_loss=0.2066, pruned_loss=0.02971, over 970569.36 frames.], batch size: 26, lr: 1.31e-04 2022-05-09 00:41:57,019 INFO [train.py:715] (0/8) Epoch 17, batch 13000, loss[loss=0.1652, simple_loss=0.2357, pruned_loss=0.04738, over 4832.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02972, over 970919.50 frames.], batch size: 27, lr: 1.31e-04 2022-05-09 00:42:36,100 INFO [train.py:715] (0/8) Epoch 17, batch 13050, loss[loss=0.1191, simple_loss=0.1946, pruned_loss=0.0218, over 4970.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2066, pruned_loss=0.02944, over 971754.91 frames.], batch size: 35, lr: 1.31e-04 2022-05-09 00:43:15,220 INFO [train.py:715] (0/8) Epoch 17, batch 13100, loss[loss=0.134, simple_loss=0.208, pruned_loss=0.02999, over 4755.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2062, pruned_loss=0.02917, over 971960.67 frames.], batch size: 19, lr: 1.31e-04 2022-05-09 00:43:54,020 INFO [train.py:715] (0/8) Epoch 17, batch 13150, loss[loss=0.1267, simple_loss=0.2016, pruned_loss=0.02591, over 4784.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02938, over 971901.82 frames.], batch size: 14, lr: 1.31e-04 2022-05-09 00:44:33,796 INFO [train.py:715] (0/8) Epoch 17, batch 13200, loss[loss=0.1164, simple_loss=0.1942, pruned_loss=0.01934, over 4938.00 frames.], tot_loss[loss=0.1329, simple_loss=0.207, pruned_loss=0.02938, over 972455.73 frames.], batch size: 23, lr: 1.31e-04 2022-05-09 00:45:12,323 INFO [train.py:715] (0/8) Epoch 17, batch 13250, loss[loss=0.1229, simple_loss=0.1962, pruned_loss=0.02476, over 4959.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2072, pruned_loss=0.0297, over 971694.08 frames.], batch size: 15, lr: 1.31e-04 2022-05-09 00:45:51,623 INFO [train.py:715] (0/8) Epoch 17, batch 13300, loss[loss=0.1437, simple_loss=0.2106, pruned_loss=0.0384, over 4910.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02943, over 971661.29 frames.], batch size: 17, lr: 1.31e-04 2022-05-09 00:46:30,707 INFO [train.py:715] (0/8) Epoch 17, batch 13350, loss[loss=0.1568, simple_loss=0.2324, pruned_loss=0.04055, over 4980.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.02928, over 972098.54 frames.], batch size: 25, lr: 1.31e-04 2022-05-09 00:47:09,940 INFO [train.py:715] (0/8) Epoch 17, batch 13400, loss[loss=0.1339, simple_loss=0.1948, pruned_loss=0.03653, over 4911.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2059, pruned_loss=0.02891, over 971396.91 frames.], batch size: 18, lr: 1.31e-04 2022-05-09 00:47:49,250 INFO [train.py:715] (0/8) Epoch 17, batch 13450, loss[loss=0.1451, simple_loss=0.2154, pruned_loss=0.03743, over 4898.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2063, pruned_loss=0.02923, over 970701.57 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 00:48:27,711 INFO [train.py:715] (0/8) Epoch 17, batch 13500, loss[loss=0.1316, simple_loss=0.2082, pruned_loss=0.02755, over 4927.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02917, over 972275.14 frames.], batch size: 29, lr: 1.30e-04 2022-05-09 00:49:07,420 INFO [train.py:715] (0/8) Epoch 17, batch 13550, loss[loss=0.1195, simple_loss=0.1894, pruned_loss=0.02476, over 4780.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02938, over 972504.92 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 00:49:45,772 INFO [train.py:715] (0/8) Epoch 17, batch 13600, loss[loss=0.1222, simple_loss=0.1912, pruned_loss=0.02655, over 4812.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02905, over 972723.92 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 00:50:24,814 INFO [train.py:715] (0/8) Epoch 17, batch 13650, loss[loss=0.1016, simple_loss=0.1722, pruned_loss=0.01549, over 4884.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02934, over 972278.97 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 00:51:04,636 INFO [train.py:715] (0/8) Epoch 17, batch 13700, loss[loss=0.1182, simple_loss=0.1932, pruned_loss=0.02165, over 4893.00 frames.], tot_loss[loss=0.132, simple_loss=0.2057, pruned_loss=0.02917, over 972142.16 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 00:51:43,955 INFO [train.py:715] (0/8) Epoch 17, batch 13750, loss[loss=0.1319, simple_loss=0.2116, pruned_loss=0.02613, over 4981.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2056, pruned_loss=0.02909, over 972676.54 frames.], batch size: 35, lr: 1.30e-04 2022-05-09 00:52:24,098 INFO [train.py:715] (0/8) Epoch 17, batch 13800, loss[loss=0.1099, simple_loss=0.189, pruned_loss=0.01541, over 4831.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2065, pruned_loss=0.02942, over 972465.52 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 00:53:03,510 INFO [train.py:715] (0/8) Epoch 17, batch 13850, loss[loss=0.1148, simple_loss=0.1905, pruned_loss=0.01951, over 4990.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2068, pruned_loss=0.02971, over 972429.05 frames.], batch size: 20, lr: 1.30e-04 2022-05-09 00:53:43,316 INFO [train.py:715] (0/8) Epoch 17, batch 13900, loss[loss=0.1234, simple_loss=0.2076, pruned_loss=0.01962, over 4924.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2066, pruned_loss=0.02934, over 972328.98 frames.], batch size: 29, lr: 1.30e-04 2022-05-09 00:54:22,806 INFO [train.py:715] (0/8) Epoch 17, batch 13950, loss[loss=0.1175, simple_loss=0.1841, pruned_loss=0.02545, over 4878.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2056, pruned_loss=0.02885, over 971453.23 frames.], batch size: 32, lr: 1.30e-04 2022-05-09 00:55:02,838 INFO [train.py:715] (0/8) Epoch 17, batch 14000, loss[loss=0.1339, simple_loss=0.2016, pruned_loss=0.0331, over 4815.00 frames.], tot_loss[loss=0.1311, simple_loss=0.205, pruned_loss=0.02861, over 971883.31 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 00:55:42,000 INFO [train.py:715] (0/8) Epoch 17, batch 14050, loss[loss=0.1635, simple_loss=0.2341, pruned_loss=0.04642, over 4702.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2059, pruned_loss=0.02934, over 971260.38 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 00:56:21,074 INFO [train.py:715] (0/8) Epoch 17, batch 14100, loss[loss=0.1532, simple_loss=0.2318, pruned_loss=0.03732, over 4944.00 frames.], tot_loss[loss=0.132, simple_loss=0.2056, pruned_loss=0.02924, over 971290.25 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 00:57:01,244 INFO [train.py:715] (0/8) Epoch 17, batch 14150, loss[loss=0.1607, simple_loss=0.2318, pruned_loss=0.04476, over 4857.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2061, pruned_loss=0.02938, over 971682.64 frames.], batch size: 38, lr: 1.30e-04 2022-05-09 00:57:40,316 INFO [train.py:715] (0/8) Epoch 17, batch 14200, loss[loss=0.111, simple_loss=0.1907, pruned_loss=0.01564, over 4756.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2063, pruned_loss=0.02937, over 972549.39 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 00:58:19,831 INFO [train.py:715] (0/8) Epoch 17, batch 14250, loss[loss=0.1266, simple_loss=0.1976, pruned_loss=0.02779, over 4767.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02949, over 972240.85 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 00:58:58,998 INFO [train.py:715] (0/8) Epoch 17, batch 14300, loss[loss=0.101, simple_loss=0.1779, pruned_loss=0.01206, over 4825.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2061, pruned_loss=0.02925, over 972378.82 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 00:59:38,850 INFO [train.py:715] (0/8) Epoch 17, batch 14350, loss[loss=0.1174, simple_loss=0.195, pruned_loss=0.01992, over 4750.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2069, pruned_loss=0.02964, over 972511.47 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 01:00:17,886 INFO [train.py:715] (0/8) Epoch 17, batch 14400, loss[loss=0.1148, simple_loss=0.1905, pruned_loss=0.01953, over 4862.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02961, over 972146.88 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 01:00:56,577 INFO [train.py:715] (0/8) Epoch 17, batch 14450, loss[loss=0.113, simple_loss=0.1929, pruned_loss=0.01655, over 4690.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02904, over 971918.36 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 01:01:36,312 INFO [train.py:715] (0/8) Epoch 17, batch 14500, loss[loss=0.1232, simple_loss=0.201, pruned_loss=0.02275, over 4946.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02898, over 972119.42 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 01:02:15,671 INFO [train.py:715] (0/8) Epoch 17, batch 14550, loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02869, over 4933.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02925, over 972847.85 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:02:54,145 INFO [train.py:715] (0/8) Epoch 17, batch 14600, loss[loss=0.1652, simple_loss=0.2312, pruned_loss=0.04963, over 4934.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02944, over 973635.97 frames.], batch size: 23, lr: 1.30e-04 2022-05-09 01:03:33,784 INFO [train.py:715] (0/8) Epoch 17, batch 14650, loss[loss=0.104, simple_loss=0.1743, pruned_loss=0.01687, over 4775.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02929, over 973489.05 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 01:04:13,437 INFO [train.py:715] (0/8) Epoch 17, batch 14700, loss[loss=0.1279, simple_loss=0.223, pruned_loss=0.0164, over 4775.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.02857, over 973068.88 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:04:52,649 INFO [train.py:715] (0/8) Epoch 17, batch 14750, loss[loss=0.1485, simple_loss=0.2268, pruned_loss=0.03508, over 4746.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.0288, over 972794.34 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:05:31,532 INFO [train.py:715] (0/8) Epoch 17, batch 14800, loss[loss=0.1311, simple_loss=0.1984, pruned_loss=0.03188, over 4975.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02899, over 973315.61 frames.], batch size: 35, lr: 1.30e-04 2022-05-09 01:06:11,599 INFO [train.py:715] (0/8) Epoch 17, batch 14850, loss[loss=0.1249, simple_loss=0.1945, pruned_loss=0.02764, over 4833.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02894, over 973590.04 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 01:06:50,382 INFO [train.py:715] (0/8) Epoch 17, batch 14900, loss[loss=0.1324, simple_loss=0.2093, pruned_loss=0.02771, over 4975.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.02939, over 973012.81 frames.], batch size: 28, lr: 1.30e-04 2022-05-09 01:07:29,327 INFO [train.py:715] (0/8) Epoch 17, batch 14950, loss[loss=0.1391, simple_loss=0.2077, pruned_loss=0.03524, over 4975.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.02932, over 973507.70 frames.], batch size: 25, lr: 1.30e-04 2022-05-09 01:08:09,011 INFO [train.py:715] (0/8) Epoch 17, batch 15000, loss[loss=0.1174, simple_loss=0.1893, pruned_loss=0.02278, over 4783.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02947, over 973663.39 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:08:09,012 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 01:08:19,084 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1046, simple_loss=0.1881, pruned_loss=0.01059, over 914524.00 frames. 2022-05-09 01:08:59,142 INFO [train.py:715] (0/8) Epoch 17, batch 15050, loss[loss=0.1412, simple_loss=0.2104, pruned_loss=0.03599, over 4975.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2067, pruned_loss=0.02941, over 973456.87 frames.], batch size: 35, lr: 1.30e-04 2022-05-09 01:09:38,649 INFO [train.py:715] (0/8) Epoch 17, batch 15100, loss[loss=0.1507, simple_loss=0.2373, pruned_loss=0.03207, over 4900.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02992, over 972958.70 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 01:10:17,574 INFO [train.py:715] (0/8) Epoch 17, batch 15150, loss[loss=0.1139, simple_loss=0.2012, pruned_loss=0.01331, over 4989.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02942, over 972414.89 frames.], batch size: 25, lr: 1.30e-04 2022-05-09 01:10:56,609 INFO [train.py:715] (0/8) Epoch 17, batch 15200, loss[loss=0.1124, simple_loss=0.1947, pruned_loss=0.01505, over 4874.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2062, pruned_loss=0.02927, over 972485.72 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 01:11:36,233 INFO [train.py:715] (0/8) Epoch 17, batch 15250, loss[loss=0.1512, simple_loss=0.2281, pruned_loss=0.03717, over 4909.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.0289, over 972376.05 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:12:15,602 INFO [train.py:715] (0/8) Epoch 17, batch 15300, loss[loss=0.1313, simple_loss=0.2044, pruned_loss=0.02909, over 4849.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2062, pruned_loss=0.02917, over 972390.18 frames.], batch size: 32, lr: 1.30e-04 2022-05-09 01:12:53,855 INFO [train.py:715] (0/8) Epoch 17, batch 15350, loss[loss=0.1345, simple_loss=0.2084, pruned_loss=0.03037, over 4892.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2065, pruned_loss=0.02958, over 972130.56 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 01:13:33,403 INFO [train.py:715] (0/8) Epoch 17, batch 15400, loss[loss=0.1186, simple_loss=0.1904, pruned_loss=0.02343, over 4787.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2069, pruned_loss=0.02941, over 972229.01 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 01:14:12,475 INFO [train.py:715] (0/8) Epoch 17, batch 15450, loss[loss=0.1243, simple_loss=0.1957, pruned_loss=0.0265, over 4755.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2056, pruned_loss=0.02894, over 971939.09 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:14:51,817 INFO [train.py:715] (0/8) Epoch 17, batch 15500, loss[loss=0.1152, simple_loss=0.1888, pruned_loss=0.02081, over 4862.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2059, pruned_loss=0.02946, over 972923.32 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 01:15:30,648 INFO [train.py:715] (0/8) Epoch 17, batch 15550, loss[loss=0.1325, simple_loss=0.2188, pruned_loss=0.02306, over 4791.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2062, pruned_loss=0.02941, over 973494.39 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:16:10,370 INFO [train.py:715] (0/8) Epoch 17, batch 15600, loss[loss=0.135, simple_loss=0.2145, pruned_loss=0.0278, over 4828.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2073, pruned_loss=0.02988, over 972674.20 frames.], batch size: 25, lr: 1.30e-04 2022-05-09 01:16:49,789 INFO [train.py:715] (0/8) Epoch 17, batch 15650, loss[loss=0.1677, simple_loss=0.2374, pruned_loss=0.049, over 4980.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2085, pruned_loss=0.03045, over 973336.43 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 01:17:27,917 INFO [train.py:715] (0/8) Epoch 17, batch 15700, loss[loss=0.1221, simple_loss=0.1987, pruned_loss=0.0227, over 4979.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02981, over 972974.38 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 01:18:07,729 INFO [train.py:715] (0/8) Epoch 17, batch 15750, loss[loss=0.1195, simple_loss=0.1958, pruned_loss=0.02163, over 4808.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02957, over 972908.72 frames.], batch size: 25, lr: 1.30e-04 2022-05-09 01:18:47,133 INFO [train.py:715] (0/8) Epoch 17, batch 15800, loss[loss=0.1306, simple_loss=0.2065, pruned_loss=0.02741, over 4899.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02947, over 972923.60 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 01:19:26,081 INFO [train.py:715] (0/8) Epoch 17, batch 15850, loss[loss=0.1206, simple_loss=0.1995, pruned_loss=0.02086, over 4777.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.0296, over 973361.10 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:20:04,718 INFO [train.py:715] (0/8) Epoch 17, batch 15900, loss[loss=0.1613, simple_loss=0.2445, pruned_loss=0.03905, over 4950.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02981, over 973725.81 frames.], batch size: 39, lr: 1.30e-04 2022-05-09 01:20:44,128 INFO [train.py:715] (0/8) Epoch 17, batch 15950, loss[loss=0.144, simple_loss=0.2165, pruned_loss=0.03573, over 4758.00 frames.], tot_loss[loss=0.1337, simple_loss=0.208, pruned_loss=0.02971, over 972923.40 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 01:21:23,626 INFO [train.py:715] (0/8) Epoch 17, batch 16000, loss[loss=0.161, simple_loss=0.2371, pruned_loss=0.04243, over 4840.00 frames.], tot_loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03001, over 972486.35 frames.], batch size: 20, lr: 1.30e-04 2022-05-09 01:22:02,017 INFO [train.py:715] (0/8) Epoch 17, batch 16050, loss[loss=0.1362, simple_loss=0.2057, pruned_loss=0.03332, over 4749.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02954, over 972823.60 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:22:42,059 INFO [train.py:715] (0/8) Epoch 17, batch 16100, loss[loss=0.1239, simple_loss=0.2021, pruned_loss=0.02284, over 4926.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02961, over 973007.86 frames.], batch size: 23, lr: 1.30e-04 2022-05-09 01:23:21,959 INFO [train.py:715] (0/8) Epoch 17, batch 16150, loss[loss=0.1334, simple_loss=0.2162, pruned_loss=0.02532, over 4928.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02938, over 972631.41 frames.], batch size: 39, lr: 1.30e-04 2022-05-09 01:24:01,719 INFO [train.py:715] (0/8) Epoch 17, batch 16200, loss[loss=0.1816, simple_loss=0.2521, pruned_loss=0.05557, over 4689.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02937, over 973037.59 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 01:24:33,583 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-608000.pt 2022-05-09 01:24:43,127 INFO [train.py:715] (0/8) Epoch 17, batch 16250, loss[loss=0.1548, simple_loss=0.2269, pruned_loss=0.0413, over 4894.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2075, pruned_loss=0.02968, over 972922.44 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:25:23,138 INFO [train.py:715] (0/8) Epoch 17, batch 16300, loss[loss=0.1422, simple_loss=0.2207, pruned_loss=0.03181, over 4737.00 frames.], tot_loss[loss=0.1339, simple_loss=0.208, pruned_loss=0.02986, over 973706.82 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:26:02,217 INFO [train.py:715] (0/8) Epoch 17, batch 16350, loss[loss=0.1232, simple_loss=0.201, pruned_loss=0.02273, over 4782.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2086, pruned_loss=0.03009, over 973138.18 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:26:40,869 INFO [train.py:715] (0/8) Epoch 17, batch 16400, loss[loss=0.1304, simple_loss=0.1987, pruned_loss=0.03098, over 4845.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2082, pruned_loss=0.03002, over 972581.43 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 01:27:20,585 INFO [train.py:715] (0/8) Epoch 17, batch 16450, loss[loss=0.1263, simple_loss=0.2023, pruned_loss=0.02516, over 4744.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02964, over 972062.83 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 01:28:00,548 INFO [train.py:715] (0/8) Epoch 17, batch 16500, loss[loss=0.1246, simple_loss=0.191, pruned_loss=0.02905, over 4884.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2087, pruned_loss=0.03005, over 972177.37 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 01:28:39,570 INFO [train.py:715] (0/8) Epoch 17, batch 16550, loss[loss=0.1267, simple_loss=0.1989, pruned_loss=0.02722, over 4816.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2081, pruned_loss=0.03004, over 971834.55 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 01:29:18,072 INFO [train.py:715] (0/8) Epoch 17, batch 16600, loss[loss=0.1277, simple_loss=0.2061, pruned_loss=0.02464, over 4748.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02964, over 971646.59 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 01:29:58,261 INFO [train.py:715] (0/8) Epoch 17, batch 16650, loss[loss=0.1305, simple_loss=0.2079, pruned_loss=0.02656, over 4735.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02926, over 970607.41 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:30:38,044 INFO [train.py:715] (0/8) Epoch 17, batch 16700, loss[loss=0.1211, simple_loss=0.2016, pruned_loss=0.02027, over 4982.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02971, over 971272.60 frames.], batch size: 28, lr: 1.30e-04 2022-05-09 01:31:16,488 INFO [train.py:715] (0/8) Epoch 17, batch 16750, loss[loss=0.1481, simple_loss=0.2282, pruned_loss=0.03399, over 4829.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2085, pruned_loss=0.03013, over 971832.48 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 01:31:56,307 INFO [train.py:715] (0/8) Epoch 17, batch 16800, loss[loss=0.1488, simple_loss=0.2216, pruned_loss=0.03797, over 4753.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2081, pruned_loss=0.02971, over 971749.11 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 01:32:35,730 INFO [train.py:715] (0/8) Epoch 17, batch 16850, loss[loss=0.1204, simple_loss=0.1984, pruned_loss=0.02124, over 4769.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02937, over 971479.08 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:33:15,641 INFO [train.py:715] (0/8) Epoch 17, batch 16900, loss[loss=0.1222, simple_loss=0.1975, pruned_loss=0.02347, over 4802.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2082, pruned_loss=0.02979, over 971074.30 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 01:33:53,863 INFO [train.py:715] (0/8) Epoch 17, batch 16950, loss[loss=0.1498, simple_loss=0.2274, pruned_loss=0.0361, over 4819.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2082, pruned_loss=0.02953, over 971168.07 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 01:34:33,422 INFO [train.py:715] (0/8) Epoch 17, batch 17000, loss[loss=0.1282, simple_loss=0.2107, pruned_loss=0.02283, over 4883.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02919, over 970779.38 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:35:12,910 INFO [train.py:715] (0/8) Epoch 17, batch 17050, loss[loss=0.1326, simple_loss=0.2061, pruned_loss=0.02957, over 4912.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02919, over 970998.44 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:35:51,173 INFO [train.py:715] (0/8) Epoch 17, batch 17100, loss[loss=0.1329, simple_loss=0.2116, pruned_loss=0.02707, over 4983.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.02877, over 971048.24 frames.], batch size: 35, lr: 1.30e-04 2022-05-09 01:36:30,681 INFO [train.py:715] (0/8) Epoch 17, batch 17150, loss[loss=0.1393, simple_loss=0.2124, pruned_loss=0.03311, over 4790.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2074, pruned_loss=0.02921, over 971811.85 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:37:10,050 INFO [train.py:715] (0/8) Epoch 17, batch 17200, loss[loss=0.1104, simple_loss=0.1826, pruned_loss=0.01913, over 4640.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02897, over 971396.38 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 01:37:48,554 INFO [train.py:715] (0/8) Epoch 17, batch 17250, loss[loss=0.1243, simple_loss=0.2072, pruned_loss=0.02071, over 4864.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02882, over 971508.59 frames.], batch size: 32, lr: 1.30e-04 2022-05-09 01:38:26,818 INFO [train.py:715] (0/8) Epoch 17, batch 17300, loss[loss=0.1315, simple_loss=0.2083, pruned_loss=0.0273, over 4957.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02949, over 971610.48 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 01:39:06,137 INFO [train.py:715] (0/8) Epoch 17, batch 17350, loss[loss=0.125, simple_loss=0.2041, pruned_loss=0.0229, over 4964.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2069, pruned_loss=0.02876, over 971466.83 frames.], batch size: 35, lr: 1.30e-04 2022-05-09 01:39:45,338 INFO [train.py:715] (0/8) Epoch 17, batch 17400, loss[loss=0.1108, simple_loss=0.1848, pruned_loss=0.01844, over 4943.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.02889, over 972017.72 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:40:23,319 INFO [train.py:715] (0/8) Epoch 17, batch 17450, loss[loss=0.1351, simple_loss=0.2093, pruned_loss=0.03048, over 4795.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02913, over 971803.62 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:41:03,008 INFO [train.py:715] (0/8) Epoch 17, batch 17500, loss[loss=0.1192, simple_loss=0.1976, pruned_loss=0.02035, over 4864.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2073, pruned_loss=0.02895, over 971814.87 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 01:41:42,132 INFO [train.py:715] (0/8) Epoch 17, batch 17550, loss[loss=0.1375, simple_loss=0.2191, pruned_loss=0.02796, over 4901.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2079, pruned_loss=0.02879, over 971774.73 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 01:42:20,887 INFO [train.py:715] (0/8) Epoch 17, batch 17600, loss[loss=0.1507, simple_loss=0.2278, pruned_loss=0.03679, over 4822.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2077, pruned_loss=0.02889, over 971766.54 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 01:42:59,400 INFO [train.py:715] (0/8) Epoch 17, batch 17650, loss[loss=0.1261, simple_loss=0.202, pruned_loss=0.02516, over 4793.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2078, pruned_loss=0.02927, over 971664.46 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 01:43:38,881 INFO [train.py:715] (0/8) Epoch 17, batch 17700, loss[loss=0.1152, simple_loss=0.1852, pruned_loss=0.02257, over 4751.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02905, over 971055.57 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:44:17,597 INFO [train.py:715] (0/8) Epoch 17, batch 17750, loss[loss=0.1085, simple_loss=0.1816, pruned_loss=0.01769, over 4822.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02943, over 970881.77 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 01:44:56,090 INFO [train.py:715] (0/8) Epoch 17, batch 17800, loss[loss=0.1157, simple_loss=0.1983, pruned_loss=0.01659, over 4942.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02898, over 970494.92 frames.], batch size: 29, lr: 1.30e-04 2022-05-09 01:45:35,675 INFO [train.py:715] (0/8) Epoch 17, batch 17850, loss[loss=0.1185, simple_loss=0.1925, pruned_loss=0.02226, over 4792.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02931, over 970613.97 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 01:46:14,669 INFO [train.py:715] (0/8) Epoch 17, batch 17900, loss[loss=0.1224, simple_loss=0.2054, pruned_loss=0.01967, over 4766.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02874, over 970579.57 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 01:46:54,011 INFO [train.py:715] (0/8) Epoch 17, batch 17950, loss[loss=0.1134, simple_loss=0.1889, pruned_loss=0.01892, over 4777.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02944, over 971278.57 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:47:32,272 INFO [train.py:715] (0/8) Epoch 17, batch 18000, loss[loss=0.1246, simple_loss=0.1996, pruned_loss=0.02482, over 4831.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2076, pruned_loss=0.02959, over 971341.08 frames.], batch size: 20, lr: 1.30e-04 2022-05-09 01:47:32,273 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 01:47:42,063 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1047, simple_loss=0.1881, pruned_loss=0.01066, over 914524.00 frames. 2022-05-09 01:48:20,784 INFO [train.py:715] (0/8) Epoch 17, batch 18050, loss[loss=0.129, simple_loss=0.2132, pruned_loss=0.02237, over 4968.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02954, over 971372.08 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 01:49:00,410 INFO [train.py:715] (0/8) Epoch 17, batch 18100, loss[loss=0.119, simple_loss=0.1979, pruned_loss=0.02002, over 4803.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02945, over 971306.17 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:49:39,796 INFO [train.py:715] (0/8) Epoch 17, batch 18150, loss[loss=0.1152, simple_loss=0.1877, pruned_loss=0.02133, over 4788.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02946, over 970441.10 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:50:17,778 INFO [train.py:715] (0/8) Epoch 17, batch 18200, loss[loss=0.1044, simple_loss=0.1767, pruned_loss=0.01606, over 4991.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02952, over 970794.27 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 01:50:57,528 INFO [train.py:715] (0/8) Epoch 17, batch 18250, loss[loss=0.1468, simple_loss=0.2299, pruned_loss=0.03183, over 4737.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02878, over 970975.37 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:51:37,058 INFO [train.py:715] (0/8) Epoch 17, batch 18300, loss[loss=0.132, simple_loss=0.2182, pruned_loss=0.02287, over 4794.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02822, over 970304.60 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:52:15,573 INFO [train.py:715] (0/8) Epoch 17, batch 18350, loss[loss=0.1209, simple_loss=0.1994, pruned_loss=0.02125, over 4846.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02833, over 970920.14 frames.], batch size: 30, lr: 1.30e-04 2022-05-09 01:52:55,003 INFO [train.py:715] (0/8) Epoch 17, batch 18400, loss[loss=0.1359, simple_loss=0.2059, pruned_loss=0.03298, over 4779.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02873, over 970485.10 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 01:53:33,899 INFO [train.py:715] (0/8) Epoch 17, batch 18450, loss[loss=0.1178, simple_loss=0.191, pruned_loss=0.02229, over 4785.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02916, over 970386.18 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:54:13,084 INFO [train.py:715] (0/8) Epoch 17, batch 18500, loss[loss=0.1514, simple_loss=0.2275, pruned_loss=0.03767, over 4932.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02863, over 970820.86 frames.], batch size: 29, lr: 1.30e-04 2022-05-09 01:54:51,414 INFO [train.py:715] (0/8) Epoch 17, batch 18550, loss[loss=0.1293, simple_loss=0.2053, pruned_loss=0.02669, over 4695.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02848, over 971027.37 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 01:55:30,376 INFO [train.py:715] (0/8) Epoch 17, batch 18600, loss[loss=0.141, simple_loss=0.2184, pruned_loss=0.03187, over 4776.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02865, over 970412.12 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:56:09,532 INFO [train.py:715] (0/8) Epoch 17, batch 18650, loss[loss=0.1181, simple_loss=0.1907, pruned_loss=0.02274, over 4818.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.0283, over 970928.64 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 01:56:47,377 INFO [train.py:715] (0/8) Epoch 17, batch 18700, loss[loss=0.1542, simple_loss=0.2221, pruned_loss=0.04315, over 4862.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02872, over 971580.22 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 01:57:27,055 INFO [train.py:715] (0/8) Epoch 17, batch 18750, loss[loss=0.1214, simple_loss=0.1864, pruned_loss=0.0282, over 4827.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02921, over 971248.49 frames.], batch size: 30, lr: 1.30e-04 2022-05-09 01:58:06,643 INFO [train.py:715] (0/8) Epoch 17, batch 18800, loss[loss=0.1595, simple_loss=0.2429, pruned_loss=0.03805, over 4920.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02899, over 970745.90 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 01:58:45,347 INFO [train.py:715] (0/8) Epoch 17, batch 18850, loss[loss=0.1168, simple_loss=0.1891, pruned_loss=0.02227, over 4892.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02928, over 971314.92 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 01:59:23,453 INFO [train.py:715] (0/8) Epoch 17, batch 18900, loss[loss=0.1342, simple_loss=0.1946, pruned_loss=0.0369, over 4869.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02992, over 972330.57 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 02:00:02,550 INFO [train.py:715] (0/8) Epoch 17, batch 18950, loss[loss=0.125, simple_loss=0.1984, pruned_loss=0.0258, over 4983.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02966, over 972416.42 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 02:00:41,833 INFO [train.py:715] (0/8) Epoch 17, batch 19000, loss[loss=0.1342, simple_loss=0.2092, pruned_loss=0.02963, over 4741.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2063, pruned_loss=0.02954, over 972557.01 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 02:01:20,327 INFO [train.py:715] (0/8) Epoch 17, batch 19050, loss[loss=0.1266, simple_loss=0.2058, pruned_loss=0.02369, over 4936.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2066, pruned_loss=0.02961, over 972505.41 frames.], batch size: 23, lr: 1.30e-04 2022-05-09 02:01:59,756 INFO [train.py:715] (0/8) Epoch 17, batch 19100, loss[loss=0.1289, simple_loss=0.2149, pruned_loss=0.02151, over 4753.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2059, pruned_loss=0.02932, over 972003.36 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 02:02:38,887 INFO [train.py:715] (0/8) Epoch 17, batch 19150, loss[loss=0.1388, simple_loss=0.2222, pruned_loss=0.02768, over 4953.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02861, over 972611.53 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 02:03:17,328 INFO [train.py:715] (0/8) Epoch 17, batch 19200, loss[loss=0.1292, simple_loss=0.1992, pruned_loss=0.02956, over 4901.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2052, pruned_loss=0.02871, over 972757.01 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 02:03:56,164 INFO [train.py:715] (0/8) Epoch 17, batch 19250, loss[loss=0.1324, simple_loss=0.2075, pruned_loss=0.02865, over 4753.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2047, pruned_loss=0.02842, over 971048.57 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 02:04:35,737 INFO [train.py:715] (0/8) Epoch 17, batch 19300, loss[loss=0.112, simple_loss=0.1882, pruned_loss=0.01789, over 4983.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2042, pruned_loss=0.02818, over 972097.74 frames.], batch size: 28, lr: 1.30e-04 2022-05-09 02:05:15,460 INFO [train.py:715] (0/8) Epoch 17, batch 19350, loss[loss=0.1711, simple_loss=0.23, pruned_loss=0.0561, over 4818.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2051, pruned_loss=0.02815, over 972365.19 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 02:05:54,627 INFO [train.py:715] (0/8) Epoch 17, batch 19400, loss[loss=0.1269, simple_loss=0.2049, pruned_loss=0.02442, over 4751.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2056, pruned_loss=0.02827, over 972272.71 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 02:06:34,193 INFO [train.py:715] (0/8) Epoch 17, batch 19450, loss[loss=0.1276, simple_loss=0.1995, pruned_loss=0.02789, over 4872.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.0286, over 972925.86 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 02:07:13,757 INFO [train.py:715] (0/8) Epoch 17, batch 19500, loss[loss=0.1317, simple_loss=0.2038, pruned_loss=0.02979, over 4775.00 frames.], tot_loss[loss=0.131, simple_loss=0.2052, pruned_loss=0.02838, over 973021.01 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 02:07:53,344 INFO [train.py:715] (0/8) Epoch 17, batch 19550, loss[loss=0.1035, simple_loss=0.1801, pruned_loss=0.01348, over 4708.00 frames.], tot_loss[loss=0.131, simple_loss=0.2051, pruned_loss=0.02848, over 973153.62 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:08:31,622 INFO [train.py:715] (0/8) Epoch 17, batch 19600, loss[loss=0.1274, simple_loss=0.2008, pruned_loss=0.02703, over 4788.00 frames.], tot_loss[loss=0.1309, simple_loss=0.205, pruned_loss=0.02841, over 972954.76 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 02:09:11,586 INFO [train.py:715] (0/8) Epoch 17, batch 19650, loss[loss=0.1469, simple_loss=0.2171, pruned_loss=0.0384, over 4863.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2057, pruned_loss=0.02874, over 973043.09 frames.], batch size: 32, lr: 1.30e-04 2022-05-09 02:09:51,448 INFO [train.py:715] (0/8) Epoch 17, batch 19700, loss[loss=0.1382, simple_loss=0.2164, pruned_loss=0.03003, over 4781.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02938, over 972728.30 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 02:10:30,060 INFO [train.py:715] (0/8) Epoch 17, batch 19750, loss[loss=0.1158, simple_loss=0.1972, pruned_loss=0.01718, over 4819.00 frames.], tot_loss[loss=0.1337, simple_loss=0.208, pruned_loss=0.0297, over 973159.46 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 02:11:09,367 INFO [train.py:715] (0/8) Epoch 17, batch 19800, loss[loss=0.1099, simple_loss=0.1864, pruned_loss=0.01668, over 4764.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02956, over 972364.31 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 02:11:47,961 INFO [train.py:715] (0/8) Epoch 17, batch 19850, loss[loss=0.1177, simple_loss=0.2053, pruned_loss=0.01508, over 4934.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02879, over 972046.16 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 02:12:26,930 INFO [train.py:715] (0/8) Epoch 17, batch 19900, loss[loss=0.1188, simple_loss=0.2012, pruned_loss=0.01822, over 4810.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02863, over 970856.29 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 02:13:05,189 INFO [train.py:715] (0/8) Epoch 17, batch 19950, loss[loss=0.1078, simple_loss=0.1913, pruned_loss=0.01218, over 4805.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02874, over 971576.46 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 02:13:44,429 INFO [train.py:715] (0/8) Epoch 17, batch 20000, loss[loss=0.111, simple_loss=0.1783, pruned_loss=0.02187, over 4821.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.0287, over 972810.11 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 02:14:24,054 INFO [train.py:715] (0/8) Epoch 17, batch 20050, loss[loss=0.1558, simple_loss=0.2269, pruned_loss=0.04234, over 4765.00 frames.], tot_loss[loss=0.132, simple_loss=0.206, pruned_loss=0.02901, over 972263.34 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 02:15:03,202 INFO [train.py:715] (0/8) Epoch 17, batch 20100, loss[loss=0.1114, simple_loss=0.1863, pruned_loss=0.01829, over 4768.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02889, over 971952.32 frames.], batch size: 12, lr: 1.30e-04 2022-05-09 02:15:42,011 INFO [train.py:715] (0/8) Epoch 17, batch 20150, loss[loss=0.1385, simple_loss=0.2095, pruned_loss=0.03374, over 4701.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2062, pruned_loss=0.02924, over 972427.33 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:16:22,282 INFO [train.py:715] (0/8) Epoch 17, batch 20200, loss[loss=0.1148, simple_loss=0.1914, pruned_loss=0.01912, over 4890.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02889, over 972644.40 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 02:17:02,697 INFO [train.py:715] (0/8) Epoch 17, batch 20250, loss[loss=0.09894, simple_loss=0.1682, pruned_loss=0.01483, over 4836.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2058, pruned_loss=0.02884, over 972567.15 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 02:17:40,776 INFO [train.py:715] (0/8) Epoch 17, batch 20300, loss[loss=0.1214, simple_loss=0.1995, pruned_loss=0.02165, over 4920.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02858, over 972164.04 frames.], batch size: 23, lr: 1.30e-04 2022-05-09 02:18:20,505 INFO [train.py:715] (0/8) Epoch 17, batch 20350, loss[loss=0.1272, simple_loss=0.1988, pruned_loss=0.02782, over 4970.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2058, pruned_loss=0.02827, over 971439.31 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:19:00,638 INFO [train.py:715] (0/8) Epoch 17, batch 20400, loss[loss=0.134, simple_loss=0.2137, pruned_loss=0.02712, over 4863.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02846, over 971756.26 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 02:19:39,221 INFO [train.py:715] (0/8) Epoch 17, batch 20450, loss[loss=0.1217, simple_loss=0.1941, pruned_loss=0.02462, over 4847.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02876, over 970923.52 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 02:20:17,924 INFO [train.py:715] (0/8) Epoch 17, batch 20500, loss[loss=0.1367, simple_loss=0.2129, pruned_loss=0.03027, over 4856.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02886, over 970952.34 frames.], batch size: 20, lr: 1.30e-04 2022-05-09 02:20:57,776 INFO [train.py:715] (0/8) Epoch 17, batch 20550, loss[loss=0.1282, simple_loss=0.2022, pruned_loss=0.02707, over 4893.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02881, over 971216.74 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 02:21:36,910 INFO [train.py:715] (0/8) Epoch 17, batch 20600, loss[loss=0.1285, simple_loss=0.2067, pruned_loss=0.02515, over 4826.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2073, pruned_loss=0.02863, over 972071.64 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 02:22:15,100 INFO [train.py:715] (0/8) Epoch 17, batch 20650, loss[loss=0.1337, simple_loss=0.201, pruned_loss=0.03319, over 4993.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02833, over 972842.62 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 02:22:54,069 INFO [train.py:715] (0/8) Epoch 17, batch 20700, loss[loss=0.142, simple_loss=0.2097, pruned_loss=0.03718, over 4980.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02883, over 973827.86 frames.], batch size: 35, lr: 1.30e-04 2022-05-09 02:23:33,730 INFO [train.py:715] (0/8) Epoch 17, batch 20750, loss[loss=0.1527, simple_loss=0.2215, pruned_loss=0.04193, over 4754.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2075, pruned_loss=0.02895, over 973227.30 frames.], batch size: 19, lr: 1.30e-04 2022-05-09 02:24:12,678 INFO [train.py:715] (0/8) Epoch 17, batch 20800, loss[loss=0.1572, simple_loss=0.2212, pruned_loss=0.04654, over 4826.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02885, over 973059.98 frames.], batch size: 30, lr: 1.30e-04 2022-05-09 02:24:51,252 INFO [train.py:715] (0/8) Epoch 17, batch 20850, loss[loss=0.1215, simple_loss=0.1951, pruned_loss=0.02391, over 4981.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02871, over 972717.98 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 02:25:30,263 INFO [train.py:715] (0/8) Epoch 17, batch 20900, loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02856, over 4937.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02876, over 972758.81 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 02:26:10,245 INFO [train.py:715] (0/8) Epoch 17, batch 20950, loss[loss=0.1386, simple_loss=0.214, pruned_loss=0.03162, over 4938.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02867, over 972685.36 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 02:26:48,268 INFO [train.py:715] (0/8) Epoch 17, batch 21000, loss[loss=0.1102, simple_loss=0.1836, pruned_loss=0.01846, over 4877.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02899, over 972449.81 frames.], batch size: 16, lr: 1.30e-04 2022-05-09 02:26:48,269 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 02:27:00,912 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1049, simple_loss=0.1882, pruned_loss=0.01077, over 914524.00 frames. 2022-05-09 02:27:38,926 INFO [train.py:715] (0/8) Epoch 17, batch 21050, loss[loss=0.1341, simple_loss=0.2126, pruned_loss=0.02777, over 4813.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2074, pruned_loss=0.02904, over 972716.09 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 02:28:18,319 INFO [train.py:715] (0/8) Epoch 17, batch 21100, loss[loss=0.1254, simple_loss=0.1951, pruned_loss=0.02782, over 4900.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02916, over 972488.02 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 02:28:58,368 INFO [train.py:715] (0/8) Epoch 17, batch 21150, loss[loss=0.1462, simple_loss=0.2158, pruned_loss=0.03832, over 4943.00 frames.], tot_loss[loss=0.1329, simple_loss=0.207, pruned_loss=0.02935, over 973126.44 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 02:29:37,026 INFO [train.py:715] (0/8) Epoch 17, batch 21200, loss[loss=0.1247, simple_loss=0.2008, pruned_loss=0.02426, over 4840.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.02916, over 974126.02 frames.], batch size: 30, lr: 1.30e-04 2022-05-09 02:30:15,712 INFO [train.py:715] (0/8) Epoch 17, batch 21250, loss[loss=0.1573, simple_loss=0.2304, pruned_loss=0.04208, over 4706.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02935, over 974780.64 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:30:55,577 INFO [train.py:715] (0/8) Epoch 17, batch 21300, loss[loss=0.1397, simple_loss=0.2196, pruned_loss=0.02989, over 4836.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02887, over 974412.69 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:31:35,365 INFO [train.py:715] (0/8) Epoch 17, batch 21350, loss[loss=0.1148, simple_loss=0.1869, pruned_loss=0.02132, over 4770.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02933, over 974313.41 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 02:32:13,591 INFO [train.py:715] (0/8) Epoch 17, batch 21400, loss[loss=0.1257, simple_loss=0.1976, pruned_loss=0.02686, over 4975.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02897, over 974524.51 frames.], batch size: 33, lr: 1.30e-04 2022-05-09 02:32:53,758 INFO [train.py:715] (0/8) Epoch 17, batch 21450, loss[loss=0.1536, simple_loss=0.2357, pruned_loss=0.03572, over 4841.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02924, over 973559.90 frames.], batch size: 27, lr: 1.30e-04 2022-05-09 02:33:33,549 INFO [train.py:715] (0/8) Epoch 17, batch 21500, loss[loss=0.1354, simple_loss=0.2173, pruned_loss=0.0267, over 4834.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02895, over 971881.42 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 02:34:12,043 INFO [train.py:715] (0/8) Epoch 17, batch 21550, loss[loss=0.1244, simple_loss=0.207, pruned_loss=0.02088, over 4928.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02901, over 971423.91 frames.], batch size: 23, lr: 1.30e-04 2022-05-09 02:34:51,492 INFO [train.py:715] (0/8) Epoch 17, batch 21600, loss[loss=0.1422, simple_loss=0.2186, pruned_loss=0.03289, over 4819.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.0293, over 971200.48 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:35:31,958 INFO [train.py:715] (0/8) Epoch 17, batch 21650, loss[loss=0.1307, simple_loss=0.1925, pruned_loss=0.03446, over 4694.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02909, over 970817.98 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:36:11,046 INFO [train.py:715] (0/8) Epoch 17, batch 21700, loss[loss=0.1393, simple_loss=0.2167, pruned_loss=0.03098, over 4854.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02915, over 970720.03 frames.], batch size: 20, lr: 1.30e-04 2022-05-09 02:36:49,708 INFO [train.py:715] (0/8) Epoch 17, batch 21750, loss[loss=0.1514, simple_loss=0.2331, pruned_loss=0.03487, over 4848.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2074, pruned_loss=0.02894, over 971189.15 frames.], batch size: 20, lr: 1.30e-04 2022-05-09 02:37:29,247 INFO [train.py:715] (0/8) Epoch 17, batch 21800, loss[loss=0.1402, simple_loss=0.2131, pruned_loss=0.03371, over 4936.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2071, pruned_loss=0.02867, over 971468.67 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 02:38:08,213 INFO [train.py:715] (0/8) Epoch 17, batch 21850, loss[loss=0.1115, simple_loss=0.1819, pruned_loss=0.02056, over 4846.00 frames.], tot_loss[loss=0.1324, simple_loss=0.207, pruned_loss=0.02888, over 971964.73 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:38:47,457 INFO [train.py:715] (0/8) Epoch 17, batch 21900, loss[loss=0.154, simple_loss=0.229, pruned_loss=0.0395, over 4822.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.0293, over 971339.65 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:39:25,950 INFO [train.py:715] (0/8) Epoch 17, batch 21950, loss[loss=0.1203, simple_loss=0.1917, pruned_loss=0.02444, over 4848.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.0292, over 972015.13 frames.], batch size: 32, lr: 1.30e-04 2022-05-09 02:40:05,671 INFO [train.py:715] (0/8) Epoch 17, batch 22000, loss[loss=0.1335, simple_loss=0.1929, pruned_loss=0.03705, over 4781.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02929, over 971782.91 frames.], batch size: 14, lr: 1.30e-04 2022-05-09 02:40:45,437 INFO [train.py:715] (0/8) Epoch 17, batch 22050, loss[loss=0.1342, simple_loss=0.2055, pruned_loss=0.03145, over 4835.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02947, over 971667.29 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:41:23,862 INFO [train.py:715] (0/8) Epoch 17, batch 22100, loss[loss=0.1175, simple_loss=0.1864, pruned_loss=0.02429, over 4855.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2064, pruned_loss=0.02901, over 971323.17 frames.], batch size: 13, lr: 1.30e-04 2022-05-09 02:42:03,595 INFO [train.py:715] (0/8) Epoch 17, batch 22150, loss[loss=0.1343, simple_loss=0.2121, pruned_loss=0.02819, over 4788.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02937, over 971638.83 frames.], batch size: 18, lr: 1.30e-04 2022-05-09 02:42:43,494 INFO [train.py:715] (0/8) Epoch 17, batch 22200, loss[loss=0.1376, simple_loss=0.204, pruned_loss=0.03561, over 4905.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2064, pruned_loss=0.02936, over 970816.20 frames.], batch size: 39, lr: 1.30e-04 2022-05-09 02:43:22,388 INFO [train.py:715] (0/8) Epoch 17, batch 22250, loss[loss=0.133, simple_loss=0.2067, pruned_loss=0.02966, over 4884.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2079, pruned_loss=0.02996, over 972381.98 frames.], batch size: 22, lr: 1.30e-04 2022-05-09 02:44:01,343 INFO [train.py:715] (0/8) Epoch 17, batch 22300, loss[loss=0.1552, simple_loss=0.2398, pruned_loss=0.03531, over 4951.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2086, pruned_loss=0.03037, over 971799.20 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 02:44:41,265 INFO [train.py:715] (0/8) Epoch 17, batch 22350, loss[loss=0.1244, simple_loss=0.198, pruned_loss=0.02533, over 4910.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.03018, over 972221.41 frames.], batch size: 17, lr: 1.30e-04 2022-05-09 02:45:20,835 INFO [train.py:715] (0/8) Epoch 17, batch 22400, loss[loss=0.1555, simple_loss=0.2353, pruned_loss=0.03782, over 4796.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.0297, over 971824.40 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 02:45:59,650 INFO [train.py:715] (0/8) Epoch 17, batch 22450, loss[loss=0.1575, simple_loss=0.2311, pruned_loss=0.04193, over 4975.00 frames.], tot_loss[loss=0.1341, simple_loss=0.208, pruned_loss=0.03005, over 972859.07 frames.], batch size: 24, lr: 1.30e-04 2022-05-09 02:46:38,625 INFO [train.py:715] (0/8) Epoch 17, batch 22500, loss[loss=0.1347, simple_loss=0.204, pruned_loss=0.03273, over 4928.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2075, pruned_loss=0.02969, over 972403.48 frames.], batch size: 21, lr: 1.30e-04 2022-05-09 02:47:18,394 INFO [train.py:715] (0/8) Epoch 17, batch 22550, loss[loss=0.1275, simple_loss=0.2037, pruned_loss=0.02559, over 4866.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2077, pruned_loss=0.0297, over 972861.10 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:47:56,725 INFO [train.py:715] (0/8) Epoch 17, batch 22600, loss[loss=0.1249, simple_loss=0.2003, pruned_loss=0.02475, over 4968.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2078, pruned_loss=0.02974, over 972654.96 frames.], batch size: 15, lr: 1.30e-04 2022-05-09 02:48:36,268 INFO [train.py:715] (0/8) Epoch 17, batch 22650, loss[loss=0.141, simple_loss=0.2227, pruned_loss=0.02965, over 4822.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2064, pruned_loss=0.02901, over 972597.48 frames.], batch size: 26, lr: 1.30e-04 2022-05-09 02:49:15,728 INFO [train.py:715] (0/8) Epoch 17, batch 22700, loss[loss=0.1629, simple_loss=0.2312, pruned_loss=0.04733, over 4839.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02905, over 972350.84 frames.], batch size: 30, lr: 1.30e-04 2022-05-09 02:49:54,663 INFO [train.py:715] (0/8) Epoch 17, batch 22750, loss[loss=0.1401, simple_loss=0.2181, pruned_loss=0.03108, over 4909.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02877, over 972329.32 frames.], batch size: 39, lr: 1.30e-04 2022-05-09 02:50:33,046 INFO [train.py:715] (0/8) Epoch 17, batch 22800, loss[loss=0.1194, simple_loss=0.1931, pruned_loss=0.02286, over 4927.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02849, over 972851.72 frames.], batch size: 29, lr: 1.30e-04 2022-05-09 02:51:12,436 INFO [train.py:715] (0/8) Epoch 17, batch 22850, loss[loss=0.1079, simple_loss=0.1802, pruned_loss=0.01783, over 4961.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02827, over 972062.05 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 02:51:52,338 INFO [train.py:715] (0/8) Epoch 17, batch 22900, loss[loss=0.1306, simple_loss=0.2055, pruned_loss=0.02782, over 4910.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.02839, over 971959.28 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 02:52:30,189 INFO [train.py:715] (0/8) Epoch 17, batch 22950, loss[loss=0.1144, simple_loss=0.1913, pruned_loss=0.01876, over 4918.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02844, over 972101.48 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 02:53:10,087 INFO [train.py:715] (0/8) Epoch 17, batch 23000, loss[loss=0.1135, simple_loss=0.1916, pruned_loss=0.01776, over 4667.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02878, over 972147.12 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 02:53:50,343 INFO [train.py:715] (0/8) Epoch 17, batch 23050, loss[loss=0.148, simple_loss=0.2105, pruned_loss=0.04274, over 4945.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.0288, over 971926.91 frames.], batch size: 39, lr: 1.29e-04 2022-05-09 02:54:29,511 INFO [train.py:715] (0/8) Epoch 17, batch 23100, loss[loss=0.1128, simple_loss=0.195, pruned_loss=0.01528, over 4819.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02867, over 972930.61 frames.], batch size: 27, lr: 1.29e-04 2022-05-09 02:55:07,922 INFO [train.py:715] (0/8) Epoch 17, batch 23150, loss[loss=0.1438, simple_loss=0.2056, pruned_loss=0.04107, over 4922.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02874, over 973751.22 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 02:55:47,702 INFO [train.py:715] (0/8) Epoch 17, batch 23200, loss[loss=0.1333, simple_loss=0.2159, pruned_loss=0.02534, over 4870.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02904, over 974244.03 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 02:56:27,704 INFO [train.py:715] (0/8) Epoch 17, batch 23250, loss[loss=0.1335, simple_loss=0.2059, pruned_loss=0.03059, over 4932.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02938, over 974318.48 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 02:57:05,636 INFO [train.py:715] (0/8) Epoch 17, batch 23300, loss[loss=0.145, simple_loss=0.2111, pruned_loss=0.03944, over 4946.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02914, over 973918.44 frames.], batch size: 35, lr: 1.29e-04 2022-05-09 02:57:44,993 INFO [train.py:715] (0/8) Epoch 17, batch 23350, loss[loss=0.1499, simple_loss=0.2083, pruned_loss=0.0457, over 4699.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2073, pruned_loss=0.02968, over 973210.44 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 02:58:25,086 INFO [train.py:715] (0/8) Epoch 17, batch 23400, loss[loss=0.1211, simple_loss=0.1899, pruned_loss=0.02615, over 4815.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2064, pruned_loss=0.02961, over 973158.83 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 02:59:03,866 INFO [train.py:715] (0/8) Epoch 17, batch 23450, loss[loss=0.1064, simple_loss=0.1834, pruned_loss=0.01475, over 4974.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02921, over 973175.14 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 02:59:42,960 INFO [train.py:715] (0/8) Epoch 17, batch 23500, loss[loss=0.1273, simple_loss=0.2025, pruned_loss=0.02609, over 4918.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02896, over 972720.37 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 03:00:22,277 INFO [train.py:715] (0/8) Epoch 17, batch 23550, loss[loss=0.1051, simple_loss=0.1839, pruned_loss=0.01317, over 4980.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2062, pruned_loss=0.02839, over 973474.30 frames.], batch size: 28, lr: 1.29e-04 2022-05-09 03:01:01,965 INFO [train.py:715] (0/8) Epoch 17, batch 23600, loss[loss=0.1396, simple_loss=0.2134, pruned_loss=0.03291, over 4644.00 frames.], tot_loss[loss=0.132, simple_loss=0.2066, pruned_loss=0.02867, over 972965.68 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 03:01:40,303 INFO [train.py:715] (0/8) Epoch 17, batch 23650, loss[loss=0.1017, simple_loss=0.1742, pruned_loss=0.01462, over 4738.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02879, over 972709.28 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 03:02:19,920 INFO [train.py:715] (0/8) Epoch 17, batch 23700, loss[loss=0.1389, simple_loss=0.2022, pruned_loss=0.03776, over 4790.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02874, over 972923.20 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:02:59,507 INFO [train.py:715] (0/8) Epoch 17, batch 23750, loss[loss=0.1443, simple_loss=0.2249, pruned_loss=0.03186, over 4796.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02864, over 972694.52 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 03:03:38,380 INFO [train.py:715] (0/8) Epoch 17, batch 23800, loss[loss=0.1467, simple_loss=0.2274, pruned_loss=0.033, over 4962.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02897, over 973224.72 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:04:16,662 INFO [train.py:715] (0/8) Epoch 17, batch 23850, loss[loss=0.1387, simple_loss=0.2079, pruned_loss=0.0348, over 4799.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.02889, over 973231.28 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:04:56,706 INFO [train.py:715] (0/8) Epoch 17, batch 23900, loss[loss=0.1184, simple_loss=0.1934, pruned_loss=0.02169, over 4914.00 frames.], tot_loss[loss=0.133, simple_loss=0.2078, pruned_loss=0.02914, over 972076.08 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 03:05:35,870 INFO [train.py:715] (0/8) Epoch 17, batch 23950, loss[loss=0.1197, simple_loss=0.2118, pruned_loss=0.0138, over 4708.00 frames.], tot_loss[loss=0.133, simple_loss=0.208, pruned_loss=0.029, over 971489.54 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:06:14,198 INFO [train.py:715] (0/8) Epoch 17, batch 24000, loss[loss=0.1308, simple_loss=0.2105, pruned_loss=0.02556, over 4697.00 frames.], tot_loss[loss=0.133, simple_loss=0.2078, pruned_loss=0.02909, over 971376.07 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:06:14,199 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 03:06:24,068 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1047, simple_loss=0.1881, pruned_loss=0.01067, over 914524.00 frames. 2022-05-09 03:07:02,577 INFO [train.py:715] (0/8) Epoch 17, batch 24050, loss[loss=0.1262, simple_loss=0.1938, pruned_loss=0.02932, over 4906.00 frames.], tot_loss[loss=0.1333, simple_loss=0.208, pruned_loss=0.0293, over 970889.13 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 03:07:41,974 INFO [train.py:715] (0/8) Epoch 17, batch 24100, loss[loss=0.1163, simple_loss=0.1902, pruned_loss=0.02121, over 4944.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2081, pruned_loss=0.02957, over 971255.16 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 03:08:22,150 INFO [train.py:715] (0/8) Epoch 17, batch 24150, loss[loss=0.122, simple_loss=0.1993, pruned_loss=0.02239, over 4979.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2079, pruned_loss=0.02938, over 971420.82 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:09:00,902 INFO [train.py:715] (0/8) Epoch 17, batch 24200, loss[loss=0.1316, simple_loss=0.1984, pruned_loss=0.03239, over 4757.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02927, over 971871.82 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:09:33,369 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-616000.pt 2022-05-09 03:09:42,451 INFO [train.py:715] (0/8) Epoch 17, batch 24250, loss[loss=0.1504, simple_loss=0.2212, pruned_loss=0.03978, over 4702.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02901, over 971914.64 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:10:23,058 INFO [train.py:715] (0/8) Epoch 17, batch 24300, loss[loss=0.1318, simple_loss=0.2185, pruned_loss=0.02254, over 4777.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02905, over 972070.68 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:11:02,608 INFO [train.py:715] (0/8) Epoch 17, batch 24350, loss[loss=0.1454, simple_loss=0.2171, pruned_loss=0.03688, over 4647.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02926, over 971300.80 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 03:11:41,989 INFO [train.py:715] (0/8) Epoch 17, batch 24400, loss[loss=0.1352, simple_loss=0.2039, pruned_loss=0.03324, over 4879.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02933, over 971184.28 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 03:12:21,138 INFO [train.py:715] (0/8) Epoch 17, batch 24450, loss[loss=0.09925, simple_loss=0.1775, pruned_loss=0.01052, over 4760.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02912, over 971163.69 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 03:13:01,328 INFO [train.py:715] (0/8) Epoch 17, batch 24500, loss[loss=0.1494, simple_loss=0.2138, pruned_loss=0.04248, over 4879.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02884, over 971535.46 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 03:13:40,453 INFO [train.py:715] (0/8) Epoch 17, batch 24550, loss[loss=0.1864, simple_loss=0.2529, pruned_loss=0.05993, over 4779.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02901, over 971479.44 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:14:19,284 INFO [train.py:715] (0/8) Epoch 17, batch 24600, loss[loss=0.1326, simple_loss=0.2078, pruned_loss=0.02875, over 4943.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02913, over 971057.00 frames.], batch size: 39, lr: 1.29e-04 2022-05-09 03:14:59,438 INFO [train.py:715] (0/8) Epoch 17, batch 24650, loss[loss=0.1304, simple_loss=0.2088, pruned_loss=0.02597, over 4817.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02933, over 971082.38 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 03:15:39,738 INFO [train.py:715] (0/8) Epoch 17, batch 24700, loss[loss=0.1744, simple_loss=0.2401, pruned_loss=0.05438, over 4743.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02971, over 971350.27 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:16:18,261 INFO [train.py:715] (0/8) Epoch 17, batch 24750, loss[loss=0.1489, simple_loss=0.2262, pruned_loss=0.03586, over 4879.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2088, pruned_loss=0.02988, over 972159.23 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 03:16:58,093 INFO [train.py:715] (0/8) Epoch 17, batch 24800, loss[loss=0.1216, simple_loss=0.1967, pruned_loss=0.02321, over 4928.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2087, pruned_loss=0.02987, over 972243.18 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:17:37,934 INFO [train.py:715] (0/8) Epoch 17, batch 24850, loss[loss=0.1344, simple_loss=0.2149, pruned_loss=0.02699, over 4879.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02925, over 972262.21 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 03:18:17,565 INFO [train.py:715] (0/8) Epoch 17, batch 24900, loss[loss=0.118, simple_loss=0.1886, pruned_loss=0.02376, over 4812.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02925, over 972856.85 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:18:56,112 INFO [train.py:715] (0/8) Epoch 17, batch 24950, loss[loss=0.1411, simple_loss=0.2156, pruned_loss=0.03332, over 4803.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2057, pruned_loss=0.02878, over 973371.05 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:19:35,618 INFO [train.py:715] (0/8) Epoch 17, batch 25000, loss[loss=0.1355, simple_loss=0.2096, pruned_loss=0.03071, over 4891.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2049, pruned_loss=0.02834, over 972762.01 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:20:14,001 INFO [train.py:715] (0/8) Epoch 17, batch 25050, loss[loss=0.1501, simple_loss=0.2238, pruned_loss=0.03821, over 4884.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2054, pruned_loss=0.02851, over 972600.52 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 03:20:52,997 INFO [train.py:715] (0/8) Epoch 17, batch 25100, loss[loss=0.125, simple_loss=0.2042, pruned_loss=0.02295, over 4766.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02841, over 972436.68 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:21:32,979 INFO [train.py:715] (0/8) Epoch 17, batch 25150, loss[loss=0.142, simple_loss=0.2186, pruned_loss=0.03274, over 4761.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02887, over 972584.92 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:22:12,873 INFO [train.py:715] (0/8) Epoch 17, batch 25200, loss[loss=0.1208, simple_loss=0.188, pruned_loss=0.02683, over 4702.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02894, over 972577.33 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:22:51,914 INFO [train.py:715] (0/8) Epoch 17, batch 25250, loss[loss=0.1338, simple_loss=0.2078, pruned_loss=0.02995, over 4866.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2058, pruned_loss=0.02876, over 972219.06 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 03:23:31,035 INFO [train.py:715] (0/8) Epoch 17, batch 25300, loss[loss=0.1231, simple_loss=0.1969, pruned_loss=0.02462, over 4928.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02892, over 972225.47 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:24:11,040 INFO [train.py:715] (0/8) Epoch 17, batch 25350, loss[loss=0.1354, simple_loss=0.2164, pruned_loss=0.02725, over 4806.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.0289, over 972251.14 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 03:24:49,785 INFO [train.py:715] (0/8) Epoch 17, batch 25400, loss[loss=0.1308, simple_loss=0.2102, pruned_loss=0.02574, over 4964.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2051, pruned_loss=0.02834, over 972365.38 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 03:25:28,939 INFO [train.py:715] (0/8) Epoch 17, batch 25450, loss[loss=0.1071, simple_loss=0.1822, pruned_loss=0.01594, over 4946.00 frames.], tot_loss[loss=0.131, simple_loss=0.2052, pruned_loss=0.0284, over 973161.16 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 03:26:08,064 INFO [train.py:715] (0/8) Epoch 17, batch 25500, loss[loss=0.1349, simple_loss=0.213, pruned_loss=0.02836, over 4700.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2055, pruned_loss=0.02845, over 973128.45 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:26:47,841 INFO [train.py:715] (0/8) Epoch 17, batch 25550, loss[loss=0.1284, simple_loss=0.2088, pruned_loss=0.02396, over 4809.00 frames.], tot_loss[loss=0.132, simple_loss=0.2066, pruned_loss=0.02873, over 974215.30 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:27:26,920 INFO [train.py:715] (0/8) Epoch 17, batch 25600, loss[loss=0.1277, simple_loss=0.2035, pruned_loss=0.026, over 4693.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02888, over 974039.46 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:28:05,425 INFO [train.py:715] (0/8) Epoch 17, batch 25650, loss[loss=0.1247, simple_loss=0.2058, pruned_loss=0.02181, over 4808.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02893, over 973610.68 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:28:45,197 INFO [train.py:715] (0/8) Epoch 17, batch 25700, loss[loss=0.1335, simple_loss=0.2015, pruned_loss=0.03273, over 4981.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02853, over 973163.14 frames.], batch size: 35, lr: 1.29e-04 2022-05-09 03:29:24,285 INFO [train.py:715] (0/8) Epoch 17, batch 25750, loss[loss=0.1296, simple_loss=0.2191, pruned_loss=0.02003, over 4907.00 frames.], tot_loss[loss=0.131, simple_loss=0.2052, pruned_loss=0.02837, over 973027.59 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:30:03,677 INFO [train.py:715] (0/8) Epoch 17, batch 25800, loss[loss=0.115, simple_loss=0.1988, pruned_loss=0.01556, over 4971.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2053, pruned_loss=0.02822, over 972713.09 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 03:30:43,161 INFO [train.py:715] (0/8) Epoch 17, batch 25850, loss[loss=0.1252, simple_loss=0.2038, pruned_loss=0.02333, over 4989.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2056, pruned_loss=0.02797, over 972196.13 frames.], batch size: 28, lr: 1.29e-04 2022-05-09 03:31:22,525 INFO [train.py:715] (0/8) Epoch 17, batch 25900, loss[loss=0.1324, simple_loss=0.2047, pruned_loss=0.03003, over 4792.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.0281, over 972820.61 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:32:01,047 INFO [train.py:715] (0/8) Epoch 17, batch 25950, loss[loss=0.1408, simple_loss=0.2172, pruned_loss=0.03219, over 4740.00 frames.], tot_loss[loss=0.1304, simple_loss=0.205, pruned_loss=0.02786, over 972088.57 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:32:39,479 INFO [train.py:715] (0/8) Epoch 17, batch 26000, loss[loss=0.1289, simple_loss=0.2005, pruned_loss=0.0287, over 4780.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2044, pruned_loss=0.02785, over 971814.47 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 03:33:19,120 INFO [train.py:715] (0/8) Epoch 17, batch 26050, loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02917, over 4739.00 frames.], tot_loss[loss=0.13, simple_loss=0.2043, pruned_loss=0.0279, over 971365.05 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:33:57,728 INFO [train.py:715] (0/8) Epoch 17, batch 26100, loss[loss=0.1486, simple_loss=0.2334, pruned_loss=0.03189, over 4722.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2045, pruned_loss=0.02791, over 971623.73 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:34:37,125 INFO [train.py:715] (0/8) Epoch 17, batch 26150, loss[loss=0.1293, simple_loss=0.2005, pruned_loss=0.02903, over 4922.00 frames.], tot_loss[loss=0.1308, simple_loss=0.205, pruned_loss=0.02832, over 971566.81 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:35:16,507 INFO [train.py:715] (0/8) Epoch 17, batch 26200, loss[loss=0.1316, simple_loss=0.2052, pruned_loss=0.02902, over 4780.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.02881, over 971300.88 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:35:56,479 INFO [train.py:715] (0/8) Epoch 17, batch 26250, loss[loss=0.1588, simple_loss=0.2269, pruned_loss=0.04529, over 4948.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.0289, over 971891.14 frames.], batch size: 39, lr: 1.29e-04 2022-05-09 03:36:35,142 INFO [train.py:715] (0/8) Epoch 17, batch 26300, loss[loss=0.1234, simple_loss=0.1908, pruned_loss=0.02805, over 4807.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02913, over 972562.66 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 03:37:13,920 INFO [train.py:715] (0/8) Epoch 17, batch 26350, loss[loss=0.142, simple_loss=0.2241, pruned_loss=0.02992, over 4881.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02926, over 971903.83 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:37:53,865 INFO [train.py:715] (0/8) Epoch 17, batch 26400, loss[loss=0.1365, simple_loss=0.2091, pruned_loss=0.03192, over 4907.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2076, pruned_loss=0.02947, over 972056.25 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 03:38:32,577 INFO [train.py:715] (0/8) Epoch 17, batch 26450, loss[loss=0.1282, simple_loss=0.2083, pruned_loss=0.02404, over 4925.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2077, pruned_loss=0.02986, over 971365.64 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 03:39:11,786 INFO [train.py:715] (0/8) Epoch 17, batch 26500, loss[loss=0.1094, simple_loss=0.1803, pruned_loss=0.01925, over 4974.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02957, over 971829.21 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:39:51,009 INFO [train.py:715] (0/8) Epoch 17, batch 26550, loss[loss=0.1182, simple_loss=0.1989, pruned_loss=0.01874, over 4838.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02954, over 971910.07 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:40:29,941 INFO [train.py:715] (0/8) Epoch 17, batch 26600, loss[loss=0.1392, simple_loss=0.2139, pruned_loss=0.03224, over 4769.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02936, over 972354.62 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:41:08,348 INFO [train.py:715] (0/8) Epoch 17, batch 26650, loss[loss=0.1269, simple_loss=0.1985, pruned_loss=0.02763, over 4957.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02916, over 973079.62 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:41:47,383 INFO [train.py:715] (0/8) Epoch 17, batch 26700, loss[loss=0.1314, simple_loss=0.2046, pruned_loss=0.02909, over 4896.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02984, over 972998.29 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:42:26,784 INFO [train.py:715] (0/8) Epoch 17, batch 26750, loss[loss=0.1444, simple_loss=0.2105, pruned_loss=0.03915, over 4914.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02986, over 972486.52 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 03:43:05,133 INFO [train.py:715] (0/8) Epoch 17, batch 26800, loss[loss=0.1344, simple_loss=0.2069, pruned_loss=0.03096, over 4781.00 frames.], tot_loss[loss=0.1338, simple_loss=0.208, pruned_loss=0.02979, over 972673.48 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:43:43,931 INFO [train.py:715] (0/8) Epoch 17, batch 26850, loss[loss=0.1207, simple_loss=0.1919, pruned_loss=0.02478, over 4798.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2085, pruned_loss=0.02987, over 972194.60 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 03:44:23,814 INFO [train.py:715] (0/8) Epoch 17, batch 26900, loss[loss=0.1333, simple_loss=0.2129, pruned_loss=0.02687, over 4915.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02956, over 972252.46 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:45:02,976 INFO [train.py:715] (0/8) Epoch 17, batch 26950, loss[loss=0.1562, simple_loss=0.2301, pruned_loss=0.04113, over 4979.00 frames.], tot_loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02922, over 973420.42 frames.], batch size: 35, lr: 1.29e-04 2022-05-09 03:45:41,689 INFO [train.py:715] (0/8) Epoch 17, batch 27000, loss[loss=0.142, simple_loss=0.2206, pruned_loss=0.0317, over 4807.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2074, pruned_loss=0.02905, over 972352.31 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:45:41,691 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 03:45:51,480 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1047, simple_loss=0.188, pruned_loss=0.0107, over 914524.00 frames. 2022-05-09 03:46:30,443 INFO [train.py:715] (0/8) Epoch 17, batch 27050, loss[loss=0.114, simple_loss=0.1868, pruned_loss=0.02057, over 4785.00 frames.], tot_loss[loss=0.1336, simple_loss=0.208, pruned_loss=0.02965, over 972264.77 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 03:47:09,962 INFO [train.py:715] (0/8) Epoch 17, batch 27100, loss[loss=0.1131, simple_loss=0.1974, pruned_loss=0.01436, over 4978.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02935, over 972281.61 frames.], batch size: 25, lr: 1.29e-04 2022-05-09 03:47:49,461 INFO [train.py:715] (0/8) Epoch 17, batch 27150, loss[loss=0.1226, simple_loss=0.1997, pruned_loss=0.0228, over 4804.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02893, over 973199.52 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:48:27,665 INFO [train.py:715] (0/8) Epoch 17, batch 27200, loss[loss=0.1388, simple_loss=0.2033, pruned_loss=0.03712, over 4852.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02909, over 973871.12 frames.], batch size: 30, lr: 1.29e-04 2022-05-09 03:49:06,446 INFO [train.py:715] (0/8) Epoch 17, batch 27250, loss[loss=0.1215, simple_loss=0.1915, pruned_loss=0.02577, over 4749.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02899, over 972575.07 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:49:46,071 INFO [train.py:715] (0/8) Epoch 17, batch 27300, loss[loss=0.131, simple_loss=0.2084, pruned_loss=0.02676, over 4934.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02904, over 972705.23 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:50:25,159 INFO [train.py:715] (0/8) Epoch 17, batch 27350, loss[loss=0.1035, simple_loss=0.1695, pruned_loss=0.01873, over 4771.00 frames.], tot_loss[loss=0.1329, simple_loss=0.207, pruned_loss=0.02938, over 971901.14 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:51:04,594 INFO [train.py:715] (0/8) Epoch 17, batch 27400, loss[loss=0.1363, simple_loss=0.2144, pruned_loss=0.02908, over 4932.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2082, pruned_loss=0.02948, over 972399.65 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 03:51:43,498 INFO [train.py:715] (0/8) Epoch 17, batch 27450, loss[loss=0.1196, simple_loss=0.1904, pruned_loss=0.02441, over 4815.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2081, pruned_loss=0.02933, over 972795.15 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 03:52:23,143 INFO [train.py:715] (0/8) Epoch 17, batch 27500, loss[loss=0.1419, simple_loss=0.2111, pruned_loss=0.03632, over 4632.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2077, pruned_loss=0.02933, over 972462.56 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 03:53:01,812 INFO [train.py:715] (0/8) Epoch 17, batch 27550, loss[loss=0.1526, simple_loss=0.2189, pruned_loss=0.04311, over 4973.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2078, pruned_loss=0.02927, over 972960.12 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 03:53:40,302 INFO [train.py:715] (0/8) Epoch 17, batch 27600, loss[loss=0.1621, simple_loss=0.2346, pruned_loss=0.04478, over 4871.00 frames.], tot_loss[loss=0.1333, simple_loss=0.208, pruned_loss=0.02926, over 973243.78 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 03:54:19,255 INFO [train.py:715] (0/8) Epoch 17, batch 27650, loss[loss=0.1127, simple_loss=0.1924, pruned_loss=0.01648, over 4875.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02926, over 973320.01 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:54:57,848 INFO [train.py:715] (0/8) Epoch 17, batch 27700, loss[loss=0.1293, simple_loss=0.1973, pruned_loss=0.0306, over 4864.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02938, over 974238.20 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 03:55:37,179 INFO [train.py:715] (0/8) Epoch 17, batch 27750, loss[loss=0.1159, simple_loss=0.1825, pruned_loss=0.02469, over 4911.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02876, over 973790.36 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:56:16,912 INFO [train.py:715] (0/8) Epoch 17, batch 27800, loss[loss=0.1516, simple_loss=0.2217, pruned_loss=0.04075, over 4892.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02869, over 973848.13 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 03:56:57,476 INFO [train.py:715] (0/8) Epoch 17, batch 27850, loss[loss=0.1403, simple_loss=0.2201, pruned_loss=0.03021, over 4840.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02943, over 973790.40 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 03:57:37,273 INFO [train.py:715] (0/8) Epoch 17, batch 27900, loss[loss=0.1318, simple_loss=0.2074, pruned_loss=0.02813, over 4792.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2074, pruned_loss=0.02975, over 972912.86 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 03:58:16,548 INFO [train.py:715] (0/8) Epoch 17, batch 27950, loss[loss=0.1264, simple_loss=0.2063, pruned_loss=0.02319, over 4990.00 frames.], tot_loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02961, over 972860.06 frames.], batch size: 25, lr: 1.29e-04 2022-05-09 03:58:56,515 INFO [train.py:715] (0/8) Epoch 17, batch 28000, loss[loss=0.1452, simple_loss=0.2276, pruned_loss=0.03144, over 4750.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02985, over 973202.13 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 03:59:36,519 INFO [train.py:715] (0/8) Epoch 17, batch 28050, loss[loss=0.1515, simple_loss=0.2119, pruned_loss=0.04558, over 4772.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02967, over 972040.78 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 04:00:15,247 INFO [train.py:715] (0/8) Epoch 17, batch 28100, loss[loss=0.1241, simple_loss=0.2015, pruned_loss=0.02333, over 4825.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02958, over 973397.39 frames.], batch size: 27, lr: 1.29e-04 2022-05-09 04:00:54,611 INFO [train.py:715] (0/8) Epoch 17, batch 28150, loss[loss=0.1036, simple_loss=0.1697, pruned_loss=0.0187, over 4810.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02921, over 973519.94 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 04:01:33,613 INFO [train.py:715] (0/8) Epoch 17, batch 28200, loss[loss=0.1043, simple_loss=0.1834, pruned_loss=0.01262, over 4836.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02922, over 973866.34 frames.], batch size: 26, lr: 1.29e-04 2022-05-09 04:02:12,002 INFO [train.py:715] (0/8) Epoch 17, batch 28250, loss[loss=0.1321, simple_loss=0.2113, pruned_loss=0.02646, over 4903.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02932, over 974097.12 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:02:50,447 INFO [train.py:715] (0/8) Epoch 17, batch 28300, loss[loss=0.1353, simple_loss=0.2185, pruned_loss=0.02612, over 4743.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02867, over 974410.30 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:03:29,617 INFO [train.py:715] (0/8) Epoch 17, batch 28350, loss[loss=0.1112, simple_loss=0.1904, pruned_loss=0.01599, over 4809.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02865, over 973389.56 frames.], batch size: 27, lr: 1.29e-04 2022-05-09 04:04:09,194 INFO [train.py:715] (0/8) Epoch 17, batch 28400, loss[loss=0.1433, simple_loss=0.2214, pruned_loss=0.03258, over 4953.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02934, over 972561.85 frames.], batch size: 39, lr: 1.29e-04 2022-05-09 04:04:48,214 INFO [train.py:715] (0/8) Epoch 17, batch 28450, loss[loss=0.1147, simple_loss=0.1899, pruned_loss=0.01974, over 4798.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02884, over 972928.61 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 04:05:26,443 INFO [train.py:715] (0/8) Epoch 17, batch 28500, loss[loss=0.1545, simple_loss=0.2262, pruned_loss=0.04143, over 4743.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02859, over 972639.66 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:06:06,460 INFO [train.py:715] (0/8) Epoch 17, batch 28550, loss[loss=0.1581, simple_loss=0.2143, pruned_loss=0.05095, over 4940.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02946, over 972251.40 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 04:06:45,099 INFO [train.py:715] (0/8) Epoch 17, batch 28600, loss[loss=0.14, simple_loss=0.2157, pruned_loss=0.03214, over 4891.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02957, over 972689.48 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:07:23,870 INFO [train.py:715] (0/8) Epoch 17, batch 28650, loss[loss=0.1447, simple_loss=0.2159, pruned_loss=0.03681, over 4955.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2073, pruned_loss=0.02949, over 973066.45 frames.], batch size: 35, lr: 1.29e-04 2022-05-09 04:08:02,255 INFO [train.py:715] (0/8) Epoch 17, batch 28700, loss[loss=0.1153, simple_loss=0.1887, pruned_loss=0.02095, over 4748.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02905, over 973172.47 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:08:41,576 INFO [train.py:715] (0/8) Epoch 17, batch 28750, loss[loss=0.1196, simple_loss=0.1944, pruned_loss=0.02242, over 4794.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02961, over 973343.56 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 04:09:20,209 INFO [train.py:715] (0/8) Epoch 17, batch 28800, loss[loss=0.1157, simple_loss=0.1955, pruned_loss=0.01797, over 4810.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02929, over 972981.84 frames.], batch size: 25, lr: 1.29e-04 2022-05-09 04:09:58,907 INFO [train.py:715] (0/8) Epoch 17, batch 28850, loss[loss=0.1422, simple_loss=0.2208, pruned_loss=0.03184, over 4780.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02921, over 972231.66 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 04:10:37,989 INFO [train.py:715] (0/8) Epoch 17, batch 28900, loss[loss=0.1522, simple_loss=0.2184, pruned_loss=0.04304, over 4975.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2074, pruned_loss=0.02916, over 972393.51 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:11:16,518 INFO [train.py:715] (0/8) Epoch 17, batch 28950, loss[loss=0.1218, simple_loss=0.1933, pruned_loss=0.02512, over 4917.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2078, pruned_loss=0.02956, over 973264.38 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 04:11:54,922 INFO [train.py:715] (0/8) Epoch 17, batch 29000, loss[loss=0.1273, simple_loss=0.1977, pruned_loss=0.02841, over 4989.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02922, over 973214.63 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 04:12:33,658 INFO [train.py:715] (0/8) Epoch 17, batch 29050, loss[loss=0.1229, simple_loss=0.2002, pruned_loss=0.02275, over 4831.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02891, over 972849.87 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 04:13:13,011 INFO [train.py:715] (0/8) Epoch 17, batch 29100, loss[loss=0.1305, simple_loss=0.2049, pruned_loss=0.02798, over 4850.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2074, pruned_loss=0.02883, over 972695.35 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 04:13:51,909 INFO [train.py:715] (0/8) Epoch 17, batch 29150, loss[loss=0.1325, simple_loss=0.2048, pruned_loss=0.03003, over 4915.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02882, over 972649.26 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 04:14:30,016 INFO [train.py:715] (0/8) Epoch 17, batch 29200, loss[loss=0.1302, simple_loss=0.2154, pruned_loss=0.02256, over 4969.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.0286, over 972008.57 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 04:15:09,516 INFO [train.py:715] (0/8) Epoch 17, batch 29250, loss[loss=0.1198, simple_loss=0.1974, pruned_loss=0.02105, over 4988.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02888, over 972650.22 frames.], batch size: 25, lr: 1.29e-04 2022-05-09 04:15:49,146 INFO [train.py:715] (0/8) Epoch 17, batch 29300, loss[loss=0.1231, simple_loss=0.2051, pruned_loss=0.02052, over 4941.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02888, over 972670.48 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 04:16:27,571 INFO [train.py:715] (0/8) Epoch 17, batch 29350, loss[loss=0.1464, simple_loss=0.2095, pruned_loss=0.04166, over 4861.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2056, pruned_loss=0.02895, over 972904.74 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 04:17:06,155 INFO [train.py:715] (0/8) Epoch 17, batch 29400, loss[loss=0.183, simple_loss=0.2452, pruned_loss=0.06035, over 4843.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02914, over 973419.99 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:17:45,838 INFO [train.py:715] (0/8) Epoch 17, batch 29450, loss[loss=0.1209, simple_loss=0.1888, pruned_loss=0.02646, over 4756.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02927, over 972809.32 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:18:24,962 INFO [train.py:715] (0/8) Epoch 17, batch 29500, loss[loss=0.1021, simple_loss=0.1792, pruned_loss=0.01247, over 4938.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.02895, over 972920.27 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 04:19:03,882 INFO [train.py:715] (0/8) Epoch 17, batch 29550, loss[loss=0.1269, simple_loss=0.2055, pruned_loss=0.02417, over 4890.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02911, over 973826.36 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:19:43,165 INFO [train.py:715] (0/8) Epoch 17, batch 29600, loss[loss=0.1133, simple_loss=0.1923, pruned_loss=0.0171, over 4868.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02902, over 973259.57 frames.], batch size: 20, lr: 1.29e-04 2022-05-09 04:20:22,741 INFO [train.py:715] (0/8) Epoch 17, batch 29650, loss[loss=0.1525, simple_loss=0.22, pruned_loss=0.04251, over 4789.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02882, over 973250.32 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 04:21:01,515 INFO [train.py:715] (0/8) Epoch 17, batch 29700, loss[loss=0.1278, simple_loss=0.2027, pruned_loss=0.02647, over 4911.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02881, over 972519.45 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 04:21:40,463 INFO [train.py:715] (0/8) Epoch 17, batch 29750, loss[loss=0.1218, simple_loss=0.1936, pruned_loss=0.02499, over 4945.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02863, over 972045.99 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 04:22:20,620 INFO [train.py:715] (0/8) Epoch 17, batch 29800, loss[loss=0.115, simple_loss=0.183, pruned_loss=0.02351, over 4764.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02821, over 972750.64 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 04:22:59,616 INFO [train.py:715] (0/8) Epoch 17, batch 29850, loss[loss=0.1354, simple_loss=0.2084, pruned_loss=0.03123, over 4844.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.0281, over 972247.39 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 04:23:38,911 INFO [train.py:715] (0/8) Epoch 17, batch 29900, loss[loss=0.1506, simple_loss=0.22, pruned_loss=0.04055, over 4832.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 971523.82 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:24:18,619 INFO [train.py:715] (0/8) Epoch 17, batch 29950, loss[loss=0.1252, simple_loss=0.2053, pruned_loss=0.02257, over 4954.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02866, over 970998.60 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 04:24:58,026 INFO [train.py:715] (0/8) Epoch 17, batch 30000, loss[loss=0.1612, simple_loss=0.2389, pruned_loss=0.04172, over 4880.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2073, pruned_loss=0.02909, over 972044.31 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 04:24:58,027 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 04:25:08,262 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1047, simple_loss=0.188, pruned_loss=0.01065, over 914524.00 frames. 2022-05-09 04:25:48,085 INFO [train.py:715] (0/8) Epoch 17, batch 30050, loss[loss=0.1552, simple_loss=0.2296, pruned_loss=0.04043, over 4908.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02893, over 972826.39 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:26:27,726 INFO [train.py:715] (0/8) Epoch 17, batch 30100, loss[loss=0.1218, simple_loss=0.2026, pruned_loss=0.02054, over 4752.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02873, over 972705.35 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:27:06,811 INFO [train.py:715] (0/8) Epoch 17, batch 30150, loss[loss=0.1332, simple_loss=0.2167, pruned_loss=0.02482, over 4971.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02886, over 973314.83 frames.], batch size: 39, lr: 1.29e-04 2022-05-09 04:27:46,308 INFO [train.py:715] (0/8) Epoch 17, batch 30200, loss[loss=0.1762, simple_loss=0.2521, pruned_loss=0.0502, over 4984.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02887, over 973387.60 frames.], batch size: 25, lr: 1.29e-04 2022-05-09 04:28:25,425 INFO [train.py:715] (0/8) Epoch 17, batch 30250, loss[loss=0.1327, simple_loss=0.2012, pruned_loss=0.03206, over 4886.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2073, pruned_loss=0.02899, over 972776.54 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:29:04,416 INFO [train.py:715] (0/8) Epoch 17, batch 30300, loss[loss=0.1318, simple_loss=0.2047, pruned_loss=0.02946, over 4911.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2075, pruned_loss=0.02905, over 973017.12 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 04:29:44,186 INFO [train.py:715] (0/8) Epoch 17, batch 30350, loss[loss=0.1402, simple_loss=0.214, pruned_loss=0.0332, over 4940.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02903, over 973824.45 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 04:30:23,367 INFO [train.py:715] (0/8) Epoch 17, batch 30400, loss[loss=0.2045, simple_loss=0.271, pruned_loss=0.06897, over 4695.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02943, over 973113.68 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:31:02,097 INFO [train.py:715] (0/8) Epoch 17, batch 30450, loss[loss=0.1305, simple_loss=0.2049, pruned_loss=0.02803, over 4751.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02897, over 972620.13 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:31:41,824 INFO [train.py:715] (0/8) Epoch 17, batch 30500, loss[loss=0.125, simple_loss=0.2, pruned_loss=0.02502, over 4880.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02914, over 972462.12 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 04:32:21,636 INFO [train.py:715] (0/8) Epoch 17, batch 30550, loss[loss=0.1242, simple_loss=0.2071, pruned_loss=0.02068, over 4702.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02922, over 972086.65 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:33:01,422 INFO [train.py:715] (0/8) Epoch 17, batch 30600, loss[loss=0.1329, simple_loss=0.1997, pruned_loss=0.03308, over 4974.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02962, over 971908.95 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:33:40,315 INFO [train.py:715] (0/8) Epoch 17, batch 30650, loss[loss=0.1302, simple_loss=0.2077, pruned_loss=0.02638, over 4916.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02949, over 970417.77 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:34:20,057 INFO [train.py:715] (0/8) Epoch 17, batch 30700, loss[loss=0.1243, simple_loss=0.2034, pruned_loss=0.02259, over 4782.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2067, pruned_loss=0.02951, over 971209.69 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 04:34:59,085 INFO [train.py:715] (0/8) Epoch 17, batch 30750, loss[loss=0.1218, simple_loss=0.2044, pruned_loss=0.01958, over 4922.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02964, over 971052.74 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 04:35:38,913 INFO [train.py:715] (0/8) Epoch 17, batch 30800, loss[loss=0.1345, simple_loss=0.2102, pruned_loss=0.02937, over 4973.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2079, pruned_loss=0.03026, over 971400.75 frames.], batch size: 14, lr: 1.29e-04 2022-05-09 04:36:18,140 INFO [train.py:715] (0/8) Epoch 17, batch 30850, loss[loss=0.1279, simple_loss=0.2091, pruned_loss=0.02329, over 4681.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2081, pruned_loss=0.0303, over 971826.61 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:36:58,358 INFO [train.py:715] (0/8) Epoch 17, batch 30900, loss[loss=0.1179, simple_loss=0.1888, pruned_loss=0.02343, over 4888.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2078, pruned_loss=0.03002, over 972195.20 frames.], batch size: 22, lr: 1.29e-04 2022-05-09 04:37:38,028 INFO [train.py:715] (0/8) Epoch 17, batch 30950, loss[loss=0.1463, simple_loss=0.2265, pruned_loss=0.03306, over 4909.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.03001, over 972262.79 frames.], batch size: 17, lr: 1.29e-04 2022-05-09 04:38:17,299 INFO [train.py:715] (0/8) Epoch 17, batch 31000, loss[loss=0.1456, simple_loss=0.2151, pruned_loss=0.038, over 4961.00 frames.], tot_loss[loss=0.134, simple_loss=0.2078, pruned_loss=0.03009, over 972214.60 frames.], batch size: 39, lr: 1.29e-04 2022-05-09 04:38:57,009 INFO [train.py:715] (0/8) Epoch 17, batch 31050, loss[loss=0.1265, simple_loss=0.2033, pruned_loss=0.02485, over 4931.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2073, pruned_loss=0.02973, over 972472.16 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 04:39:36,077 INFO [train.py:715] (0/8) Epoch 17, batch 31100, loss[loss=0.1403, simple_loss=0.2199, pruned_loss=0.03029, over 4927.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02953, over 972699.17 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 04:40:15,209 INFO [train.py:715] (0/8) Epoch 17, batch 31150, loss[loss=0.1178, simple_loss=0.1966, pruned_loss=0.01955, over 4795.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.0291, over 972167.97 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 04:40:54,494 INFO [train.py:715] (0/8) Epoch 17, batch 31200, loss[loss=0.1426, simple_loss=0.2148, pruned_loss=0.03518, over 4787.00 frames.], tot_loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02921, over 972047.88 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 04:41:34,593 INFO [train.py:715] (0/8) Epoch 17, batch 31250, loss[loss=0.155, simple_loss=0.2446, pruned_loss=0.03274, over 4972.00 frames.], tot_loss[loss=0.134, simple_loss=0.2089, pruned_loss=0.02961, over 971353.03 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 04:42:13,892 INFO [train.py:715] (0/8) Epoch 17, batch 31300, loss[loss=0.1347, simple_loss=0.2017, pruned_loss=0.03383, over 4973.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02956, over 971167.48 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:42:53,279 INFO [train.py:715] (0/8) Epoch 17, batch 31350, loss[loss=0.1243, simple_loss=0.1964, pruned_loss=0.02608, over 4747.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2089, pruned_loss=0.02984, over 971232.06 frames.], batch size: 19, lr: 1.29e-04 2022-05-09 04:43:32,645 INFO [train.py:715] (0/8) Epoch 17, batch 31400, loss[loss=0.1456, simple_loss=0.2261, pruned_loss=0.03256, over 4827.00 frames.], tot_loss[loss=0.1344, simple_loss=0.2092, pruned_loss=0.02977, over 971360.22 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:44:11,253 INFO [train.py:715] (0/8) Epoch 17, batch 31450, loss[loss=0.1157, simple_loss=0.1889, pruned_loss=0.02128, over 4864.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2086, pruned_loss=0.02948, over 972201.77 frames.], batch size: 32, lr: 1.29e-04 2022-05-09 04:44:51,211 INFO [train.py:715] (0/8) Epoch 17, batch 31500, loss[loss=0.1502, simple_loss=0.2241, pruned_loss=0.03814, over 4932.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.02966, over 973028.44 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 04:45:29,936 INFO [train.py:715] (0/8) Epoch 17, batch 31550, loss[loss=0.1084, simple_loss=0.1806, pruned_loss=0.01811, over 4818.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2079, pruned_loss=0.02981, over 972455.14 frames.], batch size: 12, lr: 1.29e-04 2022-05-09 04:46:09,494 INFO [train.py:715] (0/8) Epoch 17, batch 31600, loss[loss=0.1157, simple_loss=0.1899, pruned_loss=0.02078, over 4833.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02946, over 972606.75 frames.], batch size: 13, lr: 1.29e-04 2022-05-09 04:46:48,897 INFO [train.py:715] (0/8) Epoch 17, batch 31650, loss[loss=0.1206, simple_loss=0.1961, pruned_loss=0.02253, over 4935.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02958, over 972848.38 frames.], batch size: 23, lr: 1.29e-04 2022-05-09 04:47:28,179 INFO [train.py:715] (0/8) Epoch 17, batch 31700, loss[loss=0.171, simple_loss=0.2301, pruned_loss=0.05593, over 4749.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02938, over 972526.14 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:48:07,937 INFO [train.py:715] (0/8) Epoch 17, batch 31750, loss[loss=0.1561, simple_loss=0.2283, pruned_loss=0.04196, over 4777.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02949, over 972929.74 frames.], batch size: 18, lr: 1.29e-04 2022-05-09 04:48:47,177 INFO [train.py:715] (0/8) Epoch 17, batch 31800, loss[loss=0.1353, simple_loss=0.2102, pruned_loss=0.03027, over 4814.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02947, over 972377.49 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:49:27,379 INFO [train.py:715] (0/8) Epoch 17, batch 31850, loss[loss=0.143, simple_loss=0.2183, pruned_loss=0.03381, over 4833.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02927, over 971721.53 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:50:06,507 INFO [train.py:715] (0/8) Epoch 17, batch 31900, loss[loss=0.126, simple_loss=0.1974, pruned_loss=0.0273, over 4843.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02905, over 972002.97 frames.], batch size: 15, lr: 1.29e-04 2022-05-09 04:50:45,988 INFO [train.py:715] (0/8) Epoch 17, batch 31950, loss[loss=0.1261, simple_loss=0.1973, pruned_loss=0.02745, over 4927.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02866, over 972171.85 frames.], batch size: 29, lr: 1.29e-04 2022-05-09 04:51:25,761 INFO [train.py:715] (0/8) Epoch 17, batch 32000, loss[loss=0.1259, simple_loss=0.1961, pruned_loss=0.02781, over 4836.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02864, over 971745.97 frames.], batch size: 27, lr: 1.29e-04 2022-05-09 04:52:04,645 INFO [train.py:715] (0/8) Epoch 17, batch 32050, loss[loss=0.1648, simple_loss=0.2259, pruned_loss=0.05187, over 4942.00 frames.], tot_loss[loss=0.132, simple_loss=0.2067, pruned_loss=0.02866, over 971586.17 frames.], batch size: 21, lr: 1.29e-04 2022-05-09 04:52:44,367 INFO [train.py:715] (0/8) Epoch 17, batch 32100, loss[loss=0.1335, simple_loss=0.213, pruned_loss=0.02704, over 4877.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02831, over 971507.82 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:53:23,404 INFO [train.py:715] (0/8) Epoch 17, batch 32150, loss[loss=0.1181, simple_loss=0.1927, pruned_loss=0.02173, over 4845.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02856, over 970964.29 frames.], batch size: 26, lr: 1.29e-04 2022-05-09 04:54:02,757 INFO [train.py:715] (0/8) Epoch 17, batch 32200, loss[loss=0.1535, simple_loss=0.2245, pruned_loss=0.04127, over 4974.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02863, over 971635.08 frames.], batch size: 35, lr: 1.29e-04 2022-05-09 04:54:35,604 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-624000.pt 2022-05-09 04:54:45,061 INFO [train.py:715] (0/8) Epoch 17, batch 32250, loss[loss=0.1573, simple_loss=0.223, pruned_loss=0.0458, over 4869.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02852, over 971441.26 frames.], batch size: 16, lr: 1.29e-04 2022-05-09 04:55:24,421 INFO [train.py:715] (0/8) Epoch 17, batch 32300, loss[loss=0.1469, simple_loss=0.1998, pruned_loss=0.04703, over 4853.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.0286, over 970942.98 frames.], batch size: 34, lr: 1.29e-04 2022-05-09 04:56:04,322 INFO [train.py:715] (0/8) Epoch 17, batch 32350, loss[loss=0.1383, simple_loss=0.2164, pruned_loss=0.03015, over 4966.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02842, over 972024.71 frames.], batch size: 24, lr: 1.29e-04 2022-05-09 04:56:43,384 INFO [train.py:715] (0/8) Epoch 17, batch 32400, loss[loss=0.1326, simple_loss=0.206, pruned_loss=0.02964, over 4974.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.0281, over 971657.14 frames.], batch size: 33, lr: 1.29e-04 2022-05-09 04:57:22,530 INFO [train.py:715] (0/8) Epoch 17, batch 32450, loss[loss=0.1397, simple_loss=0.2203, pruned_loss=0.02957, over 4803.00 frames.], tot_loss[loss=0.131, simple_loss=0.2056, pruned_loss=0.02824, over 971612.71 frames.], batch size: 25, lr: 1.28e-04 2022-05-09 04:58:02,556 INFO [train.py:715] (0/8) Epoch 17, batch 32500, loss[loss=0.1162, simple_loss=0.1923, pruned_loss=0.02007, over 4644.00 frames.], tot_loss[loss=0.1312, simple_loss=0.206, pruned_loss=0.02823, over 971074.65 frames.], batch size: 13, lr: 1.28e-04 2022-05-09 04:58:41,967 INFO [train.py:715] (0/8) Epoch 17, batch 32550, loss[loss=0.1372, simple_loss=0.2018, pruned_loss=0.03626, over 4765.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2062, pruned_loss=0.02822, over 971281.04 frames.], batch size: 18, lr: 1.28e-04 2022-05-09 04:59:21,559 INFO [train.py:715] (0/8) Epoch 17, batch 32600, loss[loss=0.1234, simple_loss=0.2017, pruned_loss=0.0225, over 4747.00 frames.], tot_loss[loss=0.131, simple_loss=0.2057, pruned_loss=0.02812, over 971563.53 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:00:01,069 INFO [train.py:715] (0/8) Epoch 17, batch 32650, loss[loss=0.1362, simple_loss=0.2118, pruned_loss=0.03032, over 4931.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02866, over 971667.62 frames.], batch size: 23, lr: 1.28e-04 2022-05-09 05:00:39,805 INFO [train.py:715] (0/8) Epoch 17, batch 32700, loss[loss=0.124, simple_loss=0.2024, pruned_loss=0.02283, over 4794.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02903, over 972299.63 frames.], batch size: 24, lr: 1.28e-04 2022-05-09 05:01:19,988 INFO [train.py:715] (0/8) Epoch 17, batch 32750, loss[loss=0.1137, simple_loss=0.1856, pruned_loss=0.02089, over 4919.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.0292, over 972897.94 frames.], batch size: 18, lr: 1.28e-04 2022-05-09 05:01:59,334 INFO [train.py:715] (0/8) Epoch 17, batch 32800, loss[loss=0.128, simple_loss=0.2018, pruned_loss=0.02705, over 4803.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02889, over 972188.18 frames.], batch size: 21, lr: 1.28e-04 2022-05-09 05:02:38,973 INFO [train.py:715] (0/8) Epoch 17, batch 32850, loss[loss=0.1342, simple_loss=0.2112, pruned_loss=0.02865, over 4850.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02849, over 972339.89 frames.], batch size: 20, lr: 1.28e-04 2022-05-09 05:03:18,518 INFO [train.py:715] (0/8) Epoch 17, batch 32900, loss[loss=0.1195, simple_loss=0.1956, pruned_loss=0.02169, over 4890.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.0286, over 972858.01 frames.], batch size: 22, lr: 1.28e-04 2022-05-09 05:03:58,023 INFO [train.py:715] (0/8) Epoch 17, batch 32950, loss[loss=0.1178, simple_loss=0.1833, pruned_loss=0.02615, over 4832.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02853, over 972063.50 frames.], batch size: 13, lr: 1.28e-04 2022-05-09 05:04:36,960 INFO [train.py:715] (0/8) Epoch 17, batch 33000, loss[loss=0.126, simple_loss=0.201, pruned_loss=0.02549, over 4812.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.02855, over 971678.01 frames.], batch size: 21, lr: 1.28e-04 2022-05-09 05:04:36,961 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 05:04:49,646 INFO [train.py:742] (0/8) Epoch 17, validation: loss=0.1049, simple_loss=0.1881, pruned_loss=0.0108, over 914524.00 frames. 2022-05-09 05:05:28,987 INFO [train.py:715] (0/8) Epoch 17, batch 33050, loss[loss=0.1248, simple_loss=0.1842, pruned_loss=0.03271, over 4858.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02889, over 971657.30 frames.], batch size: 13, lr: 1.28e-04 2022-05-09 05:06:08,146 INFO [train.py:715] (0/8) Epoch 17, batch 33100, loss[loss=0.1124, simple_loss=0.1923, pruned_loss=0.01625, over 4901.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2067, pruned_loss=0.02849, over 971915.85 frames.], batch size: 22, lr: 1.28e-04 2022-05-09 05:06:47,449 INFO [train.py:715] (0/8) Epoch 17, batch 33150, loss[loss=0.1332, simple_loss=0.2054, pruned_loss=0.03048, over 4823.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2068, pruned_loss=0.02849, over 972313.31 frames.], batch size: 13, lr: 1.28e-04 2022-05-09 05:07:27,180 INFO [train.py:715] (0/8) Epoch 17, batch 33200, loss[loss=0.1175, simple_loss=0.1989, pruned_loss=0.01805, over 4801.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02891, over 972892.12 frames.], batch size: 21, lr: 1.28e-04 2022-05-09 05:08:06,793 INFO [train.py:715] (0/8) Epoch 17, batch 33250, loss[loss=0.1572, simple_loss=0.229, pruned_loss=0.04266, over 4914.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.02886, over 973274.29 frames.], batch size: 19, lr: 1.28e-04 2022-05-09 05:08:46,105 INFO [train.py:715] (0/8) Epoch 17, batch 33300, loss[loss=0.1323, simple_loss=0.2056, pruned_loss=0.02953, over 4817.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02928, over 974190.12 frames.], batch size: 27, lr: 1.28e-04 2022-05-09 05:09:25,683 INFO [train.py:715] (0/8) Epoch 17, batch 33350, loss[loss=0.1445, simple_loss=0.214, pruned_loss=0.03744, over 4925.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02916, over 973491.84 frames.], batch size: 18, lr: 1.28e-04 2022-05-09 05:10:05,486 INFO [train.py:715] (0/8) Epoch 17, batch 33400, loss[loss=0.1381, simple_loss=0.2142, pruned_loss=0.03097, over 4883.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2081, pruned_loss=0.02944, over 973484.35 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:10:44,825 INFO [train.py:715] (0/8) Epoch 17, batch 33450, loss[loss=0.1521, simple_loss=0.2209, pruned_loss=0.04169, over 4884.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2078, pruned_loss=0.02931, over 973190.29 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:11:24,374 INFO [train.py:715] (0/8) Epoch 17, batch 33500, loss[loss=0.144, simple_loss=0.2205, pruned_loss=0.03375, over 4972.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2081, pruned_loss=0.02952, over 973358.97 frames.], batch size: 15, lr: 1.28e-04 2022-05-09 05:12:04,584 INFO [train.py:715] (0/8) Epoch 17, batch 33550, loss[loss=0.151, simple_loss=0.2294, pruned_loss=0.03626, over 4994.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02943, over 973173.95 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:12:44,744 INFO [train.py:715] (0/8) Epoch 17, batch 33600, loss[loss=0.1275, simple_loss=0.1962, pruned_loss=0.02934, over 4918.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02885, over 972351.21 frames.], batch size: 23, lr: 1.28e-04 2022-05-09 05:13:23,722 INFO [train.py:715] (0/8) Epoch 17, batch 33650, loss[loss=0.1097, simple_loss=0.1927, pruned_loss=0.0134, over 4805.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02839, over 972830.77 frames.], batch size: 21, lr: 1.28e-04 2022-05-09 05:14:03,358 INFO [train.py:715] (0/8) Epoch 17, batch 33700, loss[loss=0.1531, simple_loss=0.2231, pruned_loss=0.04156, over 4856.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02881, over 972923.75 frames.], batch size: 30, lr: 1.28e-04 2022-05-09 05:14:42,576 INFO [train.py:715] (0/8) Epoch 17, batch 33750, loss[loss=0.1298, simple_loss=0.2035, pruned_loss=0.02803, over 4878.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02913, over 973124.22 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:15:21,393 INFO [train.py:715] (0/8) Epoch 17, batch 33800, loss[loss=0.1238, simple_loss=0.1938, pruned_loss=0.02694, over 4828.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.0291, over 973118.32 frames.], batch size: 13, lr: 1.28e-04 2022-05-09 05:16:01,529 INFO [train.py:715] (0/8) Epoch 17, batch 33850, loss[loss=0.1295, simple_loss=0.2059, pruned_loss=0.02649, over 4848.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02911, over 973899.08 frames.], batch size: 20, lr: 1.28e-04 2022-05-09 05:16:41,834 INFO [train.py:715] (0/8) Epoch 17, batch 33900, loss[loss=0.1351, simple_loss=0.2111, pruned_loss=0.02953, over 4925.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2074, pruned_loss=0.02917, over 974540.30 frames.], batch size: 29, lr: 1.28e-04 2022-05-09 05:17:21,087 INFO [train.py:715] (0/8) Epoch 17, batch 33950, loss[loss=0.145, simple_loss=0.22, pruned_loss=0.03498, over 4802.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02904, over 974214.09 frames.], batch size: 14, lr: 1.28e-04 2022-05-09 05:18:00,096 INFO [train.py:715] (0/8) Epoch 17, batch 34000, loss[loss=0.1127, simple_loss=0.1896, pruned_loss=0.01785, over 4973.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02883, over 974001.46 frames.], batch size: 14, lr: 1.28e-04 2022-05-09 05:18:39,510 INFO [train.py:715] (0/8) Epoch 17, batch 34050, loss[loss=0.1564, simple_loss=0.2214, pruned_loss=0.04575, over 4906.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.0293, over 973313.40 frames.], batch size: 17, lr: 1.28e-04 2022-05-09 05:19:19,502 INFO [train.py:715] (0/8) Epoch 17, batch 34100, loss[loss=0.136, simple_loss=0.2115, pruned_loss=0.03029, over 4896.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02938, over 972327.19 frames.], batch size: 39, lr: 1.28e-04 2022-05-09 05:19:58,305 INFO [train.py:715] (0/8) Epoch 17, batch 34150, loss[loss=0.1501, simple_loss=0.2291, pruned_loss=0.03557, over 4867.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2084, pruned_loss=0.02956, over 971302.75 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:20:37,451 INFO [train.py:715] (0/8) Epoch 17, batch 34200, loss[loss=0.1279, simple_loss=0.1949, pruned_loss=0.03041, over 4988.00 frames.], tot_loss[loss=0.134, simple_loss=0.2083, pruned_loss=0.02986, over 972649.14 frames.], batch size: 31, lr: 1.28e-04 2022-05-09 05:21:16,559 INFO [train.py:715] (0/8) Epoch 17, batch 34250, loss[loss=0.1271, simple_loss=0.1907, pruned_loss=0.03179, over 4887.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2076, pruned_loss=0.02942, over 973602.03 frames.], batch size: 32, lr: 1.28e-04 2022-05-09 05:21:55,279 INFO [train.py:715] (0/8) Epoch 17, batch 34300, loss[loss=0.1361, simple_loss=0.2029, pruned_loss=0.03469, over 4779.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02936, over 973609.18 frames.], batch size: 14, lr: 1.28e-04 2022-05-09 05:22:34,165 INFO [train.py:715] (0/8) Epoch 17, batch 34350, loss[loss=0.149, simple_loss=0.2238, pruned_loss=0.03705, over 4992.00 frames.], tot_loss[loss=0.1329, simple_loss=0.207, pruned_loss=0.02945, over 973351.13 frames.], batch size: 20, lr: 1.28e-04 2022-05-09 05:23:13,524 INFO [train.py:715] (0/8) Epoch 17, batch 34400, loss[loss=0.1407, simple_loss=0.2152, pruned_loss=0.03307, over 4768.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.0294, over 972845.15 frames.], batch size: 19, lr: 1.28e-04 2022-05-09 05:23:52,514 INFO [train.py:715] (0/8) Epoch 17, batch 34450, loss[loss=0.1349, simple_loss=0.2155, pruned_loss=0.02712, over 4864.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2066, pruned_loss=0.02931, over 972951.01 frames.], batch size: 20, lr: 1.28e-04 2022-05-09 05:24:30,969 INFO [train.py:715] (0/8) Epoch 17, batch 34500, loss[loss=0.135, simple_loss=0.2063, pruned_loss=0.03186, over 4988.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02896, over 973531.03 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:25:09,846 INFO [train.py:715] (0/8) Epoch 17, batch 34550, loss[loss=0.1226, simple_loss=0.2035, pruned_loss=0.02084, over 4836.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.0287, over 972640.19 frames.], batch size: 25, lr: 1.28e-04 2022-05-09 05:25:48,995 INFO [train.py:715] (0/8) Epoch 17, batch 34600, loss[loss=0.1548, simple_loss=0.2341, pruned_loss=0.03772, over 4966.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02891, over 972493.50 frames.], batch size: 15, lr: 1.28e-04 2022-05-09 05:26:27,693 INFO [train.py:715] (0/8) Epoch 17, batch 34650, loss[loss=0.133, simple_loss=0.2156, pruned_loss=0.02518, over 4782.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02877, over 971953.18 frames.], batch size: 17, lr: 1.28e-04 2022-05-09 05:27:06,961 INFO [train.py:715] (0/8) Epoch 17, batch 34700, loss[loss=0.1623, simple_loss=0.2324, pruned_loss=0.04605, over 4877.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.0291, over 972249.39 frames.], batch size: 16, lr: 1.28e-04 2022-05-09 05:27:45,509 INFO [train.py:715] (0/8) Epoch 17, batch 34750, loss[loss=0.1313, simple_loss=0.1958, pruned_loss=0.03343, over 4834.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2054, pruned_loss=0.02884, over 971473.27 frames.], batch size: 15, lr: 1.28e-04 2022-05-09 05:28:22,198 INFO [train.py:715] (0/8) Epoch 17, batch 34800, loss[loss=0.1156, simple_loss=0.1855, pruned_loss=0.02283, over 4785.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2056, pruned_loss=0.02885, over 971421.42 frames.], batch size: 12, lr: 1.28e-04 2022-05-09 05:28:31,602 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-17.pt 2022-05-09 05:29:12,356 INFO [train.py:715] (0/8) Epoch 18, batch 0, loss[loss=0.1378, simple_loss=0.218, pruned_loss=0.02883, over 4814.00 frames.], tot_loss[loss=0.1378, simple_loss=0.218, pruned_loss=0.02883, over 4814.00 frames.], batch size: 27, lr: 1.25e-04 2022-05-09 05:29:51,056 INFO [train.py:715] (0/8) Epoch 18, batch 50, loss[loss=0.1526, simple_loss=0.2258, pruned_loss=0.03966, over 4861.00 frames.], tot_loss[loss=0.131, simple_loss=0.2041, pruned_loss=0.02895, over 220279.26 frames.], batch size: 16, lr: 1.25e-04 2022-05-09 05:30:31,042 INFO [train.py:715] (0/8) Epoch 18, batch 100, loss[loss=0.1394, simple_loss=0.2136, pruned_loss=0.03256, over 4855.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2049, pruned_loss=0.02976, over 387021.76 frames.], batch size: 32, lr: 1.25e-04 2022-05-09 05:31:10,957 INFO [train.py:715] (0/8) Epoch 18, batch 150, loss[loss=0.1253, simple_loss=0.1954, pruned_loss=0.02759, over 4896.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2055, pruned_loss=0.02902, over 517115.64 frames.], batch size: 38, lr: 1.25e-04 2022-05-09 05:31:50,256 INFO [train.py:715] (0/8) Epoch 18, batch 200, loss[loss=0.1187, simple_loss=0.1964, pruned_loss=0.02045, over 4966.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02896, over 618728.76 frames.], batch size: 24, lr: 1.25e-04 2022-05-09 05:32:29,110 INFO [train.py:715] (0/8) Epoch 18, batch 250, loss[loss=0.1426, simple_loss=0.2176, pruned_loss=0.03387, over 4881.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2076, pruned_loss=0.02916, over 697495.15 frames.], batch size: 19, lr: 1.25e-04 2022-05-09 05:33:08,561 INFO [train.py:715] (0/8) Epoch 18, batch 300, loss[loss=0.1465, simple_loss=0.2148, pruned_loss=0.03908, over 4846.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02955, over 757846.67 frames.], batch size: 13, lr: 1.25e-04 2022-05-09 05:33:48,412 INFO [train.py:715] (0/8) Epoch 18, batch 350, loss[loss=0.2478, simple_loss=0.3178, pruned_loss=0.0889, over 4963.00 frames.], tot_loss[loss=0.1334, simple_loss=0.208, pruned_loss=0.02945, over 805309.72 frames.], batch size: 39, lr: 1.25e-04 2022-05-09 05:34:27,357 INFO [train.py:715] (0/8) Epoch 18, batch 400, loss[loss=0.1337, simple_loss=0.2058, pruned_loss=0.03084, over 4773.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2071, pruned_loss=0.02868, over 841782.71 frames.], batch size: 17, lr: 1.25e-04 2022-05-09 05:35:07,146 INFO [train.py:715] (0/8) Epoch 18, batch 450, loss[loss=0.1258, simple_loss=0.1965, pruned_loss=0.02753, over 4984.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2071, pruned_loss=0.02857, over 871366.83 frames.], batch size: 35, lr: 1.25e-04 2022-05-09 05:35:47,327 INFO [train.py:715] (0/8) Epoch 18, batch 500, loss[loss=0.1083, simple_loss=0.1835, pruned_loss=0.01658, over 4760.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02878, over 894779.32 frames.], batch size: 19, lr: 1.25e-04 2022-05-09 05:36:27,089 INFO [train.py:715] (0/8) Epoch 18, batch 550, loss[loss=0.1548, simple_loss=0.2373, pruned_loss=0.03619, over 4892.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02899, over 911994.48 frames.], batch size: 19, lr: 1.25e-04 2022-05-09 05:37:06,104 INFO [train.py:715] (0/8) Epoch 18, batch 600, loss[loss=0.1761, simple_loss=0.2514, pruned_loss=0.05042, over 4794.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02912, over 924941.59 frames.], batch size: 24, lr: 1.25e-04 2022-05-09 05:37:45,636 INFO [train.py:715] (0/8) Epoch 18, batch 650, loss[loss=0.134, simple_loss=0.2099, pruned_loss=0.0291, over 4807.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02927, over 935234.94 frames.], batch size: 21, lr: 1.25e-04 2022-05-09 05:38:25,474 INFO [train.py:715] (0/8) Epoch 18, batch 700, loss[loss=0.1417, simple_loss=0.2247, pruned_loss=0.02934, over 4865.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.0289, over 943814.78 frames.], batch size: 32, lr: 1.25e-04 2022-05-09 05:39:04,428 INFO [train.py:715] (0/8) Epoch 18, batch 750, loss[loss=0.1375, simple_loss=0.2066, pruned_loss=0.03424, over 4759.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2058, pruned_loss=0.02906, over 949647.84 frames.], batch size: 19, lr: 1.25e-04 2022-05-09 05:39:43,254 INFO [train.py:715] (0/8) Epoch 18, batch 800, loss[loss=0.1074, simple_loss=0.1759, pruned_loss=0.01949, over 4803.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2058, pruned_loss=0.02896, over 954654.22 frames.], batch size: 21, lr: 1.25e-04 2022-05-09 05:40:22,747 INFO [train.py:715] (0/8) Epoch 18, batch 850, loss[loss=0.1374, simple_loss=0.2108, pruned_loss=0.03197, over 4771.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02875, over 958399.18 frames.], batch size: 14, lr: 1.25e-04 2022-05-09 05:41:02,297 INFO [train.py:715] (0/8) Epoch 18, batch 900, loss[loss=0.1271, simple_loss=0.1975, pruned_loss=0.02837, over 4871.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02864, over 961641.15 frames.], batch size: 22, lr: 1.25e-04 2022-05-09 05:41:41,276 INFO [train.py:715] (0/8) Epoch 18, batch 950, loss[loss=0.1514, simple_loss=0.2295, pruned_loss=0.03663, over 4931.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02866, over 964407.28 frames.], batch size: 29, lr: 1.25e-04 2022-05-09 05:42:20,887 INFO [train.py:715] (0/8) Epoch 18, batch 1000, loss[loss=0.1381, simple_loss=0.2216, pruned_loss=0.02731, over 4951.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02884, over 967209.33 frames.], batch size: 21, lr: 1.25e-04 2022-05-09 05:43:00,526 INFO [train.py:715] (0/8) Epoch 18, batch 1050, loss[loss=0.1262, simple_loss=0.2061, pruned_loss=0.02317, over 4857.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02867, over 967531.06 frames.], batch size: 20, lr: 1.25e-04 2022-05-09 05:43:39,929 INFO [train.py:715] (0/8) Epoch 18, batch 1100, loss[loss=0.1081, simple_loss=0.1873, pruned_loss=0.01451, over 4833.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2066, pruned_loss=0.02939, over 969037.08 frames.], batch size: 26, lr: 1.25e-04 2022-05-09 05:44:18,724 INFO [train.py:715] (0/8) Epoch 18, batch 1150, loss[loss=0.1348, simple_loss=0.2007, pruned_loss=0.03446, over 4941.00 frames.], tot_loss[loss=0.132, simple_loss=0.206, pruned_loss=0.02896, over 970033.40 frames.], batch size: 35, lr: 1.25e-04 2022-05-09 05:44:58,550 INFO [train.py:715] (0/8) Epoch 18, batch 1200, loss[loss=0.1111, simple_loss=0.1877, pruned_loss=0.01728, over 4906.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.0291, over 970051.52 frames.], batch size: 17, lr: 1.25e-04 2022-05-09 05:45:38,521 INFO [train.py:715] (0/8) Epoch 18, batch 1250, loss[loss=0.1161, simple_loss=0.1907, pruned_loss=0.02071, over 4790.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.0289, over 970002.12 frames.], batch size: 12, lr: 1.25e-04 2022-05-09 05:46:17,551 INFO [train.py:715] (0/8) Epoch 18, batch 1300, loss[loss=0.1377, simple_loss=0.2049, pruned_loss=0.03523, over 4860.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2074, pruned_loss=0.02894, over 971351.15 frames.], batch size: 30, lr: 1.25e-04 2022-05-09 05:46:56,370 INFO [train.py:715] (0/8) Epoch 18, batch 1350, loss[loss=0.143, simple_loss=0.217, pruned_loss=0.0345, over 4808.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02917, over 971200.39 frames.], batch size: 15, lr: 1.25e-04 2022-05-09 05:47:35,779 INFO [train.py:715] (0/8) Epoch 18, batch 1400, loss[loss=0.1177, simple_loss=0.19, pruned_loss=0.02266, over 4921.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02857, over 971336.00 frames.], batch size: 23, lr: 1.25e-04 2022-05-09 05:48:15,004 INFO [train.py:715] (0/8) Epoch 18, batch 1450, loss[loss=0.1719, simple_loss=0.2387, pruned_loss=0.05254, over 4978.00 frames.], tot_loss[loss=0.1312, simple_loss=0.206, pruned_loss=0.02816, over 971167.38 frames.], batch size: 24, lr: 1.25e-04 2022-05-09 05:48:53,402 INFO [train.py:715] (0/8) Epoch 18, batch 1500, loss[loss=0.1178, simple_loss=0.1949, pruned_loss=0.02036, over 4832.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2068, pruned_loss=0.02833, over 972043.37 frames.], batch size: 30, lr: 1.25e-04 2022-05-09 05:49:32,904 INFO [train.py:715] (0/8) Epoch 18, batch 1550, loss[loss=0.1268, simple_loss=0.2114, pruned_loss=0.0211, over 4777.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2074, pruned_loss=0.02878, over 971858.10 frames.], batch size: 18, lr: 1.25e-04 2022-05-09 05:50:12,320 INFO [train.py:715] (0/8) Epoch 18, batch 1600, loss[loss=0.1627, simple_loss=0.2277, pruned_loss=0.04881, over 4948.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2073, pruned_loss=0.02875, over 972223.69 frames.], batch size: 35, lr: 1.25e-04 2022-05-09 05:50:51,518 INFO [train.py:715] (0/8) Epoch 18, batch 1650, loss[loss=0.1718, simple_loss=0.2372, pruned_loss=0.0532, over 4869.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2073, pruned_loss=0.02881, over 971658.59 frames.], batch size: 16, lr: 1.25e-04 2022-05-09 05:51:30,464 INFO [train.py:715] (0/8) Epoch 18, batch 1700, loss[loss=0.135, simple_loss=0.2182, pruned_loss=0.02585, over 4945.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2081, pruned_loss=0.02924, over 972089.00 frames.], batch size: 15, lr: 1.25e-04 2022-05-09 05:52:09,882 INFO [train.py:715] (0/8) Epoch 18, batch 1750, loss[loss=0.1171, simple_loss=0.1966, pruned_loss=0.01881, over 4812.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2072, pruned_loss=0.02852, over 971828.10 frames.], batch size: 21, lr: 1.25e-04 2022-05-09 05:52:49,170 INFO [train.py:715] (0/8) Epoch 18, batch 1800, loss[loss=0.1553, simple_loss=0.2185, pruned_loss=0.04603, over 4984.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02907, over 971871.74 frames.], batch size: 39, lr: 1.25e-04 2022-05-09 05:53:27,454 INFO [train.py:715] (0/8) Epoch 18, batch 1850, loss[loss=0.1402, simple_loss=0.1948, pruned_loss=0.0428, over 4853.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02914, over 971579.61 frames.], batch size: 30, lr: 1.25e-04 2022-05-09 05:54:06,243 INFO [train.py:715] (0/8) Epoch 18, batch 1900, loss[loss=0.1327, simple_loss=0.2101, pruned_loss=0.02766, over 4987.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2068, pruned_loss=0.02883, over 972776.81 frames.], batch size: 31, lr: 1.25e-04 2022-05-09 05:54:45,621 INFO [train.py:715] (0/8) Epoch 18, batch 1950, loss[loss=0.1339, simple_loss=0.1961, pruned_loss=0.03582, over 4834.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02879, over 973450.37 frames.], batch size: 13, lr: 1.25e-04 2022-05-09 05:55:24,353 INFO [train.py:715] (0/8) Epoch 18, batch 2000, loss[loss=0.1096, simple_loss=0.1826, pruned_loss=0.0183, over 4871.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2055, pruned_loss=0.02845, over 973711.28 frames.], batch size: 32, lr: 1.25e-04 2022-05-09 05:56:02,841 INFO [train.py:715] (0/8) Epoch 18, batch 2050, loss[loss=0.1196, simple_loss=0.192, pruned_loss=0.0236, over 4906.00 frames.], tot_loss[loss=0.131, simple_loss=0.2054, pruned_loss=0.02835, over 974435.02 frames.], batch size: 17, lr: 1.25e-04 2022-05-09 05:56:42,079 INFO [train.py:715] (0/8) Epoch 18, batch 2100, loss[loss=0.1183, simple_loss=0.1964, pruned_loss=0.02009, over 4950.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02884, over 974497.70 frames.], batch size: 21, lr: 1.25e-04 2022-05-09 05:57:21,525 INFO [train.py:715] (0/8) Epoch 18, batch 2150, loss[loss=0.1376, simple_loss=0.2009, pruned_loss=0.03712, over 4696.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.0292, over 973288.68 frames.], batch size: 15, lr: 1.25e-04 2022-05-09 05:57:59,831 INFO [train.py:715] (0/8) Epoch 18, batch 2200, loss[loss=0.1357, simple_loss=0.2137, pruned_loss=0.02885, over 4977.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02878, over 973196.65 frames.], batch size: 25, lr: 1.25e-04 2022-05-09 05:58:39,478 INFO [train.py:715] (0/8) Epoch 18, batch 2250, loss[loss=0.1611, simple_loss=0.2437, pruned_loss=0.03929, over 4862.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02927, over 973465.09 frames.], batch size: 16, lr: 1.25e-04 2022-05-09 05:59:18,828 INFO [train.py:715] (0/8) Epoch 18, batch 2300, loss[loss=0.1297, simple_loss=0.2079, pruned_loss=0.02576, over 4906.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02895, over 972817.17 frames.], batch size: 17, lr: 1.25e-04 2022-05-09 05:59:57,617 INFO [train.py:715] (0/8) Epoch 18, batch 2350, loss[loss=0.1471, simple_loss=0.2243, pruned_loss=0.03496, over 4910.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02862, over 972760.20 frames.], batch size: 17, lr: 1.25e-04 2022-05-09 06:00:36,229 INFO [train.py:715] (0/8) Epoch 18, batch 2400, loss[loss=0.1113, simple_loss=0.1876, pruned_loss=0.01751, over 4985.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2056, pruned_loss=0.02861, over 973046.67 frames.], batch size: 28, lr: 1.25e-04 2022-05-09 06:01:15,692 INFO [train.py:715] (0/8) Epoch 18, batch 2450, loss[loss=0.115, simple_loss=0.1891, pruned_loss=0.02039, over 4982.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02871, over 973534.25 frames.], batch size: 25, lr: 1.25e-04 2022-05-09 06:01:55,087 INFO [train.py:715] (0/8) Epoch 18, batch 2500, loss[loss=0.1382, simple_loss=0.2155, pruned_loss=0.03048, over 4847.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02882, over 973439.36 frames.], batch size: 20, lr: 1.25e-04 2022-05-09 06:02:33,094 INFO [train.py:715] (0/8) Epoch 18, batch 2550, loss[loss=0.1378, simple_loss=0.2135, pruned_loss=0.03108, over 4912.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02913, over 972966.06 frames.], batch size: 18, lr: 1.25e-04 2022-05-09 06:03:11,866 INFO [train.py:715] (0/8) Epoch 18, batch 2600, loss[loss=0.1379, simple_loss=0.2026, pruned_loss=0.03658, over 4956.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02907, over 973595.75 frames.], batch size: 24, lr: 1.25e-04 2022-05-09 06:03:51,785 INFO [train.py:715] (0/8) Epoch 18, batch 2650, loss[loss=0.123, simple_loss=0.1945, pruned_loss=0.02575, over 4763.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2055, pruned_loss=0.0285, over 973456.68 frames.], batch size: 12, lr: 1.25e-04 2022-05-09 06:04:30,528 INFO [train.py:715] (0/8) Epoch 18, batch 2700, loss[loss=0.132, simple_loss=0.2034, pruned_loss=0.03026, over 4892.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02862, over 972612.51 frames.], batch size: 19, lr: 1.25e-04 2022-05-09 06:05:08,885 INFO [train.py:715] (0/8) Epoch 18, batch 2750, loss[loss=0.1284, simple_loss=0.2032, pruned_loss=0.02679, over 4814.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02862, over 972439.37 frames.], batch size: 27, lr: 1.25e-04 2022-05-09 06:05:47,972 INFO [train.py:715] (0/8) Epoch 18, batch 2800, loss[loss=0.1162, simple_loss=0.1874, pruned_loss=0.02252, over 4834.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.0288, over 971932.39 frames.], batch size: 12, lr: 1.25e-04 2022-05-09 06:06:27,518 INFO [train.py:715] (0/8) Epoch 18, batch 2850, loss[loss=0.1229, simple_loss=0.1858, pruned_loss=0.02997, over 4800.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.0289, over 971748.67 frames.], batch size: 12, lr: 1.25e-04 2022-05-09 06:07:06,089 INFO [train.py:715] (0/8) Epoch 18, batch 2900, loss[loss=0.1304, simple_loss=0.2105, pruned_loss=0.02515, over 4975.00 frames.], tot_loss[loss=0.1332, simple_loss=0.207, pruned_loss=0.02965, over 972209.08 frames.], batch size: 24, lr: 1.25e-04 2022-05-09 06:07:44,916 INFO [train.py:715] (0/8) Epoch 18, batch 2950, loss[loss=0.1407, simple_loss=0.2112, pruned_loss=0.03509, over 4749.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2071, pruned_loss=0.02988, over 970922.20 frames.], batch size: 16, lr: 1.25e-04 2022-05-09 06:08:24,282 INFO [train.py:715] (0/8) Epoch 18, batch 3000, loss[loss=0.2274, simple_loss=0.2887, pruned_loss=0.08305, over 4966.00 frames.], tot_loss[loss=0.133, simple_loss=0.2066, pruned_loss=0.0297, over 971408.03 frames.], batch size: 15, lr: 1.25e-04 2022-05-09 06:08:24,282 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 06:08:34,097 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1047, simple_loss=0.1881, pruned_loss=0.01065, over 914524.00 frames. 2022-05-09 06:09:14,107 INFO [train.py:715] (0/8) Epoch 18, batch 3050, loss[loss=0.1761, simple_loss=0.2469, pruned_loss=0.05262, over 4830.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02911, over 972164.42 frames.], batch size: 15, lr: 1.25e-04 2022-05-09 06:09:52,622 INFO [train.py:715] (0/8) Epoch 18, batch 3100, loss[loss=0.1328, simple_loss=0.2161, pruned_loss=0.02477, over 4975.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02935, over 973354.75 frames.], batch size: 15, lr: 1.25e-04 2022-05-09 06:10:31,511 INFO [train.py:715] (0/8) Epoch 18, batch 3150, loss[loss=0.1183, simple_loss=0.1951, pruned_loss=0.02075, over 4935.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02884, over 973378.57 frames.], batch size: 21, lr: 1.25e-04 2022-05-09 06:11:10,546 INFO [train.py:715] (0/8) Epoch 18, batch 3200, loss[loss=0.1389, simple_loss=0.2157, pruned_loss=0.03103, over 4990.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02887, over 973356.54 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 06:11:50,030 INFO [train.py:715] (0/8) Epoch 18, batch 3250, loss[loss=0.1708, simple_loss=0.2351, pruned_loss=0.05323, over 4936.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02917, over 972965.30 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 06:12:28,193 INFO [train.py:715] (0/8) Epoch 18, batch 3300, loss[loss=0.1202, simple_loss=0.1899, pruned_loss=0.02524, over 4635.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2077, pruned_loss=0.02905, over 973087.65 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:13:07,649 INFO [train.py:715] (0/8) Epoch 18, batch 3350, loss[loss=0.1402, simple_loss=0.2218, pruned_loss=0.02929, over 4813.00 frames.], tot_loss[loss=0.1334, simple_loss=0.208, pruned_loss=0.02941, over 973283.19 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 06:13:47,789 INFO [train.py:715] (0/8) Epoch 18, batch 3400, loss[loss=0.1146, simple_loss=0.187, pruned_loss=0.02106, over 4873.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02879, over 973809.81 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:14:26,394 INFO [train.py:715] (0/8) Epoch 18, batch 3450, loss[loss=0.1242, simple_loss=0.1968, pruned_loss=0.02579, over 4938.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02829, over 973887.94 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 06:15:05,249 INFO [train.py:715] (0/8) Epoch 18, batch 3500, loss[loss=0.1242, simple_loss=0.1985, pruned_loss=0.02495, over 4825.00 frames.], tot_loss[loss=0.132, simple_loss=0.2066, pruned_loss=0.02874, over 973308.97 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:15:45,332 INFO [train.py:715] (0/8) Epoch 18, batch 3550, loss[loss=0.1272, simple_loss=0.1942, pruned_loss=0.03013, over 4796.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02902, over 973395.08 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:16:24,511 INFO [train.py:715] (0/8) Epoch 18, batch 3600, loss[loss=0.1458, simple_loss=0.2233, pruned_loss=0.03414, over 4898.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02877, over 972992.54 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 06:17:03,260 INFO [train.py:715] (0/8) Epoch 18, batch 3650, loss[loss=0.1251, simple_loss=0.2092, pruned_loss=0.02049, over 4762.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02847, over 973178.54 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 06:17:42,725 INFO [train.py:715] (0/8) Epoch 18, batch 3700, loss[loss=0.1428, simple_loss=0.2103, pruned_loss=0.03764, over 4758.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02855, over 973297.17 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:18:21,998 INFO [train.py:715] (0/8) Epoch 18, batch 3750, loss[loss=0.1557, simple_loss=0.225, pruned_loss=0.04323, over 4778.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2054, pruned_loss=0.02849, over 972480.55 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:18:59,951 INFO [train.py:715] (0/8) Epoch 18, batch 3800, loss[loss=0.1109, simple_loss=0.1839, pruned_loss=0.01891, over 4821.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2046, pruned_loss=0.02814, over 971675.41 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 06:19:39,329 INFO [train.py:715] (0/8) Epoch 18, batch 3850, loss[loss=0.1442, simple_loss=0.2193, pruned_loss=0.03452, over 4766.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2051, pruned_loss=0.02869, over 972062.47 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 06:20:19,351 INFO [train.py:715] (0/8) Epoch 18, batch 3900, loss[loss=0.1392, simple_loss=0.2196, pruned_loss=0.02943, over 4780.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02886, over 971802.26 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:20:57,823 INFO [train.py:715] (0/8) Epoch 18, batch 3950, loss[loss=0.1356, simple_loss=0.216, pruned_loss=0.02753, over 4868.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2057, pruned_loss=0.029, over 972007.49 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:21:37,234 INFO [train.py:715] (0/8) Epoch 18, batch 4000, loss[loss=0.137, simple_loss=0.2026, pruned_loss=0.0357, over 4848.00 frames.], tot_loss[loss=0.132, simple_loss=0.2059, pruned_loss=0.02909, over 973454.01 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:22:16,732 INFO [train.py:715] (0/8) Epoch 18, batch 4050, loss[loss=0.1345, simple_loss=0.2187, pruned_loss=0.02512, over 4779.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.0291, over 973127.09 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 06:22:56,009 INFO [train.py:715] (0/8) Epoch 18, batch 4100, loss[loss=0.1425, simple_loss=0.2109, pruned_loss=0.03709, over 4895.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02933, over 973078.88 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 06:23:34,761 INFO [train.py:715] (0/8) Epoch 18, batch 4150, loss[loss=0.1356, simple_loss=0.2148, pruned_loss=0.0282, over 4779.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2076, pruned_loss=0.02904, over 972748.89 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 06:24:14,198 INFO [train.py:715] (0/8) Epoch 18, batch 4200, loss[loss=0.1349, simple_loss=0.2124, pruned_loss=0.02875, over 4903.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02896, over 972531.86 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 06:24:53,578 INFO [train.py:715] (0/8) Epoch 18, batch 4250, loss[loss=0.1169, simple_loss=0.1878, pruned_loss=0.02301, over 4850.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02925, over 973558.77 frames.], batch size: 34, lr: 1.24e-04 2022-05-09 06:25:32,488 INFO [train.py:715] (0/8) Epoch 18, batch 4300, loss[loss=0.1164, simple_loss=0.2007, pruned_loss=0.01604, over 4900.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02891, over 972884.84 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 06:26:12,600 INFO [train.py:715] (0/8) Epoch 18, batch 4350, loss[loss=0.1386, simple_loss=0.2175, pruned_loss=0.02987, over 4913.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.02831, over 973275.64 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 06:26:52,058 INFO [train.py:715] (0/8) Epoch 18, batch 4400, loss[loss=0.1242, simple_loss=0.2062, pruned_loss=0.0211, over 4899.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2053, pruned_loss=0.028, over 973164.77 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 06:27:31,550 INFO [train.py:715] (0/8) Epoch 18, batch 4450, loss[loss=0.1109, simple_loss=0.1897, pruned_loss=0.01606, over 4787.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02808, over 972821.68 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 06:28:09,901 INFO [train.py:715] (0/8) Epoch 18, batch 4500, loss[loss=0.1382, simple_loss=0.2086, pruned_loss=0.03392, over 4760.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02872, over 973154.16 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:28:49,167 INFO [train.py:715] (0/8) Epoch 18, batch 4550, loss[loss=0.1667, simple_loss=0.2346, pruned_loss=0.04935, over 4820.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02891, over 973874.55 frames.], batch size: 27, lr: 1.24e-04 2022-05-09 06:29:29,011 INFO [train.py:715] (0/8) Epoch 18, batch 4600, loss[loss=0.1463, simple_loss=0.2208, pruned_loss=0.03585, over 4973.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02871, over 973736.15 frames.], batch size: 28, lr: 1.24e-04 2022-05-09 06:30:07,895 INFO [train.py:715] (0/8) Epoch 18, batch 4650, loss[loss=0.1454, simple_loss=0.2187, pruned_loss=0.03604, over 4993.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02904, over 974639.32 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:30:47,011 INFO [train.py:715] (0/8) Epoch 18, batch 4700, loss[loss=0.1434, simple_loss=0.2112, pruned_loss=0.03777, over 4964.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2064, pruned_loss=0.02958, over 974224.47 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 06:31:26,065 INFO [train.py:715] (0/8) Epoch 18, batch 4750, loss[loss=0.1402, simple_loss=0.2201, pruned_loss=0.03017, over 4909.00 frames.], tot_loss[loss=0.1334, simple_loss=0.207, pruned_loss=0.02993, over 973230.73 frames.], batch size: 39, lr: 1.24e-04 2022-05-09 06:32:06,181 INFO [train.py:715] (0/8) Epoch 18, batch 4800, loss[loss=0.1607, simple_loss=0.228, pruned_loss=0.04669, over 4927.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2075, pruned_loss=0.03005, over 973605.87 frames.], batch size: 23, lr: 1.24e-04 2022-05-09 06:32:44,915 INFO [train.py:715] (0/8) Epoch 18, batch 4850, loss[loss=0.1188, simple_loss=0.2011, pruned_loss=0.0182, over 4885.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.02951, over 972797.97 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 06:33:24,374 INFO [train.py:715] (0/8) Epoch 18, batch 4900, loss[loss=0.1112, simple_loss=0.1781, pruned_loss=0.02214, over 4832.00 frames.], tot_loss[loss=0.1321, simple_loss=0.206, pruned_loss=0.02914, over 972661.70 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:34:04,557 INFO [train.py:715] (0/8) Epoch 18, batch 4950, loss[loss=0.1207, simple_loss=0.1978, pruned_loss=0.0218, over 4741.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02895, over 972040.21 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:34:43,673 INFO [train.py:715] (0/8) Epoch 18, batch 5000, loss[loss=0.1175, simple_loss=0.1966, pruned_loss=0.0192, over 4685.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02857, over 971581.15 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 06:35:22,353 INFO [train.py:715] (0/8) Epoch 18, batch 5050, loss[loss=0.1348, simple_loss=0.2131, pruned_loss=0.02829, over 4907.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02867, over 971550.87 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 06:36:01,524 INFO [train.py:715] (0/8) Epoch 18, batch 5100, loss[loss=0.121, simple_loss=0.1914, pruned_loss=0.02528, over 4880.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02884, over 970902.59 frames.], batch size: 32, lr: 1.24e-04 2022-05-09 06:36:41,085 INFO [train.py:715] (0/8) Epoch 18, batch 5150, loss[loss=0.1026, simple_loss=0.1775, pruned_loss=0.01387, over 4987.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02875, over 970919.94 frames.], batch size: 28, lr: 1.24e-04 2022-05-09 06:37:19,650 INFO [train.py:715] (0/8) Epoch 18, batch 5200, loss[loss=0.1121, simple_loss=0.1828, pruned_loss=0.02065, over 4934.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02887, over 971743.01 frames.], batch size: 23, lr: 1.24e-04 2022-05-09 06:37:59,019 INFO [train.py:715] (0/8) Epoch 18, batch 5250, loss[loss=0.1207, simple_loss=0.1935, pruned_loss=0.02389, over 4892.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2065, pruned_loss=0.02898, over 971700.63 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:38:38,939 INFO [train.py:715] (0/8) Epoch 18, batch 5300, loss[loss=0.09819, simple_loss=0.1695, pruned_loss=0.01342, over 4773.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02896, over 972146.75 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 06:39:18,962 INFO [train.py:715] (0/8) Epoch 18, batch 5350, loss[loss=0.1285, simple_loss=0.2006, pruned_loss=0.02817, over 4783.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02856, over 972129.23 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 06:39:57,057 INFO [train.py:715] (0/8) Epoch 18, batch 5400, loss[loss=0.1068, simple_loss=0.1721, pruned_loss=0.02072, over 4691.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.02838, over 972406.78 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 06:40:22,256 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-632000.pt 2022-05-09 06:40:38,714 INFO [train.py:715] (0/8) Epoch 18, batch 5450, loss[loss=0.139, simple_loss=0.2069, pruned_loss=0.03553, over 4829.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02847, over 972645.97 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:41:19,093 INFO [train.py:715] (0/8) Epoch 18, batch 5500, loss[loss=0.1063, simple_loss=0.1771, pruned_loss=0.01775, over 4756.00 frames.], tot_loss[loss=0.132, simple_loss=0.2068, pruned_loss=0.02856, over 973126.41 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:41:58,076 INFO [train.py:715] (0/8) Epoch 18, batch 5550, loss[loss=0.1438, simple_loss=0.2102, pruned_loss=0.03864, over 4967.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02908, over 972182.49 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 06:42:36,881 INFO [train.py:715] (0/8) Epoch 18, batch 5600, loss[loss=0.1494, simple_loss=0.2085, pruned_loss=0.04513, over 4758.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.0293, over 972735.65 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:43:15,923 INFO [train.py:715] (0/8) Epoch 18, batch 5650, loss[loss=0.1261, simple_loss=0.2004, pruned_loss=0.02593, over 4977.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02944, over 973171.40 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 06:43:55,546 INFO [train.py:715] (0/8) Epoch 18, batch 5700, loss[loss=0.1203, simple_loss=0.1883, pruned_loss=0.02617, over 4977.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02955, over 973053.22 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:44:33,667 INFO [train.py:715] (0/8) Epoch 18, batch 5750, loss[loss=0.1294, simple_loss=0.2014, pruned_loss=0.02864, over 4980.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02924, over 973385.38 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 06:45:12,598 INFO [train.py:715] (0/8) Epoch 18, batch 5800, loss[loss=0.1419, simple_loss=0.2134, pruned_loss=0.03525, over 4692.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02986, over 973630.46 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 06:45:52,337 INFO [train.py:715] (0/8) Epoch 18, batch 5850, loss[loss=0.125, simple_loss=0.2063, pruned_loss=0.0219, over 4822.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02987, over 974012.79 frames.], batch size: 27, lr: 1.24e-04 2022-05-09 06:46:31,447 INFO [train.py:715] (0/8) Epoch 18, batch 5900, loss[loss=0.1427, simple_loss=0.2206, pruned_loss=0.03243, over 4738.00 frames.], tot_loss[loss=0.1347, simple_loss=0.2091, pruned_loss=0.0302, over 973240.17 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:47:10,141 INFO [train.py:715] (0/8) Epoch 18, batch 5950, loss[loss=0.1434, simple_loss=0.2203, pruned_loss=0.03326, over 4737.00 frames.], tot_loss[loss=0.1346, simple_loss=0.2089, pruned_loss=0.03009, over 973352.49 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:47:49,547 INFO [train.py:715] (0/8) Epoch 18, batch 6000, loss[loss=0.1109, simple_loss=0.1927, pruned_loss=0.01455, over 4815.00 frames.], tot_loss[loss=0.134, simple_loss=0.2084, pruned_loss=0.02984, over 973846.27 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 06:47:49,548 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 06:47:59,476 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1047, simple_loss=0.188, pruned_loss=0.01075, over 914524.00 frames. 2022-05-09 06:48:39,111 INFO [train.py:715] (0/8) Epoch 18, batch 6050, loss[loss=0.151, simple_loss=0.2154, pruned_loss=0.04325, over 4971.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2083, pruned_loss=0.02973, over 973255.22 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 06:49:18,283 INFO [train.py:715] (0/8) Epoch 18, batch 6100, loss[loss=0.1205, simple_loss=0.1903, pruned_loss=0.02538, over 4806.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02959, over 973363.63 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:49:56,625 INFO [train.py:715] (0/8) Epoch 18, batch 6150, loss[loss=0.1304, simple_loss=0.2013, pruned_loss=0.02979, over 4914.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2075, pruned_loss=0.02912, over 973396.24 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 06:50:35,918 INFO [train.py:715] (0/8) Epoch 18, batch 6200, loss[loss=0.09825, simple_loss=0.168, pruned_loss=0.01424, over 4845.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02907, over 972551.73 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:51:15,497 INFO [train.py:715] (0/8) Epoch 18, batch 6250, loss[loss=0.1448, simple_loss=0.2207, pruned_loss=0.0344, over 4736.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02892, over 972171.25 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 06:51:54,529 INFO [train.py:715] (0/8) Epoch 18, batch 6300, loss[loss=0.1333, simple_loss=0.2169, pruned_loss=0.02487, over 4847.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02885, over 972769.42 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 06:52:33,701 INFO [train.py:715] (0/8) Epoch 18, batch 6350, loss[loss=0.1428, simple_loss=0.2106, pruned_loss=0.03749, over 4980.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2063, pruned_loss=0.02938, over 972740.59 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 06:53:12,891 INFO [train.py:715] (0/8) Epoch 18, batch 6400, loss[loss=0.1042, simple_loss=0.1692, pruned_loss=0.01959, over 4760.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2064, pruned_loss=0.02969, over 972797.77 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 06:53:52,074 INFO [train.py:715] (0/8) Epoch 18, batch 6450, loss[loss=0.1275, simple_loss=0.1941, pruned_loss=0.03042, over 4963.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2062, pruned_loss=0.02956, over 973135.37 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 06:54:30,356 INFO [train.py:715] (0/8) Epoch 18, batch 6500, loss[loss=0.1282, simple_loss=0.1969, pruned_loss=0.02972, over 4642.00 frames.], tot_loss[loss=0.1322, simple_loss=0.206, pruned_loss=0.02922, over 973026.83 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:55:08,636 INFO [train.py:715] (0/8) Epoch 18, batch 6550, loss[loss=0.1456, simple_loss=0.2156, pruned_loss=0.0378, over 4845.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02907, over 972406.35 frames.], batch size: 32, lr: 1.24e-04 2022-05-09 06:55:48,100 INFO [train.py:715] (0/8) Epoch 18, batch 6600, loss[loss=0.1309, simple_loss=0.1983, pruned_loss=0.03179, over 4985.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02916, over 972147.16 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 06:56:27,457 INFO [train.py:715] (0/8) Epoch 18, batch 6650, loss[loss=0.1388, simple_loss=0.208, pruned_loss=0.03483, over 4845.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02877, over 972463.87 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 06:57:05,501 INFO [train.py:715] (0/8) Epoch 18, batch 6700, loss[loss=0.1567, simple_loss=0.2393, pruned_loss=0.03705, over 4936.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2068, pruned_loss=0.02978, over 972948.01 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 06:57:44,485 INFO [train.py:715] (0/8) Epoch 18, batch 6750, loss[loss=0.1137, simple_loss=0.1857, pruned_loss=0.02081, over 4861.00 frames.], tot_loss[loss=0.133, simple_loss=0.2069, pruned_loss=0.02955, over 972038.55 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 06:58:23,845 INFO [train.py:715] (0/8) Epoch 18, batch 6800, loss[loss=0.1346, simple_loss=0.2074, pruned_loss=0.03087, over 4931.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02868, over 971747.83 frames.], batch size: 23, lr: 1.24e-04 2022-05-09 06:59:02,552 INFO [train.py:715] (0/8) Epoch 18, batch 6850, loss[loss=0.1107, simple_loss=0.1875, pruned_loss=0.01695, over 4923.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02848, over 972379.00 frames.], batch size: 39, lr: 1.24e-04 2022-05-09 06:59:40,720 INFO [train.py:715] (0/8) Epoch 18, batch 6900, loss[loss=0.1317, simple_loss=0.2086, pruned_loss=0.02745, over 4873.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02871, over 971871.76 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:00:20,315 INFO [train.py:715] (0/8) Epoch 18, batch 6950, loss[loss=0.1544, simple_loss=0.2274, pruned_loss=0.04071, over 4846.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02896, over 971810.99 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 07:00:59,039 INFO [train.py:715] (0/8) Epoch 18, batch 7000, loss[loss=0.1095, simple_loss=0.1913, pruned_loss=0.01391, over 4905.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2067, pruned_loss=0.02857, over 971628.85 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 07:01:37,420 INFO [train.py:715] (0/8) Epoch 18, batch 7050, loss[loss=0.1591, simple_loss=0.2266, pruned_loss=0.04583, over 4972.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02904, over 971922.09 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 07:02:16,624 INFO [train.py:715] (0/8) Epoch 18, batch 7100, loss[loss=0.1098, simple_loss=0.1913, pruned_loss=0.01418, over 4964.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02871, over 972564.13 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:02:56,201 INFO [train.py:715] (0/8) Epoch 18, batch 7150, loss[loss=0.1385, simple_loss=0.2241, pruned_loss=0.02648, over 4906.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02879, over 972277.72 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 07:03:34,830 INFO [train.py:715] (0/8) Epoch 18, batch 7200, loss[loss=0.1337, simple_loss=0.2074, pruned_loss=0.03003, over 4825.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02863, over 971742.01 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 07:04:13,058 INFO [train.py:715] (0/8) Epoch 18, batch 7250, loss[loss=0.1231, simple_loss=0.1873, pruned_loss=0.02942, over 4755.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02881, over 971867.31 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:04:52,160 INFO [train.py:715] (0/8) Epoch 18, batch 7300, loss[loss=0.1248, simple_loss=0.1997, pruned_loss=0.02492, over 4731.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2077, pruned_loss=0.029, over 972132.76 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:05:31,283 INFO [train.py:715] (0/8) Epoch 18, batch 7350, loss[loss=0.1171, simple_loss=0.1987, pruned_loss=0.01773, over 4950.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02874, over 971774.06 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:06:09,355 INFO [train.py:715] (0/8) Epoch 18, batch 7400, loss[loss=0.1276, simple_loss=0.2096, pruned_loss=0.02282, over 4947.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2078, pruned_loss=0.02901, over 971881.83 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:06:48,510 INFO [train.py:715] (0/8) Epoch 18, batch 7450, loss[loss=0.1261, simple_loss=0.1993, pruned_loss=0.02646, over 4753.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2081, pruned_loss=0.02931, over 972265.53 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 07:07:27,759 INFO [train.py:715] (0/8) Epoch 18, batch 7500, loss[loss=0.1424, simple_loss=0.2187, pruned_loss=0.03298, over 4925.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2079, pruned_loss=0.02919, over 972407.59 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 07:08:05,372 INFO [train.py:715] (0/8) Epoch 18, batch 7550, loss[loss=0.09903, simple_loss=0.17, pruned_loss=0.01401, over 4787.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2068, pruned_loss=0.02844, over 972758.77 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 07:08:43,907 INFO [train.py:715] (0/8) Epoch 18, batch 7600, loss[loss=0.1525, simple_loss=0.2313, pruned_loss=0.03689, over 4965.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02863, over 973212.32 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:09:23,636 INFO [train.py:715] (0/8) Epoch 18, batch 7650, loss[loss=0.1327, simple_loss=0.2135, pruned_loss=0.02593, over 4783.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02856, over 972501.14 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 07:10:02,903 INFO [train.py:715] (0/8) Epoch 18, batch 7700, loss[loss=0.131, simple_loss=0.2039, pruned_loss=0.02899, over 4982.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2065, pruned_loss=0.02845, over 972986.44 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 07:10:41,607 INFO [train.py:715] (0/8) Epoch 18, batch 7750, loss[loss=0.1405, simple_loss=0.2166, pruned_loss=0.03222, over 4758.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02872, over 973564.73 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:11:21,214 INFO [train.py:715] (0/8) Epoch 18, batch 7800, loss[loss=0.1003, simple_loss=0.1782, pruned_loss=0.01123, over 4882.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2063, pruned_loss=0.02846, over 972944.02 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 07:12:01,096 INFO [train.py:715] (0/8) Epoch 18, batch 7850, loss[loss=0.1204, simple_loss=0.2025, pruned_loss=0.01915, over 4945.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02838, over 972974.27 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 07:12:40,474 INFO [train.py:715] (0/8) Epoch 18, batch 7900, loss[loss=0.1112, simple_loss=0.1894, pruned_loss=0.01647, over 4983.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2063, pruned_loss=0.02843, over 973680.14 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:13:19,674 INFO [train.py:715] (0/8) Epoch 18, batch 7950, loss[loss=0.1346, simple_loss=0.2138, pruned_loss=0.02767, over 4795.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02903, over 973018.35 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:13:59,113 INFO [train.py:715] (0/8) Epoch 18, batch 8000, loss[loss=0.1185, simple_loss=0.2059, pruned_loss=0.01556, over 4888.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02872, over 973068.93 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 07:14:38,129 INFO [train.py:715] (0/8) Epoch 18, batch 8050, loss[loss=0.1251, simple_loss=0.1992, pruned_loss=0.02549, over 4985.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02854, over 973180.64 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:15:16,607 INFO [train.py:715] (0/8) Epoch 18, batch 8100, loss[loss=0.1291, simple_loss=0.2011, pruned_loss=0.02852, over 4868.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2059, pruned_loss=0.02815, over 972611.28 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 07:15:55,248 INFO [train.py:715] (0/8) Epoch 18, batch 8150, loss[loss=0.1514, simple_loss=0.2374, pruned_loss=0.03266, over 4861.00 frames.], tot_loss[loss=0.131, simple_loss=0.2054, pruned_loss=0.02832, over 971889.59 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:16:34,308 INFO [train.py:715] (0/8) Epoch 18, batch 8200, loss[loss=0.1279, simple_loss=0.2128, pruned_loss=0.02148, over 4823.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2053, pruned_loss=0.02819, over 972367.63 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 07:17:12,926 INFO [train.py:715] (0/8) Epoch 18, batch 8250, loss[loss=0.145, simple_loss=0.2176, pruned_loss=0.0362, over 4899.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.02861, over 971782.19 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 07:17:51,218 INFO [train.py:715] (0/8) Epoch 18, batch 8300, loss[loss=0.1386, simple_loss=0.2189, pruned_loss=0.02916, over 4944.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.02909, over 972288.03 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 07:18:31,281 INFO [train.py:715] (0/8) Epoch 18, batch 8350, loss[loss=0.1461, simple_loss=0.2096, pruned_loss=0.04132, over 4993.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02903, over 972290.17 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 07:19:10,480 INFO [train.py:715] (0/8) Epoch 18, batch 8400, loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02961, over 4990.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02918, over 972610.81 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:19:48,918 INFO [train.py:715] (0/8) Epoch 18, batch 8450, loss[loss=0.1328, simple_loss=0.2154, pruned_loss=0.02512, over 4950.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02906, over 972310.90 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 07:20:28,154 INFO [train.py:715] (0/8) Epoch 18, batch 8500, loss[loss=0.1101, simple_loss=0.1875, pruned_loss=0.01631, over 4938.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.0291, over 972296.32 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:21:07,334 INFO [train.py:715] (0/8) Epoch 18, batch 8550, loss[loss=0.123, simple_loss=0.1975, pruned_loss=0.02421, over 4848.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02904, over 972897.43 frames.], batch size: 34, lr: 1.24e-04 2022-05-09 07:21:46,033 INFO [train.py:715] (0/8) Epoch 18, batch 8600, loss[loss=0.1591, simple_loss=0.2282, pruned_loss=0.04496, over 4825.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02907, over 972676.33 frames.], batch size: 30, lr: 1.24e-04 2022-05-09 07:22:24,237 INFO [train.py:715] (0/8) Epoch 18, batch 8650, loss[loss=0.1422, simple_loss=0.2185, pruned_loss=0.03292, over 4833.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02961, over 972359.28 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 07:23:03,807 INFO [train.py:715] (0/8) Epoch 18, batch 8700, loss[loss=0.109, simple_loss=0.1888, pruned_loss=0.01464, over 4961.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.02929, over 972035.74 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:23:43,632 INFO [train.py:715] (0/8) Epoch 18, batch 8750, loss[loss=0.1419, simple_loss=0.1977, pruned_loss=0.04305, over 4806.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2067, pruned_loss=0.0294, over 972722.25 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 07:24:23,134 INFO [train.py:715] (0/8) Epoch 18, batch 8800, loss[loss=0.1398, simple_loss=0.2078, pruned_loss=0.03592, over 4822.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.02929, over 971991.73 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:25:01,506 INFO [train.py:715] (0/8) Epoch 18, batch 8850, loss[loss=0.1335, simple_loss=0.2099, pruned_loss=0.02861, over 4934.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2053, pruned_loss=0.02872, over 971463.73 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 07:25:41,121 INFO [train.py:715] (0/8) Epoch 18, batch 8900, loss[loss=0.1319, simple_loss=0.203, pruned_loss=0.03036, over 4992.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2046, pruned_loss=0.02803, over 971358.66 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 07:26:19,640 INFO [train.py:715] (0/8) Epoch 18, batch 8950, loss[loss=0.1128, simple_loss=0.1982, pruned_loss=0.01372, over 4877.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2043, pruned_loss=0.02814, over 970869.01 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:26:58,100 INFO [train.py:715] (0/8) Epoch 18, batch 9000, loss[loss=0.106, simple_loss=0.1778, pruned_loss=0.01708, over 4742.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2048, pruned_loss=0.02835, over 971601.60 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:26:58,101 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 07:27:08,041 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1045, simple_loss=0.1879, pruned_loss=0.01057, over 914524.00 frames. 2022-05-09 07:27:46,930 INFO [train.py:715] (0/8) Epoch 18, batch 9050, loss[loss=0.1261, simple_loss=0.2052, pruned_loss=0.02346, over 4793.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2051, pruned_loss=0.02867, over 972209.03 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:28:26,538 INFO [train.py:715] (0/8) Epoch 18, batch 9100, loss[loss=0.1289, simple_loss=0.1973, pruned_loss=0.03025, over 4876.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2058, pruned_loss=0.02903, over 972043.51 frames.], batch size: 32, lr: 1.24e-04 2022-05-09 07:29:05,695 INFO [train.py:715] (0/8) Epoch 18, batch 9150, loss[loss=0.1313, simple_loss=0.2129, pruned_loss=0.02486, over 4749.00 frames.], tot_loss[loss=0.131, simple_loss=0.2049, pruned_loss=0.02849, over 972114.22 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 07:29:43,360 INFO [train.py:715] (0/8) Epoch 18, batch 9200, loss[loss=0.1222, simple_loss=0.1982, pruned_loss=0.02314, over 4830.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2053, pruned_loss=0.0288, over 971981.27 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 07:30:22,558 INFO [train.py:715] (0/8) Epoch 18, batch 9250, loss[loss=0.1196, simple_loss=0.1925, pruned_loss=0.02338, over 4940.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2056, pruned_loss=0.02849, over 971509.90 frames.], batch size: 23, lr: 1.24e-04 2022-05-09 07:31:01,718 INFO [train.py:715] (0/8) Epoch 18, batch 9300, loss[loss=0.1266, simple_loss=0.2009, pruned_loss=0.02619, over 4854.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.02853, over 971138.52 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 07:31:39,924 INFO [train.py:715] (0/8) Epoch 18, batch 9350, loss[loss=0.1737, simple_loss=0.2439, pruned_loss=0.05176, over 4918.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02895, over 971319.57 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 07:32:18,513 INFO [train.py:715] (0/8) Epoch 18, batch 9400, loss[loss=0.1119, simple_loss=0.186, pruned_loss=0.01892, over 4974.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02881, over 971466.49 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:32:58,075 INFO [train.py:715] (0/8) Epoch 18, batch 9450, loss[loss=0.1374, simple_loss=0.2099, pruned_loss=0.0324, over 4982.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2074, pruned_loss=0.02918, over 971641.35 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:33:36,478 INFO [train.py:715] (0/8) Epoch 18, batch 9500, loss[loss=0.1394, simple_loss=0.2184, pruned_loss=0.03021, over 4963.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02919, over 971828.11 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 07:34:14,736 INFO [train.py:715] (0/8) Epoch 18, batch 9550, loss[loss=0.1026, simple_loss=0.1709, pruned_loss=0.01708, over 4909.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2057, pruned_loss=0.02802, over 971945.08 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 07:34:53,873 INFO [train.py:715] (0/8) Epoch 18, batch 9600, loss[loss=0.1631, simple_loss=0.2451, pruned_loss=0.04053, over 4943.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2064, pruned_loss=0.02835, over 971913.99 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:35:33,428 INFO [train.py:715] (0/8) Epoch 18, batch 9650, loss[loss=0.1283, simple_loss=0.2086, pruned_loss=0.024, over 4866.00 frames.], tot_loss[loss=0.1321, simple_loss=0.207, pruned_loss=0.02855, over 971391.35 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 07:36:12,258 INFO [train.py:715] (0/8) Epoch 18, batch 9700, loss[loss=0.1133, simple_loss=0.1791, pruned_loss=0.02379, over 4820.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2073, pruned_loss=0.0289, over 971092.08 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 07:36:50,930 INFO [train.py:715] (0/8) Epoch 18, batch 9750, loss[loss=0.1282, simple_loss=0.1919, pruned_loss=0.03224, over 4966.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02892, over 970905.77 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 07:37:31,026 INFO [train.py:715] (0/8) Epoch 18, batch 9800, loss[loss=0.1606, simple_loss=0.2244, pruned_loss=0.04837, over 4869.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02921, over 971425.01 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 07:38:09,631 INFO [train.py:715] (0/8) Epoch 18, batch 9850, loss[loss=0.129, simple_loss=0.1921, pruned_loss=0.03295, over 4980.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02916, over 971645.50 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 07:38:47,995 INFO [train.py:715] (0/8) Epoch 18, batch 9900, loss[loss=0.1233, simple_loss=0.2057, pruned_loss=0.02042, over 4790.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.0291, over 971502.95 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 07:39:27,316 INFO [train.py:715] (0/8) Epoch 18, batch 9950, loss[loss=0.1241, simple_loss=0.2083, pruned_loss=0.0199, over 4977.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2069, pruned_loss=0.02866, over 971937.15 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:40:06,406 INFO [train.py:715] (0/8) Epoch 18, batch 10000, loss[loss=0.143, simple_loss=0.2186, pruned_loss=0.03369, over 4960.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02851, over 971496.95 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:40:45,256 INFO [train.py:715] (0/8) Epoch 18, batch 10050, loss[loss=0.1336, simple_loss=0.2042, pruned_loss=0.03156, over 4869.00 frames.], tot_loss[loss=0.131, simple_loss=0.2058, pruned_loss=0.02804, over 971804.70 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:41:23,496 INFO [train.py:715] (0/8) Epoch 18, batch 10100, loss[loss=0.09555, simple_loss=0.1638, pruned_loss=0.01367, over 4849.00 frames.], tot_loss[loss=0.131, simple_loss=0.2057, pruned_loss=0.02816, over 972326.02 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 07:42:02,485 INFO [train.py:715] (0/8) Epoch 18, batch 10150, loss[loss=0.1246, simple_loss=0.1921, pruned_loss=0.02856, over 4927.00 frames.], tot_loss[loss=0.131, simple_loss=0.2057, pruned_loss=0.02813, over 972675.14 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:42:41,660 INFO [train.py:715] (0/8) Epoch 18, batch 10200, loss[loss=0.1203, simple_loss=0.1959, pruned_loss=0.02236, over 4928.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2051, pruned_loss=0.02769, over 973281.91 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 07:43:20,197 INFO [train.py:715] (0/8) Epoch 18, batch 10250, loss[loss=0.1262, simple_loss=0.1977, pruned_loss=0.02732, over 4825.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.02835, over 972173.54 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 07:43:59,313 INFO [train.py:715] (0/8) Epoch 18, batch 10300, loss[loss=0.1023, simple_loss=0.1806, pruned_loss=0.01199, over 4845.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.0286, over 972198.98 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 07:44:39,637 INFO [train.py:715] (0/8) Epoch 18, batch 10350, loss[loss=0.1256, simple_loss=0.2102, pruned_loss=0.02053, over 4790.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02875, over 972321.75 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:45:18,120 INFO [train.py:715] (0/8) Epoch 18, batch 10400, loss[loss=0.1428, simple_loss=0.2201, pruned_loss=0.03268, over 4962.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02899, over 972779.45 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 07:45:56,568 INFO [train.py:715] (0/8) Epoch 18, batch 10450, loss[loss=0.126, simple_loss=0.209, pruned_loss=0.02153, over 4986.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.02836, over 973043.91 frames.], batch size: 28, lr: 1.24e-04 2022-05-09 07:46:36,299 INFO [train.py:715] (0/8) Epoch 18, batch 10500, loss[loss=0.1312, simple_loss=0.2012, pruned_loss=0.03059, over 4974.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2056, pruned_loss=0.02855, over 972789.10 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 07:47:15,165 INFO [train.py:715] (0/8) Epoch 18, batch 10550, loss[loss=0.1248, simple_loss=0.1991, pruned_loss=0.02519, over 4969.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02911, over 972680.26 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 07:47:53,898 INFO [train.py:715] (0/8) Epoch 18, batch 10600, loss[loss=0.1768, simple_loss=0.2541, pruned_loss=0.04972, over 4881.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02947, over 971642.86 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:48:33,501 INFO [train.py:715] (0/8) Epoch 18, batch 10650, loss[loss=0.1272, simple_loss=0.202, pruned_loss=0.02623, over 4836.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02916, over 971861.06 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 07:49:13,192 INFO [train.py:715] (0/8) Epoch 18, batch 10700, loss[loss=0.148, simple_loss=0.2174, pruned_loss=0.03936, over 4882.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02922, over 971461.13 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:49:52,111 INFO [train.py:715] (0/8) Epoch 18, batch 10750, loss[loss=0.113, simple_loss=0.1725, pruned_loss=0.02672, over 4833.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.02903, over 971861.52 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 07:50:31,125 INFO [train.py:715] (0/8) Epoch 18, batch 10800, loss[loss=0.1338, simple_loss=0.2088, pruned_loss=0.02941, over 4771.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.0286, over 972239.72 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 07:51:10,555 INFO [train.py:715] (0/8) Epoch 18, batch 10850, loss[loss=0.1094, simple_loss=0.1806, pruned_loss=0.01916, over 4826.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02823, over 971954.20 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 07:51:49,056 INFO [train.py:715] (0/8) Epoch 18, batch 10900, loss[loss=0.1242, simple_loss=0.197, pruned_loss=0.02572, over 4874.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02834, over 971793.88 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 07:52:27,636 INFO [train.py:715] (0/8) Epoch 18, batch 10950, loss[loss=0.1229, simple_loss=0.2046, pruned_loss=0.0206, over 4660.00 frames.], tot_loss[loss=0.131, simple_loss=0.2056, pruned_loss=0.0282, over 971512.20 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 07:53:07,688 INFO [train.py:715] (0/8) Epoch 18, batch 11000, loss[loss=0.132, simple_loss=0.2024, pruned_loss=0.03077, over 4912.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2065, pruned_loss=0.02848, over 972517.66 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 07:53:46,744 INFO [train.py:715] (0/8) Epoch 18, batch 11050, loss[loss=0.1176, simple_loss=0.1983, pruned_loss=0.01848, over 4792.00 frames.], tot_loss[loss=0.132, simple_loss=0.2066, pruned_loss=0.02867, over 971883.82 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 07:54:26,297 INFO [train.py:715] (0/8) Epoch 18, batch 11100, loss[loss=0.1585, simple_loss=0.2279, pruned_loss=0.04454, over 4959.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02915, over 971992.68 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 07:55:05,194 INFO [train.py:715] (0/8) Epoch 18, batch 11150, loss[loss=0.1479, simple_loss=0.2135, pruned_loss=0.04114, over 4915.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02872, over 972032.07 frames.], batch size: 39, lr: 1.24e-04 2022-05-09 07:55:44,747 INFO [train.py:715] (0/8) Epoch 18, batch 11200, loss[loss=0.09982, simple_loss=0.18, pruned_loss=0.009802, over 4890.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02934, over 971793.78 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 07:56:23,189 INFO [train.py:715] (0/8) Epoch 18, batch 11250, loss[loss=0.1119, simple_loss=0.1788, pruned_loss=0.02249, over 4835.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2056, pruned_loss=0.02858, over 971697.08 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 07:57:01,928 INFO [train.py:715] (0/8) Epoch 18, batch 11300, loss[loss=0.117, simple_loss=0.1939, pruned_loss=0.0201, over 4885.00 frames.], tot_loss[loss=0.1321, simple_loss=0.206, pruned_loss=0.02904, over 972320.22 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 07:57:41,018 INFO [train.py:715] (0/8) Epoch 18, batch 11350, loss[loss=0.117, simple_loss=0.1955, pruned_loss=0.01929, over 4777.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02876, over 972171.95 frames.], batch size: 12, lr: 1.24e-04 2022-05-09 07:58:20,188 INFO [train.py:715] (0/8) Epoch 18, batch 11400, loss[loss=0.1155, simple_loss=0.1864, pruned_loss=0.02227, over 4759.00 frames.], tot_loss[loss=0.131, simple_loss=0.2055, pruned_loss=0.02827, over 971559.41 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 07:58:59,554 INFO [train.py:715] (0/8) Epoch 18, batch 11450, loss[loss=0.1139, simple_loss=0.1999, pruned_loss=0.01391, over 4825.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2054, pruned_loss=0.02844, over 971145.90 frames.], batch size: 26, lr: 1.24e-04 2022-05-09 07:59:38,057 INFO [train.py:715] (0/8) Epoch 18, batch 11500, loss[loss=0.1943, simple_loss=0.2569, pruned_loss=0.06588, over 4861.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02871, over 971748.97 frames.], batch size: 32, lr: 1.24e-04 2022-05-09 08:00:17,724 INFO [train.py:715] (0/8) Epoch 18, batch 11550, loss[loss=0.1572, simple_loss=0.2261, pruned_loss=0.04418, over 4866.00 frames.], tot_loss[loss=0.1322, simple_loss=0.206, pruned_loss=0.02917, over 971621.16 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 08:00:57,125 INFO [train.py:715] (0/8) Epoch 18, batch 11600, loss[loss=0.1409, simple_loss=0.2196, pruned_loss=0.03106, over 4962.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2065, pruned_loss=0.02918, over 971396.34 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 08:01:35,946 INFO [train.py:715] (0/8) Epoch 18, batch 11650, loss[loss=0.1337, simple_loss=0.2017, pruned_loss=0.0328, over 4989.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02937, over 972249.60 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 08:02:15,653 INFO [train.py:715] (0/8) Epoch 18, batch 11700, loss[loss=0.108, simple_loss=0.1841, pruned_loss=0.01596, over 4875.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02924, over 971707.56 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 08:02:54,929 INFO [train.py:715] (0/8) Epoch 18, batch 11750, loss[loss=0.1216, simple_loss=0.2014, pruned_loss=0.02092, over 4932.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02926, over 971257.76 frames.], batch size: 23, lr: 1.24e-04 2022-05-09 08:03:34,974 INFO [train.py:715] (0/8) Epoch 18, batch 11800, loss[loss=0.1267, simple_loss=0.2019, pruned_loss=0.02573, over 4941.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2065, pruned_loss=0.02924, over 971497.50 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 08:04:13,539 INFO [train.py:715] (0/8) Epoch 18, batch 11850, loss[loss=0.1306, simple_loss=0.1949, pruned_loss=0.03318, over 4783.00 frames.], tot_loss[loss=0.132, simple_loss=0.206, pruned_loss=0.02903, over 971414.97 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 08:04:53,376 INFO [train.py:715] (0/8) Epoch 18, batch 11900, loss[loss=0.1121, simple_loss=0.1937, pruned_loss=0.01531, over 4930.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2057, pruned_loss=0.02905, over 971895.36 frames.], batch size: 23, lr: 1.24e-04 2022-05-09 08:05:32,228 INFO [train.py:715] (0/8) Epoch 18, batch 11950, loss[loss=0.144, simple_loss=0.2113, pruned_loss=0.03834, over 4967.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02911, over 972519.01 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 08:06:10,824 INFO [train.py:715] (0/8) Epoch 18, batch 12000, loss[loss=0.1059, simple_loss=0.184, pruned_loss=0.01394, over 4778.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2055, pruned_loss=0.02852, over 971962.81 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 08:06:10,825 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 08:06:20,737 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1046, simple_loss=0.188, pruned_loss=0.01063, over 914524.00 frames. 2022-05-09 08:07:00,013 INFO [train.py:715] (0/8) Epoch 18, batch 12050, loss[loss=0.1201, simple_loss=0.1913, pruned_loss=0.02447, over 4774.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2055, pruned_loss=0.02837, over 973021.03 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 08:07:39,517 INFO [train.py:715] (0/8) Epoch 18, batch 12100, loss[loss=0.1489, simple_loss=0.2171, pruned_loss=0.04031, over 4870.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02836, over 972593.37 frames.], batch size: 34, lr: 1.24e-04 2022-05-09 08:08:19,047 INFO [train.py:715] (0/8) Epoch 18, batch 12150, loss[loss=0.108, simple_loss=0.1826, pruned_loss=0.01672, over 4835.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2056, pruned_loss=0.02826, over 972009.67 frames.], batch size: 13, lr: 1.24e-04 2022-05-09 08:08:59,340 INFO [train.py:715] (0/8) Epoch 18, batch 12200, loss[loss=0.147, simple_loss=0.2172, pruned_loss=0.03845, over 4974.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02877, over 971937.19 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 08:09:38,274 INFO [train.py:715] (0/8) Epoch 18, batch 12250, loss[loss=0.125, simple_loss=0.2074, pruned_loss=0.0213, over 4881.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02855, over 972414.68 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 08:10:18,804 INFO [train.py:715] (0/8) Epoch 18, batch 12300, loss[loss=0.1487, simple_loss=0.2235, pruned_loss=0.03695, over 4875.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02877, over 972147.49 frames.], batch size: 16, lr: 1.24e-04 2022-05-09 08:10:58,225 INFO [train.py:715] (0/8) Epoch 18, batch 12350, loss[loss=0.1436, simple_loss=0.2195, pruned_loss=0.03385, over 4817.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02926, over 972536.40 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 08:11:37,140 INFO [train.py:715] (0/8) Epoch 18, batch 12400, loss[loss=0.1283, simple_loss=0.2151, pruned_loss=0.02072, over 4912.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.02877, over 972869.42 frames.], batch size: 18, lr: 1.24e-04 2022-05-09 08:12:16,682 INFO [train.py:715] (0/8) Epoch 18, batch 12450, loss[loss=0.139, simple_loss=0.2103, pruned_loss=0.03382, over 4799.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02894, over 972580.89 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 08:12:55,934 INFO [train.py:715] (0/8) Epoch 18, batch 12500, loss[loss=0.1196, simple_loss=0.1883, pruned_loss=0.02549, over 4908.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02891, over 972978.61 frames.], batch size: 17, lr: 1.24e-04 2022-05-09 08:13:36,319 INFO [train.py:715] (0/8) Epoch 18, batch 12550, loss[loss=0.1332, simple_loss=0.2079, pruned_loss=0.02927, over 4803.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02948, over 973087.52 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 08:14:14,818 INFO [train.py:715] (0/8) Epoch 18, batch 12600, loss[loss=0.1608, simple_loss=0.2343, pruned_loss=0.04365, over 4777.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02938, over 972662.90 frames.], batch size: 14, lr: 1.24e-04 2022-05-09 08:14:54,512 INFO [train.py:715] (0/8) Epoch 18, batch 12650, loss[loss=0.1168, simple_loss=0.1924, pruned_loss=0.02065, over 4865.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02949, over 972222.30 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 08:15:33,311 INFO [train.py:715] (0/8) Epoch 18, batch 12700, loss[loss=0.1196, simple_loss=0.1946, pruned_loss=0.02229, over 4974.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2081, pruned_loss=0.02955, over 972454.49 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 08:16:12,929 INFO [train.py:715] (0/8) Epoch 18, batch 12750, loss[loss=0.1374, simple_loss=0.2043, pruned_loss=0.03525, over 4980.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02943, over 973217.25 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 08:16:52,482 INFO [train.py:715] (0/8) Epoch 18, batch 12800, loss[loss=0.1371, simple_loss=0.2119, pruned_loss=0.03111, over 4841.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02936, over 973593.33 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 08:17:31,836 INFO [train.py:715] (0/8) Epoch 18, batch 12850, loss[loss=0.1215, simple_loss=0.2022, pruned_loss=0.02038, over 4933.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2073, pruned_loss=0.02907, over 972613.95 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 08:18:11,707 INFO [train.py:715] (0/8) Epoch 18, batch 12900, loss[loss=0.1307, simple_loss=0.1929, pruned_loss=0.03423, over 4956.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02917, over 972921.80 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 08:18:50,197 INFO [train.py:715] (0/8) Epoch 18, batch 12950, loss[loss=0.1474, simple_loss=0.237, pruned_loss=0.0289, over 4954.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02884, over 973208.55 frames.], batch size: 29, lr: 1.24e-04 2022-05-09 08:19:30,193 INFO [train.py:715] (0/8) Epoch 18, batch 13000, loss[loss=0.1249, simple_loss=0.2055, pruned_loss=0.02218, over 4812.00 frames.], tot_loss[loss=0.1322, simple_loss=0.207, pruned_loss=0.02867, over 972785.59 frames.], batch size: 24, lr: 1.24e-04 2022-05-09 08:20:09,528 INFO [train.py:715] (0/8) Epoch 18, batch 13050, loss[loss=0.1265, simple_loss=0.2081, pruned_loss=0.02242, over 4860.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2072, pruned_loss=0.0287, over 972859.09 frames.], batch size: 20, lr: 1.24e-04 2022-05-09 08:20:48,609 INFO [train.py:715] (0/8) Epoch 18, batch 13100, loss[loss=0.1336, simple_loss=0.2074, pruned_loss=0.02988, over 4864.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02916, over 972162.29 frames.], batch size: 30, lr: 1.24e-04 2022-05-09 08:21:28,136 INFO [train.py:715] (0/8) Epoch 18, batch 13150, loss[loss=0.1282, simple_loss=0.211, pruned_loss=0.02266, over 4692.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02853, over 972215.58 frames.], batch size: 15, lr: 1.24e-04 2022-05-09 08:22:07,408 INFO [train.py:715] (0/8) Epoch 18, batch 13200, loss[loss=0.131, simple_loss=0.2096, pruned_loss=0.02621, over 4969.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02852, over 972355.16 frames.], batch size: 35, lr: 1.24e-04 2022-05-09 08:22:47,220 INFO [train.py:715] (0/8) Epoch 18, batch 13250, loss[loss=0.117, simple_loss=0.1919, pruned_loss=0.02104, over 4764.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02847, over 972030.06 frames.], batch size: 19, lr: 1.24e-04 2022-05-09 08:23:25,808 INFO [train.py:715] (0/8) Epoch 18, batch 13300, loss[loss=0.1499, simple_loss=0.219, pruned_loss=0.04034, over 4949.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02906, over 972070.87 frames.], batch size: 21, lr: 1.24e-04 2022-05-09 08:24:05,547 INFO [train.py:715] (0/8) Epoch 18, batch 13350, loss[loss=0.1233, simple_loss=0.1943, pruned_loss=0.02612, over 4883.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.02901, over 971571.24 frames.], batch size: 22, lr: 1.24e-04 2022-05-09 08:24:44,550 INFO [train.py:715] (0/8) Epoch 18, batch 13400, loss[loss=0.1137, simple_loss=0.1896, pruned_loss=0.01891, over 4984.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02897, over 971691.92 frames.], batch size: 25, lr: 1.24e-04 2022-05-09 08:25:08,686 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-640000.pt 2022-05-09 08:25:25,444 INFO [train.py:715] (0/8) Epoch 18, batch 13450, loss[loss=0.1314, simple_loss=0.2082, pruned_loss=0.02733, over 4866.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02894, over 971908.15 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 08:26:05,135 INFO [train.py:715] (0/8) Epoch 18, batch 13500, loss[loss=0.114, simple_loss=0.1977, pruned_loss=0.0152, over 4941.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02868, over 972229.09 frames.], batch size: 29, lr: 1.23e-04 2022-05-09 08:26:44,080 INFO [train.py:715] (0/8) Epoch 18, batch 13550, loss[loss=0.1266, simple_loss=0.199, pruned_loss=0.02708, over 4969.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.0292, over 971305.98 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 08:27:23,341 INFO [train.py:715] (0/8) Epoch 18, batch 13600, loss[loss=0.1298, simple_loss=0.2002, pruned_loss=0.02973, over 4824.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.0295, over 971363.11 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 08:28:02,152 INFO [train.py:715] (0/8) Epoch 18, batch 13650, loss[loss=0.1457, simple_loss=0.2189, pruned_loss=0.03629, over 4899.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02934, over 971320.61 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 08:28:41,561 INFO [train.py:715] (0/8) Epoch 18, batch 13700, loss[loss=0.1261, simple_loss=0.1962, pruned_loss=0.02793, over 4930.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2077, pruned_loss=0.02939, over 971547.03 frames.], batch size: 29, lr: 1.23e-04 2022-05-09 08:29:20,646 INFO [train.py:715] (0/8) Epoch 18, batch 13750, loss[loss=0.1241, simple_loss=0.1911, pruned_loss=0.02853, over 4803.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02948, over 972213.91 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:29:59,711 INFO [train.py:715] (0/8) Epoch 18, batch 13800, loss[loss=0.146, simple_loss=0.2217, pruned_loss=0.0351, over 4816.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02935, over 972498.48 frames.], batch size: 27, lr: 1.23e-04 2022-05-09 08:30:39,479 INFO [train.py:715] (0/8) Epoch 18, batch 13850, loss[loss=0.1267, simple_loss=0.214, pruned_loss=0.01965, over 4783.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2064, pruned_loss=0.02923, over 972537.97 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 08:31:18,291 INFO [train.py:715] (0/8) Epoch 18, batch 13900, loss[loss=0.1271, simple_loss=0.2014, pruned_loss=0.02644, over 4866.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2058, pruned_loss=0.02919, over 972187.05 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 08:31:57,750 INFO [train.py:715] (0/8) Epoch 18, batch 13950, loss[loss=0.1327, simple_loss=0.1971, pruned_loss=0.03416, over 4973.00 frames.], tot_loss[loss=0.1321, simple_loss=0.206, pruned_loss=0.02908, over 972631.55 frames.], batch size: 35, lr: 1.23e-04 2022-05-09 08:32:37,348 INFO [train.py:715] (0/8) Epoch 18, batch 14000, loss[loss=0.1357, simple_loss=0.2189, pruned_loss=0.02622, over 4889.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02937, over 972996.74 frames.], batch size: 22, lr: 1.23e-04 2022-05-09 08:33:17,110 INFO [train.py:715] (0/8) Epoch 18, batch 14050, loss[loss=0.1424, simple_loss=0.215, pruned_loss=0.03489, over 4759.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2071, pruned_loss=0.0291, over 973931.07 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 08:33:56,294 INFO [train.py:715] (0/8) Epoch 18, batch 14100, loss[loss=0.132, simple_loss=0.2112, pruned_loss=0.0264, over 4907.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02929, over 973189.14 frames.], batch size: 22, lr: 1.23e-04 2022-05-09 08:34:35,396 INFO [train.py:715] (0/8) Epoch 18, batch 14150, loss[loss=0.1288, simple_loss=0.2024, pruned_loss=0.02756, over 4854.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02914, over 972649.84 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 08:35:14,782 INFO [train.py:715] (0/8) Epoch 18, batch 14200, loss[loss=0.1436, simple_loss=0.2312, pruned_loss=0.02799, over 4940.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2078, pruned_loss=0.02933, over 972219.17 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 08:35:54,058 INFO [train.py:715] (0/8) Epoch 18, batch 14250, loss[loss=0.1626, simple_loss=0.2363, pruned_loss=0.0445, over 4938.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02965, over 972734.71 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:36:33,984 INFO [train.py:715] (0/8) Epoch 18, batch 14300, loss[loss=0.1373, simple_loss=0.2186, pruned_loss=0.02802, over 4919.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02945, over 972181.15 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 08:37:13,315 INFO [train.py:715] (0/8) Epoch 18, batch 14350, loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.0288, over 4805.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02943, over 972318.71 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:37:52,857 INFO [train.py:715] (0/8) Epoch 18, batch 14400, loss[loss=0.1486, simple_loss=0.2204, pruned_loss=0.03837, over 4970.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.0295, over 972080.39 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 08:38:32,499 INFO [train.py:715] (0/8) Epoch 18, batch 14450, loss[loss=0.1499, simple_loss=0.2288, pruned_loss=0.03549, over 4787.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02939, over 971922.50 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 08:39:11,248 INFO [train.py:715] (0/8) Epoch 18, batch 14500, loss[loss=0.1014, simple_loss=0.1657, pruned_loss=0.01854, over 4751.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02871, over 972014.83 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 08:39:50,386 INFO [train.py:715] (0/8) Epoch 18, batch 14550, loss[loss=0.1466, simple_loss=0.2318, pruned_loss=0.03065, over 4910.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02852, over 972237.35 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 08:40:29,520 INFO [train.py:715] (0/8) Epoch 18, batch 14600, loss[loss=0.1173, simple_loss=0.1939, pruned_loss=0.0204, over 4791.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02843, over 971882.90 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:41:09,221 INFO [train.py:715] (0/8) Epoch 18, batch 14650, loss[loss=0.1225, simple_loss=0.1996, pruned_loss=0.02272, over 4894.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2062, pruned_loss=0.0284, over 972020.58 frames.], batch size: 22, lr: 1.23e-04 2022-05-09 08:41:48,677 INFO [train.py:715] (0/8) Epoch 18, batch 14700, loss[loss=0.1072, simple_loss=0.1831, pruned_loss=0.01565, over 4979.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02912, over 972883.19 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 08:42:28,034 INFO [train.py:715] (0/8) Epoch 18, batch 14750, loss[loss=0.09721, simple_loss=0.1664, pruned_loss=0.01402, over 4818.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02877, over 973205.14 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 08:43:07,463 INFO [train.py:715] (0/8) Epoch 18, batch 14800, loss[loss=0.1421, simple_loss=0.2201, pruned_loss=0.03203, over 4943.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02885, over 972628.50 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:43:46,218 INFO [train.py:715] (0/8) Epoch 18, batch 14850, loss[loss=0.1451, simple_loss=0.2103, pruned_loss=0.03994, over 4775.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.0293, over 973193.67 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 08:44:25,879 INFO [train.py:715] (0/8) Epoch 18, batch 14900, loss[loss=0.1595, simple_loss=0.2238, pruned_loss=0.04765, over 4761.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02906, over 972997.21 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 08:45:05,547 INFO [train.py:715] (0/8) Epoch 18, batch 14950, loss[loss=0.1197, simple_loss=0.19, pruned_loss=0.02473, over 4847.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02901, over 972912.02 frames.], batch size: 30, lr: 1.23e-04 2022-05-09 08:45:44,813 INFO [train.py:715] (0/8) Epoch 18, batch 15000, loss[loss=0.1301, simple_loss=0.204, pruned_loss=0.02805, over 4916.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02896, over 973183.96 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 08:45:44,814 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 08:45:54,766 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1048, simple_loss=0.1881, pruned_loss=0.01071, over 914524.00 frames. 2022-05-09 08:46:34,347 INFO [train.py:715] (0/8) Epoch 18, batch 15050, loss[loss=0.1338, simple_loss=0.2097, pruned_loss=0.02898, over 4761.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02902, over 972085.13 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 08:47:13,522 INFO [train.py:715] (0/8) Epoch 18, batch 15100, loss[loss=0.1171, simple_loss=0.1885, pruned_loss=0.02283, over 4956.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.0289, over 971530.82 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:47:53,254 INFO [train.py:715] (0/8) Epoch 18, batch 15150, loss[loss=0.1331, simple_loss=0.2196, pruned_loss=0.02329, over 4803.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02885, over 971459.94 frames.], batch size: 25, lr: 1.23e-04 2022-05-09 08:48:32,388 INFO [train.py:715] (0/8) Epoch 18, batch 15200, loss[loss=0.1523, simple_loss=0.2243, pruned_loss=0.04013, over 4806.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2074, pruned_loss=0.02878, over 971382.91 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 08:49:11,929 INFO [train.py:715] (0/8) Epoch 18, batch 15250, loss[loss=0.1192, simple_loss=0.1966, pruned_loss=0.02091, over 4732.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2071, pruned_loss=0.0287, over 970820.21 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 08:49:51,790 INFO [train.py:715] (0/8) Epoch 18, batch 15300, loss[loss=0.1102, simple_loss=0.1871, pruned_loss=0.01661, over 4760.00 frames.], tot_loss[loss=0.1324, simple_loss=0.207, pruned_loss=0.02885, over 971498.22 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 08:50:31,164 INFO [train.py:715] (0/8) Epoch 18, batch 15350, loss[loss=0.1041, simple_loss=0.1794, pruned_loss=0.01438, over 4955.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02853, over 971826.08 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 08:51:10,082 INFO [train.py:715] (0/8) Epoch 18, batch 15400, loss[loss=0.1345, simple_loss=0.2028, pruned_loss=0.03309, over 4995.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2066, pruned_loss=0.02827, over 971997.67 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 08:51:49,372 INFO [train.py:715] (0/8) Epoch 18, batch 15450, loss[loss=0.1788, simple_loss=0.2405, pruned_loss=0.05858, over 4915.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02867, over 972138.30 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 08:52:28,994 INFO [train.py:715] (0/8) Epoch 18, batch 15500, loss[loss=0.126, simple_loss=0.1876, pruned_loss=0.03214, over 4740.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2058, pruned_loss=0.02877, over 972576.18 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 08:53:08,162 INFO [train.py:715] (0/8) Epoch 18, batch 15550, loss[loss=0.1216, simple_loss=0.201, pruned_loss=0.02109, over 4637.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02843, over 972164.25 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 08:53:47,889 INFO [train.py:715] (0/8) Epoch 18, batch 15600, loss[loss=0.1269, simple_loss=0.2137, pruned_loss=0.02009, over 4816.00 frames.], tot_loss[loss=0.131, simple_loss=0.2055, pruned_loss=0.02827, over 972135.58 frames.], batch size: 26, lr: 1.23e-04 2022-05-09 08:54:28,012 INFO [train.py:715] (0/8) Epoch 18, batch 15650, loss[loss=0.1199, simple_loss=0.1926, pruned_loss=0.02363, over 4919.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02864, over 971459.07 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 08:55:07,610 INFO [train.py:715] (0/8) Epoch 18, batch 15700, loss[loss=0.155, simple_loss=0.2426, pruned_loss=0.0337, over 4917.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02892, over 971840.84 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 08:55:46,514 INFO [train.py:715] (0/8) Epoch 18, batch 15750, loss[loss=0.132, simple_loss=0.2177, pruned_loss=0.02315, over 4956.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02857, over 971615.98 frames.], batch size: 39, lr: 1.23e-04 2022-05-09 08:56:25,956 INFO [train.py:715] (0/8) Epoch 18, batch 15800, loss[loss=0.11, simple_loss=0.1903, pruned_loss=0.0148, over 4773.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02891, over 972058.82 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 08:57:05,866 INFO [train.py:715] (0/8) Epoch 18, batch 15850, loss[loss=0.1381, simple_loss=0.2123, pruned_loss=0.03195, over 4987.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2066, pruned_loss=0.02917, over 972231.51 frames.], batch size: 27, lr: 1.23e-04 2022-05-09 08:57:45,096 INFO [train.py:715] (0/8) Epoch 18, batch 15900, loss[loss=0.1733, simple_loss=0.2576, pruned_loss=0.04448, over 4985.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02905, over 973026.82 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 08:58:24,415 INFO [train.py:715] (0/8) Epoch 18, batch 15950, loss[loss=0.1364, simple_loss=0.2282, pruned_loss=0.02225, over 4826.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.0288, over 973102.29 frames.], batch size: 26, lr: 1.23e-04 2022-05-09 08:59:04,889 INFO [train.py:715] (0/8) Epoch 18, batch 16000, loss[loss=0.1419, simple_loss=0.2258, pruned_loss=0.02899, over 4823.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02875, over 972187.62 frames.], batch size: 26, lr: 1.23e-04 2022-05-09 08:59:45,377 INFO [train.py:715] (0/8) Epoch 18, batch 16050, loss[loss=0.1301, simple_loss=0.2122, pruned_loss=0.024, over 4799.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02913, over 972945.41 frames.], batch size: 25, lr: 1.23e-04 2022-05-09 09:00:24,419 INFO [train.py:715] (0/8) Epoch 18, batch 16100, loss[loss=0.1457, simple_loss=0.2189, pruned_loss=0.03621, over 4801.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.02886, over 973307.43 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 09:01:03,600 INFO [train.py:715] (0/8) Epoch 18, batch 16150, loss[loss=0.1518, simple_loss=0.2185, pruned_loss=0.04257, over 4985.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02874, over 972551.94 frames.], batch size: 31, lr: 1.23e-04 2022-05-09 09:01:43,693 INFO [train.py:715] (0/8) Epoch 18, batch 16200, loss[loss=0.1305, simple_loss=0.2098, pruned_loss=0.0256, over 4926.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02915, over 973563.63 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 09:02:22,642 INFO [train.py:715] (0/8) Epoch 18, batch 16250, loss[loss=0.1046, simple_loss=0.1675, pruned_loss=0.0208, over 4775.00 frames.], tot_loss[loss=0.133, simple_loss=0.2078, pruned_loss=0.02907, over 973196.93 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 09:03:01,670 INFO [train.py:715] (0/8) Epoch 18, batch 16300, loss[loss=0.1401, simple_loss=0.217, pruned_loss=0.03159, over 4885.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2073, pruned_loss=0.02865, over 973454.29 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 09:03:41,211 INFO [train.py:715] (0/8) Epoch 18, batch 16350, loss[loss=0.1439, simple_loss=0.2213, pruned_loss=0.03329, over 4780.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2073, pruned_loss=0.02889, over 973191.43 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:04:20,326 INFO [train.py:715] (0/8) Epoch 18, batch 16400, loss[loss=0.1427, simple_loss=0.2088, pruned_loss=0.03835, over 4891.00 frames.], tot_loss[loss=0.1334, simple_loss=0.208, pruned_loss=0.02939, over 973362.79 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:04:59,284 INFO [train.py:715] (0/8) Epoch 18, batch 16450, loss[loss=0.1161, simple_loss=0.1939, pruned_loss=0.01911, over 4916.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2077, pruned_loss=0.02951, over 972723.63 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:05:38,806 INFO [train.py:715] (0/8) Epoch 18, batch 16500, loss[loss=0.1291, simple_loss=0.2104, pruned_loss=0.02389, over 4779.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02967, over 971457.56 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:06:18,647 INFO [train.py:715] (0/8) Epoch 18, batch 16550, loss[loss=0.1374, simple_loss=0.2127, pruned_loss=0.03107, over 4986.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02934, over 971966.17 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:06:57,075 INFO [train.py:715] (0/8) Epoch 18, batch 16600, loss[loss=0.1298, simple_loss=0.2169, pruned_loss=0.02132, over 4810.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02928, over 971604.19 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 09:07:36,511 INFO [train.py:715] (0/8) Epoch 18, batch 16650, loss[loss=0.1311, simple_loss=0.203, pruned_loss=0.02962, over 4843.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.02914, over 971520.10 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 09:08:15,857 INFO [train.py:715] (0/8) Epoch 18, batch 16700, loss[loss=0.124, simple_loss=0.2016, pruned_loss=0.02321, over 4981.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.02856, over 971165.73 frames.], batch size: 27, lr: 1.23e-04 2022-05-09 09:08:55,197 INFO [train.py:715] (0/8) Epoch 18, batch 16750, loss[loss=0.1209, simple_loss=0.194, pruned_loss=0.02392, over 4847.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02821, over 971189.58 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 09:09:34,643 INFO [train.py:715] (0/8) Epoch 18, batch 16800, loss[loss=0.1281, simple_loss=0.2005, pruned_loss=0.02784, over 4757.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02828, over 971449.81 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 09:10:13,847 INFO [train.py:715] (0/8) Epoch 18, batch 16850, loss[loss=0.0891, simple_loss=0.1636, pruned_loss=0.007299, over 4858.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.02826, over 971994.90 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 09:10:53,308 INFO [train.py:715] (0/8) Epoch 18, batch 16900, loss[loss=0.1364, simple_loss=0.208, pruned_loss=0.03244, over 4894.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2066, pruned_loss=0.02823, over 972330.87 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 09:11:32,152 INFO [train.py:715] (0/8) Epoch 18, batch 16950, loss[loss=0.1432, simple_loss=0.2154, pruned_loss=0.03546, over 4988.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02846, over 972451.60 frames.], batch size: 31, lr: 1.23e-04 2022-05-09 09:12:11,611 INFO [train.py:715] (0/8) Epoch 18, batch 17000, loss[loss=0.1078, simple_loss=0.185, pruned_loss=0.01532, over 4982.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2064, pruned_loss=0.02821, over 972044.41 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 09:12:51,061 INFO [train.py:715] (0/8) Epoch 18, batch 17050, loss[loss=0.1473, simple_loss=0.2133, pruned_loss=0.0407, over 4745.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.0285, over 972397.55 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 09:13:30,536 INFO [train.py:715] (0/8) Epoch 18, batch 17100, loss[loss=0.1311, simple_loss=0.2028, pruned_loss=0.02968, over 4825.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2058, pruned_loss=0.02834, over 972091.09 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:14:10,114 INFO [train.py:715] (0/8) Epoch 18, batch 17150, loss[loss=0.1622, simple_loss=0.2228, pruned_loss=0.05085, over 4977.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02876, over 972084.10 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:14:49,238 INFO [train.py:715] (0/8) Epoch 18, batch 17200, loss[loss=0.1226, simple_loss=0.2075, pruned_loss=0.01888, over 4859.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02894, over 972297.09 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 09:15:28,957 INFO [train.py:715] (0/8) Epoch 18, batch 17250, loss[loss=0.1669, simple_loss=0.2287, pruned_loss=0.05261, over 4919.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02878, over 971888.55 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 09:16:08,221 INFO [train.py:715] (0/8) Epoch 18, batch 17300, loss[loss=0.1559, simple_loss=0.2252, pruned_loss=0.04332, over 4886.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02869, over 971683.09 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 09:16:48,151 INFO [train.py:715] (0/8) Epoch 18, batch 17350, loss[loss=0.1268, simple_loss=0.2059, pruned_loss=0.02389, over 4945.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.0285, over 971917.98 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 09:17:27,212 INFO [train.py:715] (0/8) Epoch 18, batch 17400, loss[loss=0.147, simple_loss=0.2149, pruned_loss=0.03951, over 4706.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02854, over 971876.93 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:18:07,004 INFO [train.py:715] (0/8) Epoch 18, batch 17450, loss[loss=0.1351, simple_loss=0.2099, pruned_loss=0.03011, over 4791.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2067, pruned_loss=0.0285, over 971960.32 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:18:46,084 INFO [train.py:715] (0/8) Epoch 18, batch 17500, loss[loss=0.1353, simple_loss=0.2174, pruned_loss=0.02658, over 4930.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2059, pruned_loss=0.0279, over 971827.05 frames.], batch size: 39, lr: 1.23e-04 2022-05-09 09:19:24,710 INFO [train.py:715] (0/8) Epoch 18, batch 17550, loss[loss=0.1356, simple_loss=0.2205, pruned_loss=0.02532, over 4917.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2056, pruned_loss=0.02784, over 971589.12 frames.], batch size: 29, lr: 1.23e-04 2022-05-09 09:20:04,279 INFO [train.py:715] (0/8) Epoch 18, batch 17600, loss[loss=0.1012, simple_loss=0.1703, pruned_loss=0.01607, over 4757.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2055, pruned_loss=0.02839, over 971444.14 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 09:20:43,546 INFO [train.py:715] (0/8) Epoch 18, batch 17650, loss[loss=0.1385, simple_loss=0.2064, pruned_loss=0.03525, over 4914.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.02846, over 971910.83 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:21:22,850 INFO [train.py:715] (0/8) Epoch 18, batch 17700, loss[loss=0.1282, simple_loss=0.1914, pruned_loss=0.03252, over 4788.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02847, over 972094.87 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 09:22:01,948 INFO [train.py:715] (0/8) Epoch 18, batch 17750, loss[loss=0.1226, simple_loss=0.1872, pruned_loss=0.029, over 4846.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02886, over 972162.97 frames.], batch size: 32, lr: 1.23e-04 2022-05-09 09:22:41,546 INFO [train.py:715] (0/8) Epoch 18, batch 17800, loss[loss=0.1171, simple_loss=0.1947, pruned_loss=0.01976, over 4825.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02855, over 973492.17 frames.], batch size: 25, lr: 1.23e-04 2022-05-09 09:23:20,833 INFO [train.py:715] (0/8) Epoch 18, batch 17850, loss[loss=0.1107, simple_loss=0.1865, pruned_loss=0.01748, over 4801.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02852, over 971952.10 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 09:23:59,344 INFO [train.py:715] (0/8) Epoch 18, batch 17900, loss[loss=0.1356, simple_loss=0.2119, pruned_loss=0.02968, over 4829.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.0283, over 971999.13 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:24:39,458 INFO [train.py:715] (0/8) Epoch 18, batch 17950, loss[loss=0.1279, simple_loss=0.1961, pruned_loss=0.02988, over 4969.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02852, over 972650.92 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:25:18,520 INFO [train.py:715] (0/8) Epoch 18, batch 18000, loss[loss=0.1244, simple_loss=0.2101, pruned_loss=0.0193, over 4943.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2062, pruned_loss=0.02841, over 972607.34 frames.], batch size: 29, lr: 1.23e-04 2022-05-09 09:25:18,521 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 09:25:28,383 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1046, simple_loss=0.1878, pruned_loss=0.01063, over 914524.00 frames. 2022-05-09 09:26:07,770 INFO [train.py:715] (0/8) Epoch 18, batch 18050, loss[loss=0.1105, simple_loss=0.1848, pruned_loss=0.01813, over 4837.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.02811, over 972487.06 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 09:26:47,165 INFO [train.py:715] (0/8) Epoch 18, batch 18100, loss[loss=0.1313, simple_loss=0.2015, pruned_loss=0.03052, over 4747.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.02812, over 972568.18 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 09:27:26,268 INFO [train.py:715] (0/8) Epoch 18, batch 18150, loss[loss=0.1334, simple_loss=0.2079, pruned_loss=0.02947, over 4912.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2053, pruned_loss=0.02825, over 972635.02 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 09:28:06,059 INFO [train.py:715] (0/8) Epoch 18, batch 18200, loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02794, over 4824.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.02799, over 972504.12 frames.], batch size: 26, lr: 1.23e-04 2022-05-09 09:28:45,751 INFO [train.py:715] (0/8) Epoch 18, batch 18250, loss[loss=0.114, simple_loss=0.1912, pruned_loss=0.0184, over 4764.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02864, over 972706.55 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 09:29:24,151 INFO [train.py:715] (0/8) Epoch 18, batch 18300, loss[loss=0.1455, simple_loss=0.2204, pruned_loss=0.03529, over 4789.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02901, over 973000.65 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 09:30:03,823 INFO [train.py:715] (0/8) Epoch 18, batch 18350, loss[loss=0.1286, simple_loss=0.201, pruned_loss=0.02812, over 4809.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2078, pruned_loss=0.0297, over 972710.10 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 09:30:43,380 INFO [train.py:715] (0/8) Epoch 18, batch 18400, loss[loss=0.1366, simple_loss=0.2146, pruned_loss=0.0293, over 4761.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.02951, over 972095.01 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 09:31:22,380 INFO [train.py:715] (0/8) Epoch 18, batch 18450, loss[loss=0.138, simple_loss=0.2139, pruned_loss=0.03103, over 4773.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02897, over 972538.74 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:32:01,510 INFO [train.py:715] (0/8) Epoch 18, batch 18500, loss[loss=0.1405, simple_loss=0.2129, pruned_loss=0.03404, over 4987.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02932, over 971988.43 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:32:40,862 INFO [train.py:715] (0/8) Epoch 18, batch 18550, loss[loss=0.1937, simple_loss=0.2597, pruned_loss=0.0638, over 4922.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02956, over 972657.28 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 09:33:20,072 INFO [train.py:715] (0/8) Epoch 18, batch 18600, loss[loss=0.1393, simple_loss=0.2277, pruned_loss=0.02547, over 4809.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02916, over 972057.00 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 09:33:58,711 INFO [train.py:715] (0/8) Epoch 18, batch 18650, loss[loss=0.1423, simple_loss=0.2163, pruned_loss=0.0341, over 4938.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.0294, over 972623.18 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 09:34:38,205 INFO [train.py:715] (0/8) Epoch 18, batch 18700, loss[loss=0.1616, simple_loss=0.2331, pruned_loss=0.04503, over 4912.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02978, over 972124.46 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 09:35:17,420 INFO [train.py:715] (0/8) Epoch 18, batch 18750, loss[loss=0.1259, simple_loss=0.1923, pruned_loss=0.02971, over 4861.00 frames.], tot_loss[loss=0.1343, simple_loss=0.2083, pruned_loss=0.0301, over 972941.24 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 09:35:56,635 INFO [train.py:715] (0/8) Epoch 18, batch 18800, loss[loss=0.1491, simple_loss=0.2221, pruned_loss=0.03809, over 4842.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2083, pruned_loss=0.02992, over 971934.59 frames.], batch size: 30, lr: 1.23e-04 2022-05-09 09:36:35,995 INFO [train.py:715] (0/8) Epoch 18, batch 18850, loss[loss=0.117, simple_loss=0.197, pruned_loss=0.01853, over 4986.00 frames.], tot_loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02999, over 972630.01 frames.], batch size: 28, lr: 1.23e-04 2022-05-09 09:37:15,844 INFO [train.py:715] (0/8) Epoch 18, batch 18900, loss[loss=0.1312, simple_loss=0.2039, pruned_loss=0.02923, over 4861.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.0294, over 972012.14 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 09:37:54,909 INFO [train.py:715] (0/8) Epoch 18, batch 18950, loss[loss=0.1312, simple_loss=0.2106, pruned_loss=0.02591, over 4974.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2079, pruned_loss=0.02961, over 972396.69 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 09:38:33,354 INFO [train.py:715] (0/8) Epoch 18, batch 19000, loss[loss=0.1058, simple_loss=0.1822, pruned_loss=0.01468, over 4887.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2074, pruned_loss=0.02956, over 972743.51 frames.], batch size: 22, lr: 1.23e-04 2022-05-09 09:39:12,865 INFO [train.py:715] (0/8) Epoch 18, batch 19050, loss[loss=0.1382, simple_loss=0.2061, pruned_loss=0.03515, over 4840.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.0294, over 972684.63 frames.], batch size: 32, lr: 1.23e-04 2022-05-09 09:39:51,871 INFO [train.py:715] (0/8) Epoch 18, batch 19100, loss[loss=0.1296, simple_loss=0.2047, pruned_loss=0.02728, over 4938.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2076, pruned_loss=0.02937, over 972964.52 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:40:31,188 INFO [train.py:715] (0/8) Epoch 18, batch 19150, loss[loss=0.1576, simple_loss=0.2282, pruned_loss=0.04353, over 4921.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.0294, over 973810.56 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 09:41:11,055 INFO [train.py:715] (0/8) Epoch 18, batch 19200, loss[loss=0.1361, simple_loss=0.2113, pruned_loss=0.03048, over 4957.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2068, pruned_loss=0.02935, over 974173.48 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 09:41:50,578 INFO [train.py:715] (0/8) Epoch 18, batch 19250, loss[loss=0.1177, simple_loss=0.2036, pruned_loss=0.01595, over 4801.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2076, pruned_loss=0.02994, over 973383.70 frames.], batch size: 25, lr: 1.23e-04 2022-05-09 09:42:29,654 INFO [train.py:715] (0/8) Epoch 18, batch 19300, loss[loss=0.138, simple_loss=0.2132, pruned_loss=0.03139, over 4969.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2065, pruned_loss=0.02951, over 973082.57 frames.], batch size: 39, lr: 1.23e-04 2022-05-09 09:43:08,119 INFO [train.py:715] (0/8) Epoch 18, batch 19350, loss[loss=0.126, simple_loss=0.1976, pruned_loss=0.02723, over 4856.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2068, pruned_loss=0.02931, over 973137.38 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 09:43:47,525 INFO [train.py:715] (0/8) Epoch 18, batch 19400, loss[loss=0.1453, simple_loss=0.2125, pruned_loss=0.03907, over 4826.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02926, over 972052.25 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:44:26,731 INFO [train.py:715] (0/8) Epoch 18, batch 19450, loss[loss=0.1294, simple_loss=0.1986, pruned_loss=0.03008, over 4976.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2064, pruned_loss=0.02901, over 972497.43 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:45:05,482 INFO [train.py:715] (0/8) Epoch 18, batch 19500, loss[loss=0.1174, simple_loss=0.1831, pruned_loss=0.02588, over 4835.00 frames.], tot_loss[loss=0.131, simple_loss=0.2053, pruned_loss=0.02832, over 972472.98 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 09:45:44,652 INFO [train.py:715] (0/8) Epoch 18, batch 19550, loss[loss=0.1224, simple_loss=0.2024, pruned_loss=0.0212, over 4817.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2051, pruned_loss=0.02808, over 972174.79 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 09:46:24,061 INFO [train.py:715] (0/8) Epoch 18, batch 19600, loss[loss=0.1393, simple_loss=0.2135, pruned_loss=0.03257, over 4931.00 frames.], tot_loss[loss=0.131, simple_loss=0.2056, pruned_loss=0.02819, over 972930.95 frames.], batch size: 39, lr: 1.23e-04 2022-05-09 09:47:02,884 INFO [train.py:715] (0/8) Epoch 18, batch 19650, loss[loss=0.2181, simple_loss=0.2943, pruned_loss=0.07097, over 4949.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2069, pruned_loss=0.02876, over 972553.25 frames.], batch size: 39, lr: 1.23e-04 2022-05-09 09:47:41,714 INFO [train.py:715] (0/8) Epoch 18, batch 19700, loss[loss=0.1623, simple_loss=0.2406, pruned_loss=0.04201, over 4834.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02927, over 972740.78 frames.], batch size: 30, lr: 1.23e-04 2022-05-09 09:48:21,732 INFO [train.py:715] (0/8) Epoch 18, batch 19750, loss[loss=0.1589, simple_loss=0.227, pruned_loss=0.0454, over 4755.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02922, over 973089.28 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 09:49:01,596 INFO [train.py:715] (0/8) Epoch 18, batch 19800, loss[loss=0.1171, simple_loss=0.1947, pruned_loss=0.01971, over 4827.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02922, over 972962.46 frames.], batch size: 27, lr: 1.23e-04 2022-05-09 09:49:40,674 INFO [train.py:715] (0/8) Epoch 18, batch 19850, loss[loss=0.1077, simple_loss=0.179, pruned_loss=0.01819, over 4907.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2075, pruned_loss=0.02983, over 972682.01 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 09:50:20,123 INFO [train.py:715] (0/8) Epoch 18, batch 19900, loss[loss=0.1678, simple_loss=0.2364, pruned_loss=0.04961, over 4970.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2068, pruned_loss=0.02945, over 972571.22 frames.], batch size: 35, lr: 1.23e-04 2022-05-09 09:50:59,802 INFO [train.py:715] (0/8) Epoch 18, batch 19950, loss[loss=0.1341, simple_loss=0.2193, pruned_loss=0.02443, over 4753.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.0293, over 973376.05 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 09:51:39,045 INFO [train.py:715] (0/8) Epoch 18, batch 20000, loss[loss=0.1286, simple_loss=0.1984, pruned_loss=0.0294, over 4825.00 frames.], tot_loss[loss=0.133, simple_loss=0.2067, pruned_loss=0.02967, over 973335.63 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 09:52:18,800 INFO [train.py:715] (0/8) Epoch 18, batch 20050, loss[loss=0.1296, simple_loss=0.2121, pruned_loss=0.02353, over 4857.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2054, pruned_loss=0.02903, over 972828.07 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 09:52:59,019 INFO [train.py:715] (0/8) Epoch 18, batch 20100, loss[loss=0.1439, simple_loss=0.2224, pruned_loss=0.03271, over 4913.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2064, pruned_loss=0.02944, over 973031.27 frames.], batch size: 29, lr: 1.23e-04 2022-05-09 09:53:39,145 INFO [train.py:715] (0/8) Epoch 18, batch 20150, loss[loss=0.1074, simple_loss=0.1918, pruned_loss=0.01151, over 4949.00 frames.], tot_loss[loss=0.1321, simple_loss=0.206, pruned_loss=0.02912, over 972216.39 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 09:54:18,208 INFO [train.py:715] (0/8) Epoch 18, batch 20200, loss[loss=0.1483, simple_loss=0.208, pruned_loss=0.04429, over 4955.00 frames.], tot_loss[loss=0.133, simple_loss=0.207, pruned_loss=0.02946, over 972854.49 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 09:54:57,191 INFO [train.py:715] (0/8) Epoch 18, batch 20250, loss[loss=0.1333, simple_loss=0.2095, pruned_loss=0.02852, over 4931.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02912, over 973915.41 frames.], batch size: 23, lr: 1.23e-04 2022-05-09 09:55:36,872 INFO [train.py:715] (0/8) Epoch 18, batch 20300, loss[loss=0.1563, simple_loss=0.2223, pruned_loss=0.04512, over 4860.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2061, pruned_loss=0.02904, over 973413.84 frames.], batch size: 30, lr: 1.23e-04 2022-05-09 09:56:16,005 INFO [train.py:715] (0/8) Epoch 18, batch 20350, loss[loss=0.1256, simple_loss=0.209, pruned_loss=0.02108, over 4960.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02933, over 973057.32 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 09:56:55,262 INFO [train.py:715] (0/8) Epoch 18, batch 20400, loss[loss=0.1273, simple_loss=0.2091, pruned_loss=0.02271, over 4787.00 frames.], tot_loss[loss=0.132, simple_loss=0.2059, pruned_loss=0.02902, over 973146.11 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 09:57:34,098 INFO [train.py:715] (0/8) Epoch 18, batch 20450, loss[loss=0.1315, simple_loss=0.2052, pruned_loss=0.02897, over 4968.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2053, pruned_loss=0.02854, over 973104.23 frames.], batch size: 25, lr: 1.23e-04 2022-05-09 09:58:14,207 INFO [train.py:715] (0/8) Epoch 18, batch 20500, loss[loss=0.128, simple_loss=0.2023, pruned_loss=0.02682, over 4781.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02878, over 972309.84 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 09:58:52,921 INFO [train.py:715] (0/8) Epoch 18, batch 20550, loss[loss=0.1617, simple_loss=0.2268, pruned_loss=0.04828, over 4954.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2061, pruned_loss=0.02927, over 971950.83 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 09:59:31,852 INFO [train.py:715] (0/8) Epoch 18, batch 20600, loss[loss=0.1459, simple_loss=0.228, pruned_loss=0.0319, over 4804.00 frames.], tot_loss[loss=0.132, simple_loss=0.2058, pruned_loss=0.02908, over 972272.72 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 10:00:10,870 INFO [train.py:715] (0/8) Epoch 18, batch 20650, loss[loss=0.1657, simple_loss=0.2296, pruned_loss=0.05092, over 4990.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02902, over 971552.13 frames.], batch size: 33, lr: 1.23e-04 2022-05-09 10:00:50,420 INFO [train.py:715] (0/8) Epoch 18, batch 20700, loss[loss=0.1223, simple_loss=0.1997, pruned_loss=0.02241, over 4864.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02869, over 971963.59 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:01:28,859 INFO [train.py:715] (0/8) Epoch 18, batch 20750, loss[loss=0.1409, simple_loss=0.225, pruned_loss=0.02836, over 4787.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02876, over 972149.95 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 10:02:08,330 INFO [train.py:715] (0/8) Epoch 18, batch 20800, loss[loss=0.1285, simple_loss=0.1968, pruned_loss=0.03012, over 4758.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02858, over 972471.49 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:02:47,768 INFO [train.py:715] (0/8) Epoch 18, batch 20850, loss[loss=0.1406, simple_loss=0.2163, pruned_loss=0.03244, over 4938.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2054, pruned_loss=0.02852, over 972600.23 frames.], batch size: 35, lr: 1.23e-04 2022-05-09 10:03:26,622 INFO [train.py:715] (0/8) Epoch 18, batch 20900, loss[loss=0.1336, simple_loss=0.21, pruned_loss=0.02861, over 4878.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02867, over 972589.76 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:04:05,322 INFO [train.py:715] (0/8) Epoch 18, batch 20950, loss[loss=0.1513, simple_loss=0.2206, pruned_loss=0.04101, over 4976.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02906, over 973430.94 frames.], batch size: 35, lr: 1.23e-04 2022-05-09 10:04:44,842 INFO [train.py:715] (0/8) Epoch 18, batch 21000, loss[loss=0.1382, simple_loss=0.2126, pruned_loss=0.03186, over 4911.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02873, over 973490.01 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 10:04:44,843 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 10:04:54,817 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1046, simple_loss=0.1879, pruned_loss=0.01059, over 914524.00 frames. 2022-05-09 10:05:34,563 INFO [train.py:715] (0/8) Epoch 18, batch 21050, loss[loss=0.1593, simple_loss=0.2258, pruned_loss=0.04642, over 4858.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02894, over 973358.85 frames.], batch size: 32, lr: 1.23e-04 2022-05-09 10:06:14,352 INFO [train.py:715] (0/8) Epoch 18, batch 21100, loss[loss=0.1227, simple_loss=0.1947, pruned_loss=0.02539, over 4800.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02862, over 973805.99 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 10:06:53,520 INFO [train.py:715] (0/8) Epoch 18, batch 21150, loss[loss=0.1317, simple_loss=0.1979, pruned_loss=0.03277, over 4806.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2062, pruned_loss=0.02911, over 974329.84 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 10:07:33,001 INFO [train.py:715] (0/8) Epoch 18, batch 21200, loss[loss=0.1367, simple_loss=0.2197, pruned_loss=0.02683, over 4982.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02892, over 974098.15 frames.], batch size: 28, lr: 1.23e-04 2022-05-09 10:08:12,705 INFO [train.py:715] (0/8) Epoch 18, batch 21250, loss[loss=0.1179, simple_loss=0.1844, pruned_loss=0.02572, over 4875.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02906, over 973503.97 frames.], batch size: 32, lr: 1.23e-04 2022-05-09 10:08:51,644 INFO [train.py:715] (0/8) Epoch 18, batch 21300, loss[loss=0.143, simple_loss=0.2365, pruned_loss=0.02476, over 4691.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02879, over 973010.59 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 10:09:30,190 INFO [train.py:715] (0/8) Epoch 18, batch 21350, loss[loss=0.109, simple_loss=0.1881, pruned_loss=0.01494, over 4780.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.0284, over 973670.91 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 10:10:09,580 INFO [train.py:715] (0/8) Epoch 18, batch 21400, loss[loss=0.1182, simple_loss=0.189, pruned_loss=0.02371, over 4825.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2059, pruned_loss=0.0286, over 973214.15 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 10:10:33,520 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-648000.pt 2022-05-09 10:10:51,760 INFO [train.py:715] (0/8) Epoch 18, batch 21450, loss[loss=0.1136, simple_loss=0.1833, pruned_loss=0.02193, over 4814.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02879, over 972410.54 frames.], batch size: 27, lr: 1.23e-04 2022-05-09 10:11:30,941 INFO [train.py:715] (0/8) Epoch 18, batch 21500, loss[loss=0.1143, simple_loss=0.1856, pruned_loss=0.02146, over 4733.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02891, over 973030.03 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 10:12:09,691 INFO [train.py:715] (0/8) Epoch 18, batch 21550, loss[loss=0.1058, simple_loss=0.1726, pruned_loss=0.01953, over 4762.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2067, pruned_loss=0.02848, over 973023.49 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 10:12:49,086 INFO [train.py:715] (0/8) Epoch 18, batch 21600, loss[loss=0.1377, simple_loss=0.2111, pruned_loss=0.03215, over 4970.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2073, pruned_loss=0.02876, over 973068.13 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:13:28,301 INFO [train.py:715] (0/8) Epoch 18, batch 21650, loss[loss=0.1205, simple_loss=0.2137, pruned_loss=0.01362, over 4759.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2073, pruned_loss=0.02881, over 973445.25 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 10:14:06,693 INFO [train.py:715] (0/8) Epoch 18, batch 21700, loss[loss=0.1122, simple_loss=0.1896, pruned_loss=0.01743, over 4746.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2074, pruned_loss=0.02848, over 973648.54 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:14:45,678 INFO [train.py:715] (0/8) Epoch 18, batch 21750, loss[loss=0.1296, simple_loss=0.2106, pruned_loss=0.02428, over 4854.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2067, pruned_loss=0.02828, over 973724.95 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 10:15:24,819 INFO [train.py:715] (0/8) Epoch 18, batch 21800, loss[loss=0.1171, simple_loss=0.1931, pruned_loss=0.02055, over 4826.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2062, pruned_loss=0.02805, over 973216.21 frames.], batch size: 27, lr: 1.23e-04 2022-05-09 10:16:04,132 INFO [train.py:715] (0/8) Epoch 18, batch 21850, loss[loss=0.1441, simple_loss=0.2248, pruned_loss=0.0317, over 4861.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2065, pruned_loss=0.02812, over 973235.20 frames.], batch size: 22, lr: 1.23e-04 2022-05-09 10:16:43,558 INFO [train.py:715] (0/8) Epoch 18, batch 21900, loss[loss=0.1123, simple_loss=0.1821, pruned_loss=0.02128, over 4962.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02835, over 974054.59 frames.], batch size: 35, lr: 1.23e-04 2022-05-09 10:17:23,079 INFO [train.py:715] (0/8) Epoch 18, batch 21950, loss[loss=0.1192, simple_loss=0.1883, pruned_loss=0.02508, over 4636.00 frames.], tot_loss[loss=0.1308, simple_loss=0.206, pruned_loss=0.02779, over 973359.87 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 10:18:02,133 INFO [train.py:715] (0/8) Epoch 18, batch 22000, loss[loss=0.1412, simple_loss=0.224, pruned_loss=0.02921, over 4763.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2063, pruned_loss=0.02814, over 972642.31 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 10:18:41,238 INFO [train.py:715] (0/8) Epoch 18, batch 22050, loss[loss=0.1371, simple_loss=0.2007, pruned_loss=0.03672, over 4842.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02798, over 972668.39 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 10:19:20,729 INFO [train.py:715] (0/8) Epoch 18, batch 22100, loss[loss=0.1465, simple_loss=0.2329, pruned_loss=0.03002, over 4975.00 frames.], tot_loss[loss=0.131, simple_loss=0.2059, pruned_loss=0.02804, over 973002.40 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 10:19:59,601 INFO [train.py:715] (0/8) Epoch 18, batch 22150, loss[loss=0.1409, simple_loss=0.215, pruned_loss=0.03334, over 4772.00 frames.], tot_loss[loss=0.1295, simple_loss=0.2042, pruned_loss=0.02739, over 972181.06 frames.], batch size: 19, lr: 1.23e-04 2022-05-09 10:20:39,095 INFO [train.py:715] (0/8) Epoch 18, batch 22200, loss[loss=0.1342, simple_loss=0.2064, pruned_loss=0.03101, over 4773.00 frames.], tot_loss[loss=0.1298, simple_loss=0.2042, pruned_loss=0.02768, over 972309.54 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 10:21:17,769 INFO [train.py:715] (0/8) Epoch 18, batch 22250, loss[loss=0.1439, simple_loss=0.2089, pruned_loss=0.0394, over 4846.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2051, pruned_loss=0.02803, over 971983.04 frames.], batch size: 34, lr: 1.23e-04 2022-05-09 10:21:57,017 INFO [train.py:715] (0/8) Epoch 18, batch 22300, loss[loss=0.1285, simple_loss=0.2128, pruned_loss=0.02209, over 4700.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2052, pruned_loss=0.02828, over 972295.58 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 10:22:35,717 INFO [train.py:715] (0/8) Epoch 18, batch 22350, loss[loss=0.1164, simple_loss=0.1925, pruned_loss=0.02013, over 4791.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.02811, over 971680.08 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:23:14,494 INFO [train.py:715] (0/8) Epoch 18, batch 22400, loss[loss=0.1389, simple_loss=0.2103, pruned_loss=0.03375, over 4801.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2053, pruned_loss=0.02783, over 972361.53 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 10:23:53,396 INFO [train.py:715] (0/8) Epoch 18, batch 22450, loss[loss=0.1404, simple_loss=0.2171, pruned_loss=0.03181, over 4975.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02796, over 973056.63 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:24:32,482 INFO [train.py:715] (0/8) Epoch 18, batch 22500, loss[loss=0.1273, simple_loss=0.2041, pruned_loss=0.02531, over 4795.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2056, pruned_loss=0.02807, over 973262.07 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:25:11,262 INFO [train.py:715] (0/8) Epoch 18, batch 22550, loss[loss=0.1068, simple_loss=0.176, pruned_loss=0.01883, over 4790.00 frames.], tot_loss[loss=0.131, simple_loss=0.2056, pruned_loss=0.0282, over 972628.14 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 10:25:50,055 INFO [train.py:715] (0/8) Epoch 18, batch 22600, loss[loss=0.1124, simple_loss=0.1845, pruned_loss=0.02021, over 4796.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02846, over 972161.38 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:26:29,077 INFO [train.py:715] (0/8) Epoch 18, batch 22650, loss[loss=0.1187, simple_loss=0.1884, pruned_loss=0.02452, over 4797.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2066, pruned_loss=0.02833, over 972877.52 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 10:27:07,863 INFO [train.py:715] (0/8) Epoch 18, batch 22700, loss[loss=0.1218, simple_loss=0.1997, pruned_loss=0.02196, over 4736.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2067, pruned_loss=0.02824, over 972225.13 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:27:46,834 INFO [train.py:715] (0/8) Epoch 18, batch 22750, loss[loss=0.1312, simple_loss=0.2039, pruned_loss=0.02922, over 4883.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2073, pruned_loss=0.02863, over 973152.55 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:28:26,216 INFO [train.py:715] (0/8) Epoch 18, batch 22800, loss[loss=0.132, simple_loss=0.2198, pruned_loss=0.02207, over 4872.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2074, pruned_loss=0.02851, over 972762.30 frames.], batch size: 22, lr: 1.23e-04 2022-05-09 10:29:04,920 INFO [train.py:715] (0/8) Epoch 18, batch 22850, loss[loss=0.1341, simple_loss=0.2131, pruned_loss=0.02761, over 4929.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2077, pruned_loss=0.02889, over 972196.36 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 10:29:43,878 INFO [train.py:715] (0/8) Epoch 18, batch 22900, loss[loss=0.1449, simple_loss=0.2185, pruned_loss=0.03568, over 4957.00 frames.], tot_loss[loss=0.1336, simple_loss=0.2085, pruned_loss=0.02939, over 972058.07 frames.], batch size: 35, lr: 1.23e-04 2022-05-09 10:30:22,778 INFO [train.py:715] (0/8) Epoch 18, batch 22950, loss[loss=0.1448, simple_loss=0.2188, pruned_loss=0.03538, over 4839.00 frames.], tot_loss[loss=0.1341, simple_loss=0.2087, pruned_loss=0.02978, over 973194.04 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 10:31:02,202 INFO [train.py:715] (0/8) Epoch 18, batch 23000, loss[loss=0.1499, simple_loss=0.2295, pruned_loss=0.03515, over 4804.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02965, over 973132.23 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 10:31:40,968 INFO [train.py:715] (0/8) Epoch 18, batch 23050, loss[loss=0.1171, simple_loss=0.1873, pruned_loss=0.02346, over 4852.00 frames.], tot_loss[loss=0.1349, simple_loss=0.2091, pruned_loss=0.03035, over 973458.94 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 10:32:20,093 INFO [train.py:715] (0/8) Epoch 18, batch 23100, loss[loss=0.1328, simple_loss=0.1931, pruned_loss=0.03626, over 4866.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02992, over 973115.10 frames.], batch size: 32, lr: 1.23e-04 2022-05-09 10:32:59,659 INFO [train.py:715] (0/8) Epoch 18, batch 23150, loss[loss=0.1171, simple_loss=0.1881, pruned_loss=0.02307, over 4780.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2078, pruned_loss=0.02982, over 972364.34 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 10:33:38,765 INFO [train.py:715] (0/8) Epoch 18, batch 23200, loss[loss=0.1308, simple_loss=0.2148, pruned_loss=0.02341, over 4859.00 frames.], tot_loss[loss=0.134, simple_loss=0.2081, pruned_loss=0.03001, over 972904.71 frames.], batch size: 20, lr: 1.23e-04 2022-05-09 10:34:17,632 INFO [train.py:715] (0/8) Epoch 18, batch 23250, loss[loss=0.1297, simple_loss=0.2141, pruned_loss=0.02266, over 4939.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02984, over 972478.34 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 10:34:56,934 INFO [train.py:715] (0/8) Epoch 18, batch 23300, loss[loss=0.1187, simple_loss=0.2028, pruned_loss=0.01734, over 4786.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2071, pruned_loss=0.02935, over 971361.34 frames.], batch size: 17, lr: 1.23e-04 2022-05-09 10:35:36,580 INFO [train.py:715] (0/8) Epoch 18, batch 23350, loss[loss=0.1165, simple_loss=0.1855, pruned_loss=0.02372, over 4773.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02909, over 970916.19 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 10:36:15,525 INFO [train.py:715] (0/8) Epoch 18, batch 23400, loss[loss=0.1055, simple_loss=0.1702, pruned_loss=0.02033, over 4815.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2065, pruned_loss=0.02892, over 970817.19 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 10:36:54,046 INFO [train.py:715] (0/8) Epoch 18, batch 23450, loss[loss=0.1335, simple_loss=0.2044, pruned_loss=0.03135, over 4972.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02876, over 971191.25 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:37:33,551 INFO [train.py:715] (0/8) Epoch 18, batch 23500, loss[loss=0.1467, simple_loss=0.2179, pruned_loss=0.03771, over 4953.00 frames.], tot_loss[loss=0.1328, simple_loss=0.207, pruned_loss=0.02927, over 971021.92 frames.], batch size: 15, lr: 1.23e-04 2022-05-09 10:38:12,433 INFO [train.py:715] (0/8) Epoch 18, batch 23550, loss[loss=0.1438, simple_loss=0.2195, pruned_loss=0.03411, over 4931.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02966, over 970996.37 frames.], batch size: 18, lr: 1.23e-04 2022-05-09 10:38:51,086 INFO [train.py:715] (0/8) Epoch 18, batch 23600, loss[loss=0.1632, simple_loss=0.2379, pruned_loss=0.04425, over 4815.00 frames.], tot_loss[loss=0.134, simple_loss=0.2086, pruned_loss=0.02976, over 970942.73 frames.], batch size: 13, lr: 1.23e-04 2022-05-09 10:39:30,019 INFO [train.py:715] (0/8) Epoch 18, batch 23650, loss[loss=0.1271, simple_loss=0.2099, pruned_loss=0.02213, over 4739.00 frames.], tot_loss[loss=0.1339, simple_loss=0.2085, pruned_loss=0.02965, over 971368.69 frames.], batch size: 16, lr: 1.23e-04 2022-05-09 10:40:08,661 INFO [train.py:715] (0/8) Epoch 18, batch 23700, loss[loss=0.1265, simple_loss=0.2004, pruned_loss=0.0263, over 4778.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02956, over 970411.43 frames.], batch size: 14, lr: 1.23e-04 2022-05-09 10:40:47,461 INFO [train.py:715] (0/8) Epoch 18, batch 23750, loss[loss=0.1186, simple_loss=0.2057, pruned_loss=0.01575, over 4956.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.0288, over 971248.94 frames.], batch size: 21, lr: 1.23e-04 2022-05-09 10:41:26,880 INFO [train.py:715] (0/8) Epoch 18, batch 23800, loss[loss=0.1428, simple_loss=0.2247, pruned_loss=0.03039, over 4922.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2072, pruned_loss=0.02862, over 971938.02 frames.], batch size: 29, lr: 1.23e-04 2022-05-09 10:42:06,533 INFO [train.py:715] (0/8) Epoch 18, batch 23850, loss[loss=0.1114, simple_loss=0.1913, pruned_loss=0.01571, over 4976.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2072, pruned_loss=0.02856, over 972231.43 frames.], batch size: 24, lr: 1.23e-04 2022-05-09 10:42:45,346 INFO [train.py:715] (0/8) Epoch 18, batch 23900, loss[loss=0.1029, simple_loss=0.1652, pruned_loss=0.02026, over 4832.00 frames.], tot_loss[loss=0.132, simple_loss=0.2069, pruned_loss=0.02858, over 971834.41 frames.], batch size: 12, lr: 1.23e-04 2022-05-09 10:43:24,100 INFO [train.py:715] (0/8) Epoch 18, batch 23950, loss[loss=0.1152, simple_loss=0.1886, pruned_loss=0.02089, over 4947.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02869, over 972262.92 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 10:44:03,429 INFO [train.py:715] (0/8) Epoch 18, batch 24000, loss[loss=0.1233, simple_loss=0.1992, pruned_loss=0.02375, over 4958.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2062, pruned_loss=0.02827, over 972433.33 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 10:44:03,429 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 10:44:13,351 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1045, simple_loss=0.1878, pruned_loss=0.01057, over 914524.00 frames. 2022-05-09 10:44:52,989 INFO [train.py:715] (0/8) Epoch 18, batch 24050, loss[loss=0.125, simple_loss=0.1985, pruned_loss=0.02579, over 4847.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02897, over 972849.87 frames.], batch size: 30, lr: 1.22e-04 2022-05-09 10:45:31,815 INFO [train.py:715] (0/8) Epoch 18, batch 24100, loss[loss=0.1696, simple_loss=0.2245, pruned_loss=0.05735, over 4838.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02912, over 972954.46 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 10:46:10,745 INFO [train.py:715] (0/8) Epoch 18, batch 24150, loss[loss=0.1245, simple_loss=0.2079, pruned_loss=0.02053, over 4937.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2082, pruned_loss=0.02967, over 973338.09 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 10:46:50,169 INFO [train.py:715] (0/8) Epoch 18, batch 24200, loss[loss=0.1187, simple_loss=0.1952, pruned_loss=0.02112, over 4873.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02951, over 973370.89 frames.], batch size: 22, lr: 1.22e-04 2022-05-09 10:47:29,224 INFO [train.py:715] (0/8) Epoch 18, batch 24250, loss[loss=0.179, simple_loss=0.2546, pruned_loss=0.05168, over 4689.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2074, pruned_loss=0.0295, over 972796.93 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 10:48:08,101 INFO [train.py:715] (0/8) Epoch 18, batch 24300, loss[loss=0.1324, simple_loss=0.2077, pruned_loss=0.02852, over 4783.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02883, over 972573.47 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 10:48:46,576 INFO [train.py:715] (0/8) Epoch 18, batch 24350, loss[loss=0.1258, simple_loss=0.1989, pruned_loss=0.02636, over 4982.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02917, over 972821.58 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 10:49:25,636 INFO [train.py:715] (0/8) Epoch 18, batch 24400, loss[loss=0.1226, simple_loss=0.1944, pruned_loss=0.02535, over 4968.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02911, over 972613.03 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 10:50:04,246 INFO [train.py:715] (0/8) Epoch 18, batch 24450, loss[loss=0.1215, simple_loss=0.2003, pruned_loss=0.02138, over 4907.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02931, over 973348.18 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 10:50:42,844 INFO [train.py:715] (0/8) Epoch 18, batch 24500, loss[loss=0.1265, simple_loss=0.1953, pruned_loss=0.02881, over 4938.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2068, pruned_loss=0.02909, over 972729.03 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 10:51:22,302 INFO [train.py:715] (0/8) Epoch 18, batch 24550, loss[loss=0.1451, simple_loss=0.2096, pruned_loss=0.04026, over 4841.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2072, pruned_loss=0.02959, over 972612.81 frames.], batch size: 30, lr: 1.22e-04 2022-05-09 10:52:01,506 INFO [train.py:715] (0/8) Epoch 18, batch 24600, loss[loss=0.1234, simple_loss=0.201, pruned_loss=0.02283, over 4820.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2065, pruned_loss=0.0293, over 972064.25 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 10:52:40,232 INFO [train.py:715] (0/8) Epoch 18, batch 24650, loss[loss=0.1139, simple_loss=0.1893, pruned_loss=0.01925, over 4873.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2071, pruned_loss=0.02951, over 972431.60 frames.], batch size: 22, lr: 1.22e-04 2022-05-09 10:53:18,837 INFO [train.py:715] (0/8) Epoch 18, batch 24700, loss[loss=0.1396, simple_loss=0.2213, pruned_loss=0.02892, over 4775.00 frames.], tot_loss[loss=0.1338, simple_loss=0.2079, pruned_loss=0.02989, over 972088.54 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 10:53:58,061 INFO [train.py:715] (0/8) Epoch 18, batch 24750, loss[loss=0.1391, simple_loss=0.2142, pruned_loss=0.03198, over 4839.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02931, over 972084.85 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 10:54:37,026 INFO [train.py:715] (0/8) Epoch 18, batch 24800, loss[loss=0.1108, simple_loss=0.1862, pruned_loss=0.01775, over 4785.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2064, pruned_loss=0.02931, over 972031.34 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 10:55:16,442 INFO [train.py:715] (0/8) Epoch 18, batch 24850, loss[loss=0.1059, simple_loss=0.1846, pruned_loss=0.0136, over 4840.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2058, pruned_loss=0.02901, over 972461.38 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 10:55:55,499 INFO [train.py:715] (0/8) Epoch 18, batch 24900, loss[loss=0.1443, simple_loss=0.2219, pruned_loss=0.03335, over 4686.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2057, pruned_loss=0.02869, over 971718.87 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 10:56:35,063 INFO [train.py:715] (0/8) Epoch 18, batch 24950, loss[loss=0.1184, simple_loss=0.2031, pruned_loss=0.01682, over 4813.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2064, pruned_loss=0.02899, over 972052.75 frames.], batch size: 27, lr: 1.22e-04 2022-05-09 10:57:14,187 INFO [train.py:715] (0/8) Epoch 18, batch 25000, loss[loss=0.1237, simple_loss=0.2086, pruned_loss=0.01943, over 4947.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02853, over 971428.76 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 10:57:52,841 INFO [train.py:715] (0/8) Epoch 18, batch 25050, loss[loss=0.1332, simple_loss=0.1939, pruned_loss=0.03626, over 4793.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02861, over 971785.13 frames.], batch size: 12, lr: 1.22e-04 2022-05-09 10:58:32,134 INFO [train.py:715] (0/8) Epoch 18, batch 25100, loss[loss=0.1247, simple_loss=0.1978, pruned_loss=0.02579, over 4749.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02849, over 972399.78 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 10:59:11,688 INFO [train.py:715] (0/8) Epoch 18, batch 25150, loss[loss=0.1451, simple_loss=0.2266, pruned_loss=0.03176, over 4932.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02893, over 972763.22 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 10:59:50,260 INFO [train.py:715] (0/8) Epoch 18, batch 25200, loss[loss=0.1124, simple_loss=0.1817, pruned_loss=0.02151, over 4836.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02899, over 971920.88 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 11:00:29,819 INFO [train.py:715] (0/8) Epoch 18, batch 25250, loss[loss=0.1217, simple_loss=0.1951, pruned_loss=0.02414, over 4878.00 frames.], tot_loss[loss=0.133, simple_loss=0.2071, pruned_loss=0.02947, over 971618.86 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:01:09,545 INFO [train.py:715] (0/8) Epoch 18, batch 25300, loss[loss=0.1168, simple_loss=0.1891, pruned_loss=0.02226, over 4912.00 frames.], tot_loss[loss=0.133, simple_loss=0.2072, pruned_loss=0.02942, over 970955.02 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 11:01:48,676 INFO [train.py:715] (0/8) Epoch 18, batch 25350, loss[loss=0.1251, simple_loss=0.2061, pruned_loss=0.02207, over 4970.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2078, pruned_loss=0.02946, over 971029.15 frames.], batch size: 28, lr: 1.22e-04 2022-05-09 11:02:27,381 INFO [train.py:715] (0/8) Epoch 18, batch 25400, loss[loss=0.1368, simple_loss=0.2169, pruned_loss=0.02832, over 4929.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02918, over 970341.57 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:03:06,959 INFO [train.py:715] (0/8) Epoch 18, batch 25450, loss[loss=0.1208, simple_loss=0.1795, pruned_loss=0.03099, over 4867.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02876, over 970870.42 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 11:03:45,937 INFO [train.py:715] (0/8) Epoch 18, batch 25500, loss[loss=0.1283, simple_loss=0.2133, pruned_loss=0.02169, over 4818.00 frames.], tot_loss[loss=0.133, simple_loss=0.2077, pruned_loss=0.02914, over 971320.09 frames.], batch size: 27, lr: 1.22e-04 2022-05-09 11:04:24,920 INFO [train.py:715] (0/8) Epoch 18, batch 25550, loss[loss=0.1341, simple_loss=0.1984, pruned_loss=0.03491, over 4926.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2076, pruned_loss=0.02885, over 971182.75 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:05:04,556 INFO [train.py:715] (0/8) Epoch 18, batch 25600, loss[loss=0.1215, simple_loss=0.2, pruned_loss=0.02155, over 4980.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2078, pruned_loss=0.02897, over 972057.62 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 11:05:44,103 INFO [train.py:715] (0/8) Epoch 18, batch 25650, loss[loss=0.1069, simple_loss=0.1718, pruned_loss=0.02101, over 4794.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2076, pruned_loss=0.02896, over 973051.17 frames.], batch size: 12, lr: 1.22e-04 2022-05-09 11:06:23,309 INFO [train.py:715] (0/8) Epoch 18, batch 25700, loss[loss=0.1501, simple_loss=0.2298, pruned_loss=0.03524, over 4794.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2077, pruned_loss=0.02923, over 972851.36 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:07:02,566 INFO [train.py:715] (0/8) Epoch 18, batch 25750, loss[loss=0.1249, simple_loss=0.2, pruned_loss=0.0249, over 4873.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02907, over 972496.03 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 11:07:41,972 INFO [train.py:715] (0/8) Epoch 18, batch 25800, loss[loss=0.1674, simple_loss=0.2358, pruned_loss=0.04951, over 4990.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02934, over 972864.21 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 11:08:20,794 INFO [train.py:715] (0/8) Epoch 18, batch 25850, loss[loss=0.1321, simple_loss=0.1997, pruned_loss=0.03227, over 4921.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02953, over 972603.60 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 11:08:59,114 INFO [train.py:715] (0/8) Epoch 18, batch 25900, loss[loss=0.1325, simple_loss=0.2062, pruned_loss=0.02944, over 4944.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2068, pruned_loss=0.02948, over 972101.47 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 11:09:38,434 INFO [train.py:715] (0/8) Epoch 18, batch 25950, loss[loss=0.1203, simple_loss=0.1939, pruned_loss=0.02336, over 4835.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02951, over 972632.54 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:10:17,511 INFO [train.py:715] (0/8) Epoch 18, batch 26000, loss[loss=0.1478, simple_loss=0.2153, pruned_loss=0.04022, over 4707.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2073, pruned_loss=0.02926, over 972766.05 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:10:56,981 INFO [train.py:715] (0/8) Epoch 18, batch 26050, loss[loss=0.1661, simple_loss=0.2613, pruned_loss=0.03546, over 4869.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02879, over 971692.68 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 11:11:36,110 INFO [train.py:715] (0/8) Epoch 18, batch 26100, loss[loss=0.125, simple_loss=0.2001, pruned_loss=0.02496, over 4896.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02844, over 971810.39 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 11:12:15,718 INFO [train.py:715] (0/8) Epoch 18, batch 26150, loss[loss=0.1305, simple_loss=0.2106, pruned_loss=0.02517, over 4871.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02852, over 971463.63 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 11:12:54,902 INFO [train.py:715] (0/8) Epoch 18, batch 26200, loss[loss=0.1332, simple_loss=0.1982, pruned_loss=0.03412, over 4992.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2053, pruned_loss=0.02826, over 971951.13 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:13:33,233 INFO [train.py:715] (0/8) Epoch 18, batch 26250, loss[loss=0.1401, simple_loss=0.2123, pruned_loss=0.03394, over 4923.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.0288, over 972766.35 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:14:12,856 INFO [train.py:715] (0/8) Epoch 18, batch 26300, loss[loss=0.1222, simple_loss=0.2025, pruned_loss=0.0209, over 4984.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02844, over 972094.19 frames.], batch size: 28, lr: 1.22e-04 2022-05-09 11:14:51,542 INFO [train.py:715] (0/8) Epoch 18, batch 26350, loss[loss=0.1233, simple_loss=0.1931, pruned_loss=0.02678, over 4988.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02869, over 972069.17 frames.], batch size: 28, lr: 1.22e-04 2022-05-09 11:15:30,592 INFO [train.py:715] (0/8) Epoch 18, batch 26400, loss[loss=0.137, simple_loss=0.2045, pruned_loss=0.03475, over 4794.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02908, over 971601.97 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:16:09,484 INFO [train.py:715] (0/8) Epoch 18, batch 26450, loss[loss=0.1365, simple_loss=0.2139, pruned_loss=0.02954, over 4929.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02877, over 972070.09 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 11:16:49,038 INFO [train.py:715] (0/8) Epoch 18, batch 26500, loss[loss=0.1113, simple_loss=0.1799, pruned_loss=0.02133, over 4848.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02874, over 971875.63 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:17:28,069 INFO [train.py:715] (0/8) Epoch 18, batch 26550, loss[loss=0.1504, simple_loss=0.2218, pruned_loss=0.0395, over 4777.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02866, over 972071.50 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:18:06,862 INFO [train.py:715] (0/8) Epoch 18, batch 26600, loss[loss=0.1583, simple_loss=0.2413, pruned_loss=0.0376, over 4771.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02864, over 972189.70 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 11:18:46,128 INFO [train.py:715] (0/8) Epoch 18, batch 26650, loss[loss=0.1315, simple_loss=0.2018, pruned_loss=0.03055, over 4932.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02895, over 972535.35 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 11:19:25,270 INFO [train.py:715] (0/8) Epoch 18, batch 26700, loss[loss=0.1097, simple_loss=0.1849, pruned_loss=0.01725, over 4945.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02904, over 973006.48 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 11:20:05,264 INFO [train.py:715] (0/8) Epoch 18, batch 26750, loss[loss=0.1215, simple_loss=0.2017, pruned_loss=0.02064, over 4963.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02884, over 973024.32 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 11:20:43,661 INFO [train.py:715] (0/8) Epoch 18, batch 26800, loss[loss=0.1344, simple_loss=0.2087, pruned_loss=0.03003, over 4824.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02904, over 972376.50 frames.], batch size: 26, lr: 1.22e-04 2022-05-09 11:21:23,696 INFO [train.py:715] (0/8) Epoch 18, batch 26850, loss[loss=0.1496, simple_loss=0.2206, pruned_loss=0.03928, over 4937.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02895, over 971331.64 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 11:22:03,377 INFO [train.py:715] (0/8) Epoch 18, batch 26900, loss[loss=0.1237, simple_loss=0.1951, pruned_loss=0.02622, over 4806.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2066, pruned_loss=0.02933, over 971593.11 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:22:41,420 INFO [train.py:715] (0/8) Epoch 18, batch 26950, loss[loss=0.1063, simple_loss=0.1776, pruned_loss=0.01748, over 4787.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02931, over 970753.19 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:23:20,804 INFO [train.py:715] (0/8) Epoch 18, batch 27000, loss[loss=0.1472, simple_loss=0.2219, pruned_loss=0.03629, over 4765.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2076, pruned_loss=0.02969, over 970883.11 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:23:20,805 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 11:23:30,797 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1044, simple_loss=0.1877, pruned_loss=0.01055, over 914524.00 frames. 2022-05-09 11:24:11,107 INFO [train.py:715] (0/8) Epoch 18, batch 27050, loss[loss=0.1382, simple_loss=0.2121, pruned_loss=0.0321, over 4827.00 frames.], tot_loss[loss=0.1334, simple_loss=0.2074, pruned_loss=0.02971, over 971135.14 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:24:50,011 INFO [train.py:715] (0/8) Epoch 18, batch 27100, loss[loss=0.1211, simple_loss=0.2049, pruned_loss=0.01867, over 4800.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2063, pruned_loss=0.02936, over 972117.26 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:25:29,326 INFO [train.py:715] (0/8) Epoch 18, batch 27150, loss[loss=0.1198, simple_loss=0.1941, pruned_loss=0.02274, over 4868.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2055, pruned_loss=0.02885, over 972484.57 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:26:08,686 INFO [train.py:715] (0/8) Epoch 18, batch 27200, loss[loss=0.1168, simple_loss=0.1857, pruned_loss=0.02399, over 4813.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2052, pruned_loss=0.0282, over 972600.23 frames.], batch size: 12, lr: 1.22e-04 2022-05-09 11:26:47,915 INFO [train.py:715] (0/8) Epoch 18, batch 27250, loss[loss=0.1633, simple_loss=0.2378, pruned_loss=0.04439, over 4831.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02841, over 972522.15 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:27:26,981 INFO [train.py:715] (0/8) Epoch 18, batch 27300, loss[loss=0.1244, simple_loss=0.2008, pruned_loss=0.02402, over 4792.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02846, over 972294.30 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 11:28:05,821 INFO [train.py:715] (0/8) Epoch 18, batch 27350, loss[loss=0.1421, simple_loss=0.2124, pruned_loss=0.03594, over 4959.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.0282, over 972731.06 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:28:46,002 INFO [train.py:715] (0/8) Epoch 18, batch 27400, loss[loss=0.1358, simple_loss=0.2109, pruned_loss=0.03034, over 4922.00 frames.], tot_loss[loss=0.131, simple_loss=0.2061, pruned_loss=0.02796, over 973518.97 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:29:25,403 INFO [train.py:715] (0/8) Epoch 18, batch 27450, loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02958, over 4955.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02836, over 973370.43 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:30:04,444 INFO [train.py:715] (0/8) Epoch 18, batch 27500, loss[loss=0.1229, simple_loss=0.1839, pruned_loss=0.03094, over 4958.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02865, over 973346.15 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:30:44,160 INFO [train.py:715] (0/8) Epoch 18, batch 27550, loss[loss=0.139, simple_loss=0.2125, pruned_loss=0.03273, over 4987.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2068, pruned_loss=0.02886, over 973319.29 frames.], batch size: 33, lr: 1.22e-04 2022-05-09 11:31:23,280 INFO [train.py:715] (0/8) Epoch 18, batch 27600, loss[loss=0.1532, simple_loss=0.2259, pruned_loss=0.04027, over 4965.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2068, pruned_loss=0.02897, over 973012.40 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:32:01,946 INFO [train.py:715] (0/8) Epoch 18, batch 27650, loss[loss=0.1163, simple_loss=0.1864, pruned_loss=0.02313, over 4817.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2077, pruned_loss=0.02964, over 973037.94 frames.], batch size: 26, lr: 1.22e-04 2022-05-09 11:32:40,858 INFO [train.py:715] (0/8) Epoch 18, batch 27700, loss[loss=0.1385, simple_loss=0.2058, pruned_loss=0.03557, over 4854.00 frames.], tot_loss[loss=0.1337, simple_loss=0.2081, pruned_loss=0.02962, over 972452.50 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:33:20,155 INFO [train.py:715] (0/8) Epoch 18, batch 27750, loss[loss=0.1343, simple_loss=0.2029, pruned_loss=0.03281, over 4897.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.0293, over 973041.48 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 11:33:59,615 INFO [train.py:715] (0/8) Epoch 18, batch 27800, loss[loss=0.1422, simple_loss=0.2281, pruned_loss=0.02821, over 4949.00 frames.], tot_loss[loss=0.133, simple_loss=0.2077, pruned_loss=0.02916, over 972843.26 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 11:34:38,869 INFO [train.py:715] (0/8) Epoch 18, batch 27850, loss[loss=0.1461, simple_loss=0.2134, pruned_loss=0.03943, over 4849.00 frames.], tot_loss[loss=0.1332, simple_loss=0.208, pruned_loss=0.02919, over 973714.97 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:35:18,480 INFO [train.py:715] (0/8) Epoch 18, batch 27900, loss[loss=0.1165, simple_loss=0.1911, pruned_loss=0.02089, over 4834.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2081, pruned_loss=0.02915, over 973032.35 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:35:57,737 INFO [train.py:715] (0/8) Epoch 18, batch 27950, loss[loss=0.1174, simple_loss=0.1859, pruned_loss=0.02445, over 4816.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2077, pruned_loss=0.02932, over 972083.19 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 11:36:36,980 INFO [train.py:715] (0/8) Epoch 18, batch 28000, loss[loss=0.134, simple_loss=0.2079, pruned_loss=0.03006, over 4779.00 frames.], tot_loss[loss=0.133, simple_loss=0.2075, pruned_loss=0.02921, over 972164.61 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 11:37:16,529 INFO [train.py:715] (0/8) Epoch 18, batch 28050, loss[loss=0.143, simple_loss=0.215, pruned_loss=0.03548, over 4815.00 frames.], tot_loss[loss=0.134, simple_loss=0.2082, pruned_loss=0.02994, over 972022.63 frames.], batch size: 26, lr: 1.22e-04 2022-05-09 11:37:56,319 INFO [train.py:715] (0/8) Epoch 18, batch 28100, loss[loss=0.1165, simple_loss=0.1999, pruned_loss=0.01655, over 4787.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2073, pruned_loss=0.0296, over 971862.06 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:38:35,509 INFO [train.py:715] (0/8) Epoch 18, batch 28150, loss[loss=0.1608, simple_loss=0.2165, pruned_loss=0.05252, over 4831.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02931, over 972537.02 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:39:13,842 INFO [train.py:715] (0/8) Epoch 18, batch 28200, loss[loss=0.1305, simple_loss=0.1954, pruned_loss=0.03285, over 4687.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2068, pruned_loss=0.02917, over 972352.11 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 11:39:53,464 INFO [train.py:715] (0/8) Epoch 18, batch 28250, loss[loss=0.1171, simple_loss=0.1971, pruned_loss=0.0185, over 4948.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.0287, over 972875.25 frames.], batch size: 29, lr: 1.22e-04 2022-05-09 11:40:32,293 INFO [train.py:715] (0/8) Epoch 18, batch 28300, loss[loss=0.1157, simple_loss=0.1887, pruned_loss=0.02135, over 4866.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02883, over 972004.13 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 11:41:11,195 INFO [train.py:715] (0/8) Epoch 18, batch 28350, loss[loss=0.1112, simple_loss=0.1887, pruned_loss=0.01687, over 4810.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02837, over 972384.80 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 11:41:50,493 INFO [train.py:715] (0/8) Epoch 18, batch 28400, loss[loss=0.1249, simple_loss=0.2022, pruned_loss=0.02382, over 4969.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2056, pruned_loss=0.02802, over 972718.95 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 11:42:29,771 INFO [train.py:715] (0/8) Epoch 18, batch 28450, loss[loss=0.1178, simple_loss=0.1943, pruned_loss=0.02064, over 4916.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02834, over 972659.17 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 11:43:08,843 INFO [train.py:715] (0/8) Epoch 18, batch 28500, loss[loss=0.1239, simple_loss=0.1977, pruned_loss=0.025, over 4790.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02874, over 972574.86 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 11:43:47,953 INFO [train.py:715] (0/8) Epoch 18, batch 28550, loss[loss=0.1679, simple_loss=0.2485, pruned_loss=0.0436, over 4893.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02871, over 971823.21 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 11:44:27,957 INFO [train.py:715] (0/8) Epoch 18, batch 28600, loss[loss=0.1177, simple_loss=0.1959, pruned_loss=0.0197, over 4801.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02879, over 972014.49 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:45:06,661 INFO [train.py:715] (0/8) Epoch 18, batch 28650, loss[loss=0.1335, simple_loss=0.211, pruned_loss=0.02799, over 4770.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2064, pruned_loss=0.02916, over 972260.41 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 11:45:45,610 INFO [train.py:715] (0/8) Epoch 18, batch 28700, loss[loss=0.111, simple_loss=0.1929, pruned_loss=0.01452, over 4908.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02884, over 972501.83 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 11:46:25,177 INFO [train.py:715] (0/8) Epoch 18, batch 28750, loss[loss=0.1513, simple_loss=0.2144, pruned_loss=0.04411, over 4843.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02879, over 973391.39 frames.], batch size: 30, lr: 1.22e-04 2022-05-09 11:47:04,220 INFO [train.py:715] (0/8) Epoch 18, batch 28800, loss[loss=0.1325, simple_loss=0.2098, pruned_loss=0.02757, over 4979.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02869, over 973821.75 frames.], batch size: 33, lr: 1.22e-04 2022-05-09 11:47:43,075 INFO [train.py:715] (0/8) Epoch 18, batch 28850, loss[loss=0.15, simple_loss=0.233, pruned_loss=0.03351, over 4736.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02859, over 973350.83 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 11:48:21,622 INFO [train.py:715] (0/8) Epoch 18, batch 28900, loss[loss=0.1435, simple_loss=0.2179, pruned_loss=0.03456, over 4756.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02847, over 972864.52 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 11:49:01,742 INFO [train.py:715] (0/8) Epoch 18, batch 28950, loss[loss=0.126, simple_loss=0.1935, pruned_loss=0.02926, over 4966.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02811, over 972940.41 frames.], batch size: 35, lr: 1.22e-04 2022-05-09 11:49:40,556 INFO [train.py:715] (0/8) Epoch 18, batch 29000, loss[loss=0.1476, simple_loss=0.2102, pruned_loss=0.04248, over 4781.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02807, over 972366.95 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:50:19,731 INFO [train.py:715] (0/8) Epoch 18, batch 29050, loss[loss=0.1129, simple_loss=0.1854, pruned_loss=0.02019, over 4836.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02823, over 972012.84 frames.], batch size: 30, lr: 1.22e-04 2022-05-09 11:50:59,123 INFO [train.py:715] (0/8) Epoch 18, batch 29100, loss[loss=0.1456, simple_loss=0.2244, pruned_loss=0.03341, over 4840.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2061, pruned_loss=0.02814, over 972846.23 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:51:38,415 INFO [train.py:715] (0/8) Epoch 18, batch 29150, loss[loss=0.1562, simple_loss=0.2356, pruned_loss=0.03843, over 4759.00 frames.], tot_loss[loss=0.1305, simple_loss=0.205, pruned_loss=0.02796, over 971677.95 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 11:52:17,129 INFO [train.py:715] (0/8) Epoch 18, batch 29200, loss[loss=0.1576, simple_loss=0.2296, pruned_loss=0.04277, over 4893.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.02806, over 972288.54 frames.], batch size: 22, lr: 1.22e-04 2022-05-09 11:52:55,643 INFO [train.py:715] (0/8) Epoch 18, batch 29250, loss[loss=0.1275, simple_loss=0.2044, pruned_loss=0.02524, over 4823.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2049, pruned_loss=0.02804, over 972162.11 frames.], batch size: 26, lr: 1.22e-04 2022-05-09 11:53:35,210 INFO [train.py:715] (0/8) Epoch 18, batch 29300, loss[loss=0.1134, simple_loss=0.1818, pruned_loss=0.02251, over 4977.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2047, pruned_loss=0.02807, over 972189.87 frames.], batch size: 28, lr: 1.22e-04 2022-05-09 11:54:13,912 INFO [train.py:715] (0/8) Epoch 18, batch 29350, loss[loss=0.1288, simple_loss=0.2014, pruned_loss=0.02814, over 4788.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2053, pruned_loss=0.02847, over 971689.35 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:54:52,606 INFO [train.py:715] (0/8) Epoch 18, batch 29400, loss[loss=0.1156, simple_loss=0.1891, pruned_loss=0.02102, over 4926.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02876, over 973239.18 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:55:17,121 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-656000.pt 2022-05-09 11:55:33,950 INFO [train.py:715] (0/8) Epoch 18, batch 29450, loss[loss=0.1079, simple_loss=0.184, pruned_loss=0.01588, over 4756.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02841, over 973745.90 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 11:56:12,976 INFO [train.py:715] (0/8) Epoch 18, batch 29500, loss[loss=0.1432, simple_loss=0.2116, pruned_loss=0.03737, over 4981.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02858, over 973493.80 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 11:56:52,073 INFO [train.py:715] (0/8) Epoch 18, batch 29550, loss[loss=0.1598, simple_loss=0.237, pruned_loss=0.04132, over 4941.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2067, pruned_loss=0.02901, over 973357.11 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 11:57:30,048 INFO [train.py:715] (0/8) Epoch 18, batch 29600, loss[loss=0.1355, simple_loss=0.2106, pruned_loss=0.03018, over 4941.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02914, over 973742.91 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 11:58:09,261 INFO [train.py:715] (0/8) Epoch 18, batch 29650, loss[loss=0.1529, simple_loss=0.2194, pruned_loss=0.04319, over 4870.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02893, over 973638.32 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 11:58:48,233 INFO [train.py:715] (0/8) Epoch 18, batch 29700, loss[loss=0.1243, simple_loss=0.1971, pruned_loss=0.02576, over 4828.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02887, over 972759.94 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 11:59:26,595 INFO [train.py:715] (0/8) Epoch 18, batch 29750, loss[loss=0.1256, simple_loss=0.1988, pruned_loss=0.02617, over 4817.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2077, pruned_loss=0.02923, over 972973.09 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:00:05,914 INFO [train.py:715] (0/8) Epoch 18, batch 29800, loss[loss=0.112, simple_loss=0.1709, pruned_loss=0.02657, over 4892.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02903, over 973206.82 frames.], batch size: 22, lr: 1.22e-04 2022-05-09 12:00:45,625 INFO [train.py:715] (0/8) Epoch 18, batch 29850, loss[loss=0.1181, simple_loss=0.1987, pruned_loss=0.01878, over 4974.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02872, over 973152.78 frames.], batch size: 35, lr: 1.22e-04 2022-05-09 12:01:24,704 INFO [train.py:715] (0/8) Epoch 18, batch 29900, loss[loss=0.1298, simple_loss=0.2049, pruned_loss=0.02735, over 4969.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2069, pruned_loss=0.02919, over 973377.00 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 12:02:03,295 INFO [train.py:715] (0/8) Epoch 18, batch 29950, loss[loss=0.1372, simple_loss=0.2079, pruned_loss=0.03324, over 4763.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02883, over 972508.11 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 12:02:43,059 INFO [train.py:715] (0/8) Epoch 18, batch 30000, loss[loss=0.1453, simple_loss=0.2128, pruned_loss=0.0389, over 4906.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2062, pruned_loss=0.02814, over 972580.05 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 12:02:43,060 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 12:02:52,969 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1047, simple_loss=0.188, pruned_loss=0.01071, over 914524.00 frames. 2022-05-09 12:03:33,189 INFO [train.py:715] (0/8) Epoch 18, batch 30050, loss[loss=0.1483, simple_loss=0.2145, pruned_loss=0.04106, over 4913.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2071, pruned_loss=0.02858, over 973050.62 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 12:04:12,316 INFO [train.py:715] (0/8) Epoch 18, batch 30100, loss[loss=0.1479, simple_loss=0.2202, pruned_loss=0.03781, over 4952.00 frames.], tot_loss[loss=0.1333, simple_loss=0.2077, pruned_loss=0.02944, over 973180.39 frames.], batch size: 35, lr: 1.22e-04 2022-05-09 12:04:50,502 INFO [train.py:715] (0/8) Epoch 18, batch 30150, loss[loss=0.1172, simple_loss=0.1956, pruned_loss=0.0194, over 4854.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.0291, over 972235.62 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 12:05:29,935 INFO [train.py:715] (0/8) Epoch 18, batch 30200, loss[loss=0.1583, simple_loss=0.2264, pruned_loss=0.0451, over 4973.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02935, over 972725.53 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 12:06:09,179 INFO [train.py:715] (0/8) Epoch 18, batch 30250, loss[loss=0.1387, simple_loss=0.211, pruned_loss=0.03322, over 4874.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02875, over 971969.30 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:06:48,869 INFO [train.py:715] (0/8) Epoch 18, batch 30300, loss[loss=0.1178, simple_loss=0.1916, pruned_loss=0.02201, over 4961.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02875, over 971416.99 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 12:07:27,506 INFO [train.py:715] (0/8) Epoch 18, batch 30350, loss[loss=0.1425, simple_loss=0.2223, pruned_loss=0.03134, over 4700.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.02849, over 971979.86 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 12:08:07,405 INFO [train.py:715] (0/8) Epoch 18, batch 30400, loss[loss=0.1333, simple_loss=0.2019, pruned_loss=0.03237, over 4757.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02871, over 971387.79 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:08:46,435 INFO [train.py:715] (0/8) Epoch 18, batch 30450, loss[loss=0.1344, simple_loss=0.2083, pruned_loss=0.03024, over 4741.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.0286, over 971283.15 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:09:24,921 INFO [train.py:715] (0/8) Epoch 18, batch 30500, loss[loss=0.1199, simple_loss=0.1988, pruned_loss=0.0205, over 4806.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.0289, over 971616.47 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:10:04,132 INFO [train.py:715] (0/8) Epoch 18, batch 30550, loss[loss=0.1296, simple_loss=0.1948, pruned_loss=0.03222, over 4933.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02838, over 971109.57 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 12:10:42,812 INFO [train.py:715] (0/8) Epoch 18, batch 30600, loss[loss=0.1375, simple_loss=0.2258, pruned_loss=0.0246, over 4764.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02856, over 971660.29 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:11:21,479 INFO [train.py:715] (0/8) Epoch 18, batch 30650, loss[loss=0.1261, simple_loss=0.2098, pruned_loss=0.02115, over 4949.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.0283, over 972188.23 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:12:00,152 INFO [train.py:715] (0/8) Epoch 18, batch 30700, loss[loss=0.1266, simple_loss=0.2032, pruned_loss=0.02504, over 4875.00 frames.], tot_loss[loss=0.131, simple_loss=0.2057, pruned_loss=0.02812, over 972487.93 frames.], batch size: 22, lr: 1.22e-04 2022-05-09 12:12:39,281 INFO [train.py:715] (0/8) Epoch 18, batch 30750, loss[loss=0.116, simple_loss=0.1887, pruned_loss=0.02163, over 4802.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02855, over 973197.46 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 12:13:18,033 INFO [train.py:715] (0/8) Epoch 18, batch 30800, loss[loss=0.1607, simple_loss=0.2356, pruned_loss=0.04293, over 4896.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02864, over 972871.59 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 12:13:56,472 INFO [train.py:715] (0/8) Epoch 18, batch 30850, loss[loss=0.1494, simple_loss=0.2338, pruned_loss=0.03252, over 4897.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.02833, over 973089.00 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 12:14:35,505 INFO [train.py:715] (0/8) Epoch 18, batch 30900, loss[loss=0.1567, simple_loss=0.2276, pruned_loss=0.04293, over 4931.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02844, over 973343.92 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 12:15:14,125 INFO [train.py:715] (0/8) Epoch 18, batch 30950, loss[loss=0.122, simple_loss=0.2024, pruned_loss=0.02079, over 4966.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2071, pruned_loss=0.02868, over 973762.04 frames.], batch size: 35, lr: 1.22e-04 2022-05-09 12:15:52,433 INFO [train.py:715] (0/8) Epoch 18, batch 31000, loss[loss=0.136, simple_loss=0.2042, pruned_loss=0.0339, over 4849.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02893, over 973538.24 frames.], batch size: 30, lr: 1.22e-04 2022-05-09 12:16:31,405 INFO [train.py:715] (0/8) Epoch 18, batch 31050, loss[loss=0.1157, simple_loss=0.185, pruned_loss=0.02316, over 4782.00 frames.], tot_loss[loss=0.132, simple_loss=0.2066, pruned_loss=0.02868, over 972901.02 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 12:17:10,957 INFO [train.py:715] (0/8) Epoch 18, batch 31100, loss[loss=0.1568, simple_loss=0.2333, pruned_loss=0.0402, over 4869.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02901, over 971964.78 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 12:17:49,902 INFO [train.py:715] (0/8) Epoch 18, batch 31150, loss[loss=0.1045, simple_loss=0.1852, pruned_loss=0.01192, over 4771.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02856, over 972561.01 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:18:28,834 INFO [train.py:715] (0/8) Epoch 18, batch 31200, loss[loss=0.1547, simple_loss=0.2374, pruned_loss=0.03598, over 4699.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02846, over 971940.13 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 12:19:08,214 INFO [train.py:715] (0/8) Epoch 18, batch 31250, loss[loss=0.132, simple_loss=0.2083, pruned_loss=0.02782, over 4979.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02849, over 972526.72 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 12:19:47,254 INFO [train.py:715] (0/8) Epoch 18, batch 31300, loss[loss=0.12, simple_loss=0.1958, pruned_loss=0.0221, over 4855.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2047, pruned_loss=0.02826, over 972788.26 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 12:20:25,874 INFO [train.py:715] (0/8) Epoch 18, batch 31350, loss[loss=0.1187, simple_loss=0.1937, pruned_loss=0.0219, over 4755.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2048, pruned_loss=0.02812, over 972981.30 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:21:05,050 INFO [train.py:715] (0/8) Epoch 18, batch 31400, loss[loss=0.1106, simple_loss=0.1818, pruned_loss=0.0197, over 4778.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2051, pruned_loss=0.02804, over 972673.36 frames.], batch size: 12, lr: 1.22e-04 2022-05-09 12:21:44,553 INFO [train.py:715] (0/8) Epoch 18, batch 31450, loss[loss=0.1231, simple_loss=0.189, pruned_loss=0.02865, over 4743.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2053, pruned_loss=0.02808, over 972397.25 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:22:23,385 INFO [train.py:715] (0/8) Epoch 18, batch 31500, loss[loss=0.1243, simple_loss=0.1986, pruned_loss=0.025, over 4961.00 frames.], tot_loss[loss=0.1304, simple_loss=0.205, pruned_loss=0.0279, over 972609.14 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 12:23:01,621 INFO [train.py:715] (0/8) Epoch 18, batch 31550, loss[loss=0.1465, simple_loss=0.2294, pruned_loss=0.03174, over 4788.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2046, pruned_loss=0.02787, over 972758.14 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 12:23:41,439 INFO [train.py:715] (0/8) Epoch 18, batch 31600, loss[loss=0.1499, simple_loss=0.2319, pruned_loss=0.03399, over 4797.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2049, pruned_loss=0.02791, over 972985.95 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 12:24:20,709 INFO [train.py:715] (0/8) Epoch 18, batch 31650, loss[loss=0.1145, simple_loss=0.1929, pruned_loss=0.01804, over 4631.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2055, pruned_loss=0.02793, over 973520.32 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 12:24:59,684 INFO [train.py:715] (0/8) Epoch 18, batch 31700, loss[loss=0.1138, simple_loss=0.1789, pruned_loss=0.02439, over 4972.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2051, pruned_loss=0.02826, over 973419.33 frames.], batch size: 35, lr: 1.22e-04 2022-05-09 12:25:38,809 INFO [train.py:715] (0/8) Epoch 18, batch 31750, loss[loss=0.1375, simple_loss=0.2133, pruned_loss=0.03078, over 4930.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.028, over 973769.51 frames.], batch size: 29, lr: 1.22e-04 2022-05-09 12:26:18,650 INFO [train.py:715] (0/8) Epoch 18, batch 31800, loss[loss=0.1294, simple_loss=0.2019, pruned_loss=0.02843, over 4782.00 frames.], tot_loss[loss=0.1314, simple_loss=0.206, pruned_loss=0.02842, over 972956.39 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 12:26:58,018 INFO [train.py:715] (0/8) Epoch 18, batch 31850, loss[loss=0.1107, simple_loss=0.1876, pruned_loss=0.01688, over 4744.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 972699.52 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:27:36,976 INFO [train.py:715] (0/8) Epoch 18, batch 31900, loss[loss=0.1206, simple_loss=0.2007, pruned_loss=0.02029, over 4757.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02896, over 973054.02 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:28:16,143 INFO [train.py:715] (0/8) Epoch 18, batch 31950, loss[loss=0.1416, simple_loss=0.2204, pruned_loss=0.03137, over 4857.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02867, over 972818.81 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 12:28:54,456 INFO [train.py:715] (0/8) Epoch 18, batch 32000, loss[loss=0.1234, simple_loss=0.1871, pruned_loss=0.02984, over 4886.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.0289, over 972103.69 frames.], batch size: 32, lr: 1.22e-04 2022-05-09 12:29:32,616 INFO [train.py:715] (0/8) Epoch 18, batch 32050, loss[loss=0.1407, simple_loss=0.208, pruned_loss=0.0367, over 4751.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2056, pruned_loss=0.02866, over 972588.92 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:30:11,873 INFO [train.py:715] (0/8) Epoch 18, batch 32100, loss[loss=0.1095, simple_loss=0.1815, pruned_loss=0.01875, over 4980.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2054, pruned_loss=0.02845, over 972653.83 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 12:30:51,372 INFO [train.py:715] (0/8) Epoch 18, batch 32150, loss[loss=0.1179, simple_loss=0.1857, pruned_loss=0.02503, over 4781.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02859, over 972142.47 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 12:31:30,532 INFO [train.py:715] (0/8) Epoch 18, batch 32200, loss[loss=0.1311, simple_loss=0.2114, pruned_loss=0.02541, over 4795.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02887, over 972168.01 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 12:32:08,909 INFO [train.py:715] (0/8) Epoch 18, batch 32250, loss[loss=0.1361, simple_loss=0.2115, pruned_loss=0.03038, over 4919.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02876, over 971966.60 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 12:32:48,155 INFO [train.py:715] (0/8) Epoch 18, batch 32300, loss[loss=0.1546, simple_loss=0.2296, pruned_loss=0.03977, over 4915.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02903, over 972332.61 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 12:33:26,711 INFO [train.py:715] (0/8) Epoch 18, batch 32350, loss[loss=0.1176, simple_loss=0.1959, pruned_loss=0.0196, over 4980.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02867, over 973347.87 frames.], batch size: 40, lr: 1.22e-04 2022-05-09 12:34:05,359 INFO [train.py:715] (0/8) Epoch 18, batch 32400, loss[loss=0.1379, simple_loss=0.2192, pruned_loss=0.0283, over 4767.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.029, over 972599.12 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:34:44,777 INFO [train.py:715] (0/8) Epoch 18, batch 32450, loss[loss=0.1215, simple_loss=0.1949, pruned_loss=0.024, over 4937.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02895, over 972998.92 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:35:23,649 INFO [train.py:715] (0/8) Epoch 18, batch 32500, loss[loss=0.1343, simple_loss=0.2163, pruned_loss=0.02621, over 4887.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02901, over 974085.70 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:36:02,851 INFO [train.py:715] (0/8) Epoch 18, batch 32550, loss[loss=0.1186, simple_loss=0.2015, pruned_loss=0.0179, over 4807.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02916, over 974409.35 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 12:36:42,023 INFO [train.py:715] (0/8) Epoch 18, batch 32600, loss[loss=0.1599, simple_loss=0.2259, pruned_loss=0.04698, over 4951.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.02916, over 973709.56 frames.], batch size: 35, lr: 1.22e-04 2022-05-09 12:37:21,455 INFO [train.py:715] (0/8) Epoch 18, batch 32650, loss[loss=0.1484, simple_loss=0.2226, pruned_loss=0.03707, over 4799.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.0289, over 973200.18 frames.], batch size: 24, lr: 1.22e-04 2022-05-09 12:37:59,892 INFO [train.py:715] (0/8) Epoch 18, batch 32700, loss[loss=0.1281, simple_loss=0.1898, pruned_loss=0.03316, over 4853.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02908, over 972683.59 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 12:38:38,639 INFO [train.py:715] (0/8) Epoch 18, batch 32750, loss[loss=0.1329, simple_loss=0.2129, pruned_loss=0.02639, over 4948.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02891, over 973514.82 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:39:17,958 INFO [train.py:715] (0/8) Epoch 18, batch 32800, loss[loss=0.1273, simple_loss=0.2064, pruned_loss=0.0241, over 4752.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.02894, over 973144.16 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:39:57,147 INFO [train.py:715] (0/8) Epoch 18, batch 32850, loss[loss=0.1348, simple_loss=0.2014, pruned_loss=0.03407, over 4846.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02898, over 972567.57 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 12:40:35,664 INFO [train.py:715] (0/8) Epoch 18, batch 32900, loss[loss=0.1345, simple_loss=0.199, pruned_loss=0.03506, over 4811.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.02893, over 972592.05 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 12:41:14,762 INFO [train.py:715] (0/8) Epoch 18, batch 32950, loss[loss=0.1328, simple_loss=0.2046, pruned_loss=0.03052, over 4825.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.0289, over 972853.05 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 12:41:53,956 INFO [train.py:715] (0/8) Epoch 18, batch 33000, loss[loss=0.152, simple_loss=0.2299, pruned_loss=0.03703, over 4765.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02877, over 972478.30 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 12:41:53,957 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 12:42:03,827 INFO [train.py:742] (0/8) Epoch 18, validation: loss=0.1046, simple_loss=0.1878, pruned_loss=0.01068, over 914524.00 frames. 2022-05-09 12:42:43,654 INFO [train.py:715] (0/8) Epoch 18, batch 33050, loss[loss=0.1489, simple_loss=0.2267, pruned_loss=0.03551, over 4926.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02865, over 972877.95 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 12:43:22,615 INFO [train.py:715] (0/8) Epoch 18, batch 33100, loss[loss=0.1215, simple_loss=0.1912, pruned_loss=0.02593, over 4940.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2056, pruned_loss=0.02833, over 973189.32 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:44:02,102 INFO [train.py:715] (0/8) Epoch 18, batch 33150, loss[loss=0.1111, simple_loss=0.1888, pruned_loss=0.01672, over 4874.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 973769.54 frames.], batch size: 22, lr: 1.22e-04 2022-05-09 12:44:41,943 INFO [train.py:715] (0/8) Epoch 18, batch 33200, loss[loss=0.1119, simple_loss=0.1865, pruned_loss=0.0187, over 4753.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02884, over 972881.49 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:45:20,899 INFO [train.py:715] (0/8) Epoch 18, batch 33250, loss[loss=0.1366, simple_loss=0.2191, pruned_loss=0.02701, over 4989.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02873, over 972961.85 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 12:45:59,527 INFO [train.py:715] (0/8) Epoch 18, batch 33300, loss[loss=0.1393, simple_loss=0.2112, pruned_loss=0.03375, over 4815.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2065, pruned_loss=0.02908, over 973049.65 frames.], batch size: 12, lr: 1.22e-04 2022-05-09 12:46:38,973 INFO [train.py:715] (0/8) Epoch 18, batch 33350, loss[loss=0.1399, simple_loss=0.2132, pruned_loss=0.03332, over 4867.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02849, over 972626.88 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:47:18,350 INFO [train.py:715] (0/8) Epoch 18, batch 33400, loss[loss=0.1326, simple_loss=0.2054, pruned_loss=0.02993, over 4757.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02823, over 973178.38 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:47:57,077 INFO [train.py:715] (0/8) Epoch 18, batch 33450, loss[loss=0.1521, simple_loss=0.2246, pruned_loss=0.03985, over 4769.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2065, pruned_loss=0.02851, over 972813.12 frames.], batch size: 19, lr: 1.22e-04 2022-05-09 12:48:36,021 INFO [train.py:715] (0/8) Epoch 18, batch 33500, loss[loss=0.1149, simple_loss=0.1823, pruned_loss=0.02373, over 4830.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02897, over 973218.29 frames.], batch size: 12, lr: 1.22e-04 2022-05-09 12:49:15,395 INFO [train.py:715] (0/8) Epoch 18, batch 33550, loss[loss=0.1204, simple_loss=0.2006, pruned_loss=0.02012, over 4781.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02828, over 972849.39 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 12:49:54,438 INFO [train.py:715] (0/8) Epoch 18, batch 33600, loss[loss=0.1313, simple_loss=0.2172, pruned_loss=0.02274, over 4859.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2067, pruned_loss=0.02856, over 972601.43 frames.], batch size: 34, lr: 1.22e-04 2022-05-09 12:50:32,500 INFO [train.py:715] (0/8) Epoch 18, batch 33650, loss[loss=0.1201, simple_loss=0.2021, pruned_loss=0.01906, over 4984.00 frames.], tot_loss[loss=0.1322, simple_loss=0.207, pruned_loss=0.02868, over 973135.66 frames.], batch size: 25, lr: 1.22e-04 2022-05-09 12:51:11,945 INFO [train.py:715] (0/8) Epoch 18, batch 33700, loss[loss=0.1407, simple_loss=0.2035, pruned_loss=0.03897, over 4853.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02891, over 972928.69 frames.], batch size: 30, lr: 1.22e-04 2022-05-09 12:51:51,113 INFO [train.py:715] (0/8) Epoch 18, batch 33750, loss[loss=0.1392, simple_loss=0.214, pruned_loss=0.03222, over 4930.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02864, over 973403.55 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 12:52:30,428 INFO [train.py:715] (0/8) Epoch 18, batch 33800, loss[loss=0.1377, simple_loss=0.205, pruned_loss=0.03517, over 4787.00 frames.], tot_loss[loss=0.132, simple_loss=0.2069, pruned_loss=0.02855, over 974140.43 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 12:53:09,705 INFO [train.py:715] (0/8) Epoch 18, batch 33850, loss[loss=0.1263, simple_loss=0.2016, pruned_loss=0.02553, over 4876.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2066, pruned_loss=0.02832, over 972993.98 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 12:53:49,535 INFO [train.py:715] (0/8) Epoch 18, batch 33900, loss[loss=0.1137, simple_loss=0.1953, pruned_loss=0.01606, over 4917.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02839, over 972687.77 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 12:54:28,736 INFO [train.py:715] (0/8) Epoch 18, batch 33950, loss[loss=0.1337, simple_loss=0.206, pruned_loss=0.03066, over 4853.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2065, pruned_loss=0.02852, over 972435.87 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 12:55:07,055 INFO [train.py:715] (0/8) Epoch 18, batch 34000, loss[loss=0.1535, simple_loss=0.2192, pruned_loss=0.04396, over 4874.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.0291, over 972475.69 frames.], batch size: 20, lr: 1.22e-04 2022-05-09 12:55:46,476 INFO [train.py:715] (0/8) Epoch 18, batch 34050, loss[loss=0.1302, simple_loss=0.2008, pruned_loss=0.0298, over 4857.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2071, pruned_loss=0.02891, over 973430.57 frames.], batch size: 34, lr: 1.22e-04 2022-05-09 12:56:25,890 INFO [train.py:715] (0/8) Epoch 18, batch 34100, loss[loss=0.1121, simple_loss=0.1829, pruned_loss=0.02065, over 4962.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02853, over 973176.81 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 12:57:05,032 INFO [train.py:715] (0/8) Epoch 18, batch 34150, loss[loss=0.1198, simple_loss=0.1964, pruned_loss=0.02165, over 4915.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.0285, over 972593.03 frames.], batch size: 17, lr: 1.22e-04 2022-05-09 12:57:44,080 INFO [train.py:715] (0/8) Epoch 18, batch 34200, loss[loss=0.1105, simple_loss=0.1837, pruned_loss=0.01864, over 4926.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.02847, over 972652.18 frames.], batch size: 23, lr: 1.22e-04 2022-05-09 12:58:23,224 INFO [train.py:715] (0/8) Epoch 18, batch 34250, loss[loss=0.154, simple_loss=0.2327, pruned_loss=0.03762, over 4970.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2047, pruned_loss=0.02815, over 972613.47 frames.], batch size: 21, lr: 1.22e-04 2022-05-09 12:59:02,031 INFO [train.py:715] (0/8) Epoch 18, batch 34300, loss[loss=0.1326, simple_loss=0.2034, pruned_loss=0.03091, over 4800.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02884, over 972164.85 frames.], batch size: 13, lr: 1.22e-04 2022-05-09 12:59:40,332 INFO [train.py:715] (0/8) Epoch 18, batch 34350, loss[loss=0.1221, simple_loss=0.2082, pruned_loss=0.01799, over 4897.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.02888, over 971739.75 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 13:00:19,858 INFO [train.py:715] (0/8) Epoch 18, batch 34400, loss[loss=0.1881, simple_loss=0.2515, pruned_loss=0.06242, over 4808.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2073, pruned_loss=0.02912, over 971709.72 frames.], batch size: 14, lr: 1.22e-04 2022-05-09 13:00:59,441 INFO [train.py:715] (0/8) Epoch 18, batch 34450, loss[loss=0.1206, simple_loss=0.1985, pruned_loss=0.02128, over 4971.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2066, pruned_loss=0.02858, over 971793.53 frames.], batch size: 28, lr: 1.22e-04 2022-05-09 13:01:39,369 INFO [train.py:715] (0/8) Epoch 18, batch 34500, loss[loss=0.1401, simple_loss=0.2154, pruned_loss=0.03238, over 4694.00 frames.], tot_loss[loss=0.132, simple_loss=0.207, pruned_loss=0.02852, over 971894.12 frames.], batch size: 15, lr: 1.22e-04 2022-05-09 13:02:18,892 INFO [train.py:715] (0/8) Epoch 18, batch 34550, loss[loss=0.1208, simple_loss=0.1983, pruned_loss=0.02167, over 4869.00 frames.], tot_loss[loss=0.1331, simple_loss=0.208, pruned_loss=0.02912, over 971213.42 frames.], batch size: 39, lr: 1.22e-04 2022-05-09 13:02:58,571 INFO [train.py:715] (0/8) Epoch 18, batch 34600, loss[loss=0.1215, simple_loss=0.1945, pruned_loss=0.02426, over 4909.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2074, pruned_loss=0.02883, over 971470.85 frames.], batch size: 18, lr: 1.22e-04 2022-05-09 13:03:37,753 INFO [train.py:715] (0/8) Epoch 18, batch 34650, loss[loss=0.1345, simple_loss=0.2075, pruned_loss=0.03069, over 4736.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02882, over 972412.49 frames.], batch size: 16, lr: 1.22e-04 2022-05-09 13:04:17,378 INFO [train.py:715] (0/8) Epoch 18, batch 34700, loss[loss=0.1369, simple_loss=0.2072, pruned_loss=0.03333, over 4932.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2063, pruned_loss=0.02876, over 972309.65 frames.], batch size: 18, lr: 1.21e-04 2022-05-09 13:04:56,516 INFO [train.py:715] (0/8) Epoch 18, batch 34750, loss[loss=0.1043, simple_loss=0.1764, pruned_loss=0.01606, over 4834.00 frames.], tot_loss[loss=0.1318, simple_loss=0.206, pruned_loss=0.02885, over 972078.96 frames.], batch size: 15, lr: 1.21e-04 2022-05-09 13:05:34,142 INFO [train.py:715] (0/8) Epoch 18, batch 34800, loss[loss=0.1251, simple_loss=0.1935, pruned_loss=0.02839, over 4799.00 frames.], tot_loss[loss=0.131, simple_loss=0.205, pruned_loss=0.02845, over 970274.74 frames.], batch size: 12, lr: 1.21e-04 2022-05-09 13:05:42,979 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-18.pt 2022-05-09 13:06:24,913 INFO [train.py:715] (0/8) Epoch 19, batch 0, loss[loss=0.1277, simple_loss=0.2045, pruned_loss=0.02546, over 4747.00 frames.], tot_loss[loss=0.1277, simple_loss=0.2045, pruned_loss=0.02546, over 4747.00 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 13:07:03,500 INFO [train.py:715] (0/8) Epoch 19, batch 50, loss[loss=0.1403, simple_loss=0.202, pruned_loss=0.03934, over 4748.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2049, pruned_loss=0.02818, over 219077.00 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 13:07:43,523 INFO [train.py:715] (0/8) Epoch 19, batch 100, loss[loss=0.1369, simple_loss=0.1998, pruned_loss=0.03701, over 4947.00 frames.], tot_loss[loss=0.1335, simple_loss=0.207, pruned_loss=0.03001, over 386005.42 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 13:08:23,948 INFO [train.py:715] (0/8) Epoch 19, batch 150, loss[loss=0.1232, simple_loss=0.2141, pruned_loss=0.01611, over 4989.00 frames.], tot_loss[loss=0.1321, simple_loss=0.206, pruned_loss=0.02905, over 516235.76 frames.], batch size: 28, lr: 1.18e-04 2022-05-09 13:09:04,145 INFO [train.py:715] (0/8) Epoch 19, batch 200, loss[loss=0.1343, simple_loss=0.2129, pruned_loss=0.02787, over 4882.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02888, over 618195.58 frames.], batch size: 22, lr: 1.18e-04 2022-05-09 13:09:44,074 INFO [train.py:715] (0/8) Epoch 19, batch 250, loss[loss=0.15, simple_loss=0.2274, pruned_loss=0.03629, over 4780.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2061, pruned_loss=0.02924, over 696314.64 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:10:24,214 INFO [train.py:715] (0/8) Epoch 19, batch 300, loss[loss=0.161, simple_loss=0.2476, pruned_loss=0.03724, over 4923.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2055, pruned_loss=0.02899, over 757663.85 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:11:04,668 INFO [train.py:715] (0/8) Epoch 19, batch 350, loss[loss=0.1469, simple_loss=0.2049, pruned_loss=0.04446, over 4637.00 frames.], tot_loss[loss=0.1311, simple_loss=0.205, pruned_loss=0.0286, over 804698.05 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 13:11:43,715 INFO [train.py:715] (0/8) Epoch 19, batch 400, loss[loss=0.1697, simple_loss=0.2225, pruned_loss=0.0585, over 4770.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2057, pruned_loss=0.0291, over 841227.22 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 13:12:24,047 INFO [train.py:715] (0/8) Epoch 19, batch 450, loss[loss=0.1358, simple_loss=0.2103, pruned_loss=0.03065, over 4937.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2062, pruned_loss=0.02926, over 869673.42 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 13:13:04,619 INFO [train.py:715] (0/8) Epoch 19, batch 500, loss[loss=0.1429, simple_loss=0.217, pruned_loss=0.03446, over 4840.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02872, over 892790.52 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 13:13:44,279 INFO [train.py:715] (0/8) Epoch 19, batch 550, loss[loss=0.1683, simple_loss=0.2303, pruned_loss=0.05318, over 4904.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02875, over 909808.53 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:14:24,232 INFO [train.py:715] (0/8) Epoch 19, batch 600, loss[loss=0.1227, simple_loss=0.1976, pruned_loss=0.02387, over 4764.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2061, pruned_loss=0.02885, over 923774.93 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 13:15:04,545 INFO [train.py:715] (0/8) Epoch 19, batch 650, loss[loss=0.1299, simple_loss=0.2042, pruned_loss=0.02783, over 4953.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02885, over 935201.71 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 13:15:44,872 INFO [train.py:715] (0/8) Epoch 19, batch 700, loss[loss=0.1143, simple_loss=0.1929, pruned_loss=0.01787, over 4814.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2072, pruned_loss=0.02916, over 942555.18 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 13:16:24,132 INFO [train.py:715] (0/8) Epoch 19, batch 750, loss[loss=0.1361, simple_loss=0.2121, pruned_loss=0.03001, over 4811.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02882, over 949132.40 frames.], batch size: 27, lr: 1.18e-04 2022-05-09 13:17:03,937 INFO [train.py:715] (0/8) Epoch 19, batch 800, loss[loss=0.1707, simple_loss=0.2408, pruned_loss=0.05025, over 4973.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02909, over 955076.85 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 13:17:44,199 INFO [train.py:715] (0/8) Epoch 19, batch 850, loss[loss=0.1424, simple_loss=0.2087, pruned_loss=0.03808, over 4774.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02897, over 959008.65 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 13:18:24,381 INFO [train.py:715] (0/8) Epoch 19, batch 900, loss[loss=0.1187, simple_loss=0.1969, pruned_loss=0.02026, over 4922.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.0285, over 962233.19 frames.], batch size: 29, lr: 1.18e-04 2022-05-09 13:19:03,891 INFO [train.py:715] (0/8) Epoch 19, batch 950, loss[loss=0.1353, simple_loss=0.2067, pruned_loss=0.03193, over 4942.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02861, over 964546.63 frames.], batch size: 39, lr: 1.18e-04 2022-05-09 13:19:43,252 INFO [train.py:715] (0/8) Epoch 19, batch 1000, loss[loss=0.1374, simple_loss=0.2165, pruned_loss=0.02914, over 4930.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02853, over 966976.89 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 13:20:23,194 INFO [train.py:715] (0/8) Epoch 19, batch 1050, loss[loss=0.1468, simple_loss=0.2308, pruned_loss=0.03143, over 4767.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02843, over 967864.25 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:21:02,189 INFO [train.py:715] (0/8) Epoch 19, batch 1100, loss[loss=0.1274, simple_loss=0.2075, pruned_loss=0.02369, over 4971.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.02883, over 968426.14 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 13:21:42,017 INFO [train.py:715] (0/8) Epoch 19, batch 1150, loss[loss=0.1394, simple_loss=0.2102, pruned_loss=0.03432, over 4856.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02868, over 969406.47 frames.], batch size: 20, lr: 1.18e-04 2022-05-09 13:22:21,963 INFO [train.py:715] (0/8) Epoch 19, batch 1200, loss[loss=0.1465, simple_loss=0.2063, pruned_loss=0.04332, over 4859.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.02807, over 969840.23 frames.], batch size: 34, lr: 1.18e-04 2022-05-09 13:23:01,715 INFO [train.py:715] (0/8) Epoch 19, batch 1250, loss[loss=0.1258, simple_loss=0.1967, pruned_loss=0.02746, over 4892.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.028, over 969355.28 frames.], batch size: 39, lr: 1.18e-04 2022-05-09 13:23:41,060 INFO [train.py:715] (0/8) Epoch 19, batch 1300, loss[loss=0.09966, simple_loss=0.1762, pruned_loss=0.01158, over 4969.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2051, pruned_loss=0.02795, over 969689.58 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 13:24:20,594 INFO [train.py:715] (0/8) Epoch 19, batch 1350, loss[loss=0.1347, simple_loss=0.2088, pruned_loss=0.03031, over 4883.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02814, over 969741.37 frames.], batch size: 22, lr: 1.18e-04 2022-05-09 13:25:00,616 INFO [train.py:715] (0/8) Epoch 19, batch 1400, loss[loss=0.1448, simple_loss=0.2143, pruned_loss=0.03764, over 4902.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02826, over 970454.22 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:25:39,916 INFO [train.py:715] (0/8) Epoch 19, batch 1450, loss[loss=0.1226, simple_loss=0.1951, pruned_loss=0.02509, over 4770.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2058, pruned_loss=0.02806, over 970937.84 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 13:26:20,230 INFO [train.py:715] (0/8) Epoch 19, batch 1500, loss[loss=0.1291, simple_loss=0.1987, pruned_loss=0.02975, over 4902.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2059, pruned_loss=0.02815, over 971222.74 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:27:00,280 INFO [train.py:715] (0/8) Epoch 19, batch 1550, loss[loss=0.1206, simple_loss=0.1948, pruned_loss=0.02314, over 4886.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2068, pruned_loss=0.02855, over 970569.72 frames.], batch size: 22, lr: 1.18e-04 2022-05-09 13:27:40,367 INFO [train.py:715] (0/8) Epoch 19, batch 1600, loss[loss=0.1364, simple_loss=0.2092, pruned_loss=0.03177, over 4934.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02841, over 970643.64 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 13:28:19,704 INFO [train.py:715] (0/8) Epoch 19, batch 1650, loss[loss=0.1525, simple_loss=0.2323, pruned_loss=0.03641, over 4951.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2069, pruned_loss=0.02863, over 971260.26 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 13:28:59,069 INFO [train.py:715] (0/8) Epoch 19, batch 1700, loss[loss=0.1469, simple_loss=0.2016, pruned_loss=0.04611, over 4782.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.0286, over 971303.08 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:29:39,055 INFO [train.py:715] (0/8) Epoch 19, batch 1750, loss[loss=0.1244, simple_loss=0.2064, pruned_loss=0.02118, over 4805.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02849, over 971302.78 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 13:30:18,169 INFO [train.py:715] (0/8) Epoch 19, batch 1800, loss[loss=0.1359, simple_loss=0.2173, pruned_loss=0.02723, over 4825.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2061, pruned_loss=0.02826, over 971238.70 frames.], batch size: 26, lr: 1.18e-04 2022-05-09 13:30:57,612 INFO [train.py:715] (0/8) Epoch 19, batch 1850, loss[loss=0.1439, simple_loss=0.2068, pruned_loss=0.04051, over 4843.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2061, pruned_loss=0.02825, over 971046.32 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 13:31:36,856 INFO [train.py:715] (0/8) Epoch 19, batch 1900, loss[loss=0.1618, simple_loss=0.2258, pruned_loss=0.0489, over 4862.00 frames.], tot_loss[loss=0.132, simple_loss=0.2062, pruned_loss=0.02892, over 971044.86 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 13:32:16,775 INFO [train.py:715] (0/8) Epoch 19, batch 1950, loss[loss=0.1224, simple_loss=0.1998, pruned_loss=0.02256, over 4958.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2064, pruned_loss=0.029, over 970955.20 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 13:32:55,072 INFO [train.py:715] (0/8) Epoch 19, batch 2000, loss[loss=0.1538, simple_loss=0.2313, pruned_loss=0.03813, over 4895.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2064, pruned_loss=0.02898, over 971703.24 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:33:34,212 INFO [train.py:715] (0/8) Epoch 19, batch 2050, loss[loss=0.1217, simple_loss=0.1904, pruned_loss=0.02649, over 4870.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02903, over 971987.34 frames.], batch size: 30, lr: 1.18e-04 2022-05-09 13:34:13,312 INFO [train.py:715] (0/8) Epoch 19, batch 2100, loss[loss=0.1248, simple_loss=0.2039, pruned_loss=0.02284, over 4906.00 frames.], tot_loss[loss=0.131, simple_loss=0.2049, pruned_loss=0.02857, over 972695.98 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:34:52,130 INFO [train.py:715] (0/8) Epoch 19, batch 2150, loss[loss=0.1347, simple_loss=0.2094, pruned_loss=0.02995, over 4930.00 frames.], tot_loss[loss=0.132, simple_loss=0.2058, pruned_loss=0.02906, over 972096.48 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:35:31,124 INFO [train.py:715] (0/8) Epoch 19, batch 2200, loss[loss=0.1262, simple_loss=0.211, pruned_loss=0.02069, over 4986.00 frames.], tot_loss[loss=0.131, simple_loss=0.2052, pruned_loss=0.02837, over 972004.83 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 13:36:09,821 INFO [train.py:715] (0/8) Epoch 19, batch 2250, loss[loss=0.1152, simple_loss=0.1862, pruned_loss=0.0221, over 4991.00 frames.], tot_loss[loss=0.131, simple_loss=0.2053, pruned_loss=0.02834, over 972305.51 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 13:36:49,415 INFO [train.py:715] (0/8) Epoch 19, batch 2300, loss[loss=0.1363, simple_loss=0.2084, pruned_loss=0.03215, over 4852.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2046, pruned_loss=0.02796, over 972544.67 frames.], batch size: 30, lr: 1.18e-04 2022-05-09 13:37:28,006 INFO [train.py:715] (0/8) Epoch 19, batch 2350, loss[loss=0.1439, simple_loss=0.2126, pruned_loss=0.03761, over 4961.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2046, pruned_loss=0.02819, over 972749.31 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 13:38:07,166 INFO [train.py:715] (0/8) Epoch 19, batch 2400, loss[loss=0.1143, simple_loss=0.1884, pruned_loss=0.02016, over 4804.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2056, pruned_loss=0.0283, over 972743.41 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 13:38:46,611 INFO [train.py:715] (0/8) Epoch 19, batch 2450, loss[loss=0.1547, simple_loss=0.2291, pruned_loss=0.04017, over 4875.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2056, pruned_loss=0.02828, over 973402.33 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 13:39:25,447 INFO [train.py:715] (0/8) Epoch 19, batch 2500, loss[loss=0.127, simple_loss=0.202, pruned_loss=0.02598, over 4703.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2051, pruned_loss=0.0278, over 973398.97 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 13:40:04,477 INFO [train.py:715] (0/8) Epoch 19, batch 2550, loss[loss=0.1423, simple_loss=0.2196, pruned_loss=0.03251, over 4786.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.0281, over 972885.13 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:40:44,005 INFO [train.py:715] (0/8) Epoch 19, batch 2600, loss[loss=0.1357, simple_loss=0.2022, pruned_loss=0.03455, over 4792.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02867, over 972304.52 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 13:41:01,349 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-664000.pt 2022-05-09 13:41:26,470 INFO [train.py:715] (0/8) Epoch 19, batch 2650, loss[loss=0.1518, simple_loss=0.2395, pruned_loss=0.03206, over 4814.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02885, over 973314.24 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 13:42:05,376 INFO [train.py:715] (0/8) Epoch 19, batch 2700, loss[loss=0.1376, simple_loss=0.2153, pruned_loss=0.02997, over 4893.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02894, over 973719.08 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 13:42:44,049 INFO [train.py:715] (0/8) Epoch 19, batch 2750, loss[loss=0.1303, simple_loss=0.2038, pruned_loss=0.02843, over 4961.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02893, over 974056.47 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 13:43:23,787 INFO [train.py:715] (0/8) Epoch 19, batch 2800, loss[loss=0.1533, simple_loss=0.2227, pruned_loss=0.04198, over 4755.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02866, over 973430.22 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 13:44:03,072 INFO [train.py:715] (0/8) Epoch 19, batch 2850, loss[loss=0.1306, simple_loss=0.2041, pruned_loss=0.02857, over 4832.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02888, over 973983.23 frames.], batch size: 26, lr: 1.18e-04 2022-05-09 13:44:42,000 INFO [train.py:715] (0/8) Epoch 19, batch 2900, loss[loss=0.1319, simple_loss=0.2076, pruned_loss=0.02804, over 4780.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02859, over 973181.07 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:45:20,759 INFO [train.py:715] (0/8) Epoch 19, batch 2950, loss[loss=0.1627, simple_loss=0.2289, pruned_loss=0.04822, over 4870.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2053, pruned_loss=0.02881, over 973633.25 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 13:46:00,072 INFO [train.py:715] (0/8) Epoch 19, batch 3000, loss[loss=0.1619, simple_loss=0.2267, pruned_loss=0.0486, over 4784.00 frames.], tot_loss[loss=0.1321, simple_loss=0.206, pruned_loss=0.02912, over 973855.61 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:46:00,073 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 13:46:10,051 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1045, simple_loss=0.1877, pruned_loss=0.01062, over 914524.00 frames. 2022-05-09 13:46:50,336 INFO [train.py:715] (0/8) Epoch 19, batch 3050, loss[loss=0.1406, simple_loss=0.2169, pruned_loss=0.03215, over 4902.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2058, pruned_loss=0.029, over 972596.23 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 13:47:29,686 INFO [train.py:715] (0/8) Epoch 19, batch 3100, loss[loss=0.1232, simple_loss=0.2075, pruned_loss=0.01946, over 4982.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02908, over 973240.56 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 13:48:08,830 INFO [train.py:715] (0/8) Epoch 19, batch 3150, loss[loss=0.1599, simple_loss=0.2386, pruned_loss=0.04058, over 4928.00 frames.], tot_loss[loss=0.133, simple_loss=0.2074, pruned_loss=0.02928, over 973985.72 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 13:48:48,669 INFO [train.py:715] (0/8) Epoch 19, batch 3200, loss[loss=0.1653, simple_loss=0.2294, pruned_loss=0.05061, over 4974.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2072, pruned_loss=0.02946, over 973422.75 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 13:49:27,691 INFO [train.py:715] (0/8) Epoch 19, batch 3250, loss[loss=0.1523, simple_loss=0.224, pruned_loss=0.04027, over 4958.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2075, pruned_loss=0.02936, over 972664.15 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 13:50:07,131 INFO [train.py:715] (0/8) Epoch 19, batch 3300, loss[loss=0.1063, simple_loss=0.175, pruned_loss=0.01883, over 4771.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2072, pruned_loss=0.02917, over 972490.30 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:50:46,358 INFO [train.py:715] (0/8) Epoch 19, batch 3350, loss[loss=0.1402, simple_loss=0.2214, pruned_loss=0.02946, over 4824.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2075, pruned_loss=0.02906, over 972692.46 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 13:51:26,502 INFO [train.py:715] (0/8) Epoch 19, batch 3400, loss[loss=0.1102, simple_loss=0.1861, pruned_loss=0.01713, over 4915.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2067, pruned_loss=0.02854, over 972823.34 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 13:52:05,357 INFO [train.py:715] (0/8) Epoch 19, batch 3450, loss[loss=0.1528, simple_loss=0.2354, pruned_loss=0.0351, over 4834.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02839, over 973385.91 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 13:52:44,610 INFO [train.py:715] (0/8) Epoch 19, batch 3500, loss[loss=0.1392, simple_loss=0.2096, pruned_loss=0.03444, over 4738.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2073, pruned_loss=0.02882, over 973205.41 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 13:53:23,730 INFO [train.py:715] (0/8) Epoch 19, batch 3550, loss[loss=0.112, simple_loss=0.1911, pruned_loss=0.01642, over 4924.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2069, pruned_loss=0.02877, over 973049.51 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 13:54:02,619 INFO [train.py:715] (0/8) Epoch 19, batch 3600, loss[loss=0.1258, simple_loss=0.1992, pruned_loss=0.02617, over 4904.00 frames.], tot_loss[loss=0.131, simple_loss=0.2053, pruned_loss=0.02836, over 972522.00 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 13:54:42,250 INFO [train.py:715] (0/8) Epoch 19, batch 3650, loss[loss=0.1041, simple_loss=0.1673, pruned_loss=0.02049, over 4762.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2055, pruned_loss=0.02884, over 972135.57 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 13:55:21,394 INFO [train.py:715] (0/8) Epoch 19, batch 3700, loss[loss=0.1292, simple_loss=0.1972, pruned_loss=0.03053, over 4773.00 frames.], tot_loss[loss=0.1309, simple_loss=0.205, pruned_loss=0.02844, over 972438.32 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:56:01,849 INFO [train.py:715] (0/8) Epoch 19, batch 3750, loss[loss=0.1159, simple_loss=0.1888, pruned_loss=0.02156, over 4947.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2044, pruned_loss=0.02823, over 972700.62 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 13:56:40,835 INFO [train.py:715] (0/8) Epoch 19, batch 3800, loss[loss=0.122, simple_loss=0.2055, pruned_loss=0.01929, over 4812.00 frames.], tot_loss[loss=0.1308, simple_loss=0.205, pruned_loss=0.02832, over 972309.79 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 13:57:19,815 INFO [train.py:715] (0/8) Epoch 19, batch 3850, loss[loss=0.16, simple_loss=0.2373, pruned_loss=0.04138, over 4762.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.02836, over 972048.72 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 13:57:59,510 INFO [train.py:715] (0/8) Epoch 19, batch 3900, loss[loss=0.1121, simple_loss=0.1778, pruned_loss=0.02316, over 4869.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02866, over 970715.22 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 13:58:38,557 INFO [train.py:715] (0/8) Epoch 19, batch 3950, loss[loss=0.146, simple_loss=0.2259, pruned_loss=0.03305, over 4965.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.0288, over 971426.15 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 13:59:17,192 INFO [train.py:715] (0/8) Epoch 19, batch 4000, loss[loss=0.1263, simple_loss=0.2052, pruned_loss=0.0237, over 4809.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2053, pruned_loss=0.02856, over 972297.55 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 13:59:56,643 INFO [train.py:715] (0/8) Epoch 19, batch 4050, loss[loss=0.1588, simple_loss=0.2242, pruned_loss=0.04668, over 4992.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.02888, over 971844.67 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 14:00:36,774 INFO [train.py:715] (0/8) Epoch 19, batch 4100, loss[loss=0.1439, simple_loss=0.218, pruned_loss=0.03495, over 4971.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02888, over 971486.84 frames.], batch size: 39, lr: 1.18e-04 2022-05-09 14:01:15,966 INFO [train.py:715] (0/8) Epoch 19, batch 4150, loss[loss=0.1404, simple_loss=0.2103, pruned_loss=0.03531, over 4879.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2056, pruned_loss=0.02851, over 972174.17 frames.], batch size: 22, lr: 1.18e-04 2022-05-09 14:01:54,738 INFO [train.py:715] (0/8) Epoch 19, batch 4200, loss[loss=0.1348, simple_loss=0.2115, pruned_loss=0.02902, over 4893.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2055, pruned_loss=0.02834, over 972363.31 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 14:02:33,998 INFO [train.py:715] (0/8) Epoch 19, batch 4250, loss[loss=0.1589, simple_loss=0.2282, pruned_loss=0.04483, over 4787.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2046, pruned_loss=0.02799, over 973058.83 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 14:03:13,060 INFO [train.py:715] (0/8) Epoch 19, batch 4300, loss[loss=0.1122, simple_loss=0.182, pruned_loss=0.02122, over 4876.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2048, pruned_loss=0.02773, over 972832.09 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 14:03:52,546 INFO [train.py:715] (0/8) Epoch 19, batch 4350, loss[loss=0.1111, simple_loss=0.1859, pruned_loss=0.01815, over 4954.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2051, pruned_loss=0.02783, over 973413.34 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:04:31,610 INFO [train.py:715] (0/8) Epoch 19, batch 4400, loss[loss=0.1411, simple_loss=0.2119, pruned_loss=0.03517, over 4644.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2056, pruned_loss=0.02807, over 973626.51 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 14:05:11,664 INFO [train.py:715] (0/8) Epoch 19, batch 4450, loss[loss=0.1648, simple_loss=0.2267, pruned_loss=0.05144, over 4878.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.0285, over 973701.82 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 14:05:50,508 INFO [train.py:715] (0/8) Epoch 19, batch 4500, loss[loss=0.1469, simple_loss=0.208, pruned_loss=0.04292, over 4843.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02845, over 973167.35 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 14:06:29,204 INFO [train.py:715] (0/8) Epoch 19, batch 4550, loss[loss=0.1185, simple_loss=0.1992, pruned_loss=0.01891, over 4879.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2066, pruned_loss=0.02841, over 973452.25 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 14:07:08,888 INFO [train.py:715] (0/8) Epoch 19, batch 4600, loss[loss=0.1298, simple_loss=0.2085, pruned_loss=0.02559, over 4940.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2068, pruned_loss=0.0283, over 973704.37 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:07:48,136 INFO [train.py:715] (0/8) Epoch 19, batch 4650, loss[loss=0.1133, simple_loss=0.1883, pruned_loss=0.01911, over 4818.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.0285, over 972801.99 frames.], batch size: 27, lr: 1.18e-04 2022-05-09 14:08:27,121 INFO [train.py:715] (0/8) Epoch 19, batch 4700, loss[loss=0.1123, simple_loss=0.1835, pruned_loss=0.02058, over 4822.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.02856, over 973015.91 frames.], batch size: 27, lr: 1.18e-04 2022-05-09 14:09:06,335 INFO [train.py:715] (0/8) Epoch 19, batch 4750, loss[loss=0.1253, simple_loss=0.1973, pruned_loss=0.0267, over 4864.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02889, over 972096.57 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 14:09:46,293 INFO [train.py:715] (0/8) Epoch 19, batch 4800, loss[loss=0.1421, simple_loss=0.2052, pruned_loss=0.03952, over 4686.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.02832, over 972132.16 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:10:25,676 INFO [train.py:715] (0/8) Epoch 19, batch 4850, loss[loss=0.1113, simple_loss=0.1859, pruned_loss=0.01836, over 4984.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02838, over 972962.87 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:11:04,337 INFO [train.py:715] (0/8) Epoch 19, batch 4900, loss[loss=0.1102, simple_loss=0.1851, pruned_loss=0.01767, over 4902.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02843, over 972315.54 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:11:44,092 INFO [train.py:715] (0/8) Epoch 19, batch 4950, loss[loss=0.1237, simple_loss=0.1958, pruned_loss=0.02576, over 4902.00 frames.], tot_loss[loss=0.131, simple_loss=0.2055, pruned_loss=0.02828, over 972310.30 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:12:23,739 INFO [train.py:715] (0/8) Epoch 19, batch 5000, loss[loss=0.1317, simple_loss=0.2098, pruned_loss=0.02681, over 4951.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02844, over 971956.34 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:13:02,751 INFO [train.py:715] (0/8) Epoch 19, batch 5050, loss[loss=0.1122, simple_loss=0.1905, pruned_loss=0.01695, over 4815.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.02829, over 972035.67 frames.], batch size: 26, lr: 1.18e-04 2022-05-09 14:13:41,114 INFO [train.py:715] (0/8) Epoch 19, batch 5100, loss[loss=0.1256, simple_loss=0.199, pruned_loss=0.02611, over 4818.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2065, pruned_loss=0.02849, over 972340.34 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 14:14:21,159 INFO [train.py:715] (0/8) Epoch 19, batch 5150, loss[loss=0.1209, simple_loss=0.202, pruned_loss=0.01992, over 4879.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2061, pruned_loss=0.02821, over 972782.03 frames.], batch size: 22, lr: 1.18e-04 2022-05-09 14:15:00,186 INFO [train.py:715] (0/8) Epoch 19, batch 5200, loss[loss=0.1386, simple_loss=0.2011, pruned_loss=0.03798, over 4752.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2053, pruned_loss=0.02792, over 972786.30 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 14:15:38,853 INFO [train.py:715] (0/8) Epoch 19, batch 5250, loss[loss=0.1154, simple_loss=0.1899, pruned_loss=0.02045, over 4942.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.02796, over 972627.28 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:16:18,535 INFO [train.py:715] (0/8) Epoch 19, batch 5300, loss[loss=0.1144, simple_loss=0.1875, pruned_loss=0.02067, over 4882.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2056, pruned_loss=0.02859, over 973376.07 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 14:16:58,483 INFO [train.py:715] (0/8) Epoch 19, batch 5350, loss[loss=0.1285, simple_loss=0.2039, pruned_loss=0.02651, over 4985.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2046, pruned_loss=0.02784, over 974254.34 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:17:38,564 INFO [train.py:715] (0/8) Epoch 19, batch 5400, loss[loss=0.1394, simple_loss=0.2108, pruned_loss=0.03402, over 4895.00 frames.], tot_loss[loss=0.13, simple_loss=0.2045, pruned_loss=0.02778, over 974026.66 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 14:18:17,825 INFO [train.py:715] (0/8) Epoch 19, batch 5450, loss[loss=0.1241, simple_loss=0.1998, pruned_loss=0.02424, over 4922.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2049, pruned_loss=0.02792, over 973324.68 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:18:58,015 INFO [train.py:715] (0/8) Epoch 19, batch 5500, loss[loss=0.1609, simple_loss=0.2378, pruned_loss=0.042, over 4961.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2062, pruned_loss=0.02833, over 973927.23 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 14:19:37,201 INFO [train.py:715] (0/8) Epoch 19, batch 5550, loss[loss=0.1191, simple_loss=0.2014, pruned_loss=0.01844, over 4764.00 frames.], tot_loss[loss=0.131, simple_loss=0.206, pruned_loss=0.02799, over 972988.06 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 14:20:16,791 INFO [train.py:715] (0/8) Epoch 19, batch 5600, loss[loss=0.1387, simple_loss=0.2028, pruned_loss=0.03729, over 4875.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2056, pruned_loss=0.02787, over 972523.74 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 14:20:56,101 INFO [train.py:715] (0/8) Epoch 19, batch 5650, loss[loss=0.1076, simple_loss=0.1734, pruned_loss=0.02095, over 4792.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02802, over 972669.97 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 14:21:35,828 INFO [train.py:715] (0/8) Epoch 19, batch 5700, loss[loss=0.131, simple_loss=0.212, pruned_loss=0.02501, over 4771.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2054, pruned_loss=0.02792, over 972500.36 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 14:22:15,336 INFO [train.py:715] (0/8) Epoch 19, batch 5750, loss[loss=0.123, simple_loss=0.2012, pruned_loss=0.02238, over 4894.00 frames.], tot_loss[loss=0.13, simple_loss=0.2046, pruned_loss=0.02773, over 972095.48 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:22:53,918 INFO [train.py:715] (0/8) Epoch 19, batch 5800, loss[loss=0.125, simple_loss=0.197, pruned_loss=0.02647, over 4980.00 frames.], tot_loss[loss=0.1298, simple_loss=0.2043, pruned_loss=0.02771, over 972853.29 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 14:23:33,192 INFO [train.py:715] (0/8) Epoch 19, batch 5850, loss[loss=0.1229, simple_loss=0.187, pruned_loss=0.02936, over 4816.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2051, pruned_loss=0.02789, over 972780.93 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 14:24:11,657 INFO [train.py:715] (0/8) Epoch 19, batch 5900, loss[loss=0.1236, simple_loss=0.1975, pruned_loss=0.02485, over 4961.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2048, pruned_loss=0.02795, over 972470.98 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 14:24:51,077 INFO [train.py:715] (0/8) Epoch 19, batch 5950, loss[loss=0.1234, simple_loss=0.2104, pruned_loss=0.01821, over 4865.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2058, pruned_loss=0.02848, over 973055.45 frames.], batch size: 20, lr: 1.18e-04 2022-05-09 14:25:30,281 INFO [train.py:715] (0/8) Epoch 19, batch 6000, loss[loss=0.1248, simple_loss=0.2043, pruned_loss=0.02265, over 4777.00 frames.], tot_loss[loss=0.132, simple_loss=0.2067, pruned_loss=0.02861, over 972691.95 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 14:25:30,282 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 14:25:40,197 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1046, simple_loss=0.1878, pruned_loss=0.01067, over 914524.00 frames. 2022-05-09 14:26:19,489 INFO [train.py:715] (0/8) Epoch 19, batch 6050, loss[loss=0.1699, simple_loss=0.2388, pruned_loss=0.05053, over 4780.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.0288, over 973121.35 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:26:58,346 INFO [train.py:715] (0/8) Epoch 19, batch 6100, loss[loss=0.1314, simple_loss=0.2023, pruned_loss=0.03026, over 4846.00 frames.], tot_loss[loss=0.132, simple_loss=0.2068, pruned_loss=0.02858, over 972759.39 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 14:27:37,409 INFO [train.py:715] (0/8) Epoch 19, batch 6150, loss[loss=0.1301, simple_loss=0.2054, pruned_loss=0.02745, over 4903.00 frames.], tot_loss[loss=0.1321, simple_loss=0.207, pruned_loss=0.02862, over 972780.73 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:28:15,610 INFO [train.py:715] (0/8) Epoch 19, batch 6200, loss[loss=0.1226, simple_loss=0.2015, pruned_loss=0.02182, over 4987.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2062, pruned_loss=0.02838, over 972950.40 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:28:55,852 INFO [train.py:715] (0/8) Epoch 19, batch 6250, loss[loss=0.12, simple_loss=0.1913, pruned_loss=0.02433, over 4802.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2052, pruned_loss=0.02828, over 972861.20 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:29:35,035 INFO [train.py:715] (0/8) Epoch 19, batch 6300, loss[loss=0.119, simple_loss=0.194, pruned_loss=0.02199, over 4850.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2052, pruned_loss=0.0286, over 973418.11 frames.], batch size: 20, lr: 1.18e-04 2022-05-09 14:30:14,722 INFO [train.py:715] (0/8) Epoch 19, batch 6350, loss[loss=0.1637, simple_loss=0.2208, pruned_loss=0.05324, over 4969.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2056, pruned_loss=0.02896, over 973244.25 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 14:30:54,199 INFO [train.py:715] (0/8) Epoch 19, batch 6400, loss[loss=0.1356, simple_loss=0.222, pruned_loss=0.02462, over 4956.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02865, over 973032.80 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:31:33,475 INFO [train.py:715] (0/8) Epoch 19, batch 6450, loss[loss=0.1237, simple_loss=0.1967, pruned_loss=0.02531, over 4958.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.02813, over 972040.09 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 14:32:12,985 INFO [train.py:715] (0/8) Epoch 19, batch 6500, loss[loss=0.1213, simple_loss=0.2012, pruned_loss=0.02068, over 4961.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2052, pruned_loss=0.02832, over 972588.61 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:32:51,555 INFO [train.py:715] (0/8) Epoch 19, batch 6550, loss[loss=0.1034, simple_loss=0.1796, pruned_loss=0.01358, over 4913.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02892, over 973298.69 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 14:33:31,041 INFO [train.py:715] (0/8) Epoch 19, batch 6600, loss[loss=0.1251, simple_loss=0.1921, pruned_loss=0.02908, over 4828.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02881, over 973274.67 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:34:10,194 INFO [train.py:715] (0/8) Epoch 19, batch 6650, loss[loss=0.1023, simple_loss=0.1748, pruned_loss=0.01492, over 4819.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2064, pruned_loss=0.02914, over 972326.18 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 14:34:48,933 INFO [train.py:715] (0/8) Epoch 19, batch 6700, loss[loss=0.1044, simple_loss=0.1712, pruned_loss=0.01884, over 4786.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2065, pruned_loss=0.02936, over 972051.94 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 14:35:28,064 INFO [train.py:715] (0/8) Epoch 19, batch 6750, loss[loss=0.1245, simple_loss=0.2065, pruned_loss=0.02119, over 4807.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02885, over 972513.15 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:36:07,536 INFO [train.py:715] (0/8) Epoch 19, batch 6800, loss[loss=0.1157, simple_loss=0.1957, pruned_loss=0.01787, over 4991.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.0289, over 973371.10 frames.], batch size: 26, lr: 1.18e-04 2022-05-09 14:36:46,930 INFO [train.py:715] (0/8) Epoch 19, batch 6850, loss[loss=0.1309, simple_loss=0.206, pruned_loss=0.02789, over 4953.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02878, over 973780.52 frames.], batch size: 29, lr: 1.18e-04 2022-05-09 14:37:25,089 INFO [train.py:715] (0/8) Epoch 19, batch 6900, loss[loss=0.1394, simple_loss=0.219, pruned_loss=0.02995, over 4988.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2074, pruned_loss=0.02893, over 972243.89 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:38:04,133 INFO [train.py:715] (0/8) Epoch 19, batch 6950, loss[loss=0.1465, simple_loss=0.2324, pruned_loss=0.03031, over 4987.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2079, pruned_loss=0.02896, over 972772.12 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 14:38:43,601 INFO [train.py:715] (0/8) Epoch 19, batch 7000, loss[loss=0.1114, simple_loss=0.1895, pruned_loss=0.01664, over 4846.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2075, pruned_loss=0.02874, over 972551.77 frames.], batch size: 20, lr: 1.18e-04 2022-05-09 14:39:22,851 INFO [train.py:715] (0/8) Epoch 19, batch 7050, loss[loss=0.1309, simple_loss=0.2031, pruned_loss=0.0293, over 4835.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2067, pruned_loss=0.02859, over 971394.45 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:40:02,432 INFO [train.py:715] (0/8) Epoch 19, batch 7100, loss[loss=0.1388, simple_loss=0.2121, pruned_loss=0.03274, over 4695.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2066, pruned_loss=0.02839, over 971417.57 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:40:42,072 INFO [train.py:715] (0/8) Epoch 19, batch 7150, loss[loss=0.13, simple_loss=0.2115, pruned_loss=0.02427, over 4861.00 frames.], tot_loss[loss=0.131, simple_loss=0.2058, pruned_loss=0.02809, over 971155.90 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 14:41:20,977 INFO [train.py:715] (0/8) Epoch 19, batch 7200, loss[loss=0.1625, simple_loss=0.2403, pruned_loss=0.04239, over 4978.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2063, pruned_loss=0.02817, over 971790.75 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:41:59,734 INFO [train.py:715] (0/8) Epoch 19, batch 7250, loss[loss=0.13, simple_loss=0.2026, pruned_loss=0.02869, over 4965.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2066, pruned_loss=0.02845, over 971595.73 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:42:39,096 INFO [train.py:715] (0/8) Epoch 19, batch 7300, loss[loss=0.1197, simple_loss=0.2026, pruned_loss=0.01839, over 4786.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2059, pruned_loss=0.0285, over 971699.63 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 14:43:18,260 INFO [train.py:715] (0/8) Epoch 19, batch 7350, loss[loss=0.1425, simple_loss=0.2231, pruned_loss=0.03091, over 4950.00 frames.], tot_loss[loss=0.132, simple_loss=0.2068, pruned_loss=0.02862, over 973001.64 frames.], batch size: 29, lr: 1.18e-04 2022-05-09 14:43:57,160 INFO [train.py:715] (0/8) Epoch 19, batch 7400, loss[loss=0.1229, simple_loss=0.1964, pruned_loss=0.02472, over 4903.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2073, pruned_loss=0.02869, over 972501.94 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 14:44:37,620 INFO [train.py:715] (0/8) Epoch 19, batch 7450, loss[loss=0.1224, simple_loss=0.1953, pruned_loss=0.02469, over 4831.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2068, pruned_loss=0.0288, over 971932.14 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 14:45:17,481 INFO [train.py:715] (0/8) Epoch 19, batch 7500, loss[loss=0.1361, simple_loss=0.2047, pruned_loss=0.03381, over 4843.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02935, over 971910.51 frames.], batch size: 32, lr: 1.18e-04 2022-05-09 14:45:56,700 INFO [train.py:715] (0/8) Epoch 19, batch 7550, loss[loss=0.1495, simple_loss=0.2199, pruned_loss=0.0396, over 4688.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02897, over 971631.54 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:46:36,051 INFO [train.py:715] (0/8) Epoch 19, batch 7600, loss[loss=0.1121, simple_loss=0.1759, pruned_loss=0.02418, over 4921.00 frames.], tot_loss[loss=0.1322, simple_loss=0.207, pruned_loss=0.02868, over 972419.17 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 14:47:16,831 INFO [train.py:715] (0/8) Epoch 19, batch 7650, loss[loss=0.1377, simple_loss=0.2133, pruned_loss=0.03108, over 4841.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2078, pruned_loss=0.02921, over 972768.88 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 14:47:56,151 INFO [train.py:715] (0/8) Epoch 19, batch 7700, loss[loss=0.14, simple_loss=0.2138, pruned_loss=0.03306, over 4969.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02898, over 972740.62 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 14:48:34,961 INFO [train.py:715] (0/8) Epoch 19, batch 7750, loss[loss=0.1022, simple_loss=0.1688, pruned_loss=0.01782, over 4827.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2071, pruned_loss=0.02927, over 972738.73 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 14:49:14,657 INFO [train.py:715] (0/8) Epoch 19, batch 7800, loss[loss=0.1415, simple_loss=0.225, pruned_loss=0.02899, over 4885.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02897, over 972915.57 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 14:49:54,092 INFO [train.py:715] (0/8) Epoch 19, batch 7850, loss[loss=0.1162, simple_loss=0.1931, pruned_loss=0.01964, over 4913.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02857, over 972499.63 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 14:50:33,360 INFO [train.py:715] (0/8) Epoch 19, batch 7900, loss[loss=0.1184, simple_loss=0.1995, pruned_loss=0.01861, over 4821.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02879, over 972378.31 frames.], batch size: 25, lr: 1.18e-04 2022-05-09 14:51:11,762 INFO [train.py:715] (0/8) Epoch 19, batch 7950, loss[loss=0.1255, simple_loss=0.1984, pruned_loss=0.02624, over 4796.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02891, over 971949.26 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 14:51:51,088 INFO [train.py:715] (0/8) Epoch 19, batch 8000, loss[loss=0.1369, simple_loss=0.2094, pruned_loss=0.03217, over 4808.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02879, over 972599.86 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 14:52:30,287 INFO [train.py:715] (0/8) Epoch 19, batch 8050, loss[loss=0.1141, simple_loss=0.187, pruned_loss=0.02063, over 4779.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02843, over 973053.54 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 14:53:08,821 INFO [train.py:715] (0/8) Epoch 19, batch 8100, loss[loss=0.1258, simple_loss=0.1981, pruned_loss=0.02676, over 4930.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.02893, over 972752.92 frames.], batch size: 23, lr: 1.18e-04 2022-05-09 14:53:48,273 INFO [train.py:715] (0/8) Epoch 19, batch 8150, loss[loss=0.1092, simple_loss=0.1837, pruned_loss=0.01732, over 4789.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.02894, over 972237.75 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:54:27,922 INFO [train.py:715] (0/8) Epoch 19, batch 8200, loss[loss=0.1122, simple_loss=0.1888, pruned_loss=0.01774, over 4981.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2051, pruned_loss=0.02829, over 971725.02 frames.], batch size: 28, lr: 1.18e-04 2022-05-09 14:55:06,907 INFO [train.py:715] (0/8) Epoch 19, batch 8250, loss[loss=0.1309, simple_loss=0.2128, pruned_loss=0.02447, over 4956.00 frames.], tot_loss[loss=0.1305, simple_loss=0.205, pruned_loss=0.02796, over 971371.02 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:55:45,532 INFO [train.py:715] (0/8) Epoch 19, batch 8300, loss[loss=0.1634, simple_loss=0.2182, pruned_loss=0.05427, over 4821.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2047, pruned_loss=0.02837, over 971757.24 frames.], batch size: 15, lr: 1.18e-04 2022-05-09 14:56:25,222 INFO [train.py:715] (0/8) Epoch 19, batch 8350, loss[loss=0.1469, simple_loss=0.2163, pruned_loss=0.03879, over 4971.00 frames.], tot_loss[loss=0.1319, simple_loss=0.206, pruned_loss=0.02888, over 972276.23 frames.], batch size: 35, lr: 1.18e-04 2022-05-09 14:57:04,443 INFO [train.py:715] (0/8) Epoch 19, batch 8400, loss[loss=0.1315, simple_loss=0.2023, pruned_loss=0.03036, over 4905.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2056, pruned_loss=0.02867, over 971324.63 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 14:57:43,450 INFO [train.py:715] (0/8) Epoch 19, batch 8450, loss[loss=0.1373, simple_loss=0.2196, pruned_loss=0.02752, over 4761.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02852, over 972036.30 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 14:58:23,231 INFO [train.py:715] (0/8) Epoch 19, batch 8500, loss[loss=0.1407, simple_loss=0.228, pruned_loss=0.02667, over 4957.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.0284, over 972398.23 frames.], batch size: 29, lr: 1.18e-04 2022-05-09 14:59:01,923 INFO [train.py:715] (0/8) Epoch 19, batch 8550, loss[loss=0.139, simple_loss=0.2145, pruned_loss=0.03174, over 4853.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2058, pruned_loss=0.02854, over 972362.58 frames.], batch size: 22, lr: 1.18e-04 2022-05-09 14:59:41,019 INFO [train.py:715] (0/8) Epoch 19, batch 8600, loss[loss=0.1372, simple_loss=0.2074, pruned_loss=0.03351, over 4925.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02848, over 972389.08 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 15:00:20,524 INFO [train.py:715] (0/8) Epoch 19, batch 8650, loss[loss=0.1256, simple_loss=0.1988, pruned_loss=0.02619, over 4795.00 frames.], tot_loss[loss=0.1312, simple_loss=0.206, pruned_loss=0.02816, over 971982.14 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 15:01:00,038 INFO [train.py:715] (0/8) Epoch 19, batch 8700, loss[loss=0.1162, simple_loss=0.1913, pruned_loss=0.02056, over 4974.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.02869, over 972169.46 frames.], batch size: 28, lr: 1.18e-04 2022-05-09 15:01:39,201 INFO [train.py:715] (0/8) Epoch 19, batch 8750, loss[loss=0.118, simple_loss=0.1938, pruned_loss=0.02114, over 4797.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.02886, over 971838.91 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 15:02:17,955 INFO [train.py:715] (0/8) Epoch 19, batch 8800, loss[loss=0.1275, simple_loss=0.2018, pruned_loss=0.02659, over 4877.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02846, over 972406.72 frames.], batch size: 20, lr: 1.18e-04 2022-05-09 15:02:57,606 INFO [train.py:715] (0/8) Epoch 19, batch 8850, loss[loss=0.1099, simple_loss=0.1756, pruned_loss=0.02212, over 4804.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2056, pruned_loss=0.02814, over 972149.68 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 15:03:36,662 INFO [train.py:715] (0/8) Epoch 19, batch 8900, loss[loss=0.1131, simple_loss=0.1837, pruned_loss=0.02119, over 4728.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2055, pruned_loss=0.02841, over 971778.66 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 15:04:16,004 INFO [train.py:715] (0/8) Epoch 19, batch 8950, loss[loss=0.1523, simple_loss=0.2209, pruned_loss=0.04185, over 4984.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2059, pruned_loss=0.02883, over 972229.62 frames.], batch size: 33, lr: 1.18e-04 2022-05-09 15:04:54,899 INFO [train.py:715] (0/8) Epoch 19, batch 9000, loss[loss=0.1462, simple_loss=0.2204, pruned_loss=0.03601, over 4962.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2065, pruned_loss=0.029, over 973362.74 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 15:04:54,900 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 15:05:04,820 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1047, simple_loss=0.1879, pruned_loss=0.01072, over 914524.00 frames. 2022-05-09 15:05:44,265 INFO [train.py:715] (0/8) Epoch 19, batch 9050, loss[loss=0.1341, simple_loss=0.1972, pruned_loss=0.03554, over 4795.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2063, pruned_loss=0.02908, over 973909.10 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 15:06:23,986 INFO [train.py:715] (0/8) Epoch 19, batch 9100, loss[loss=0.1274, simple_loss=0.2101, pruned_loss=0.02233, over 4903.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02867, over 973854.84 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 15:07:03,252 INFO [train.py:715] (0/8) Epoch 19, batch 9150, loss[loss=0.157, simple_loss=0.2256, pruned_loss=0.0442, over 4949.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02858, over 973684.36 frames.], batch size: 39, lr: 1.18e-04 2022-05-09 15:07:42,030 INFO [train.py:715] (0/8) Epoch 19, batch 9200, loss[loss=0.1219, simple_loss=0.1839, pruned_loss=0.02991, over 4985.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2049, pruned_loss=0.0283, over 972987.88 frames.], batch size: 14, lr: 1.18e-04 2022-05-09 15:08:21,754 INFO [train.py:715] (0/8) Epoch 19, batch 9250, loss[loss=0.1215, simple_loss=0.1965, pruned_loss=0.02328, over 4973.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2054, pruned_loss=0.02837, over 972361.27 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 15:09:00,951 INFO [train.py:715] (0/8) Epoch 19, batch 9300, loss[loss=0.132, simple_loss=0.2047, pruned_loss=0.02964, over 4763.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02862, over 972597.03 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 15:09:39,864 INFO [train.py:715] (0/8) Epoch 19, batch 9350, loss[loss=0.1089, simple_loss=0.1847, pruned_loss=0.01652, over 4841.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2063, pruned_loss=0.02847, over 972429.09 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 15:10:19,955 INFO [train.py:715] (0/8) Epoch 19, batch 9400, loss[loss=0.1223, simple_loss=0.2001, pruned_loss=0.02223, over 4833.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02856, over 972606.83 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 15:11:00,057 INFO [train.py:715] (0/8) Epoch 19, batch 9450, loss[loss=0.1158, simple_loss=0.1969, pruned_loss=0.01738, over 4746.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02829, over 972890.72 frames.], batch size: 19, lr: 1.18e-04 2022-05-09 15:11:38,883 INFO [train.py:715] (0/8) Epoch 19, batch 9500, loss[loss=0.1192, simple_loss=0.1946, pruned_loss=0.0219, over 4767.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2063, pruned_loss=0.02823, over 972983.50 frames.], batch size: 18, lr: 1.18e-04 2022-05-09 15:12:18,092 INFO [train.py:715] (0/8) Epoch 19, batch 9550, loss[loss=0.1324, simple_loss=0.2058, pruned_loss=0.02951, over 4950.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02863, over 973041.77 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 15:12:57,472 INFO [train.py:715] (0/8) Epoch 19, batch 9600, loss[loss=0.1364, simple_loss=0.2087, pruned_loss=0.03202, over 4908.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02895, over 972121.64 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 15:13:36,647 INFO [train.py:715] (0/8) Epoch 19, batch 9650, loss[loss=0.1239, simple_loss=0.1988, pruned_loss=0.02448, over 4918.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.02833, over 972753.07 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 15:14:14,973 INFO [train.py:715] (0/8) Epoch 19, batch 9700, loss[loss=0.1412, simple_loss=0.2074, pruned_loss=0.03749, over 4837.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2059, pruned_loss=0.02817, over 972945.03 frames.], batch size: 30, lr: 1.18e-04 2022-05-09 15:14:54,702 INFO [train.py:715] (0/8) Epoch 19, batch 9750, loss[loss=0.1198, simple_loss=0.193, pruned_loss=0.02335, over 4793.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.02836, over 972076.84 frames.], batch size: 12, lr: 1.18e-04 2022-05-09 15:15:34,782 INFO [train.py:715] (0/8) Epoch 19, batch 9800, loss[loss=0.1144, simple_loss=0.1876, pruned_loss=0.02056, over 4746.00 frames.], tot_loss[loss=0.1307, simple_loss=0.205, pruned_loss=0.02818, over 972343.52 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 15:16:14,504 INFO [train.py:715] (0/8) Epoch 19, batch 9850, loss[loss=0.134, simple_loss=0.2229, pruned_loss=0.02252, over 4808.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2053, pruned_loss=0.02779, over 972442.48 frames.], batch size: 24, lr: 1.18e-04 2022-05-09 15:16:53,381 INFO [train.py:715] (0/8) Epoch 19, batch 9900, loss[loss=0.1382, simple_loss=0.2049, pruned_loss=0.03581, over 4916.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2053, pruned_loss=0.02789, over 972615.42 frames.], batch size: 17, lr: 1.18e-04 2022-05-09 15:17:33,332 INFO [train.py:715] (0/8) Epoch 19, batch 9950, loss[loss=0.1223, simple_loss=0.197, pruned_loss=0.02375, over 4830.00 frames.], tot_loss[loss=0.1312, simple_loss=0.206, pruned_loss=0.02818, over 973185.07 frames.], batch size: 13, lr: 1.18e-04 2022-05-09 15:18:12,857 INFO [train.py:715] (0/8) Epoch 19, batch 10000, loss[loss=0.1363, simple_loss=0.2109, pruned_loss=0.03082, over 4939.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02822, over 973188.46 frames.], batch size: 21, lr: 1.18e-04 2022-05-09 15:18:51,541 INFO [train.py:715] (0/8) Epoch 19, batch 10050, loss[loss=0.109, simple_loss=0.1896, pruned_loss=0.01422, over 4879.00 frames.], tot_loss[loss=0.131, simple_loss=0.2059, pruned_loss=0.02809, over 972774.46 frames.], batch size: 16, lr: 1.18e-04 2022-05-09 15:19:31,274 INFO [train.py:715] (0/8) Epoch 19, batch 10100, loss[loss=0.1329, simple_loss=0.2121, pruned_loss=0.02681, over 4933.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2063, pruned_loss=0.02796, over 972614.33 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 15:20:10,774 INFO [train.py:715] (0/8) Epoch 19, batch 10150, loss[loss=0.1334, simple_loss=0.2109, pruned_loss=0.02796, over 4886.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2057, pruned_loss=0.02743, over 971868.96 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 15:20:49,757 INFO [train.py:715] (0/8) Epoch 19, batch 10200, loss[loss=0.1637, simple_loss=0.2332, pruned_loss=0.04708, over 4774.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2057, pruned_loss=0.02785, over 972213.04 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 15:21:29,136 INFO [train.py:715] (0/8) Epoch 19, batch 10250, loss[loss=0.1291, simple_loss=0.1982, pruned_loss=0.02996, over 4738.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2061, pruned_loss=0.02813, over 972025.47 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 15:22:09,266 INFO [train.py:715] (0/8) Epoch 19, batch 10300, loss[loss=0.1435, simple_loss=0.2205, pruned_loss=0.03324, over 4845.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2061, pruned_loss=0.02826, over 972512.94 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 15:22:48,849 INFO [train.py:715] (0/8) Epoch 19, batch 10350, loss[loss=0.1612, simple_loss=0.2297, pruned_loss=0.04633, over 4781.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02856, over 972735.61 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:23:27,532 INFO [train.py:715] (0/8) Epoch 19, batch 10400, loss[loss=0.1422, simple_loss=0.2091, pruned_loss=0.03767, over 4985.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.0283, over 972704.48 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 15:24:07,290 INFO [train.py:715] (0/8) Epoch 19, batch 10450, loss[loss=0.1411, simple_loss=0.217, pruned_loss=0.03262, over 4977.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02807, over 972256.17 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 15:24:47,015 INFO [train.py:715] (0/8) Epoch 19, batch 10500, loss[loss=0.1227, simple_loss=0.2064, pruned_loss=0.01946, over 4878.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02805, over 972367.73 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 15:25:25,923 INFO [train.py:715] (0/8) Epoch 19, batch 10550, loss[loss=0.1399, simple_loss=0.2051, pruned_loss=0.03736, over 4881.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2054, pruned_loss=0.02791, over 973142.60 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 15:26:04,896 INFO [train.py:715] (0/8) Epoch 19, batch 10600, loss[loss=0.1418, simple_loss=0.2162, pruned_loss=0.03365, over 4751.00 frames.], tot_loss[loss=0.1311, simple_loss=0.206, pruned_loss=0.02807, over 972911.00 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 15:26:22,047 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-672000.pt 2022-05-09 15:26:47,168 INFO [train.py:715] (0/8) Epoch 19, batch 10650, loss[loss=0.1275, simple_loss=0.1927, pruned_loss=0.03114, over 4944.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2052, pruned_loss=0.02782, over 973338.66 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 15:27:26,342 INFO [train.py:715] (0/8) Epoch 19, batch 10700, loss[loss=0.119, simple_loss=0.1804, pruned_loss=0.0288, over 4856.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02843, over 974298.26 frames.], batch size: 30, lr: 1.17e-04 2022-05-09 15:28:05,701 INFO [train.py:715] (0/8) Epoch 19, batch 10750, loss[loss=0.1245, simple_loss=0.199, pruned_loss=0.02499, over 4818.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2056, pruned_loss=0.02801, over 973185.54 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:28:45,278 INFO [train.py:715] (0/8) Epoch 19, batch 10800, loss[loss=0.1387, simple_loss=0.2139, pruned_loss=0.03174, over 4986.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.02835, over 974053.15 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 15:29:25,025 INFO [train.py:715] (0/8) Epoch 19, batch 10850, loss[loss=0.1318, simple_loss=0.1971, pruned_loss=0.03321, over 4834.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 973060.04 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 15:30:03,630 INFO [train.py:715] (0/8) Epoch 19, batch 10900, loss[loss=0.1175, simple_loss=0.193, pruned_loss=0.02105, over 4814.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02815, over 972727.88 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 15:30:42,648 INFO [train.py:715] (0/8) Epoch 19, batch 10950, loss[loss=0.1439, simple_loss=0.1987, pruned_loss=0.04457, over 4700.00 frames.], tot_loss[loss=0.1306, simple_loss=0.205, pruned_loss=0.02812, over 972647.53 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:31:22,353 INFO [train.py:715] (0/8) Epoch 19, batch 11000, loss[loss=0.1133, simple_loss=0.1871, pruned_loss=0.01981, over 4974.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2059, pruned_loss=0.02819, over 971511.56 frames.], batch size: 40, lr: 1.17e-04 2022-05-09 15:32:02,208 INFO [train.py:715] (0/8) Epoch 19, batch 11050, loss[loss=0.1399, simple_loss=0.2179, pruned_loss=0.0309, over 4972.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2053, pruned_loss=0.02756, over 972184.17 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:32:40,674 INFO [train.py:715] (0/8) Epoch 19, batch 11100, loss[loss=0.1466, simple_loss=0.2224, pruned_loss=0.03538, over 4986.00 frames.], tot_loss[loss=0.1297, simple_loss=0.205, pruned_loss=0.02724, over 971679.13 frames.], batch size: 31, lr: 1.17e-04 2022-05-09 15:33:20,043 INFO [train.py:715] (0/8) Epoch 19, batch 11150, loss[loss=0.1373, simple_loss=0.2064, pruned_loss=0.03412, over 4915.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2056, pruned_loss=0.02771, over 971323.81 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:33:59,452 INFO [train.py:715] (0/8) Epoch 19, batch 11200, loss[loss=0.1646, simple_loss=0.2212, pruned_loss=0.05404, over 4887.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.028, over 971788.45 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 15:34:38,832 INFO [train.py:715] (0/8) Epoch 19, batch 11250, loss[loss=0.1387, simple_loss=0.2112, pruned_loss=0.03312, over 4886.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2059, pruned_loss=0.02811, over 972135.06 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 15:35:18,202 INFO [train.py:715] (0/8) Epoch 19, batch 11300, loss[loss=0.1369, simple_loss=0.2074, pruned_loss=0.0332, over 4873.00 frames.], tot_loss[loss=0.131, simple_loss=0.2056, pruned_loss=0.02824, over 971295.12 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 15:35:56,985 INFO [train.py:715] (0/8) Epoch 19, batch 11350, loss[loss=0.1171, simple_loss=0.198, pruned_loss=0.01817, over 4801.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02836, over 971238.12 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 15:36:36,597 INFO [train.py:715] (0/8) Epoch 19, batch 11400, loss[loss=0.1262, simple_loss=0.2061, pruned_loss=0.02315, over 4938.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.02836, over 971254.93 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 15:37:16,205 INFO [train.py:715] (0/8) Epoch 19, batch 11450, loss[loss=0.1148, simple_loss=0.1959, pruned_loss=0.01691, over 4809.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2056, pruned_loss=0.02787, over 970370.10 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 15:37:56,153 INFO [train.py:715] (0/8) Epoch 19, batch 11500, loss[loss=0.1357, simple_loss=0.198, pruned_loss=0.0367, over 4887.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02793, over 971738.55 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 15:38:35,579 INFO [train.py:715] (0/8) Epoch 19, batch 11550, loss[loss=0.1036, simple_loss=0.1784, pruned_loss=0.01436, over 4796.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2058, pruned_loss=0.028, over 972278.58 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 15:39:14,554 INFO [train.py:715] (0/8) Epoch 19, batch 11600, loss[loss=0.1243, simple_loss=0.1955, pruned_loss=0.02657, over 4776.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2064, pruned_loss=0.02815, over 972262.44 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:39:54,375 INFO [train.py:715] (0/8) Epoch 19, batch 11650, loss[loss=0.1259, simple_loss=0.1999, pruned_loss=0.02599, over 4865.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2061, pruned_loss=0.02823, over 972333.22 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 15:40:33,462 INFO [train.py:715] (0/8) Epoch 19, batch 11700, loss[loss=0.1275, simple_loss=0.1991, pruned_loss=0.02792, over 4978.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2067, pruned_loss=0.02841, over 971879.68 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:41:13,001 INFO [train.py:715] (0/8) Epoch 19, batch 11750, loss[loss=0.132, simple_loss=0.2133, pruned_loss=0.02535, over 4779.00 frames.], tot_loss[loss=0.132, simple_loss=0.2067, pruned_loss=0.02862, over 971940.55 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:41:52,541 INFO [train.py:715] (0/8) Epoch 19, batch 11800, loss[loss=0.1339, simple_loss=0.2047, pruned_loss=0.03156, over 4850.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02847, over 972927.38 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 15:42:32,201 INFO [train.py:715] (0/8) Epoch 19, batch 11850, loss[loss=0.1226, simple_loss=0.1915, pruned_loss=0.02688, over 4926.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02894, over 972872.78 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:43:11,794 INFO [train.py:715] (0/8) Epoch 19, batch 11900, loss[loss=0.1377, simple_loss=0.2103, pruned_loss=0.03255, over 4983.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02889, over 973137.37 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:43:51,292 INFO [train.py:715] (0/8) Epoch 19, batch 11950, loss[loss=0.1252, simple_loss=0.1986, pruned_loss=0.02592, over 4749.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2056, pruned_loss=0.02851, over 972459.55 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 15:44:30,454 INFO [train.py:715] (0/8) Epoch 19, batch 12000, loss[loss=0.128, simple_loss=0.2058, pruned_loss=0.0251, over 4960.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02843, over 972775.48 frames.], batch size: 24, lr: 1.17e-04 2022-05-09 15:44:30,455 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 15:44:40,312 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1044, simple_loss=0.1877, pruned_loss=0.01054, over 914524.00 frames. 2022-05-09 15:45:20,291 INFO [train.py:715] (0/8) Epoch 19, batch 12050, loss[loss=0.1103, simple_loss=0.1915, pruned_loss=0.01456, over 4873.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2048, pruned_loss=0.02771, over 972193.48 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 15:46:00,174 INFO [train.py:715] (0/8) Epoch 19, batch 12100, loss[loss=0.1105, simple_loss=0.1843, pruned_loss=0.01835, over 4762.00 frames.], tot_loss[loss=0.131, simple_loss=0.2057, pruned_loss=0.02816, over 971584.47 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 15:46:39,303 INFO [train.py:715] (0/8) Epoch 19, batch 12150, loss[loss=0.1237, simple_loss=0.1912, pruned_loss=0.0281, over 4985.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2058, pruned_loss=0.02834, over 972200.62 frames.], batch size: 28, lr: 1.17e-04 2022-05-09 15:47:18,768 INFO [train.py:715] (0/8) Epoch 19, batch 12200, loss[loss=0.1577, simple_loss=0.2302, pruned_loss=0.04266, over 4803.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02848, over 971354.83 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 15:47:58,231 INFO [train.py:715] (0/8) Epoch 19, batch 12250, loss[loss=0.1397, simple_loss=0.2109, pruned_loss=0.03427, over 4950.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.0288, over 971197.31 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 15:48:37,866 INFO [train.py:715] (0/8) Epoch 19, batch 12300, loss[loss=0.1367, simple_loss=0.215, pruned_loss=0.02918, over 4742.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2073, pruned_loss=0.02885, over 969950.01 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 15:49:17,545 INFO [train.py:715] (0/8) Epoch 19, batch 12350, loss[loss=0.1402, simple_loss=0.2149, pruned_loss=0.03278, over 4982.00 frames.], tot_loss[loss=0.132, simple_loss=0.2068, pruned_loss=0.02863, over 971023.84 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:49:56,102 INFO [train.py:715] (0/8) Epoch 19, batch 12400, loss[loss=0.1562, simple_loss=0.2183, pruned_loss=0.04708, over 4754.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02852, over 971209.47 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 15:50:35,581 INFO [train.py:715] (0/8) Epoch 19, batch 12450, loss[loss=0.1199, simple_loss=0.1987, pruned_loss=0.02052, over 4782.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2058, pruned_loss=0.02828, over 971913.11 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:51:14,299 INFO [train.py:715] (0/8) Epoch 19, batch 12500, loss[loss=0.1035, simple_loss=0.1758, pruned_loss=0.01558, over 4865.00 frames.], tot_loss[loss=0.131, simple_loss=0.2056, pruned_loss=0.02816, over 972443.44 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 15:51:53,637 INFO [train.py:715] (0/8) Epoch 19, batch 12550, loss[loss=0.1381, simple_loss=0.2158, pruned_loss=0.03022, over 4753.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02844, over 972041.28 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 15:52:33,121 INFO [train.py:715] (0/8) Epoch 19, batch 12600, loss[loss=0.1539, simple_loss=0.225, pruned_loss=0.0414, over 4849.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02827, over 972330.12 frames.], batch size: 34, lr: 1.17e-04 2022-05-09 15:53:12,608 INFO [train.py:715] (0/8) Epoch 19, batch 12650, loss[loss=0.1107, simple_loss=0.1879, pruned_loss=0.01674, over 4824.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02838, over 972568.23 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 15:53:51,566 INFO [train.py:715] (0/8) Epoch 19, batch 12700, loss[loss=0.1408, simple_loss=0.217, pruned_loss=0.03231, over 4972.00 frames.], tot_loss[loss=0.132, simple_loss=0.2066, pruned_loss=0.02867, over 971854.51 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 15:54:30,741 INFO [train.py:715] (0/8) Epoch 19, batch 12750, loss[loss=0.1392, simple_loss=0.2205, pruned_loss=0.02892, over 4977.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02881, over 971369.27 frames.], batch size: 39, lr: 1.17e-04 2022-05-09 15:55:10,385 INFO [train.py:715] (0/8) Epoch 19, batch 12800, loss[loss=0.1294, simple_loss=0.2, pruned_loss=0.02939, over 4906.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02882, over 972105.68 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:55:49,787 INFO [train.py:715] (0/8) Epoch 19, batch 12850, loss[loss=0.1316, simple_loss=0.2088, pruned_loss=0.0272, over 4787.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2072, pruned_loss=0.02876, over 972658.27 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 15:56:28,728 INFO [train.py:715] (0/8) Epoch 19, batch 12900, loss[loss=0.1031, simple_loss=0.1757, pruned_loss=0.01524, over 4769.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02853, over 972676.59 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 15:57:08,289 INFO [train.py:715] (0/8) Epoch 19, batch 12950, loss[loss=0.1412, simple_loss=0.2041, pruned_loss=0.0392, over 4979.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2057, pruned_loss=0.02807, over 973128.38 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 15:57:47,530 INFO [train.py:715] (0/8) Epoch 19, batch 13000, loss[loss=0.1598, simple_loss=0.2341, pruned_loss=0.04273, over 4914.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2055, pruned_loss=0.02797, over 972757.62 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 15:58:26,676 INFO [train.py:715] (0/8) Epoch 19, batch 13050, loss[loss=0.1196, simple_loss=0.1923, pruned_loss=0.0234, over 4991.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2053, pruned_loss=0.0277, over 972744.99 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 15:59:05,568 INFO [train.py:715] (0/8) Epoch 19, batch 13100, loss[loss=0.1262, simple_loss=0.2099, pruned_loss=0.02131, over 4954.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02808, over 972658.14 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 15:59:44,831 INFO [train.py:715] (0/8) Epoch 19, batch 13150, loss[loss=0.1447, simple_loss=0.224, pruned_loss=0.03273, over 4866.00 frames.], tot_loss[loss=0.131, simple_loss=0.2054, pruned_loss=0.02829, over 972782.88 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 16:00:24,454 INFO [train.py:715] (0/8) Epoch 19, batch 13200, loss[loss=0.1094, simple_loss=0.1806, pruned_loss=0.0191, over 4759.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2051, pruned_loss=0.02818, over 972193.79 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 16:01:03,677 INFO [train.py:715] (0/8) Epoch 19, batch 13250, loss[loss=0.1488, simple_loss=0.2098, pruned_loss=0.04388, over 4960.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2047, pruned_loss=0.02806, over 971888.18 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:01:42,979 INFO [train.py:715] (0/8) Epoch 19, batch 13300, loss[loss=0.1271, simple_loss=0.1944, pruned_loss=0.02995, over 4775.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2047, pruned_loss=0.02817, over 971620.13 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:02:22,612 INFO [train.py:715] (0/8) Epoch 19, batch 13350, loss[loss=0.1397, simple_loss=0.2175, pruned_loss=0.03094, over 4887.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2054, pruned_loss=0.02836, over 972870.59 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 16:03:01,276 INFO [train.py:715] (0/8) Epoch 19, batch 13400, loss[loss=0.1081, simple_loss=0.1819, pruned_loss=0.01717, over 4806.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02805, over 972413.08 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:03:40,772 INFO [train.py:715] (0/8) Epoch 19, batch 13450, loss[loss=0.1322, simple_loss=0.2087, pruned_loss=0.02789, over 4913.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02828, over 972391.05 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:04:20,044 INFO [train.py:715] (0/8) Epoch 19, batch 13500, loss[loss=0.1296, simple_loss=0.2029, pruned_loss=0.02822, over 4968.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02848, over 973101.89 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:04:59,468 INFO [train.py:715] (0/8) Epoch 19, batch 13550, loss[loss=0.1008, simple_loss=0.1719, pruned_loss=0.0149, over 4883.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02813, over 972936.29 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 16:05:38,404 INFO [train.py:715] (0/8) Epoch 19, batch 13600, loss[loss=0.119, simple_loss=0.1972, pruned_loss=0.02041, over 4884.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2052, pruned_loss=0.02827, over 972618.50 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 16:06:17,569 INFO [train.py:715] (0/8) Epoch 19, batch 13650, loss[loss=0.1552, simple_loss=0.2186, pruned_loss=0.04592, over 4855.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.02884, over 972434.65 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 16:06:57,012 INFO [train.py:715] (0/8) Epoch 19, batch 13700, loss[loss=0.1332, simple_loss=0.2063, pruned_loss=0.03008, over 4803.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.0287, over 972615.62 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:07:35,729 INFO [train.py:715] (0/8) Epoch 19, batch 13750, loss[loss=0.1132, simple_loss=0.1977, pruned_loss=0.01437, over 4927.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02857, over 972241.09 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 16:08:14,997 INFO [train.py:715] (0/8) Epoch 19, batch 13800, loss[loss=0.1378, simple_loss=0.2035, pruned_loss=0.03604, over 4850.00 frames.], tot_loss[loss=0.132, simple_loss=0.2063, pruned_loss=0.02883, over 972585.72 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 16:08:55,057 INFO [train.py:715] (0/8) Epoch 19, batch 13850, loss[loss=0.1628, simple_loss=0.2316, pruned_loss=0.04697, over 4914.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02908, over 972323.38 frames.], batch size: 39, lr: 1.17e-04 2022-05-09 16:09:34,675 INFO [train.py:715] (0/8) Epoch 19, batch 13900, loss[loss=0.1208, simple_loss=0.189, pruned_loss=0.02631, over 4862.00 frames.], tot_loss[loss=0.132, simple_loss=0.2068, pruned_loss=0.02862, over 972382.95 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 16:10:14,441 INFO [train.py:715] (0/8) Epoch 19, batch 13950, loss[loss=0.1282, simple_loss=0.2013, pruned_loss=0.02753, over 4860.00 frames.], tot_loss[loss=0.131, simple_loss=0.2054, pruned_loss=0.02834, over 972745.39 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 16:10:53,215 INFO [train.py:715] (0/8) Epoch 19, batch 14000, loss[loss=0.1232, simple_loss=0.2023, pruned_loss=0.02209, over 4940.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2054, pruned_loss=0.02822, over 972894.09 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:11:32,644 INFO [train.py:715] (0/8) Epoch 19, batch 14050, loss[loss=0.1449, simple_loss=0.2191, pruned_loss=0.03532, over 4961.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02819, over 972533.89 frames.], batch size: 31, lr: 1.17e-04 2022-05-09 16:12:11,844 INFO [train.py:715] (0/8) Epoch 19, batch 14100, loss[loss=0.1564, simple_loss=0.233, pruned_loss=0.03989, over 4867.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2063, pruned_loss=0.02838, over 971831.00 frames.], batch size: 39, lr: 1.17e-04 2022-05-09 16:12:51,003 INFO [train.py:715] (0/8) Epoch 19, batch 14150, loss[loss=0.1228, simple_loss=0.2003, pruned_loss=0.02267, over 4829.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02847, over 972050.08 frames.], batch size: 26, lr: 1.17e-04 2022-05-09 16:13:30,121 INFO [train.py:715] (0/8) Epoch 19, batch 14200, loss[loss=0.1163, simple_loss=0.1914, pruned_loss=0.02061, over 4908.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02817, over 971679.23 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 16:14:08,938 INFO [train.py:715] (0/8) Epoch 19, batch 14250, loss[loss=0.144, simple_loss=0.212, pruned_loss=0.03795, over 4969.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.02862, over 971602.77 frames.], batch size: 31, lr: 1.17e-04 2022-05-09 16:14:48,114 INFO [train.py:715] (0/8) Epoch 19, batch 14300, loss[loss=0.1121, simple_loss=0.1904, pruned_loss=0.01693, over 4784.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2068, pruned_loss=0.02885, over 972064.55 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:15:27,230 INFO [train.py:715] (0/8) Epoch 19, batch 14350, loss[loss=0.1508, simple_loss=0.2232, pruned_loss=0.03914, over 4755.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02855, over 972820.57 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:16:06,785 INFO [train.py:715] (0/8) Epoch 19, batch 14400, loss[loss=0.1233, simple_loss=0.2093, pruned_loss=0.01862, over 4811.00 frames.], tot_loss[loss=0.131, simple_loss=0.206, pruned_loss=0.02801, over 973043.53 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:16:45,663 INFO [train.py:715] (0/8) Epoch 19, batch 14450, loss[loss=0.1431, simple_loss=0.2133, pruned_loss=0.03648, over 4787.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02816, over 972255.68 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:17:24,665 INFO [train.py:715] (0/8) Epoch 19, batch 14500, loss[loss=0.1006, simple_loss=0.1725, pruned_loss=0.01432, over 4827.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.02835, over 972418.37 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 16:18:03,481 INFO [train.py:715] (0/8) Epoch 19, batch 14550, loss[loss=0.1212, simple_loss=0.1964, pruned_loss=0.02298, over 4805.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2062, pruned_loss=0.0283, over 972034.49 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:18:43,154 INFO [train.py:715] (0/8) Epoch 19, batch 14600, loss[loss=0.1429, simple_loss=0.2083, pruned_loss=0.03873, over 4971.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2052, pruned_loss=0.0279, over 972417.11 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:19:22,223 INFO [train.py:715] (0/8) Epoch 19, batch 14650, loss[loss=0.1307, simple_loss=0.1964, pruned_loss=0.0325, over 4923.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2054, pruned_loss=0.02804, over 971735.34 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:20:01,162 INFO [train.py:715] (0/8) Epoch 19, batch 14700, loss[loss=0.1189, simple_loss=0.1923, pruned_loss=0.02275, over 4987.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02824, over 972117.13 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:20:40,520 INFO [train.py:715] (0/8) Epoch 19, batch 14750, loss[loss=0.1179, simple_loss=0.1892, pruned_loss=0.02333, over 4823.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02866, over 972085.63 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 16:21:19,780 INFO [train.py:715] (0/8) Epoch 19, batch 14800, loss[loss=0.1227, simple_loss=0.193, pruned_loss=0.02616, over 4986.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02876, over 972065.58 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:21:58,085 INFO [train.py:715] (0/8) Epoch 19, batch 14850, loss[loss=0.1367, simple_loss=0.2214, pruned_loss=0.02597, over 4856.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2056, pruned_loss=0.0284, over 972322.60 frames.], batch size: 30, lr: 1.17e-04 2022-05-09 16:22:37,372 INFO [train.py:715] (0/8) Epoch 19, batch 14900, loss[loss=0.1326, simple_loss=0.2022, pruned_loss=0.03147, over 4968.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02875, over 972422.55 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:23:16,322 INFO [train.py:715] (0/8) Epoch 19, batch 14950, loss[loss=0.1517, simple_loss=0.2235, pruned_loss=0.03993, over 4889.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2068, pruned_loss=0.02894, over 972536.59 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 16:23:55,094 INFO [train.py:715] (0/8) Epoch 19, batch 15000, loss[loss=0.1129, simple_loss=0.1908, pruned_loss=0.01751, over 4812.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02919, over 973033.12 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:23:55,095 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 16:24:07,490 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1045, simple_loss=0.1877, pruned_loss=0.01064, over 914524.00 frames. 2022-05-09 16:24:46,706 INFO [train.py:715] (0/8) Epoch 19, batch 15050, loss[loss=0.1467, simple_loss=0.2214, pruned_loss=0.03605, over 4790.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.0289, over 972802.25 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:25:26,174 INFO [train.py:715] (0/8) Epoch 19, batch 15100, loss[loss=0.1298, simple_loss=0.2014, pruned_loss=0.02909, over 4895.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2068, pruned_loss=0.02847, over 972515.79 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 16:26:05,805 INFO [train.py:715] (0/8) Epoch 19, batch 15150, loss[loss=0.1254, simple_loss=0.1983, pruned_loss=0.02625, over 4980.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02846, over 973214.55 frames.], batch size: 28, lr: 1.17e-04 2022-05-09 16:26:45,268 INFO [train.py:715] (0/8) Epoch 19, batch 15200, loss[loss=0.1213, simple_loss=0.201, pruned_loss=0.02085, over 4808.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2065, pruned_loss=0.02812, over 973066.23 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:27:24,251 INFO [train.py:715] (0/8) Epoch 19, batch 15250, loss[loss=0.123, simple_loss=0.1923, pruned_loss=0.02683, over 4986.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02874, over 972638.71 frames.], batch size: 33, lr: 1.17e-04 2022-05-09 16:28:04,175 INFO [train.py:715] (0/8) Epoch 19, batch 15300, loss[loss=0.1266, simple_loss=0.2022, pruned_loss=0.02553, over 4868.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2073, pruned_loss=0.02866, over 971674.53 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 16:28:43,718 INFO [train.py:715] (0/8) Epoch 19, batch 15350, loss[loss=0.1237, simple_loss=0.2012, pruned_loss=0.02308, over 4979.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2077, pruned_loss=0.02905, over 972232.19 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:29:23,530 INFO [train.py:715] (0/8) Epoch 19, batch 15400, loss[loss=0.1296, simple_loss=0.2086, pruned_loss=0.02535, over 4910.00 frames.], tot_loss[loss=0.133, simple_loss=0.2076, pruned_loss=0.02913, over 971966.74 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 16:30:03,012 INFO [train.py:715] (0/8) Epoch 19, batch 15450, loss[loss=0.1384, simple_loss=0.2148, pruned_loss=0.03096, over 4973.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2072, pruned_loss=0.02929, over 972170.89 frames.], batch size: 35, lr: 1.17e-04 2022-05-09 16:30:42,451 INFO [train.py:715] (0/8) Epoch 19, batch 15500, loss[loss=0.1287, simple_loss=0.2067, pruned_loss=0.02532, over 4909.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2074, pruned_loss=0.02942, over 972127.17 frames.], batch size: 23, lr: 1.17e-04 2022-05-09 16:31:21,336 INFO [train.py:715] (0/8) Epoch 19, batch 15550, loss[loss=0.1431, simple_loss=0.2191, pruned_loss=0.03358, over 4917.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02897, over 973433.52 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:32:00,418 INFO [train.py:715] (0/8) Epoch 19, batch 15600, loss[loss=0.1233, simple_loss=0.1947, pruned_loss=0.02599, over 4834.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2063, pruned_loss=0.029, over 973167.95 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:32:40,088 INFO [train.py:715] (0/8) Epoch 19, batch 15650, loss[loss=0.1203, simple_loss=0.1964, pruned_loss=0.02209, over 4883.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2049, pruned_loss=0.02845, over 972809.68 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 16:33:19,038 INFO [train.py:715] (0/8) Epoch 19, batch 15700, loss[loss=0.1535, simple_loss=0.2291, pruned_loss=0.03896, over 4984.00 frames.], tot_loss[loss=0.1309, simple_loss=0.205, pruned_loss=0.02842, over 973625.91 frames.], batch size: 28, lr: 1.17e-04 2022-05-09 16:33:59,135 INFO [train.py:715] (0/8) Epoch 19, batch 15750, loss[loss=0.1423, simple_loss=0.2268, pruned_loss=0.02889, over 4821.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2046, pruned_loss=0.02829, over 973046.73 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:34:38,393 INFO [train.py:715] (0/8) Epoch 19, batch 15800, loss[loss=0.1239, simple_loss=0.1903, pruned_loss=0.02876, over 4803.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2052, pruned_loss=0.02833, over 973502.33 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:35:17,528 INFO [train.py:715] (0/8) Epoch 19, batch 15850, loss[loss=0.1152, simple_loss=0.1884, pruned_loss=0.02099, over 4919.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.02851, over 973116.66 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 16:35:56,473 INFO [train.py:715] (0/8) Epoch 19, batch 15900, loss[loss=0.1322, simple_loss=0.2021, pruned_loss=0.03112, over 4947.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2069, pruned_loss=0.02879, over 973010.95 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:36:35,618 INFO [train.py:715] (0/8) Epoch 19, batch 15950, loss[loss=0.1458, simple_loss=0.2155, pruned_loss=0.03801, over 4915.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2069, pruned_loss=0.02863, over 972904.43 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:37:15,269 INFO [train.py:715] (0/8) Epoch 19, batch 16000, loss[loss=0.1121, simple_loss=0.1881, pruned_loss=0.01802, over 4948.00 frames.], tot_loss[loss=0.1322, simple_loss=0.207, pruned_loss=0.02875, over 972976.04 frames.], batch size: 35, lr: 1.17e-04 2022-05-09 16:37:53,932 INFO [train.py:715] (0/8) Epoch 19, batch 16050, loss[loss=0.1317, simple_loss=0.2085, pruned_loss=0.02747, over 4805.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2071, pruned_loss=0.02878, over 972613.12 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:38:33,242 INFO [train.py:715] (0/8) Epoch 19, batch 16100, loss[loss=0.1297, simple_loss=0.2134, pruned_loss=0.02296, over 4814.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2063, pruned_loss=0.02845, over 971424.18 frames.], batch size: 27, lr: 1.17e-04 2022-05-09 16:39:12,540 INFO [train.py:715] (0/8) Epoch 19, batch 16150, loss[loss=0.1171, simple_loss=0.1935, pruned_loss=0.02029, over 4932.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02856, over 971895.39 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:39:51,595 INFO [train.py:715] (0/8) Epoch 19, batch 16200, loss[loss=0.1311, simple_loss=0.2066, pruned_loss=0.0278, over 4782.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2067, pruned_loss=0.02834, over 972794.53 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:40:29,889 INFO [train.py:715] (0/8) Epoch 19, batch 16250, loss[loss=0.1191, simple_loss=0.1829, pruned_loss=0.02764, over 4758.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2069, pruned_loss=0.02834, over 971749.36 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:41:08,936 INFO [train.py:715] (0/8) Epoch 19, batch 16300, loss[loss=0.132, simple_loss=0.2096, pruned_loss=0.02716, over 4923.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2058, pruned_loss=0.02753, over 972365.98 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 16:41:48,390 INFO [train.py:715] (0/8) Epoch 19, batch 16350, loss[loss=0.1211, simple_loss=0.1951, pruned_loss=0.02355, over 4919.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2056, pruned_loss=0.02782, over 972736.16 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:42:26,933 INFO [train.py:715] (0/8) Epoch 19, batch 16400, loss[loss=0.1305, simple_loss=0.2037, pruned_loss=0.02869, over 4942.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2062, pruned_loss=0.02821, over 972064.95 frames.], batch size: 24, lr: 1.17e-04 2022-05-09 16:43:05,774 INFO [train.py:715] (0/8) Epoch 19, batch 16450, loss[loss=0.1028, simple_loss=0.1785, pruned_loss=0.01358, over 4785.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2057, pruned_loss=0.0281, over 971731.66 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 16:43:44,323 INFO [train.py:715] (0/8) Epoch 19, batch 16500, loss[loss=0.1417, simple_loss=0.2186, pruned_loss=0.03243, over 4988.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02819, over 972016.72 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:44:23,744 INFO [train.py:715] (0/8) Epoch 19, batch 16550, loss[loss=0.1166, simple_loss=0.1865, pruned_loss=0.02339, over 4785.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.0285, over 971936.86 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 16:45:02,746 INFO [train.py:715] (0/8) Epoch 19, batch 16600, loss[loss=0.115, simple_loss=0.1955, pruned_loss=0.01724, over 4928.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2065, pruned_loss=0.02833, over 972728.96 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:45:41,758 INFO [train.py:715] (0/8) Epoch 19, batch 16650, loss[loss=0.1234, simple_loss=0.1993, pruned_loss=0.02373, over 4982.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2057, pruned_loss=0.02799, over 973235.02 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:46:21,729 INFO [train.py:715] (0/8) Epoch 19, batch 16700, loss[loss=0.08794, simple_loss=0.16, pruned_loss=0.007919, over 4863.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2052, pruned_loss=0.02788, over 973048.17 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 16:47:00,842 INFO [train.py:715] (0/8) Epoch 19, batch 16750, loss[loss=0.1388, simple_loss=0.2119, pruned_loss=0.03283, over 4687.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02822, over 973074.89 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:47:40,567 INFO [train.py:715] (0/8) Epoch 19, batch 16800, loss[loss=0.1135, simple_loss=0.1921, pruned_loss=0.01747, over 4946.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.028, over 972788.12 frames.], batch size: 39, lr: 1.17e-04 2022-05-09 16:48:19,965 INFO [train.py:715] (0/8) Epoch 19, batch 16850, loss[loss=0.1281, simple_loss=0.209, pruned_loss=0.02361, over 4822.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2051, pruned_loss=0.02798, over 972492.12 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 16:48:59,490 INFO [train.py:715] (0/8) Epoch 19, batch 16900, loss[loss=0.1434, simple_loss=0.2142, pruned_loss=0.03634, over 4923.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2055, pruned_loss=0.02832, over 972394.93 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:49:38,105 INFO [train.py:715] (0/8) Epoch 19, batch 16950, loss[loss=0.1184, simple_loss=0.1957, pruned_loss=0.02049, over 4734.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2047, pruned_loss=0.02782, over 972368.33 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 16:50:17,669 INFO [train.py:715] (0/8) Epoch 19, batch 17000, loss[loss=0.22, simple_loss=0.2618, pruned_loss=0.08909, over 4851.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02839, over 973187.25 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 16:50:57,095 INFO [train.py:715] (0/8) Epoch 19, batch 17050, loss[loss=0.1295, simple_loss=0.2139, pruned_loss=0.02255, over 4770.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02807, over 973294.76 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 16:51:36,153 INFO [train.py:715] (0/8) Epoch 19, batch 17100, loss[loss=0.1111, simple_loss=0.1938, pruned_loss=0.01419, over 4876.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2054, pruned_loss=0.02784, over 973437.15 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 16:52:15,335 INFO [train.py:715] (0/8) Epoch 19, batch 17150, loss[loss=0.1186, simple_loss=0.1878, pruned_loss=0.0247, over 4966.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2056, pruned_loss=0.02806, over 973748.42 frames.], batch size: 28, lr: 1.17e-04 2022-05-09 16:52:54,352 INFO [train.py:715] (0/8) Epoch 19, batch 17200, loss[loss=0.117, simple_loss=0.1916, pruned_loss=0.02116, over 4912.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2054, pruned_loss=0.02794, over 974404.36 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 16:53:33,089 INFO [train.py:715] (0/8) Epoch 19, batch 17250, loss[loss=0.1197, simple_loss=0.186, pruned_loss=0.02668, over 4920.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2048, pruned_loss=0.02801, over 974208.32 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:54:12,078 INFO [train.py:715] (0/8) Epoch 19, batch 17300, loss[loss=0.1479, simple_loss=0.2141, pruned_loss=0.04083, over 4844.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2053, pruned_loss=0.02843, over 973978.16 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:54:51,725 INFO [train.py:715] (0/8) Epoch 19, batch 17350, loss[loss=0.1111, simple_loss=0.1838, pruned_loss=0.01917, over 4785.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2052, pruned_loss=0.0282, over 973858.50 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 16:55:31,272 INFO [train.py:715] (0/8) Epoch 19, batch 17400, loss[loss=0.1692, simple_loss=0.2354, pruned_loss=0.05147, over 4970.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2065, pruned_loss=0.0289, over 973577.26 frames.], batch size: 35, lr: 1.17e-04 2022-05-09 16:56:10,516 INFO [train.py:715] (0/8) Epoch 19, batch 17450, loss[loss=0.1453, simple_loss=0.2203, pruned_loss=0.03511, over 4824.00 frames.], tot_loss[loss=0.132, simple_loss=0.2065, pruned_loss=0.02872, over 973148.26 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 16:56:49,865 INFO [train.py:715] (0/8) Epoch 19, batch 17500, loss[loss=0.1504, simple_loss=0.2248, pruned_loss=0.03795, over 4980.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02863, over 972708.84 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 16:57:29,143 INFO [train.py:715] (0/8) Epoch 19, batch 17550, loss[loss=0.1226, simple_loss=0.1856, pruned_loss=0.02974, over 4986.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2069, pruned_loss=0.02877, over 973017.51 frames.], batch size: 28, lr: 1.17e-04 2022-05-09 16:58:08,751 INFO [train.py:715] (0/8) Epoch 19, batch 17600, loss[loss=0.1248, simple_loss=0.2005, pruned_loss=0.02452, over 4922.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.0283, over 973681.94 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 16:58:47,928 INFO [train.py:715] (0/8) Epoch 19, batch 17650, loss[loss=0.1299, simple_loss=0.1966, pruned_loss=0.03159, over 4767.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2056, pruned_loss=0.02806, over 974384.00 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 16:59:27,072 INFO [train.py:715] (0/8) Epoch 19, batch 17700, loss[loss=0.1076, simple_loss=0.188, pruned_loss=0.01365, over 4738.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2056, pruned_loss=0.02806, over 974610.02 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:00:06,647 INFO [train.py:715] (0/8) Epoch 19, batch 17750, loss[loss=0.1447, simple_loss=0.2151, pruned_loss=0.03711, over 4816.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.0284, over 974395.18 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 17:00:45,236 INFO [train.py:715] (0/8) Epoch 19, batch 17800, loss[loss=0.1454, simple_loss=0.2208, pruned_loss=0.03503, over 4830.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2052, pruned_loss=0.02804, over 973906.79 frames.], batch size: 30, lr: 1.17e-04 2022-05-09 17:01:24,010 INFO [train.py:715] (0/8) Epoch 19, batch 17850, loss[loss=0.1424, simple_loss=0.207, pruned_loss=0.03888, over 4976.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02871, over 973388.83 frames.], batch size: 35, lr: 1.17e-04 2022-05-09 17:02:03,485 INFO [train.py:715] (0/8) Epoch 19, batch 17900, loss[loss=0.1411, simple_loss=0.201, pruned_loss=0.04062, over 4778.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2064, pruned_loss=0.02914, over 973623.35 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 17:02:41,967 INFO [train.py:715] (0/8) Epoch 19, batch 17950, loss[loss=0.0829, simple_loss=0.1458, pruned_loss=0.009992, over 4754.00 frames.], tot_loss[loss=0.133, simple_loss=0.2068, pruned_loss=0.02957, over 972500.62 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 17:03:21,253 INFO [train.py:715] (0/8) Epoch 19, batch 18000, loss[loss=0.1433, simple_loss=0.2131, pruned_loss=0.0368, over 4885.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2058, pruned_loss=0.02884, over 971601.05 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:03:21,254 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 17:03:31,129 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1046, simple_loss=0.1877, pruned_loss=0.01074, over 914524.00 frames. 2022-05-09 17:04:10,639 INFO [train.py:715] (0/8) Epoch 19, batch 18050, loss[loss=0.1173, simple_loss=0.1965, pruned_loss=0.01904, over 4925.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2057, pruned_loss=0.02853, over 972133.06 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 17:04:50,206 INFO [train.py:715] (0/8) Epoch 19, batch 18100, loss[loss=0.1384, simple_loss=0.2226, pruned_loss=0.02713, over 4857.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02812, over 971645.76 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 17:05:30,065 INFO [train.py:715] (0/8) Epoch 19, batch 18150, loss[loss=0.1492, simple_loss=0.227, pruned_loss=0.03573, over 4871.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2055, pruned_loss=0.02787, over 971806.92 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 17:06:09,190 INFO [train.py:715] (0/8) Epoch 19, batch 18200, loss[loss=0.1354, simple_loss=0.2035, pruned_loss=0.03366, over 4797.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2051, pruned_loss=0.02754, over 971402.55 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:06:48,108 INFO [train.py:715] (0/8) Epoch 19, batch 18250, loss[loss=0.1911, simple_loss=0.2693, pruned_loss=0.05644, over 4894.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02796, over 972132.54 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 17:07:28,072 INFO [train.py:715] (0/8) Epoch 19, batch 18300, loss[loss=0.1259, simple_loss=0.1925, pruned_loss=0.02958, over 4787.00 frames.], tot_loss[loss=0.1302, simple_loss=0.205, pruned_loss=0.02772, over 972098.02 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 17:08:07,530 INFO [train.py:715] (0/8) Epoch 19, batch 18350, loss[loss=0.1297, simple_loss=0.2074, pruned_loss=0.02601, over 4826.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2047, pruned_loss=0.02778, over 971819.07 frames.], batch size: 26, lr: 1.17e-04 2022-05-09 17:08:47,415 INFO [train.py:715] (0/8) Epoch 19, batch 18400, loss[loss=0.1408, simple_loss=0.2098, pruned_loss=0.03592, over 4865.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2053, pruned_loss=0.02807, over 972541.33 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:09:26,672 INFO [train.py:715] (0/8) Epoch 19, batch 18450, loss[loss=0.1213, simple_loss=0.1977, pruned_loss=0.02241, over 4890.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.0282, over 972509.54 frames.], batch size: 22, lr: 1.17e-04 2022-05-09 17:10:06,115 INFO [train.py:715] (0/8) Epoch 19, batch 18500, loss[loss=0.1263, simple_loss=0.1968, pruned_loss=0.02787, over 4780.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2055, pruned_loss=0.02793, over 972429.79 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 17:10:45,347 INFO [train.py:715] (0/8) Epoch 19, batch 18550, loss[loss=0.1291, simple_loss=0.1994, pruned_loss=0.02933, over 4895.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2051, pruned_loss=0.02789, over 973026.12 frames.], batch size: 39, lr: 1.17e-04 2022-05-09 17:11:24,398 INFO [train.py:715] (0/8) Epoch 19, batch 18600, loss[loss=0.1556, simple_loss=0.2155, pruned_loss=0.04785, over 4954.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2045, pruned_loss=0.02793, over 972998.82 frames.], batch size: 35, lr: 1.17e-04 2022-05-09 17:11:41,867 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-680000.pt 2022-05-09 17:12:06,311 INFO [train.py:715] (0/8) Epoch 19, batch 18650, loss[loss=0.1617, simple_loss=0.2338, pruned_loss=0.04478, over 4748.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2044, pruned_loss=0.02808, over 972499.85 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:12:45,150 INFO [train.py:715] (0/8) Epoch 19, batch 18700, loss[loss=0.115, simple_loss=0.1951, pruned_loss=0.01749, over 4809.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02814, over 972489.18 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 17:13:24,440 INFO [train.py:715] (0/8) Epoch 19, batch 18750, loss[loss=0.1161, simple_loss=0.195, pruned_loss=0.01857, over 4829.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2053, pruned_loss=0.02817, over 972019.83 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:14:04,378 INFO [train.py:715] (0/8) Epoch 19, batch 18800, loss[loss=0.1526, simple_loss=0.236, pruned_loss=0.03464, over 4929.00 frames.], tot_loss[loss=0.131, simple_loss=0.2057, pruned_loss=0.02821, over 971884.07 frames.], batch size: 23, lr: 1.17e-04 2022-05-09 17:14:44,261 INFO [train.py:715] (0/8) Epoch 19, batch 18850, loss[loss=0.1624, simple_loss=0.2304, pruned_loss=0.04719, over 4775.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.02832, over 970913.84 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:15:23,450 INFO [train.py:715] (0/8) Epoch 19, batch 18900, loss[loss=0.1704, simple_loss=0.2193, pruned_loss=0.06077, over 4807.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02855, over 970571.60 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 17:16:02,831 INFO [train.py:715] (0/8) Epoch 19, batch 18950, loss[loss=0.1579, simple_loss=0.2243, pruned_loss=0.04575, over 4926.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02862, over 972006.05 frames.], batch size: 23, lr: 1.17e-04 2022-05-09 17:16:42,872 INFO [train.py:715] (0/8) Epoch 19, batch 19000, loss[loss=0.1276, simple_loss=0.2046, pruned_loss=0.02532, over 4896.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2062, pruned_loss=0.02828, over 973266.33 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 17:17:22,372 INFO [train.py:715] (0/8) Epoch 19, batch 19050, loss[loss=0.1394, simple_loss=0.2071, pruned_loss=0.0358, over 4846.00 frames.], tot_loss[loss=0.131, simple_loss=0.2058, pruned_loss=0.02813, over 973546.95 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 17:18:01,437 INFO [train.py:715] (0/8) Epoch 19, batch 19100, loss[loss=0.1357, simple_loss=0.2108, pruned_loss=0.03032, over 4701.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2054, pruned_loss=0.02792, over 973090.77 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:18:41,052 INFO [train.py:715] (0/8) Epoch 19, batch 19150, loss[loss=0.1311, simple_loss=0.207, pruned_loss=0.02755, over 4727.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02873, over 972209.23 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:19:20,395 INFO [train.py:715] (0/8) Epoch 19, batch 19200, loss[loss=0.1248, simple_loss=0.1928, pruned_loss=0.02839, over 4977.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2065, pruned_loss=0.02857, over 972290.13 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 17:19:59,881 INFO [train.py:715] (0/8) Epoch 19, batch 19250, loss[loss=0.1346, simple_loss=0.1972, pruned_loss=0.03602, over 4865.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02867, over 972594.76 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 17:20:39,158 INFO [train.py:715] (0/8) Epoch 19, batch 19300, loss[loss=0.1288, simple_loss=0.1916, pruned_loss=0.03295, over 4878.00 frames.], tot_loss[loss=0.1316, simple_loss=0.206, pruned_loss=0.02859, over 972736.00 frames.], batch size: 32, lr: 1.17e-04 2022-05-09 17:21:19,519 INFO [train.py:715] (0/8) Epoch 19, batch 19350, loss[loss=0.1137, simple_loss=0.1854, pruned_loss=0.02095, over 4910.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2049, pruned_loss=0.02819, over 972321.73 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 17:21:58,943 INFO [train.py:715] (0/8) Epoch 19, batch 19400, loss[loss=0.1233, simple_loss=0.2086, pruned_loss=0.01903, over 4987.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.0281, over 971384.30 frames.], batch size: 28, lr: 1.17e-04 2022-05-09 17:22:38,649 INFO [train.py:715] (0/8) Epoch 19, batch 19450, loss[loss=0.1408, simple_loss=0.2135, pruned_loss=0.03403, over 4814.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02812, over 972456.28 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 17:23:18,388 INFO [train.py:715] (0/8) Epoch 19, batch 19500, loss[loss=0.1113, simple_loss=0.1821, pruned_loss=0.02031, over 4754.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02858, over 973156.73 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:23:57,781 INFO [train.py:715] (0/8) Epoch 19, batch 19550, loss[loss=0.1283, simple_loss=0.2044, pruned_loss=0.02606, over 4779.00 frames.], tot_loss[loss=0.1307, simple_loss=0.205, pruned_loss=0.02817, over 972856.40 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:24:36,947 INFO [train.py:715] (0/8) Epoch 19, batch 19600, loss[loss=0.1248, simple_loss=0.2083, pruned_loss=0.02067, over 4807.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2054, pruned_loss=0.02794, over 973404.71 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 17:25:17,640 INFO [train.py:715] (0/8) Epoch 19, batch 19650, loss[loss=0.1307, simple_loss=0.1991, pruned_loss=0.03115, over 4965.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2052, pruned_loss=0.02823, over 973241.35 frames.], batch size: 35, lr: 1.17e-04 2022-05-09 17:25:56,967 INFO [train.py:715] (0/8) Epoch 19, batch 19700, loss[loss=0.1148, simple_loss=0.1911, pruned_loss=0.01927, over 4743.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02809, over 974119.81 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:26:35,807 INFO [train.py:715] (0/8) Epoch 19, batch 19750, loss[loss=0.112, simple_loss=0.1823, pruned_loss=0.02087, over 4867.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2067, pruned_loss=0.02925, over 973227.02 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 17:27:16,059 INFO [train.py:715] (0/8) Epoch 19, batch 19800, loss[loss=0.1143, simple_loss=0.1759, pruned_loss=0.02638, over 4867.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02904, over 972542.07 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:27:55,917 INFO [train.py:715] (0/8) Epoch 19, batch 19850, loss[loss=0.1267, simple_loss=0.1865, pruned_loss=0.03349, over 4779.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2067, pruned_loss=0.0291, over 973072.81 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 17:28:35,201 INFO [train.py:715] (0/8) Epoch 19, batch 19900, loss[loss=0.1422, simple_loss=0.2052, pruned_loss=0.03957, over 4820.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2057, pruned_loss=0.02897, over 972591.81 frames.], batch size: 26, lr: 1.17e-04 2022-05-09 17:29:13,880 INFO [train.py:715] (0/8) Epoch 19, batch 19950, loss[loss=0.1435, simple_loss=0.2155, pruned_loss=0.03574, over 4987.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02881, over 973117.45 frames.], batch size: 27, lr: 1.17e-04 2022-05-09 17:29:53,615 INFO [train.py:715] (0/8) Epoch 19, batch 20000, loss[loss=0.1649, simple_loss=0.23, pruned_loss=0.04983, over 4757.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2059, pruned_loss=0.02869, over 973048.34 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:30:33,007 INFO [train.py:715] (0/8) Epoch 19, batch 20050, loss[loss=0.135, simple_loss=0.2184, pruned_loss=0.02575, over 4961.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2053, pruned_loss=0.02841, over 973361.45 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:31:12,639 INFO [train.py:715] (0/8) Epoch 19, batch 20100, loss[loss=0.1246, simple_loss=0.1986, pruned_loss=0.02537, over 4889.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02865, over 973155.45 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:31:52,171 INFO [train.py:715] (0/8) Epoch 19, batch 20150, loss[loss=0.1456, simple_loss=0.2193, pruned_loss=0.03593, over 4873.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2059, pruned_loss=0.0288, over 972877.19 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:32:31,815 INFO [train.py:715] (0/8) Epoch 19, batch 20200, loss[loss=0.1295, simple_loss=0.2053, pruned_loss=0.02684, over 4821.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2054, pruned_loss=0.02872, over 972953.02 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:33:11,345 INFO [train.py:715] (0/8) Epoch 19, batch 20250, loss[loss=0.1416, simple_loss=0.2214, pruned_loss=0.03087, over 4898.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2049, pruned_loss=0.02852, over 973238.98 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 17:33:50,681 INFO [train.py:715] (0/8) Epoch 19, batch 20300, loss[loss=0.1206, simple_loss=0.1943, pruned_loss=0.02351, over 4900.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2057, pruned_loss=0.02881, over 973670.72 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:34:30,223 INFO [train.py:715] (0/8) Epoch 19, batch 20350, loss[loss=0.118, simple_loss=0.1995, pruned_loss=0.01824, over 4894.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02862, over 973053.46 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 17:35:09,429 INFO [train.py:715] (0/8) Epoch 19, batch 20400, loss[loss=0.1137, simple_loss=0.1823, pruned_loss=0.02255, over 4919.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2049, pruned_loss=0.02841, over 973007.16 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:35:48,299 INFO [train.py:715] (0/8) Epoch 19, batch 20450, loss[loss=0.1496, simple_loss=0.2205, pruned_loss=0.03937, over 4927.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02866, over 973765.96 frames.], batch size: 18, lr: 1.17e-04 2022-05-09 17:36:28,041 INFO [train.py:715] (0/8) Epoch 19, batch 20500, loss[loss=0.1141, simple_loss=0.1889, pruned_loss=0.0196, over 4858.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2056, pruned_loss=0.02877, over 973047.20 frames.], batch size: 30, lr: 1.17e-04 2022-05-09 17:37:07,744 INFO [train.py:715] (0/8) Epoch 19, batch 20550, loss[loss=0.1194, simple_loss=0.2001, pruned_loss=0.01939, over 4701.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02879, over 973706.59 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:37:46,558 INFO [train.py:715] (0/8) Epoch 19, batch 20600, loss[loss=0.13, simple_loss=0.1975, pruned_loss=0.03126, over 4783.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02899, over 973057.08 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:38:26,010 INFO [train.py:715] (0/8) Epoch 19, batch 20650, loss[loss=0.1339, simple_loss=0.213, pruned_loss=0.02745, over 4973.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2079, pruned_loss=0.02917, over 973308.36 frames.], batch size: 21, lr: 1.17e-04 2022-05-09 17:39:05,342 INFO [train.py:715] (0/8) Epoch 19, batch 20700, loss[loss=0.1401, simple_loss=0.2083, pruned_loss=0.03591, over 4994.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02862, over 973257.32 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 17:39:44,822 INFO [train.py:715] (0/8) Epoch 19, batch 20750, loss[loss=0.1287, simple_loss=0.2074, pruned_loss=0.02506, over 4978.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02878, over 972714.08 frames.], batch size: 25, lr: 1.17e-04 2022-05-09 17:40:23,533 INFO [train.py:715] (0/8) Epoch 19, batch 20800, loss[loss=0.1593, simple_loss=0.2352, pruned_loss=0.04174, over 4838.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02858, over 972779.26 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:41:02,808 INFO [train.py:715] (0/8) Epoch 19, batch 20850, loss[loss=0.09264, simple_loss=0.1603, pruned_loss=0.01249, over 4790.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2055, pruned_loss=0.02851, over 972245.08 frames.], batch size: 12, lr: 1.17e-04 2022-05-09 17:41:42,478 INFO [train.py:715] (0/8) Epoch 19, batch 20900, loss[loss=0.1545, simple_loss=0.2252, pruned_loss=0.04193, over 4876.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02865, over 972331.32 frames.], batch size: 20, lr: 1.17e-04 2022-05-09 17:42:21,287 INFO [train.py:715] (0/8) Epoch 19, batch 20950, loss[loss=0.1317, simple_loss=0.2163, pruned_loss=0.02355, over 4919.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2049, pruned_loss=0.02844, over 972880.04 frames.], batch size: 17, lr: 1.17e-04 2022-05-09 17:43:01,033 INFO [train.py:715] (0/8) Epoch 19, batch 21000, loss[loss=0.09514, simple_loss=0.1659, pruned_loss=0.0122, over 4985.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2043, pruned_loss=0.02832, over 973579.86 frames.], batch size: 14, lr: 1.17e-04 2022-05-09 17:43:01,034 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 17:43:11,506 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1045, simple_loss=0.1878, pruned_loss=0.01062, over 914524.00 frames. 2022-05-09 17:43:51,335 INFO [train.py:715] (0/8) Epoch 19, batch 21050, loss[loss=0.1529, simple_loss=0.2232, pruned_loss=0.04132, over 4899.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2057, pruned_loss=0.0285, over 974304.65 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 17:44:31,301 INFO [train.py:715] (0/8) Epoch 19, batch 21100, loss[loss=0.1409, simple_loss=0.2195, pruned_loss=0.03119, over 4984.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2053, pruned_loss=0.02847, over 974614.79 frames.], batch size: 26, lr: 1.17e-04 2022-05-09 17:45:10,116 INFO [train.py:715] (0/8) Epoch 19, batch 21150, loss[loss=0.1467, simple_loss=0.2135, pruned_loss=0.03992, over 4920.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2052, pruned_loss=0.02854, over 974921.60 frames.], batch size: 23, lr: 1.17e-04 2022-05-09 17:45:49,701 INFO [train.py:715] (0/8) Epoch 19, batch 21200, loss[loss=0.1241, simple_loss=0.1923, pruned_loss=0.02797, over 4739.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2043, pruned_loss=0.02792, over 973295.52 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:46:28,949 INFO [train.py:715] (0/8) Epoch 19, batch 21250, loss[loss=0.1152, simple_loss=0.1918, pruned_loss=0.01929, over 4825.00 frames.], tot_loss[loss=0.1297, simple_loss=0.204, pruned_loss=0.02767, over 972817.17 frames.], batch size: 15, lr: 1.17e-04 2022-05-09 17:47:07,990 INFO [train.py:715] (0/8) Epoch 19, batch 21300, loss[loss=0.1275, simple_loss=0.2011, pruned_loss=0.02696, over 4917.00 frames.], tot_loss[loss=0.1298, simple_loss=0.204, pruned_loss=0.02779, over 971708.19 frames.], batch size: 19, lr: 1.17e-04 2022-05-09 17:47:46,803 INFO [train.py:715] (0/8) Epoch 19, batch 21350, loss[loss=0.1175, simple_loss=0.1931, pruned_loss=0.02097, over 4930.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2047, pruned_loss=0.02816, over 972291.87 frames.], batch size: 29, lr: 1.17e-04 2022-05-09 17:48:26,338 INFO [train.py:715] (0/8) Epoch 19, batch 21400, loss[loss=0.1343, simple_loss=0.2227, pruned_loss=0.02298, over 4772.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2052, pruned_loss=0.02824, over 972453.63 frames.], batch size: 16, lr: 1.17e-04 2022-05-09 17:49:05,853 INFO [train.py:715] (0/8) Epoch 19, batch 21450, loss[loss=0.1254, simple_loss=0.1938, pruned_loss=0.02852, over 4826.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2059, pruned_loss=0.02855, over 971791.98 frames.], batch size: 30, lr: 1.17e-04 2022-05-09 17:49:44,644 INFO [train.py:715] (0/8) Epoch 19, batch 21500, loss[loss=0.1418, simple_loss=0.2157, pruned_loss=0.03401, over 4842.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.0283, over 972127.12 frames.], batch size: 30, lr: 1.17e-04 2022-05-09 17:50:24,355 INFO [train.py:715] (0/8) Epoch 19, batch 21550, loss[loss=0.1038, simple_loss=0.1794, pruned_loss=0.01404, over 4816.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2046, pruned_loss=0.02809, over 972070.91 frames.], batch size: 13, lr: 1.17e-04 2022-05-09 17:51:04,079 INFO [train.py:715] (0/8) Epoch 19, batch 21600, loss[loss=0.1402, simple_loss=0.2164, pruned_loss=0.03199, over 4948.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2055, pruned_loss=0.02845, over 972142.94 frames.], batch size: 23, lr: 1.17e-04 2022-05-09 17:51:43,838 INFO [train.py:715] (0/8) Epoch 19, batch 21650, loss[loss=0.1278, simple_loss=0.1992, pruned_loss=0.02823, over 4777.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02859, over 972500.34 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 17:52:22,732 INFO [train.py:715] (0/8) Epoch 19, batch 21700, loss[loss=0.125, simple_loss=0.1959, pruned_loss=0.02708, over 4780.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.0282, over 972102.74 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 17:53:02,146 INFO [train.py:715] (0/8) Epoch 19, batch 21750, loss[loss=0.1325, simple_loss=0.2148, pruned_loss=0.02514, over 4806.00 frames.], tot_loss[loss=0.1311, simple_loss=0.206, pruned_loss=0.02806, over 972540.78 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 17:53:43,112 INFO [train.py:715] (0/8) Epoch 19, batch 21800, loss[loss=0.1437, simple_loss=0.22, pruned_loss=0.03372, over 4760.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02835, over 972642.78 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 17:54:22,934 INFO [train.py:715] (0/8) Epoch 19, batch 21850, loss[loss=0.1407, simple_loss=0.2162, pruned_loss=0.03255, over 4972.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02819, over 973342.91 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 17:55:03,312 INFO [train.py:715] (0/8) Epoch 19, batch 21900, loss[loss=0.1234, simple_loss=0.1968, pruned_loss=0.02504, over 4811.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2057, pruned_loss=0.02812, over 972817.54 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 17:55:43,306 INFO [train.py:715] (0/8) Epoch 19, batch 21950, loss[loss=0.1105, simple_loss=0.1762, pruned_loss=0.02236, over 4815.00 frames.], tot_loss[loss=0.1304, simple_loss=0.205, pruned_loss=0.02786, over 972374.82 frames.], batch size: 12, lr: 1.16e-04 2022-05-09 17:56:22,525 INFO [train.py:715] (0/8) Epoch 19, batch 22000, loss[loss=0.1188, simple_loss=0.2003, pruned_loss=0.01863, over 4740.00 frames.], tot_loss[loss=0.13, simple_loss=0.2049, pruned_loss=0.02753, over 972003.96 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 17:57:01,817 INFO [train.py:715] (0/8) Epoch 19, batch 22050, loss[loss=0.1147, simple_loss=0.193, pruned_loss=0.01822, over 4943.00 frames.], tot_loss[loss=0.1297, simple_loss=0.2045, pruned_loss=0.02746, over 971606.76 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 17:57:41,387 INFO [train.py:715] (0/8) Epoch 19, batch 22100, loss[loss=0.1311, simple_loss=0.204, pruned_loss=0.0291, over 4894.00 frames.], tot_loss[loss=0.1297, simple_loss=0.2044, pruned_loss=0.02753, over 971482.63 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 17:58:21,387 INFO [train.py:715] (0/8) Epoch 19, batch 22150, loss[loss=0.1355, simple_loss=0.2035, pruned_loss=0.03377, over 4756.00 frames.], tot_loss[loss=0.1292, simple_loss=0.2041, pruned_loss=0.02718, over 971601.38 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 17:59:00,687 INFO [train.py:715] (0/8) Epoch 19, batch 22200, loss[loss=0.1101, simple_loss=0.1855, pruned_loss=0.01732, over 4820.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2055, pruned_loss=0.02758, over 971170.78 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 17:59:40,599 INFO [train.py:715] (0/8) Epoch 19, batch 22250, loss[loss=0.1568, simple_loss=0.2354, pruned_loss=0.03911, over 4888.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2058, pruned_loss=0.02789, over 972162.88 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:00:20,344 INFO [train.py:715] (0/8) Epoch 19, batch 22300, loss[loss=0.1183, simple_loss=0.1899, pruned_loss=0.02332, over 4987.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2054, pruned_loss=0.02751, over 972478.16 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 18:00:59,297 INFO [train.py:715] (0/8) Epoch 19, batch 22350, loss[loss=0.1261, simple_loss=0.2039, pruned_loss=0.02418, over 4933.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2054, pruned_loss=0.02753, over 972476.10 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:01:38,318 INFO [train.py:715] (0/8) Epoch 19, batch 22400, loss[loss=0.1191, simple_loss=0.1929, pruned_loss=0.02265, over 4786.00 frames.], tot_loss[loss=0.1312, simple_loss=0.206, pruned_loss=0.02823, over 971570.72 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 18:02:17,606 INFO [train.py:715] (0/8) Epoch 19, batch 22450, loss[loss=0.1327, simple_loss=0.2043, pruned_loss=0.0305, over 4781.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2053, pruned_loss=0.02818, over 972347.29 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:02:57,541 INFO [train.py:715] (0/8) Epoch 19, batch 22500, loss[loss=0.1343, simple_loss=0.2137, pruned_loss=0.0275, over 4795.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2054, pruned_loss=0.02872, over 972121.67 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:03:36,396 INFO [train.py:715] (0/8) Epoch 19, batch 22550, loss[loss=0.1158, simple_loss=0.1988, pruned_loss=0.01645, over 4977.00 frames.], tot_loss[loss=0.1306, simple_loss=0.205, pruned_loss=0.02814, over 973097.22 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:04:16,057 INFO [train.py:715] (0/8) Epoch 19, batch 22600, loss[loss=0.1393, simple_loss=0.2118, pruned_loss=0.03342, over 4932.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2053, pruned_loss=0.02841, over 973077.74 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 18:04:55,709 INFO [train.py:715] (0/8) Epoch 19, batch 22650, loss[loss=0.122, simple_loss=0.1933, pruned_loss=0.02536, over 4822.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2061, pruned_loss=0.02873, over 973623.43 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:05:34,683 INFO [train.py:715] (0/8) Epoch 19, batch 22700, loss[loss=0.1126, simple_loss=0.1774, pruned_loss=0.02396, over 4855.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02883, over 973580.71 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 18:06:13,669 INFO [train.py:715] (0/8) Epoch 19, batch 22750, loss[loss=0.1532, simple_loss=0.2187, pruned_loss=0.04383, over 4838.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2071, pruned_loss=0.02883, over 973191.96 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 18:06:53,393 INFO [train.py:715] (0/8) Epoch 19, batch 22800, loss[loss=0.1306, simple_loss=0.2014, pruned_loss=0.02991, over 4737.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2069, pruned_loss=0.02887, over 971905.12 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 18:07:33,726 INFO [train.py:715] (0/8) Epoch 19, batch 22850, loss[loss=0.1397, simple_loss=0.2109, pruned_loss=0.03426, over 4986.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2076, pruned_loss=0.02929, over 971593.82 frames.], batch size: 33, lr: 1.16e-04 2022-05-09 18:08:11,743 INFO [train.py:715] (0/8) Epoch 19, batch 22900, loss[loss=0.1333, simple_loss=0.2139, pruned_loss=0.02636, over 4785.00 frames.], tot_loss[loss=0.1335, simple_loss=0.2079, pruned_loss=0.02955, over 970730.62 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:08:51,141 INFO [train.py:715] (0/8) Epoch 19, batch 22950, loss[loss=0.129, simple_loss=0.2061, pruned_loss=0.02598, over 4811.00 frames.], tot_loss[loss=0.1329, simple_loss=0.2077, pruned_loss=0.02912, over 971435.09 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 18:09:31,749 INFO [train.py:715] (0/8) Epoch 19, batch 23000, loss[loss=0.1407, simple_loss=0.2124, pruned_loss=0.03446, over 4919.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2072, pruned_loss=0.02882, over 971475.94 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:10:12,241 INFO [train.py:715] (0/8) Epoch 19, batch 23050, loss[loss=0.1116, simple_loss=0.1797, pruned_loss=0.0218, over 4868.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02903, over 972075.55 frames.], batch size: 20, lr: 1.16e-04 2022-05-09 18:10:52,461 INFO [train.py:715] (0/8) Epoch 19, batch 23100, loss[loss=0.1265, simple_loss=0.2, pruned_loss=0.02652, over 4901.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2066, pruned_loss=0.02879, over 971572.37 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:11:33,199 INFO [train.py:715] (0/8) Epoch 19, batch 23150, loss[loss=0.1247, simple_loss=0.2048, pruned_loss=0.02234, over 4825.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.0289, over 971923.43 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 18:12:14,186 INFO [train.py:715] (0/8) Epoch 19, batch 23200, loss[loss=0.1239, simple_loss=0.2002, pruned_loss=0.02379, over 4755.00 frames.], tot_loss[loss=0.1322, simple_loss=0.207, pruned_loss=0.02871, over 971325.00 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:12:53,608 INFO [train.py:715] (0/8) Epoch 19, batch 23250, loss[loss=0.1353, simple_loss=0.2108, pruned_loss=0.0299, over 4753.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2064, pruned_loss=0.02843, over 971193.38 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:13:34,365 INFO [train.py:715] (0/8) Epoch 19, batch 23300, loss[loss=0.1238, simple_loss=0.2007, pruned_loss=0.02341, over 4974.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2068, pruned_loss=0.02852, over 971356.02 frames.], batch size: 28, lr: 1.16e-04 2022-05-09 18:14:16,099 INFO [train.py:715] (0/8) Epoch 19, batch 23350, loss[loss=0.1329, simple_loss=0.2026, pruned_loss=0.03158, over 4846.00 frames.], tot_loss[loss=0.132, simple_loss=0.2068, pruned_loss=0.02858, over 971833.24 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 18:14:56,714 INFO [train.py:715] (0/8) Epoch 19, batch 23400, loss[loss=0.1472, simple_loss=0.2187, pruned_loss=0.03784, over 4703.00 frames.], tot_loss[loss=0.1324, simple_loss=0.207, pruned_loss=0.02894, over 971223.77 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 18:15:37,873 INFO [train.py:715] (0/8) Epoch 19, batch 23450, loss[loss=0.1253, simple_loss=0.1985, pruned_loss=0.02609, over 4730.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.0287, over 971422.93 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 18:16:19,140 INFO [train.py:715] (0/8) Epoch 19, batch 23500, loss[loss=0.1265, simple_loss=0.191, pruned_loss=0.03098, over 4819.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2064, pruned_loss=0.02888, over 970948.42 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 18:17:00,516 INFO [train.py:715] (0/8) Epoch 19, batch 23550, loss[loss=0.1454, simple_loss=0.2217, pruned_loss=0.03453, over 4823.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02867, over 970816.17 frames.], batch size: 27, lr: 1.16e-04 2022-05-09 18:17:41,323 INFO [train.py:715] (0/8) Epoch 19, batch 23600, loss[loss=0.1813, simple_loss=0.2558, pruned_loss=0.05338, over 4735.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2059, pruned_loss=0.02852, over 971189.03 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 18:18:22,139 INFO [train.py:715] (0/8) Epoch 19, batch 23650, loss[loss=0.1192, simple_loss=0.1885, pruned_loss=0.02491, over 4791.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.0282, over 971159.95 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:19:04,123 INFO [train.py:715] (0/8) Epoch 19, batch 23700, loss[loss=0.1113, simple_loss=0.1824, pruned_loss=0.02009, over 4850.00 frames.], tot_loss[loss=0.131, simple_loss=0.2055, pruned_loss=0.02827, over 970427.14 frames.], batch size: 20, lr: 1.16e-04 2022-05-09 18:19:44,518 INFO [train.py:715] (0/8) Epoch 19, batch 23750, loss[loss=0.1187, simple_loss=0.2001, pruned_loss=0.01872, over 4770.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2055, pruned_loss=0.02841, over 971557.34 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:20:24,721 INFO [train.py:715] (0/8) Epoch 19, batch 23800, loss[loss=0.1365, simple_loss=0.2136, pruned_loss=0.02969, over 4750.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02869, over 971642.57 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 18:21:05,129 INFO [train.py:715] (0/8) Epoch 19, batch 23850, loss[loss=0.1401, simple_loss=0.2298, pruned_loss=0.02517, over 4915.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02903, over 971246.28 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:21:45,586 INFO [train.py:715] (0/8) Epoch 19, batch 23900, loss[loss=0.1174, simple_loss=0.1965, pruned_loss=0.01909, over 4933.00 frames.], tot_loss[loss=0.131, simple_loss=0.2052, pruned_loss=0.02837, over 972405.97 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 18:22:24,880 INFO [train.py:715] (0/8) Epoch 19, batch 23950, loss[loss=0.15, simple_loss=0.2288, pruned_loss=0.03554, over 4903.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2047, pruned_loss=0.02827, over 971966.57 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:23:05,241 INFO [train.py:715] (0/8) Epoch 19, batch 24000, loss[loss=0.1287, simple_loss=0.2088, pruned_loss=0.02434, over 4885.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2042, pruned_loss=0.02815, over 971678.51 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:23:05,243 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 18:23:15,158 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1046, simple_loss=0.1878, pruned_loss=0.01073, over 914524.00 frames. 2022-05-09 18:23:55,483 INFO [train.py:715] (0/8) Epoch 19, batch 24050, loss[loss=0.145, simple_loss=0.2122, pruned_loss=0.03892, over 4953.00 frames.], tot_loss[loss=0.1299, simple_loss=0.2041, pruned_loss=0.02786, over 971911.43 frames.], batch size: 35, lr: 1.16e-04 2022-05-09 18:24:36,267 INFO [train.py:715] (0/8) Epoch 19, batch 24100, loss[loss=0.1134, simple_loss=0.184, pruned_loss=0.02144, over 4904.00 frames.], tot_loss[loss=0.1295, simple_loss=0.2041, pruned_loss=0.02748, over 971720.74 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:25:16,108 INFO [train.py:715] (0/8) Epoch 19, batch 24150, loss[loss=0.1353, simple_loss=0.2199, pruned_loss=0.02537, over 4924.00 frames.], tot_loss[loss=0.1289, simple_loss=0.2038, pruned_loss=0.02701, over 972046.26 frames.], batch size: 39, lr: 1.16e-04 2022-05-09 18:25:56,276 INFO [train.py:715] (0/8) Epoch 19, batch 24200, loss[loss=0.1398, simple_loss=0.2101, pruned_loss=0.03474, over 4876.00 frames.], tot_loss[loss=0.1294, simple_loss=0.2038, pruned_loss=0.0275, over 973213.04 frames.], batch size: 22, lr: 1.16e-04 2022-05-09 18:26:36,609 INFO [train.py:715] (0/8) Epoch 19, batch 24250, loss[loss=0.1286, simple_loss=0.2069, pruned_loss=0.02511, over 4930.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2044, pruned_loss=0.02802, over 973072.20 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 18:27:17,346 INFO [train.py:715] (0/8) Epoch 19, batch 24300, loss[loss=0.127, simple_loss=0.1916, pruned_loss=0.03121, over 4864.00 frames.], tot_loss[loss=0.1298, simple_loss=0.204, pruned_loss=0.02778, over 972576.75 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 18:27:56,388 INFO [train.py:715] (0/8) Epoch 19, batch 24350, loss[loss=0.1466, simple_loss=0.2148, pruned_loss=0.03919, over 4825.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2047, pruned_loss=0.02781, over 972235.37 frames.], batch size: 12, lr: 1.16e-04 2022-05-09 18:28:36,031 INFO [train.py:715] (0/8) Epoch 19, batch 24400, loss[loss=0.1217, simple_loss=0.1932, pruned_loss=0.0251, over 4934.00 frames.], tot_loss[loss=0.1304, simple_loss=0.205, pruned_loss=0.02787, over 972327.72 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:29:16,435 INFO [train.py:715] (0/8) Epoch 19, batch 24450, loss[loss=0.1213, simple_loss=0.1899, pruned_loss=0.02633, over 4845.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2045, pruned_loss=0.02789, over 972046.72 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 18:29:55,847 INFO [train.py:715] (0/8) Epoch 19, batch 24500, loss[loss=0.1333, simple_loss=0.2075, pruned_loss=0.02957, over 4879.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2045, pruned_loss=0.02809, over 971552.34 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 18:30:34,371 INFO [train.py:715] (0/8) Epoch 19, batch 24550, loss[loss=0.1474, simple_loss=0.2212, pruned_loss=0.03683, over 4944.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2055, pruned_loss=0.02839, over 972015.30 frames.], batch size: 39, lr: 1.16e-04 2022-05-09 18:31:13,259 INFO [train.py:715] (0/8) Epoch 19, batch 24600, loss[loss=0.1444, simple_loss=0.2129, pruned_loss=0.03795, over 4984.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02841, over 971832.77 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 18:31:52,754 INFO [train.py:715] (0/8) Epoch 19, batch 24650, loss[loss=0.1244, simple_loss=0.2013, pruned_loss=0.02374, over 4976.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02823, over 970932.62 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 18:32:31,484 INFO [train.py:715] (0/8) Epoch 19, batch 24700, loss[loss=0.1256, simple_loss=0.2068, pruned_loss=0.02222, over 4990.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02831, over 971038.63 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 18:33:10,025 INFO [train.py:715] (0/8) Epoch 19, batch 24750, loss[loss=0.1139, simple_loss=0.2009, pruned_loss=0.01339, over 4795.00 frames.], tot_loss[loss=0.1312, simple_loss=0.206, pruned_loss=0.02823, over 971716.56 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:33:50,379 INFO [train.py:715] (0/8) Epoch 19, batch 24800, loss[loss=0.124, simple_loss=0.1971, pruned_loss=0.02541, over 4899.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2064, pruned_loss=0.02836, over 971446.75 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:34:30,020 INFO [train.py:715] (0/8) Epoch 19, batch 24850, loss[loss=0.1152, simple_loss=0.1893, pruned_loss=0.02049, over 4918.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2068, pruned_loss=0.02837, over 971722.29 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 18:35:09,084 INFO [train.py:715] (0/8) Epoch 19, batch 24900, loss[loss=0.1214, simple_loss=0.1919, pruned_loss=0.02545, over 4861.00 frames.], tot_loss[loss=0.132, simple_loss=0.2074, pruned_loss=0.02833, over 971928.72 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 18:35:48,550 INFO [train.py:715] (0/8) Epoch 19, batch 24950, loss[loss=0.1284, simple_loss=0.2102, pruned_loss=0.02325, over 4902.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2074, pruned_loss=0.0286, over 971744.80 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:36:28,347 INFO [train.py:715] (0/8) Epoch 19, batch 25000, loss[loss=0.09958, simple_loss=0.1746, pruned_loss=0.01227, over 4792.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2072, pruned_loss=0.02889, over 971805.93 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:37:07,182 INFO [train.py:715] (0/8) Epoch 19, batch 25050, loss[loss=0.1153, simple_loss=0.1862, pruned_loss=0.02216, over 4777.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02849, over 971448.34 frames.], batch size: 12, lr: 1.16e-04 2022-05-09 18:37:46,484 INFO [train.py:715] (0/8) Epoch 19, batch 25100, loss[loss=0.1289, simple_loss=0.2, pruned_loss=0.02894, over 4904.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2065, pruned_loss=0.02867, over 972246.24 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:38:26,081 INFO [train.py:715] (0/8) Epoch 19, batch 25150, loss[loss=0.1306, simple_loss=0.2208, pruned_loss=0.02021, over 4918.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.0287, over 971761.15 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:39:05,717 INFO [train.py:715] (0/8) Epoch 19, batch 25200, loss[loss=0.129, simple_loss=0.2007, pruned_loss=0.0287, over 4850.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2054, pruned_loss=0.02854, over 971211.26 frames.], batch size: 30, lr: 1.16e-04 2022-05-09 18:39:44,329 INFO [train.py:715] (0/8) Epoch 19, batch 25250, loss[loss=0.1315, simple_loss=0.1978, pruned_loss=0.03265, over 4939.00 frames.], tot_loss[loss=0.132, simple_loss=0.2064, pruned_loss=0.02877, over 971428.06 frames.], batch size: 35, lr: 1.16e-04 2022-05-09 18:40:23,570 INFO [train.py:715] (0/8) Epoch 19, batch 25300, loss[loss=0.1252, simple_loss=0.1995, pruned_loss=0.02546, over 4970.00 frames.], tot_loss[loss=0.131, simple_loss=0.2053, pruned_loss=0.02833, over 971470.26 frames.], batch size: 28, lr: 1.16e-04 2022-05-09 18:41:03,213 INFO [train.py:715] (0/8) Epoch 19, batch 25350, loss[loss=0.1413, simple_loss=0.2233, pruned_loss=0.02963, over 4806.00 frames.], tot_loss[loss=0.131, simple_loss=0.2054, pruned_loss=0.0283, over 971674.16 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:41:42,424 INFO [train.py:715] (0/8) Epoch 19, batch 25400, loss[loss=0.1244, simple_loss=0.2019, pruned_loss=0.02343, over 4754.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2051, pruned_loss=0.02789, over 971241.06 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:42:21,495 INFO [train.py:715] (0/8) Epoch 19, batch 25450, loss[loss=0.1342, simple_loss=0.2085, pruned_loss=0.02998, over 4862.00 frames.], tot_loss[loss=0.1299, simple_loss=0.2049, pruned_loss=0.02742, over 971945.08 frames.], batch size: 34, lr: 1.16e-04 2022-05-09 18:43:00,713 INFO [train.py:715] (0/8) Epoch 19, batch 25500, loss[loss=0.139, simple_loss=0.2031, pruned_loss=0.03745, over 4864.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2053, pruned_loss=0.02795, over 972204.80 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 18:43:39,822 INFO [train.py:715] (0/8) Epoch 19, batch 25550, loss[loss=0.1355, simple_loss=0.2121, pruned_loss=0.0295, over 4809.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2055, pruned_loss=0.02792, over 971624.68 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:44:18,029 INFO [train.py:715] (0/8) Epoch 19, batch 25600, loss[loss=0.1773, simple_loss=0.2583, pruned_loss=0.04817, over 4904.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2059, pruned_loss=0.02798, over 972723.84 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:44:56,958 INFO [train.py:715] (0/8) Epoch 19, batch 25650, loss[loss=0.1396, simple_loss=0.215, pruned_loss=0.03209, over 4892.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2057, pruned_loss=0.02809, over 972110.13 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:45:36,002 INFO [train.py:715] (0/8) Epoch 19, batch 25700, loss[loss=0.1283, simple_loss=0.2046, pruned_loss=0.02596, over 4776.00 frames.], tot_loss[loss=0.13, simple_loss=0.2048, pruned_loss=0.0276, over 972254.48 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 18:46:14,561 INFO [train.py:715] (0/8) Epoch 19, batch 25750, loss[loss=0.1314, simple_loss=0.1928, pruned_loss=0.03498, over 4975.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2049, pruned_loss=0.02788, over 972171.57 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 18:46:53,572 INFO [train.py:715] (0/8) Epoch 19, batch 25800, loss[loss=0.131, simple_loss=0.206, pruned_loss=0.02797, over 4781.00 frames.], tot_loss[loss=0.13, simple_loss=0.2045, pruned_loss=0.02777, over 971962.96 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:47:32,967 INFO [train.py:715] (0/8) Epoch 19, batch 25850, loss[loss=0.1394, simple_loss=0.2228, pruned_loss=0.02797, over 4974.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2052, pruned_loss=0.02789, over 971238.10 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:48:12,298 INFO [train.py:715] (0/8) Epoch 19, batch 25900, loss[loss=0.1114, simple_loss=0.1868, pruned_loss=0.01795, over 4825.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2051, pruned_loss=0.0279, over 971906.54 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 18:48:50,877 INFO [train.py:715] (0/8) Epoch 19, batch 25950, loss[loss=0.1663, simple_loss=0.2439, pruned_loss=0.04439, over 4903.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02798, over 972126.75 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:49:30,501 INFO [train.py:715] (0/8) Epoch 19, batch 26000, loss[loss=0.1173, simple_loss=0.1899, pruned_loss=0.02235, over 4782.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2052, pruned_loss=0.02778, over 971971.48 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 18:50:10,473 INFO [train.py:715] (0/8) Epoch 19, batch 26050, loss[loss=0.1217, simple_loss=0.1973, pruned_loss=0.02307, over 4970.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2049, pruned_loss=0.02796, over 972594.15 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:50:49,156 INFO [train.py:715] (0/8) Epoch 19, batch 26100, loss[loss=0.119, simple_loss=0.1955, pruned_loss=0.02123, over 4787.00 frames.], tot_loss[loss=0.13, simple_loss=0.2048, pruned_loss=0.02767, over 971528.60 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 18:51:28,554 INFO [train.py:715] (0/8) Epoch 19, batch 26150, loss[loss=0.1206, simple_loss=0.1978, pruned_loss=0.0217, over 4931.00 frames.], tot_loss[loss=0.1299, simple_loss=0.2043, pruned_loss=0.02769, over 971916.24 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 18:52:07,555 INFO [train.py:715] (0/8) Epoch 19, batch 26200, loss[loss=0.139, simple_loss=0.2163, pruned_loss=0.03082, over 4730.00 frames.], tot_loss[loss=0.1294, simple_loss=0.2041, pruned_loss=0.02728, over 972079.60 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 18:52:47,076 INFO [train.py:715] (0/8) Epoch 19, batch 26250, loss[loss=0.1281, simple_loss=0.2102, pruned_loss=0.02301, over 4793.00 frames.], tot_loss[loss=0.1291, simple_loss=0.2039, pruned_loss=0.02721, over 971443.93 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 18:53:25,457 INFO [train.py:715] (0/8) Epoch 19, batch 26300, loss[loss=0.1277, simple_loss=0.2011, pruned_loss=0.02715, over 4762.00 frames.], tot_loss[loss=0.1297, simple_loss=0.2046, pruned_loss=0.02742, over 972299.56 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 18:54:04,837 INFO [train.py:715] (0/8) Epoch 19, batch 26350, loss[loss=0.118, simple_loss=0.1932, pruned_loss=0.0214, over 4817.00 frames.], tot_loss[loss=0.129, simple_loss=0.2042, pruned_loss=0.02689, over 973319.12 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:54:44,064 INFO [train.py:715] (0/8) Epoch 19, batch 26400, loss[loss=0.1395, simple_loss=0.2163, pruned_loss=0.03129, over 4969.00 frames.], tot_loss[loss=0.13, simple_loss=0.2051, pruned_loss=0.0275, over 973378.27 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 18:55:23,158 INFO [train.py:715] (0/8) Epoch 19, batch 26450, loss[loss=0.1312, simple_loss=0.2152, pruned_loss=0.02358, over 4978.00 frames.], tot_loss[loss=0.1299, simple_loss=0.2048, pruned_loss=0.02753, over 974773.76 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:56:02,214 INFO [train.py:715] (0/8) Epoch 19, batch 26500, loss[loss=0.12, simple_loss=0.196, pruned_loss=0.02201, over 4768.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2053, pruned_loss=0.02794, over 974653.58 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 18:56:40,883 INFO [train.py:715] (0/8) Epoch 19, batch 26550, loss[loss=0.1177, simple_loss=0.1983, pruned_loss=0.01854, over 4797.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2053, pruned_loss=0.02772, over 974286.21 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:57:21,642 INFO [train.py:715] (0/8) Epoch 19, batch 26600, loss[loss=0.1099, simple_loss=0.1882, pruned_loss=0.01579, over 4803.00 frames.], tot_loss[loss=0.1302, simple_loss=0.2053, pruned_loss=0.02753, over 974657.03 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 18:57:38,183 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-688000.pt 2022-05-09 18:58:02,778 INFO [train.py:715] (0/8) Epoch 19, batch 26650, loss[loss=0.1456, simple_loss=0.2145, pruned_loss=0.03836, over 4859.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2057, pruned_loss=0.02777, over 974895.12 frames.], batch size: 20, lr: 1.16e-04 2022-05-09 18:58:41,692 INFO [train.py:715] (0/8) Epoch 19, batch 26700, loss[loss=0.141, simple_loss=0.2087, pruned_loss=0.03669, over 4798.00 frames.], tot_loss[loss=0.1298, simple_loss=0.2045, pruned_loss=0.0276, over 974684.26 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 18:59:21,008 INFO [train.py:715] (0/8) Epoch 19, batch 26750, loss[loss=0.1479, simple_loss=0.2158, pruned_loss=0.04006, over 4774.00 frames.], tot_loss[loss=0.1299, simple_loss=0.2043, pruned_loss=0.02769, over 973611.87 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:00:00,952 INFO [train.py:715] (0/8) Epoch 19, batch 26800, loss[loss=0.1295, simple_loss=0.1989, pruned_loss=0.03009, over 4843.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2045, pruned_loss=0.02782, over 972519.82 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 19:00:41,174 INFO [train.py:715] (0/8) Epoch 19, batch 26850, loss[loss=0.1273, simple_loss=0.2014, pruned_loss=0.0266, over 4903.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2051, pruned_loss=0.02824, over 972412.51 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:01:20,363 INFO [train.py:715] (0/8) Epoch 19, batch 26900, loss[loss=0.1307, simple_loss=0.1993, pruned_loss=0.03107, over 4893.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2055, pruned_loss=0.0287, over 973033.70 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:02:00,253 INFO [train.py:715] (0/8) Epoch 19, batch 26950, loss[loss=0.1405, simple_loss=0.2147, pruned_loss=0.03315, over 4831.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2057, pruned_loss=0.02859, over 972572.97 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:02:39,712 INFO [train.py:715] (0/8) Epoch 19, batch 27000, loss[loss=0.1138, simple_loss=0.1842, pruned_loss=0.0217, over 4958.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.02853, over 972306.35 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 19:02:39,713 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 19:02:49,600 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1047, simple_loss=0.1878, pruned_loss=0.0108, over 914524.00 frames. 2022-05-09 19:03:29,470 INFO [train.py:715] (0/8) Epoch 19, batch 27050, loss[loss=0.1416, simple_loss=0.2188, pruned_loss=0.03221, over 4757.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02824, over 972671.83 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:04:09,786 INFO [train.py:715] (0/8) Epoch 19, batch 27100, loss[loss=0.125, simple_loss=0.1987, pruned_loss=0.02565, over 4976.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2053, pruned_loss=0.02807, over 972146.19 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 19:04:50,659 INFO [train.py:715] (0/8) Epoch 19, batch 27150, loss[loss=0.1395, simple_loss=0.2096, pruned_loss=0.03472, over 4903.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2053, pruned_loss=0.02803, over 972334.16 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:05:30,591 INFO [train.py:715] (0/8) Epoch 19, batch 27200, loss[loss=0.1405, simple_loss=0.2081, pruned_loss=0.03643, over 4914.00 frames.], tot_loss[loss=0.131, simple_loss=0.2059, pruned_loss=0.02805, over 971869.18 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:06:11,126 INFO [train.py:715] (0/8) Epoch 19, batch 27250, loss[loss=0.1195, simple_loss=0.1888, pruned_loss=0.02505, over 4881.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2056, pruned_loss=0.02801, over 972922.25 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 19:06:52,922 INFO [train.py:715] (0/8) Epoch 19, batch 27300, loss[loss=0.1331, simple_loss=0.2191, pruned_loss=0.02349, over 4823.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2059, pruned_loss=0.02786, over 974194.39 frames.], batch size: 27, lr: 1.16e-04 2022-05-09 19:07:33,647 INFO [train.py:715] (0/8) Epoch 19, batch 27350, loss[loss=0.1393, simple_loss=0.219, pruned_loss=0.02978, over 4833.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2055, pruned_loss=0.02785, over 973758.71 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 19:08:14,912 INFO [train.py:715] (0/8) Epoch 19, batch 27400, loss[loss=0.1293, simple_loss=0.2127, pruned_loss=0.02301, over 4825.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2051, pruned_loss=0.02753, over 973190.08 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:08:54,855 INFO [train.py:715] (0/8) Epoch 19, batch 27450, loss[loss=0.1238, simple_loss=0.1993, pruned_loss=0.02419, over 4888.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2052, pruned_loss=0.02746, over 971950.40 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:09:36,480 INFO [train.py:715] (0/8) Epoch 19, batch 27500, loss[loss=0.1297, simple_loss=0.2135, pruned_loss=0.02296, over 4769.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2051, pruned_loss=0.02751, over 971454.53 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:10:17,083 INFO [train.py:715] (0/8) Epoch 19, batch 27550, loss[loss=0.1506, simple_loss=0.2239, pruned_loss=0.03864, over 4731.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2052, pruned_loss=0.02764, over 971213.76 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 19:10:57,702 INFO [train.py:715] (0/8) Epoch 19, batch 27600, loss[loss=0.1157, simple_loss=0.1861, pruned_loss=0.02265, over 4963.00 frames.], tot_loss[loss=0.1307, simple_loss=0.206, pruned_loss=0.02773, over 971500.23 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 19:11:38,778 INFO [train.py:715] (0/8) Epoch 19, batch 27650, loss[loss=0.1631, simple_loss=0.2252, pruned_loss=0.05051, over 4690.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2061, pruned_loss=0.02784, over 972047.25 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 19:12:19,402 INFO [train.py:715] (0/8) Epoch 19, batch 27700, loss[loss=0.1421, simple_loss=0.2039, pruned_loss=0.0402, over 4848.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2059, pruned_loss=0.02837, over 972089.68 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 19:13:00,034 INFO [train.py:715] (0/8) Epoch 19, batch 27750, loss[loss=0.1469, simple_loss=0.2188, pruned_loss=0.03748, over 4742.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02827, over 972264.63 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 19:13:40,111 INFO [train.py:715] (0/8) Epoch 19, batch 27800, loss[loss=0.1226, simple_loss=0.2031, pruned_loss=0.0211, over 4906.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02884, over 972113.02 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:14:21,132 INFO [train.py:715] (0/8) Epoch 19, batch 27850, loss[loss=0.1031, simple_loss=0.1751, pruned_loss=0.01558, over 4834.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.02871, over 972328.78 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 19:15:01,160 INFO [train.py:715] (0/8) Epoch 19, batch 27900, loss[loss=0.1275, simple_loss=0.205, pruned_loss=0.02501, over 4926.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2059, pruned_loss=0.02884, over 971576.46 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:15:41,279 INFO [train.py:715] (0/8) Epoch 19, batch 27950, loss[loss=0.1274, simple_loss=0.2076, pruned_loss=0.02362, over 4688.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2054, pruned_loss=0.02875, over 971317.95 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 19:16:21,259 INFO [train.py:715] (0/8) Epoch 19, batch 28000, loss[loss=0.1095, simple_loss=0.1794, pruned_loss=0.01982, over 4966.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2054, pruned_loss=0.0287, over 972037.37 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 19:17:02,092 INFO [train.py:715] (0/8) Epoch 19, batch 28050, loss[loss=0.133, simple_loss=0.2024, pruned_loss=0.0318, over 4804.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2057, pruned_loss=0.02892, over 971305.28 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 19:17:42,528 INFO [train.py:715] (0/8) Epoch 19, batch 28100, loss[loss=0.1348, simple_loss=0.2037, pruned_loss=0.033, over 4913.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2057, pruned_loss=0.02892, over 970526.95 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:18:22,475 INFO [train.py:715] (0/8) Epoch 19, batch 28150, loss[loss=0.1516, simple_loss=0.2267, pruned_loss=0.03822, over 4888.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2058, pruned_loss=0.029, over 971720.62 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:19:02,903 INFO [train.py:715] (0/8) Epoch 19, batch 28200, loss[loss=0.1193, simple_loss=0.2062, pruned_loss=0.01617, over 4974.00 frames.], tot_loss[loss=0.132, simple_loss=0.2059, pruned_loss=0.02903, over 973165.92 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 19:19:42,612 INFO [train.py:715] (0/8) Epoch 19, batch 28250, loss[loss=0.1081, simple_loss=0.1859, pruned_loss=0.01518, over 4829.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2055, pruned_loss=0.02862, over 973087.75 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:20:22,519 INFO [train.py:715] (0/8) Epoch 19, batch 28300, loss[loss=0.1488, simple_loss=0.2251, pruned_loss=0.03621, over 4957.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2062, pruned_loss=0.02899, over 973432.38 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:21:02,195 INFO [train.py:715] (0/8) Epoch 19, batch 28350, loss[loss=0.1363, simple_loss=0.2141, pruned_loss=0.02931, over 4830.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2066, pruned_loss=0.02884, over 974140.03 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:21:42,213 INFO [train.py:715] (0/8) Epoch 19, batch 28400, loss[loss=0.1507, simple_loss=0.2208, pruned_loss=0.04034, over 4845.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02871, over 973821.73 frames.], batch size: 30, lr: 1.16e-04 2022-05-09 19:22:22,328 INFO [train.py:715] (0/8) Epoch 19, batch 28450, loss[loss=0.1313, simple_loss=0.1897, pruned_loss=0.03639, over 4859.00 frames.], tot_loss[loss=0.131, simple_loss=0.2051, pruned_loss=0.02841, over 974013.57 frames.], batch size: 32, lr: 1.16e-04 2022-05-09 19:23:02,155 INFO [train.py:715] (0/8) Epoch 19, batch 28500, loss[loss=0.122, simple_loss=0.193, pruned_loss=0.02551, over 4823.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2051, pruned_loss=0.0287, over 973436.70 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 19:23:42,829 INFO [train.py:715] (0/8) Epoch 19, batch 28550, loss[loss=0.1126, simple_loss=0.1906, pruned_loss=0.01731, over 4791.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2055, pruned_loss=0.02855, over 972470.45 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 19:24:22,310 INFO [train.py:715] (0/8) Epoch 19, batch 28600, loss[loss=0.1265, simple_loss=0.2044, pruned_loss=0.02433, over 4968.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2055, pruned_loss=0.02846, over 972177.07 frames.], batch size: 39, lr: 1.16e-04 2022-05-09 19:25:02,349 INFO [train.py:715] (0/8) Epoch 19, batch 28650, loss[loss=0.1585, simple_loss=0.2302, pruned_loss=0.04337, over 4915.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2059, pruned_loss=0.02849, over 972600.30 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:25:43,121 INFO [train.py:715] (0/8) Epoch 19, batch 28700, loss[loss=0.124, simple_loss=0.1869, pruned_loss=0.03054, over 4905.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2061, pruned_loss=0.02846, over 973053.07 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:26:22,653 INFO [train.py:715] (0/8) Epoch 19, batch 28750, loss[loss=0.1214, simple_loss=0.1967, pruned_loss=0.02304, over 4862.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2059, pruned_loss=0.02796, over 973418.31 frames.], batch size: 20, lr: 1.16e-04 2022-05-09 19:27:02,560 INFO [train.py:715] (0/8) Epoch 19, batch 28800, loss[loss=0.1224, simple_loss=0.1904, pruned_loss=0.02718, over 4772.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2066, pruned_loss=0.02822, over 971931.08 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:27:41,936 INFO [train.py:715] (0/8) Epoch 19, batch 28850, loss[loss=0.1428, simple_loss=0.2263, pruned_loss=0.02963, over 4887.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2064, pruned_loss=0.02811, over 972867.47 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:28:21,324 INFO [train.py:715] (0/8) Epoch 19, batch 28900, loss[loss=0.1357, simple_loss=0.2088, pruned_loss=0.03126, over 4836.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2064, pruned_loss=0.02827, over 973111.67 frames.], batch size: 30, lr: 1.16e-04 2022-05-09 19:28:59,436 INFO [train.py:715] (0/8) Epoch 19, batch 28950, loss[loss=0.1047, simple_loss=0.1782, pruned_loss=0.01563, over 4967.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2069, pruned_loss=0.02843, over 972391.96 frames.], batch size: 35, lr: 1.16e-04 2022-05-09 19:29:38,325 INFO [train.py:715] (0/8) Epoch 19, batch 29000, loss[loss=0.1322, simple_loss=0.2105, pruned_loss=0.02691, over 4744.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2061, pruned_loss=0.02828, over 971409.91 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 19:30:17,558 INFO [train.py:715] (0/8) Epoch 19, batch 29050, loss[loss=0.1455, simple_loss=0.2323, pruned_loss=0.02934, over 4803.00 frames.], tot_loss[loss=0.1308, simple_loss=0.206, pruned_loss=0.02783, over 971946.65 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:30:56,437 INFO [train.py:715] (0/8) Epoch 19, batch 29100, loss[loss=0.1218, simple_loss=0.1974, pruned_loss=0.02306, over 4986.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2056, pruned_loss=0.02789, over 971867.79 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 19:31:35,382 INFO [train.py:715] (0/8) Epoch 19, batch 29150, loss[loss=0.1226, simple_loss=0.1954, pruned_loss=0.02488, over 4973.00 frames.], tot_loss[loss=0.1305, simple_loss=0.2053, pruned_loss=0.0278, over 972114.39 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 19:32:14,164 INFO [train.py:715] (0/8) Epoch 19, batch 29200, loss[loss=0.1241, simple_loss=0.2014, pruned_loss=0.02341, over 4842.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2064, pruned_loss=0.02823, over 972084.99 frames.], batch size: 20, lr: 1.16e-04 2022-05-09 19:32:53,529 INFO [train.py:715] (0/8) Epoch 19, batch 29250, loss[loss=0.1254, simple_loss=0.2061, pruned_loss=0.02241, over 4910.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2061, pruned_loss=0.02806, over 971474.61 frames.], batch size: 39, lr: 1.16e-04 2022-05-09 19:33:32,155 INFO [train.py:715] (0/8) Epoch 19, batch 29300, loss[loss=0.129, simple_loss=0.2062, pruned_loss=0.02587, over 4791.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2066, pruned_loss=0.02823, over 971788.06 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 19:34:11,670 INFO [train.py:715] (0/8) Epoch 19, batch 29350, loss[loss=0.1194, simple_loss=0.1888, pruned_loss=0.02499, over 4779.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2066, pruned_loss=0.02845, over 971668.99 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:34:50,602 INFO [train.py:715] (0/8) Epoch 19, batch 29400, loss[loss=0.1442, simple_loss=0.212, pruned_loss=0.0382, over 4787.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.02875, over 971132.39 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 19:35:29,739 INFO [train.py:715] (0/8) Epoch 19, batch 29450, loss[loss=0.1302, simple_loss=0.2064, pruned_loss=0.02704, over 4986.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2062, pruned_loss=0.02866, over 970502.36 frames.], batch size: 33, lr: 1.16e-04 2022-05-09 19:36:09,171 INFO [train.py:715] (0/8) Epoch 19, batch 29500, loss[loss=0.1343, simple_loss=0.208, pruned_loss=0.0303, over 4938.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02872, over 971329.93 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 19:36:48,558 INFO [train.py:715] (0/8) Epoch 19, batch 29550, loss[loss=0.1257, simple_loss=0.1991, pruned_loss=0.02616, over 4790.00 frames.], tot_loss[loss=0.1326, simple_loss=0.2072, pruned_loss=0.02896, over 971889.20 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:37:28,167 INFO [train.py:715] (0/8) Epoch 19, batch 29600, loss[loss=0.117, simple_loss=0.191, pruned_loss=0.02147, over 4927.00 frames.], tot_loss[loss=0.1323, simple_loss=0.207, pruned_loss=0.0288, over 972847.12 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:38:07,298 INFO [train.py:715] (0/8) Epoch 19, batch 29650, loss[loss=0.1144, simple_loss=0.193, pruned_loss=0.01789, over 4961.00 frames.], tot_loss[loss=0.1325, simple_loss=0.207, pruned_loss=0.02898, over 972510.25 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 19:38:47,450 INFO [train.py:715] (0/8) Epoch 19, batch 29700, loss[loss=0.1459, simple_loss=0.2151, pruned_loss=0.0384, over 4941.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2066, pruned_loss=0.02907, over 972489.18 frames.], batch size: 35, lr: 1.16e-04 2022-05-09 19:39:26,741 INFO [train.py:715] (0/8) Epoch 19, batch 29750, loss[loss=0.1271, simple_loss=0.2086, pruned_loss=0.02286, over 4955.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02869, over 973060.53 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:40:06,088 INFO [train.py:715] (0/8) Epoch 19, batch 29800, loss[loss=0.1518, simple_loss=0.2318, pruned_loss=0.0359, over 4767.00 frames.], tot_loss[loss=0.1326, simple_loss=0.207, pruned_loss=0.02909, over 972898.07 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:40:45,394 INFO [train.py:715] (0/8) Epoch 19, batch 29850, loss[loss=0.1233, simple_loss=0.2019, pruned_loss=0.02235, over 4946.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2067, pruned_loss=0.02897, over 973438.79 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 19:41:24,807 INFO [train.py:715] (0/8) Epoch 19, batch 29900, loss[loss=0.1343, simple_loss=0.2071, pruned_loss=0.03071, over 4886.00 frames.], tot_loss[loss=0.132, simple_loss=0.2061, pruned_loss=0.0289, over 973553.00 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:42:04,765 INFO [train.py:715] (0/8) Epoch 19, batch 29950, loss[loss=0.1221, simple_loss=0.1933, pruned_loss=0.02548, over 4808.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2058, pruned_loss=0.02868, over 972900.46 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 19:42:43,616 INFO [train.py:715] (0/8) Epoch 19, batch 30000, loss[loss=0.1553, simple_loss=0.2234, pruned_loss=0.04361, over 4770.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2062, pruned_loss=0.02909, over 973022.72 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:42:43,617 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 19:42:53,508 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1045, simple_loss=0.1877, pruned_loss=0.01067, over 914524.00 frames. 2022-05-09 19:43:32,625 INFO [train.py:715] (0/8) Epoch 19, batch 30050, loss[loss=0.1282, simple_loss=0.1979, pruned_loss=0.02927, over 4936.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2067, pruned_loss=0.02886, over 972831.58 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:44:12,190 INFO [train.py:715] (0/8) Epoch 19, batch 30100, loss[loss=0.1546, simple_loss=0.229, pruned_loss=0.0401, over 4830.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2068, pruned_loss=0.02874, over 972818.29 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 19:44:51,309 INFO [train.py:715] (0/8) Epoch 19, batch 30150, loss[loss=0.1499, simple_loss=0.2343, pruned_loss=0.03274, over 4801.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2064, pruned_loss=0.02858, over 973386.04 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:45:31,084 INFO [train.py:715] (0/8) Epoch 19, batch 30200, loss[loss=0.1514, simple_loss=0.2137, pruned_loss=0.04457, over 4766.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2069, pruned_loss=0.02919, over 973266.40 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:46:09,570 INFO [train.py:715] (0/8) Epoch 19, batch 30250, loss[loss=0.1441, simple_loss=0.2212, pruned_loss=0.03346, over 4896.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2074, pruned_loss=0.02913, over 973804.46 frames.], batch size: 39, lr: 1.16e-04 2022-05-09 19:46:48,896 INFO [train.py:715] (0/8) Epoch 19, batch 30300, loss[loss=0.1517, simple_loss=0.2284, pruned_loss=0.03754, over 4936.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02877, over 972857.29 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 19:47:28,506 INFO [train.py:715] (0/8) Epoch 19, batch 30350, loss[loss=0.1141, simple_loss=0.1944, pruned_loss=0.01689, over 4753.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2067, pruned_loss=0.02877, over 971550.24 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 19:48:08,086 INFO [train.py:715] (0/8) Epoch 19, batch 30400, loss[loss=0.1658, simple_loss=0.2402, pruned_loss=0.0457, over 4889.00 frames.], tot_loss[loss=0.1327, simple_loss=0.2071, pruned_loss=0.02912, over 971368.88 frames.], batch size: 22, lr: 1.16e-04 2022-05-09 19:48:47,850 INFO [train.py:715] (0/8) Epoch 19, batch 30450, loss[loss=0.1139, simple_loss=0.1947, pruned_loss=0.01658, over 4914.00 frames.], tot_loss[loss=0.1324, simple_loss=0.2069, pruned_loss=0.02902, over 971719.70 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 19:49:26,656 INFO [train.py:715] (0/8) Epoch 19, batch 30500, loss[loss=0.1213, simple_loss=0.1937, pruned_loss=0.02441, over 4948.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2076, pruned_loss=0.02945, over 971866.26 frames.], batch size: 29, lr: 1.16e-04 2022-05-09 19:50:06,591 INFO [train.py:715] (0/8) Epoch 19, batch 30550, loss[loss=0.1326, simple_loss=0.2004, pruned_loss=0.0324, over 4828.00 frames.], tot_loss[loss=0.1324, simple_loss=0.207, pruned_loss=0.02888, over 972177.67 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 19:50:45,745 INFO [train.py:715] (0/8) Epoch 19, batch 30600, loss[loss=0.1228, simple_loss=0.1986, pruned_loss=0.0235, over 4824.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2062, pruned_loss=0.02835, over 973053.57 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:51:25,834 INFO [train.py:715] (0/8) Epoch 19, batch 30650, loss[loss=0.1131, simple_loss=0.1885, pruned_loss=0.01887, over 4816.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2062, pruned_loss=0.02847, over 972745.41 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:52:05,610 INFO [train.py:715] (0/8) Epoch 19, batch 30700, loss[loss=0.1399, simple_loss=0.2085, pruned_loss=0.03561, over 4922.00 frames.], tot_loss[loss=0.1318, simple_loss=0.2063, pruned_loss=0.02858, over 972627.47 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:52:45,194 INFO [train.py:715] (0/8) Epoch 19, batch 30750, loss[loss=0.1051, simple_loss=0.1788, pruned_loss=0.01572, over 4826.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02911, over 973237.86 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 19:53:25,712 INFO [train.py:715] (0/8) Epoch 19, batch 30800, loss[loss=0.1403, simple_loss=0.2041, pruned_loss=0.03822, over 4923.00 frames.], tot_loss[loss=0.1322, simple_loss=0.2065, pruned_loss=0.02898, over 973421.49 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 19:54:05,654 INFO [train.py:715] (0/8) Epoch 19, batch 30850, loss[loss=0.1385, simple_loss=0.2164, pruned_loss=0.03033, over 4979.00 frames.], tot_loss[loss=0.1327, simple_loss=0.207, pruned_loss=0.02918, over 973238.78 frames.], batch size: 28, lr: 1.16e-04 2022-05-09 19:54:46,446 INFO [train.py:715] (0/8) Epoch 19, batch 30900, loss[loss=0.1196, simple_loss=0.2018, pruned_loss=0.01872, over 4970.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2062, pruned_loss=0.02875, over 974187.48 frames.], batch size: 24, lr: 1.16e-04 2022-05-09 19:55:26,512 INFO [train.py:715] (0/8) Epoch 19, batch 30950, loss[loss=0.1452, simple_loss=0.2163, pruned_loss=0.03702, over 4902.00 frames.], tot_loss[loss=0.1323, simple_loss=0.2066, pruned_loss=0.02903, over 974364.58 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:56:07,124 INFO [train.py:715] (0/8) Epoch 19, batch 31000, loss[loss=0.1231, simple_loss=0.2018, pruned_loss=0.02222, over 4791.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.02853, over 973861.23 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:56:47,788 INFO [train.py:715] (0/8) Epoch 19, batch 31050, loss[loss=0.1307, simple_loss=0.2013, pruned_loss=0.03006, over 4810.00 frames.], tot_loss[loss=0.131, simple_loss=0.2053, pruned_loss=0.02838, over 973031.49 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:57:28,114 INFO [train.py:715] (0/8) Epoch 19, batch 31100, loss[loss=0.1446, simple_loss=0.2114, pruned_loss=0.03888, over 4852.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02836, over 972109.50 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 19:58:08,816 INFO [train.py:715] (0/8) Epoch 19, batch 31150, loss[loss=0.1363, simple_loss=0.2141, pruned_loss=0.02931, over 4810.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2061, pruned_loss=0.02857, over 972056.10 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 19:58:49,192 INFO [train.py:715] (0/8) Epoch 19, batch 31200, loss[loss=0.1294, simple_loss=0.2066, pruned_loss=0.02615, over 4787.00 frames.], tot_loss[loss=0.1311, simple_loss=0.206, pruned_loss=0.02816, over 971934.09 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 19:59:30,189 INFO [train.py:715] (0/8) Epoch 19, batch 31250, loss[loss=0.1292, simple_loss=0.2058, pruned_loss=0.02624, over 4746.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2063, pruned_loss=0.02828, over 971573.21 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:00:09,923 INFO [train.py:715] (0/8) Epoch 19, batch 31300, loss[loss=0.1203, simple_loss=0.2004, pruned_loss=0.02012, over 4881.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02852, over 971503.35 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:00:50,567 INFO [train.py:715] (0/8) Epoch 19, batch 31350, loss[loss=0.1415, simple_loss=0.2164, pruned_loss=0.03329, over 4817.00 frames.], tot_loss[loss=0.133, simple_loss=0.2073, pruned_loss=0.02931, over 972006.54 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 20:01:31,161 INFO [train.py:715] (0/8) Epoch 19, batch 31400, loss[loss=0.1296, simple_loss=0.1982, pruned_loss=0.03052, over 4783.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02946, over 972940.07 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 20:02:11,464 INFO [train.py:715] (0/8) Epoch 19, batch 31450, loss[loss=0.1436, simple_loss=0.2123, pruned_loss=0.03746, over 4916.00 frames.], tot_loss[loss=0.1332, simple_loss=0.2075, pruned_loss=0.02952, over 972924.24 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 20:02:52,730 INFO [train.py:715] (0/8) Epoch 19, batch 31500, loss[loss=0.1522, simple_loss=0.219, pruned_loss=0.04267, over 4976.00 frames.], tot_loss[loss=0.1331, simple_loss=0.2073, pruned_loss=0.02939, over 972155.94 frames.], batch size: 35, lr: 1.16e-04 2022-05-09 20:03:32,847 INFO [train.py:715] (0/8) Epoch 19, batch 31550, loss[loss=0.1385, simple_loss=0.2057, pruned_loss=0.03567, over 4742.00 frames.], tot_loss[loss=0.1328, simple_loss=0.2069, pruned_loss=0.02932, over 972357.08 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:04:13,428 INFO [train.py:715] (0/8) Epoch 19, batch 31600, loss[loss=0.1188, simple_loss=0.192, pruned_loss=0.0228, over 4768.00 frames.], tot_loss[loss=0.1325, simple_loss=0.2069, pruned_loss=0.02905, over 973125.35 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 20:04:53,551 INFO [train.py:715] (0/8) Epoch 19, batch 31650, loss[loss=0.1093, simple_loss=0.191, pruned_loss=0.01383, over 4848.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2059, pruned_loss=0.02873, over 972746.64 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 20:05:33,937 INFO [train.py:715] (0/8) Epoch 19, batch 31700, loss[loss=0.1178, simple_loss=0.1947, pruned_loss=0.02045, over 4901.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2058, pruned_loss=0.02826, over 972964.07 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 20:06:14,403 INFO [train.py:715] (0/8) Epoch 19, batch 31750, loss[loss=0.1156, simple_loss=0.1958, pruned_loss=0.01767, over 4815.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02823, over 972463.12 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 20:06:54,584 INFO [train.py:715] (0/8) Epoch 19, batch 31800, loss[loss=0.1276, simple_loss=0.2059, pruned_loss=0.02465, over 4778.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2061, pruned_loss=0.02785, over 972151.21 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 20:07:35,675 INFO [train.py:715] (0/8) Epoch 19, batch 31850, loss[loss=0.118, simple_loss=0.1891, pruned_loss=0.02339, over 4769.00 frames.], tot_loss[loss=0.1309, simple_loss=0.206, pruned_loss=0.0279, over 972361.44 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 20:08:15,977 INFO [train.py:715] (0/8) Epoch 19, batch 31900, loss[loss=0.1227, simple_loss=0.1992, pruned_loss=0.02309, over 4930.00 frames.], tot_loss[loss=0.1303, simple_loss=0.205, pruned_loss=0.0278, over 972561.35 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 20:08:56,574 INFO [train.py:715] (0/8) Epoch 19, batch 31950, loss[loss=0.1117, simple_loss=0.1832, pruned_loss=0.02007, over 4756.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02809, over 973015.21 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 20:09:36,652 INFO [train.py:715] (0/8) Epoch 19, batch 32000, loss[loss=0.1309, simple_loss=0.2019, pruned_loss=0.02997, over 4748.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2057, pruned_loss=0.02836, over 973338.41 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 20:10:16,973 INFO [train.py:715] (0/8) Epoch 19, batch 32050, loss[loss=0.1335, simple_loss=0.2117, pruned_loss=0.02769, over 4811.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2056, pruned_loss=0.02855, over 972322.50 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 20:10:57,300 INFO [train.py:715] (0/8) Epoch 19, batch 32100, loss[loss=0.1194, simple_loss=0.1939, pruned_loss=0.02245, over 4777.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2061, pruned_loss=0.02863, over 972804.19 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 20:11:37,101 INFO [train.py:715] (0/8) Epoch 19, batch 32150, loss[loss=0.1157, simple_loss=0.1918, pruned_loss=0.01981, over 4955.00 frames.], tot_loss[loss=0.132, simple_loss=0.2067, pruned_loss=0.02865, over 972156.72 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 20:12:18,363 INFO [train.py:715] (0/8) Epoch 19, batch 32200, loss[loss=0.1219, simple_loss=0.2053, pruned_loss=0.01919, over 4949.00 frames.], tot_loss[loss=0.1313, simple_loss=0.206, pruned_loss=0.0283, over 972896.27 frames.], batch size: 14, lr: 1.16e-04 2022-05-09 20:12:58,104 INFO [train.py:715] (0/8) Epoch 19, batch 32250, loss[loss=0.1186, simple_loss=0.2009, pruned_loss=0.01815, over 4840.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2056, pruned_loss=0.02815, over 972741.47 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 20:13:38,490 INFO [train.py:715] (0/8) Epoch 19, batch 32300, loss[loss=0.1012, simple_loss=0.1737, pruned_loss=0.01432, over 4839.00 frames.], tot_loss[loss=0.1303, simple_loss=0.2051, pruned_loss=0.02775, over 972823.24 frames.], batch size: 27, lr: 1.16e-04 2022-05-09 20:14:19,663 INFO [train.py:715] (0/8) Epoch 19, batch 32350, loss[loss=0.1224, simple_loss=0.1929, pruned_loss=0.02595, over 4654.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2059, pruned_loss=0.02817, over 971672.04 frames.], batch size: 13, lr: 1.16e-04 2022-05-09 20:15:00,204 INFO [train.py:715] (0/8) Epoch 19, batch 32400, loss[loss=0.131, simple_loss=0.2012, pruned_loss=0.03039, over 4790.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2052, pruned_loss=0.02802, over 970646.07 frames.], batch size: 12, lr: 1.16e-04 2022-05-09 20:15:40,800 INFO [train.py:715] (0/8) Epoch 19, batch 32450, loss[loss=0.147, simple_loss=0.2182, pruned_loss=0.0379, over 4757.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2062, pruned_loss=0.02824, over 970775.92 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:16:20,795 INFO [train.py:715] (0/8) Epoch 19, batch 32500, loss[loss=0.13, simple_loss=0.194, pruned_loss=0.03294, over 4960.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2055, pruned_loss=0.02817, over 970942.64 frames.], batch size: 35, lr: 1.16e-04 2022-05-09 20:17:01,566 INFO [train.py:715] (0/8) Epoch 19, batch 32550, loss[loss=0.1868, simple_loss=0.2598, pruned_loss=0.05694, over 4863.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2064, pruned_loss=0.02866, over 971655.92 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:17:41,578 INFO [train.py:715] (0/8) Epoch 19, batch 32600, loss[loss=0.1292, simple_loss=0.2028, pruned_loss=0.02783, over 4878.00 frames.], tot_loss[loss=0.1317, simple_loss=0.206, pruned_loss=0.02867, over 971869.34 frames.], batch size: 22, lr: 1.16e-04 2022-05-09 20:18:21,655 INFO [train.py:715] (0/8) Epoch 19, batch 32650, loss[loss=0.136, simple_loss=0.2041, pruned_loss=0.03392, over 4909.00 frames.], tot_loss[loss=0.1314, simple_loss=0.2061, pruned_loss=0.02839, over 971619.26 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 20:19:02,300 INFO [train.py:715] (0/8) Epoch 19, batch 32700, loss[loss=0.1499, simple_loss=0.2309, pruned_loss=0.03439, over 4811.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02854, over 971389.08 frames.], batch size: 25, lr: 1.16e-04 2022-05-09 20:19:42,119 INFO [train.py:715] (0/8) Epoch 19, batch 32750, loss[loss=0.1236, simple_loss=0.2009, pruned_loss=0.02315, over 4821.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2058, pruned_loss=0.02805, over 972424.78 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 20:20:21,838 INFO [train.py:715] (0/8) Epoch 19, batch 32800, loss[loss=0.1089, simple_loss=0.1829, pruned_loss=0.01751, over 4929.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2058, pruned_loss=0.02804, over 972882.72 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 20:21:00,653 INFO [train.py:715] (0/8) Epoch 19, batch 32850, loss[loss=0.122, simple_loss=0.1926, pruned_loss=0.02568, over 4785.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02842, over 971895.43 frames.], batch size: 12, lr: 1.16e-04 2022-05-09 20:21:39,682 INFO [train.py:715] (0/8) Epoch 19, batch 32900, loss[loss=0.1378, simple_loss=0.2213, pruned_loss=0.02717, over 4818.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2052, pruned_loss=0.02826, over 972764.77 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 20:22:18,346 INFO [train.py:715] (0/8) Epoch 19, batch 32950, loss[loss=0.1185, simple_loss=0.1961, pruned_loss=0.02041, over 4773.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2055, pruned_loss=0.0287, over 972077.86 frames.], batch size: 18, lr: 1.16e-04 2022-05-09 20:22:57,659 INFO [train.py:715] (0/8) Epoch 19, batch 33000, loss[loss=0.1329, simple_loss=0.2132, pruned_loss=0.02634, over 4744.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2055, pruned_loss=0.02843, over 971933.11 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:22:57,660 INFO [train.py:733] (0/8) Computing validation loss 2022-05-09 20:23:07,492 INFO [train.py:742] (0/8) Epoch 19, validation: loss=0.1048, simple_loss=0.1878, pruned_loss=0.01088, over 914524.00 frames. 2022-05-09 20:23:46,768 INFO [train.py:715] (0/8) Epoch 19, batch 33050, loss[loss=0.1342, simple_loss=0.2189, pruned_loss=0.02478, over 4827.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2051, pruned_loss=0.02804, over 973026.58 frames.], batch size: 27, lr: 1.16e-04 2022-05-09 20:24:26,210 INFO [train.py:715] (0/8) Epoch 19, batch 33100, loss[loss=0.1235, simple_loss=0.1935, pruned_loss=0.02677, over 4699.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2048, pruned_loss=0.02798, over 972202.92 frames.], batch size: 15, lr: 1.16e-04 2022-05-09 20:25:05,026 INFO [train.py:715] (0/8) Epoch 19, batch 33150, loss[loss=0.1215, simple_loss=0.2034, pruned_loss=0.01973, over 4941.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2054, pruned_loss=0.02806, over 971326.35 frames.], batch size: 21, lr: 1.16e-04 2022-05-09 20:25:44,206 INFO [train.py:715] (0/8) Epoch 19, batch 33200, loss[loss=0.108, simple_loss=0.176, pruned_loss=0.02002, over 4845.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02837, over 971364.54 frames.], batch size: 12, lr: 1.16e-04 2022-05-09 20:26:23,769 INFO [train.py:715] (0/8) Epoch 19, batch 33250, loss[loss=0.1336, simple_loss=0.1972, pruned_loss=0.03498, over 4911.00 frames.], tot_loss[loss=0.1311, simple_loss=0.2057, pruned_loss=0.02828, over 971799.54 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 20:27:03,190 INFO [train.py:715] (0/8) Epoch 19, batch 33300, loss[loss=0.1408, simple_loss=0.222, pruned_loss=0.02976, over 4840.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2066, pruned_loss=0.02847, over 972063.54 frames.], batch size: 26, lr: 1.16e-04 2022-05-09 20:27:42,908 INFO [train.py:715] (0/8) Epoch 19, batch 33350, loss[loss=0.1345, simple_loss=0.2126, pruned_loss=0.02818, over 4782.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2066, pruned_loss=0.02837, over 972239.11 frames.], batch size: 17, lr: 1.16e-04 2022-05-09 20:28:22,073 INFO [train.py:715] (0/8) Epoch 19, batch 33400, loss[loss=0.1233, simple_loss=0.2021, pruned_loss=0.02227, over 4753.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2063, pruned_loss=0.02837, over 972880.96 frames.], batch size: 19, lr: 1.16e-04 2022-05-09 20:29:01,051 INFO [train.py:715] (0/8) Epoch 19, batch 33450, loss[loss=0.1473, simple_loss=0.2137, pruned_loss=0.04051, over 4992.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2064, pruned_loss=0.02809, over 972562.98 frames.], batch size: 16, lr: 1.16e-04 2022-05-09 20:29:40,018 INFO [train.py:715] (0/8) Epoch 19, batch 33500, loss[loss=0.1198, simple_loss=0.2015, pruned_loss=0.01901, over 4920.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2062, pruned_loss=0.02775, over 972361.45 frames.], batch size: 23, lr: 1.16e-04 2022-05-09 20:30:18,901 INFO [train.py:715] (0/8) Epoch 19, batch 33550, loss[loss=0.1623, simple_loss=0.2326, pruned_loss=0.04599, over 4688.00 frames.], tot_loss[loss=0.1306, simple_loss=0.2054, pruned_loss=0.02792, over 972028.51 frames.], batch size: 15, lr: 1.15e-04 2022-05-09 20:30:58,228 INFO [train.py:715] (0/8) Epoch 19, batch 33600, loss[loss=0.1872, simple_loss=0.2501, pruned_loss=0.0621, over 4839.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2062, pruned_loss=0.02858, over 972211.29 frames.], batch size: 30, lr: 1.15e-04 2022-05-09 20:31:37,212 INFO [train.py:715] (0/8) Epoch 19, batch 33650, loss[loss=0.1286, simple_loss=0.2044, pruned_loss=0.02642, over 4634.00 frames.], tot_loss[loss=0.1321, simple_loss=0.2065, pruned_loss=0.02881, over 971555.82 frames.], batch size: 13, lr: 1.15e-04 2022-05-09 20:32:16,614 INFO [train.py:715] (0/8) Epoch 19, batch 33700, loss[loss=0.1136, simple_loss=0.1878, pruned_loss=0.01971, over 4963.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2064, pruned_loss=0.02845, over 971885.61 frames.], batch size: 24, lr: 1.15e-04 2022-05-09 20:32:55,330 INFO [train.py:715] (0/8) Epoch 19, batch 33750, loss[loss=0.1565, simple_loss=0.2311, pruned_loss=0.04094, over 4810.00 frames.], tot_loss[loss=0.1316, simple_loss=0.2062, pruned_loss=0.02851, over 971951.43 frames.], batch size: 21, lr: 1.15e-04 2022-05-09 20:33:34,122 INFO [train.py:715] (0/8) Epoch 19, batch 33800, loss[loss=0.1446, simple_loss=0.2245, pruned_loss=0.03234, over 4815.00 frames.], tot_loss[loss=0.1313, simple_loss=0.2058, pruned_loss=0.02837, over 971938.22 frames.], batch size: 13, lr: 1.15e-04 2022-05-09 20:34:12,731 INFO [train.py:715] (0/8) Epoch 19, batch 33850, loss[loss=0.1736, simple_loss=0.2315, pruned_loss=0.05787, over 4882.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2057, pruned_loss=0.02794, over 972518.44 frames.], batch size: 16, lr: 1.15e-04 2022-05-09 20:34:51,525 INFO [train.py:715] (0/8) Epoch 19, batch 33900, loss[loss=0.1181, simple_loss=0.2004, pruned_loss=0.01793, over 4988.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2062, pruned_loss=0.02841, over 972910.29 frames.], batch size: 28, lr: 1.15e-04 2022-05-09 20:35:31,242 INFO [train.py:715] (0/8) Epoch 19, batch 33950, loss[loss=0.121, simple_loss=0.1867, pruned_loss=0.02761, over 4752.00 frames.], tot_loss[loss=0.1309, simple_loss=0.2053, pruned_loss=0.02821, over 972742.55 frames.], batch size: 12, lr: 1.15e-04 2022-05-09 20:36:10,891 INFO [train.py:715] (0/8) Epoch 19, batch 34000, loss[loss=0.1427, simple_loss=0.215, pruned_loss=0.03516, over 4854.00 frames.], tot_loss[loss=0.1315, simple_loss=0.206, pruned_loss=0.02846, over 973020.27 frames.], batch size: 20, lr: 1.15e-04 2022-05-09 20:36:50,178 INFO [train.py:715] (0/8) Epoch 19, batch 34050, loss[loss=0.1292, simple_loss=0.2106, pruned_loss=0.02391, over 4813.00 frames.], tot_loss[loss=0.1315, simple_loss=0.2059, pruned_loss=0.02852, over 972012.53 frames.], batch size: 21, lr: 1.15e-04 2022-05-09 20:37:28,967 INFO [train.py:715] (0/8) Epoch 19, batch 34100, loss[loss=0.13, simple_loss=0.2069, pruned_loss=0.02655, over 4823.00 frames.], tot_loss[loss=0.1308, simple_loss=0.2055, pruned_loss=0.02809, over 971827.32 frames.], batch size: 13, lr: 1.15e-04 2022-05-09 20:38:08,478 INFO [train.py:715] (0/8) Epoch 19, batch 34150, loss[loss=0.1389, simple_loss=0.2185, pruned_loss=0.02963, over 4834.00 frames.], tot_loss[loss=0.1304, simple_loss=0.2048, pruned_loss=0.02803, over 971709.32 frames.], batch size: 15, lr: 1.15e-04 2022-05-09 20:38:48,055 INFO [train.py:715] (0/8) Epoch 19, batch 34200, loss[loss=0.1526, simple_loss=0.2298, pruned_loss=0.03765, over 4907.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2054, pruned_loss=0.02847, over 971749.90 frames.], batch size: 17, lr: 1.15e-04 2022-05-09 20:39:27,598 INFO [train.py:715] (0/8) Epoch 19, batch 34250, loss[loss=0.1139, simple_loss=0.1997, pruned_loss=0.01409, over 4800.00 frames.], tot_loss[loss=0.1301, simple_loss=0.2044, pruned_loss=0.02784, over 972063.56 frames.], batch size: 21, lr: 1.15e-04 2022-05-09 20:40:06,924 INFO [train.py:715] (0/8) Epoch 19, batch 34300, loss[loss=0.1283, simple_loss=0.205, pruned_loss=0.0258, over 4782.00 frames.], tot_loss[loss=0.1293, simple_loss=0.2036, pruned_loss=0.02752, over 971579.57 frames.], batch size: 17, lr: 1.15e-04 2022-05-09 20:40:46,125 INFO [train.py:715] (0/8) Epoch 19, batch 34350, loss[loss=0.1212, simple_loss=0.191, pruned_loss=0.0257, over 4783.00 frames.], tot_loss[loss=0.1292, simple_loss=0.2038, pruned_loss=0.0273, over 971136.25 frames.], batch size: 12, lr: 1.15e-04 2022-05-09 20:41:25,882 INFO [train.py:715] (0/8) Epoch 19, batch 34400, loss[loss=0.1338, simple_loss=0.2088, pruned_loss=0.02936, over 4774.00 frames.], tot_loss[loss=0.1299, simple_loss=0.2045, pruned_loss=0.02765, over 970708.98 frames.], batch size: 17, lr: 1.15e-04 2022-05-09 20:42:05,038 INFO [train.py:715] (0/8) Epoch 19, batch 34450, loss[loss=0.1279, simple_loss=0.2, pruned_loss=0.02785, over 4742.00 frames.], tot_loss[loss=0.1298, simple_loss=0.2045, pruned_loss=0.02759, over 971086.41 frames.], batch size: 16, lr: 1.15e-04 2022-05-09 20:42:44,558 INFO [train.py:715] (0/8) Epoch 19, batch 34500, loss[loss=0.1442, simple_loss=0.2198, pruned_loss=0.03426, over 4892.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2052, pruned_loss=0.02808, over 971905.57 frames.], batch size: 16, lr: 1.15e-04 2022-05-09 20:43:24,265 INFO [train.py:715] (0/8) Epoch 19, batch 34550, loss[loss=0.1381, simple_loss=0.2205, pruned_loss=0.0278, over 4974.00 frames.], tot_loss[loss=0.1317, simple_loss=0.2063, pruned_loss=0.02856, over 972770.53 frames.], batch size: 35, lr: 1.15e-04 2022-05-09 20:44:03,126 INFO [train.py:715] (0/8) Epoch 19, batch 34600, loss[loss=0.1257, simple_loss=0.2002, pruned_loss=0.02563, over 4861.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2058, pruned_loss=0.02835, over 972789.22 frames.], batch size: 20, lr: 1.15e-04 2022-05-09 20:44:20,630 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/checkpoint-696000.pt 2022-05-09 20:44:45,164 INFO [train.py:715] (0/8) Epoch 19, batch 34650, loss[loss=0.1237, simple_loss=0.1925, pruned_loss=0.02747, over 4915.00 frames.], tot_loss[loss=0.1319, simple_loss=0.2068, pruned_loss=0.02849, over 972589.32 frames.], batch size: 18, lr: 1.15e-04 2022-05-09 20:45:24,630 INFO [train.py:715] (0/8) Epoch 19, batch 34700, loss[loss=0.1247, simple_loss=0.1988, pruned_loss=0.02532, over 4810.00 frames.], tot_loss[loss=0.1312, simple_loss=0.2059, pruned_loss=0.02825, over 972513.30 frames.], batch size: 26, lr: 1.15e-04 2022-05-09 20:46:02,678 INFO [train.py:715] (0/8) Epoch 19, batch 34750, loss[loss=0.1184, simple_loss=0.1979, pruned_loss=0.0195, over 4819.00 frames.], tot_loss[loss=0.1305, simple_loss=0.205, pruned_loss=0.02803, over 971802.66 frames.], batch size: 27, lr: 1.15e-04 2022-05-09 20:46:39,958 INFO [train.py:715] (0/8) Epoch 19, batch 34800, loss[loss=0.104, simple_loss=0.1753, pruned_loss=0.01635, over 4802.00 frames.], tot_loss[loss=0.1307, simple_loss=0.2051, pruned_loss=0.02819, over 972278.42 frames.], batch size: 12, lr: 1.15e-04 2022-05-09 20:46:48,547 INFO [checkpoint.py:70] (0/8) Saving checkpoint to pruned_transducer_stateless2/exp/v2/epoch-19.pt 2022-05-09 20:46:56,186 INFO [train.py:915] (0/8) Done!