AlexN commited on
Commit
de5eebc
β€’
1 Parent(s): 65ca3d0

Training in progress, step 8500

Browse files
.ipynb_checkpoints/preprocessor_config-checkpoint.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_normalize": true,
3
+ "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
+ "feature_size": 1,
5
+ "padding_side": "right",
6
+ "padding_value": 0,
7
+ "return_attention_mask": true,
8
+ "sampling_rate": 16000
9
+ }
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c7d5c5665cb2f10cbee4073c92b5d77e0f1dafd3e633c65ef200d4ca4dca469e
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2286263c5fbbc1e83cd9d35130517d66808e769f2dfa0b9424795299c9f1570b
3
  size 1262817457
wandb/run-20220129_131141-h6nhqm30/files/output.log CHANGED
@@ -8470,3 +8470,534 @@ Configuration saved in ./checkpoint-8000/config.json
8470
  {'eval_loss': 0.3026394248008728, 'eval_wer': 0.4099970068841664, 'eval_runtime': 64.4321, 'eval_samples_per_second': 28.542, 'eval_steps_per_second': 0.45, 'epoch': 1.86}
8471
  Model weights saved in ./checkpoint-8000/pytorch_model.bin
8472
  Configuration saved in ./checkpoint-8000/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8470
  {'eval_loss': 0.3026394248008728, 'eval_wer': 0.4099970068841664, 'eval_runtime': 64.4321, 'eval_samples_per_second': 28.542, 'eval_steps_per_second': 0.45, 'epoch': 1.86}
8471
  Model weights saved in ./checkpoint-8000/pytorch_model.bin
8472
  Configuration saved in ./checkpoint-8000/preprocessor_config.json
8473
+ Configuration saved in ./preprocessor_config.json
8474
+ Deleting older checkpoint [checkpoint-6500] due to args.save_total_limit
8475
+
8476
+
8477
+
8478
+
8479
+
8480
+
8481
+
8482
+
8483
+
8484
+
8485
+
8486
+
8487
+
8488
+
8489
+
8490
+
8491
+
8492
+
8493
+
8494
+
8495
+
8496
+
8497
+
8498
+
8499
+
8500
+
8501
+
8502
+
8503
+
8504
+
8505
+
8506
+
8507
+
8508
+
8509
+
8510
+
8511
+
8512
+
8513
+
8514
+
8515
+
8516
+
8517
+
8518
+
8519
+
8520
+
8521
+
8522
+
8523
+
8524
+
8525
+
8526
+
8527
+
8528
+
8529
+
8530
+
8531
+
8532
+
8533
+
8534
+
8535
+
8536
+
8537
+
8538
+
8539
+
8540
+
8541
+
8542
+
8543
+
8544
+
8545
+
8546
+
8547
+
8548
+
8549
+
8550
+
8551
+
8552
+
8553
+
8554
+
8555
+
8556
+
8557
+
8558
+
8559
+
8560
+
8561
+
8562
+
8563
+
8564
+
8565
+
8566
+
8567
+
8568
+
8569
+
8570
+
8571
+ 38%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 8099/21520 [6:41:43<7:19:00, 1.96s/it]
8572
+
8573
+
8574
+
8575
+
8576
+
8577
+
8578
+
8579
+
8580
+
8581
+
8582
+
8583
+
8584
+
8585
+
8586
+
8587
+
8588
+
8589
+
8590
+
8591
+
8592
+
8593
+
8594
+
8595
+
8596
+
8597
+
8598
+
8599
+
8600
+
8601
+
8602
+
8603
+
8604
+
8605
+
8606
+
8607
+
8608
+
8609
+
8610
+
8611
+
8612
+
8613
+
8614
+
8615
+
8616
+
8617
+
8618
+
8619
+
8620
+
8621
+
8622
+
8623
+
8624
+
8625
+
8626
+
8627
+
8628
+
8629
+
8630
+
8631
+
8632
+
8633
+
8634
+
8635
+
8636
+
8637
+
8638
+
8639
+
8640
+
8641
+
8642
+
8643
+
8644
+
8645
+
8646
+
8647
+
8648
+
8649
+
8650
+
8651
+
8652
+
8653
+
8654
+
8655
+
8656
+
8657
+
8658
+
8659
+
8660
+
8661
+
8662
+
8663
+
8664
+
8665
+
8666
+
8667
+
8668
+
8669
+
8670
+ 38%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 8200/21520 [6:46:08<7:12:30, 1.95s/it]
8671
+
8672
+
8673
+
8674
+
8675
+
8676
+
8677
+
8678
+
8679
+
8680
+
8681
+
8682
+
8683
+
8684
+
8685
+
8686
+
8687
+
8688
+
8689
+
8690
+
8691
+
8692
+
8693
+
8694
+
8695
+
8696
+
8697
+
8698
+
8699
+
8700
+
8701
+
8702
+
8703
+
8704
+
8705
+
8706
+
8707
+
8708
+
8709
+
8710
+
8711
+
8712
+
8713
+
8714
+
8715
+
8716
+
8717
+
8718
+
8719
+
8720
+
8721
+
8722
+
8723
+
8724
+
8725
+
8726
+
8727
+
8728
+
8729
+
8730
+
8731
+
8732
+
8733
+
8734
+
8735
+
8736
+
8737
+
8738
+
8739
+
8740
+
8741
+
8742
+
8743
+
8744
+
8745
+
8746
+
8747
+
8748
+
8749
+
8750
+
8751
+
8752
+
8753
+
8754
+
8755
+
8756
+
8757
+
8758
+
8759
+
8760
+
8761
+
8762
+
8763
+
8764
+
8765
+
8766
+
8767
+
8768
+
8769
+ 39%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 8301/21520 [6:50:33<7:07:31, 1.94s/it]
8770
+
8771
+
8772
+
8773
+
8774
+
8775
+
8776
+
8777
+
8778
+
8779
+
8780
+
8781
+
8782
+
8783
+
8784
+
8785
+
8786
+
8787
+
8788
+
8789
+
8790
+
8791
+
8792
+
8793
+
8794
+
8795
+
8796
+
8797
+
8798
+
8799
+
8800
+
8801
+
8802
+
8803
+
8804
+
8805
+
8806
+
8807
+
8808
+
8809
+
8810
+
8811
+
8812
+
8813
+
8814
+
8815
+
8816
+
8817
+
8818
+
8819
+
8820
+
8821
+
8822
+
8823
+
8824
+
8825
+
8826
+
8827
+
8828
+
8829
+
8830
+
8831
+
8832
+
8833
+
8834
+
8835
+
8836
+
8837
+
8838
+
8839
+
8840
+
8841
+
8842
+
8843
+
8844
+
8845
+
8846
+
8847
+
8848
+
8849
+
8850
+
8851
+
8852
+
8853
+
8854
+
8855
+
8856
+
8857
+
8858
+
8859
+
8860
+
8861
+
8862
+
8863
+
8864
+
8865
+
8866
+
8867
+
8868
+ 39%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 8400/21520 [6:54:55<7:36:42, 2.09s/it]
8869
+
8870
+
8871
+
8872
+
8873
+
8874
+
8875
+
8876
+
8877
+
8878
+
8879
+
8880
+
8881
+
8882
+
8883
+
8884
+
8885
+
8886
+
8887
+
8888
+
8889
+
8890
+
8891
+
8892
+
8893
+
8894
+
8895
+
8896
+
8897
+
8898
+
8899
+
8900
+
8901
+
8902
+
8903
+
8904
+
8905
+
8906
+
8907
+
8908
+
8909
+
8910
+
8911
+
8912
+
8913
+
8914
+
8915
+
8916
+
8917
+
8918
+
8919
+
8920
+
8921
+
8922
+
8923
+
8924
+
8925
+
8926
+
8927
+
8928
+
8929
+
8930
+
8931
+
8932
+
8933
+
8934
+
8935
+
8936
+
8937
+
8938
+
8939
+
8940
+
8941
+
8942
+
8943
+
8944
+
8945
+
8946
+
8947
+
8948
+
8949
+
8950
+
8951
+
8952
+
8953
+
8954
+
8955
+
8956
+
8957
+
8958
+
8959
+
8960
+
8961
+
8962
+
8963
+
8964
+
8965
+
8966
+ 39%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 8499/21520 [6:59:15<7:12:41, 1.99s/it]
8967
+ 39%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 8500/21520 [6:59:17<7:12:02, 1.99s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
8968
+ ***** Running Evaluation *****
8969
+ Num examples = 1839
8970
+ Batch size = 64
8971
+
8972
+
8973
+
8974
+
8975
+
8976
+
8977
+
8978
+
8979
+
8980
+
8981
+
8982
+
8983
+
8984
+
8985
+
8986
+
8987
+
8988
+
8989
+
8990
+
8991
+
8992
+
8993
+
8994
+
8995
+
8996
+
8997
+
8998
+
8999
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 29/29 [00:56<00:00, 1.81s/it]
9000
+
9001
+ Configuration saved in ./checkpoint-8500/config.json
9002
+ Model weights saved in ./checkpoint-8500/pytorch_model.bin
9003
+ Configuration saved in ./checkpoint-8500/preprocessor_config.json
wandb/run-20220129_131141-h6nhqm30/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_131141-h6nhqm30/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_131141-h6nhqm30/run-h6nhqm30.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:65aa39669da36f28c6653cdb0f6c9fe3d2f0eba29768f21178ba8144dbee3edf
3
- size 58269618
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:624404f6778fa63cf42e5d65fc81506fe3d0200ae05be92d55229cbe4a878ab4
3
+ size 62254538