AlexN commited on
Commit
4239b0f
β€’
1 Parent(s): 1c3a61f

Training in progress, step 13500

Browse files
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:53344aa3fbfefd43146561feb4fa3eb830c7f27886570a5f85e506bea94e7511
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7be356c0416d66c909300c8a24b255d6bc972bb2572b661bdcc3e0167f8aaba0
3
  size 1262817457
wandb/run-20220129_215451-1vipdbow/files/output.log CHANGED
@@ -15404,3 +15404,593 @@ Configuration saved in ./checkpoint-13000/config.json
15404
  {'eval_loss': 0.23965783417224884, 'eval_wer': 0.3697904786731402, 'eval_runtime': 296.6078, 'eval_samples_per_second': 19.527, 'eval_steps_per_second': 0.307, 'epoch': 1.88}
15405
  Model weights saved in ./checkpoint-13000/pytorch_model.bin
15406
  Configuration saved in ./checkpoint-13000/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15404
  {'eval_loss': 0.23965783417224884, 'eval_wer': 0.3697904786731402, 'eval_runtime': 296.6078, 'eval_samples_per_second': 19.527, 'eval_steps_per_second': 0.307, 'epoch': 1.88}
15405
  Model weights saved in ./checkpoint-13000/pytorch_model.bin
15406
  Configuration saved in ./checkpoint-13000/preprocessor_config.json
15407
+ Configuration saved in ./preprocessor_config.json
15408
+ Deleting older checkpoint [checkpoint-12000] due to args.save_total_limit
15409
+
15410
+
15411
+
15412
+
15413
+
15414
+
15415
+
15416
+
15417
+
15418
+
15419
+
15420
+
15421
+
15422
+
15423
+
15424
+
15425
+
15426
+
15427
+
15428
+
15429
+
15430
+
15431
+
15432
+
15433
+
15434
+
15435
+
15436
+
15437
+
15438
+
15439
+
15440
+
15441
+
15442
+
15443
+
15444
+
15445
+
15446
+
15447
+
15448
+
15449
+
15450
+
15451
+
15452
+
15453
+
15454
+
15455
+
15456
+
15457
+
15458
+
15459
+
15460
+
15461
+
15462
+
15463
+
15464
+
15465
+
15466
+
15467
+
15468
+
15469
+
15470
+
15471
+
15472
+
15473
+
15474
+
15475
+
15476
+
15477
+
15478
+
15479
+
15480
+
15481
+
15482
+
15483
+
15484
+
15485
+
15486
+
15487
+
15488
+
15489
+
15490
+
15491
+
15492
+
15493
+
15494
+
15495
+
15496
+
15497
+
15498
+
15499
+
15500
+
15501
+
15502
+
15503
+ 95%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 13099/13822 [14:10:56<29:12, 2.42s/it]
15504
+
15505
+
15506
+
15507
+
15508
+
15509
+
15510
+
15511
+
15512
+
15513
+
15514
+
15515
+
15516
+
15517
+
15518
+
15519
+
15520
+
15521
+
15522
+
15523
+
15524
+
15525
+
15526
+
15527
+
15528
+
15529
+
15530
+
15531
+
15532
+
15533
+
15534
+
15535
+
15536
+
15537
+
15538
+
15539
+
15540
+
15541
+
15542
+
15543
+
15544
+
15545
+
15546
+
15547
+
15548
+
15549
+
15550
+
15551
+
15552
+
15553
+
15554
+
15555
+
15556
+
15557
+
15558
+
15559
+
15560
+
15561
+
15562
+
15563
+
15564
+
15565
+
15566
+
15567
+
15568
+
15569
+
15570
+
15571
+
15572
+
15573
+
15574
+
15575
+
15576
+
15577
+
15578
+
15579
+
15580
+
15581
+
15582
+
15583
+
15584
+
15585
+
15586
+
15587
+
15588
+
15589
+
15590
+
15591
+
15592
+
15593
+
15594
+
15595
+
15596
+
15597
+
15598
+
15599
+
15600
+
15601
+
15602
+
15603
+ 95%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 13200/13822 [14:16:07<25:03, 2.42s/it]
15604
+
15605
+
15606
+
15607
+
15608
+
15609
+
15610
+
15611
+
15612
+
15613
+
15614
+
15615
+
15616
+
15617
+
15618
+
15619
+
15620
+
15621
+
15622
+
15623
+
15624
+
15625
+
15626
+
15627
+
15628
+
15629
+
15630
+
15631
+
15632
+
15633
+
15634
+
15635
+
15636
+
15637
+
15638
+
15639
+
15640
+
15641
+
15642
+
15643
+
15644
+
15645
+
15646
+
15647
+
15648
+
15649
+
15650
+
15651
+
15652
+
15653
+
15654
+
15655
+
15656
+
15657
+
15658
+
15659
+
15660
+
15661
+
15662
+
15663
+
15664
+
15665
+
15666
+
15667
+
15668
+
15669
+
15670
+
15671
+
15672
+
15673
+
15674
+
15675
+
15676
+
15677
+
15678
+
15679
+
15680
+
15681
+
15682
+
15683
+
15684
+
15685
+
15686
+
15687
+
15688
+
15689
+
15690
+
15691
+
15692
+
15693
+
15694
+
15695
+
15696
+
15697
+
15698
+
15699
+
15700
+
15701
+ 96%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 13299/13822 [14:21:16<21:13, 2.44s/it]
15702
+
15703
+
15704
+
15705
+
15706
+
15707
+
15708
+
15709
+
15710
+
15711
+
15712
+
15713
+
15714
+
15715
+
15716
+
15717
+
15718
+
15719
+
15720
+
15721
+
15722
+
15723
+
15724
+
15725
+
15726
+
15727
+
15728
+
15729
+
15730
+
15731
+
15732
+
15733
+
15734
+
15735
+
15736
+
15737
+
15738
+
15739
+
15740
+
15741
+
15742
+
15743
+
15744
+
15745
+
15746
+
15747
+
15748
+
15749
+
15750
+
15751
+
15752
+
15753
+
15754
+
15755
+
15756
+
15757
+
15758
+
15759
+
15760
+
15761
+
15762
+
15763
+
15764
+
15765
+
15766
+
15767
+
15768
+
15769
+
15770
+
15771
+
15772
+
15773
+
15774
+
15775
+
15776
+
15777
+
15778
+
15779
+
15780
+
15781
+
15782
+
15783
+
15784
+
15785
+
15786
+
15787
+
15788
+
15789
+
15790
+
15791
+
15792
+
15793
+
15794
+
15795
+
15796
+
15797
+
15798
+
15799
+
15800
+ 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 13399/13822 [14:26:30<17:29, 2.48s/it]
15801
+
15802
+
15803
+
15804
+
15805
+
15806
+
15807
+
15808
+
15809
+
15810
+
15811
+
15812
+
15813
+
15814
+
15815
+
15816
+
15817
+
15818
+
15819
+
15820
+
15821
+
15822
+
15823
+
15824
+
15825
+
15826
+
15827
+
15828
+
15829
+
15830
+
15831
+
15832
+
15833
+
15834
+
15835
+
15836
+
15837
+
15838
+
15839
+
15840
+
15841
+
15842
+
15843
+
15844
+
15845
+
15846
+
15847
+
15848
+
15849
+
15850
+
15851
+
15852
+
15853
+
15854
+
15855
+
15856
+
15857
+
15858
+
15859
+
15860
+
15861
+
15862
+
15863
+
15864
+
15865
+
15866
+
15867
+
15868
+
15869
+
15870
+
15871
+
15872
+
15873
+
15874
+
15875
+
15876
+
15877
+
15878
+
15879
+
15880
+
15881
+
15882
+
15883
+
15884
+
15885
+
15886
+
15887
+
15888
+
15889
+
15890
+
15891
+
15892
+
15893
+
15894
+
15895
+
15896
+
15897
+
15898
+ 98%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 13500/13822 [14:31:44<13:14, 2.47s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
15899
+ ***** Running Evaluation *****
15900
+ Num examples = 5792
15901
+ Batch size = 64
15902
+ {'loss': 0.9271, 'learning_rate': 2.71871449440026e-06, 'epoch': 1.95}
15903
+
15904
+
15905
+
15906
+
15907
+
15908
+
15909
+
15910
+
15911
+
15912
+
15913
+
15914
+
15915
+
15916
+
15917
+
15918
+
15919
+
15920
+
15921
+
15922
+
15923
+
15924
+
15925
+
15926
+
15927
+
15928
+
15929
+
15930
+
15931
+
15932
+
15933
+
15934
+
15935
+
15936
+
15937
+
15938
+
15939
+
15940
+
15941
+
15942
+
15943
+
15944
+
15945
+
15946
+
15947
+
15948
+
15949
+
15950
+
15951
+
15952
+
15953
+
15954
+
15955
+
15956
+
15957
+
15958
+
15959
+
15960
+
15961
+
15962
+
15963
+
15964
+
15965
+
15966
+
15967
+
15968
+
15969
+
15970
+
15971
+
15972
+
15973
+
15974
+
15975
+
15976
+
15977
+
15978
+
15979
+
15980
+
15981
+
15982
+
15983
+
15984
+
15985
+
15986
+
15987
+
15988
+
15989
+
15990
+
15991
+
15992
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 91/91 [04:06<00:00, 2.31s/it]
15993
+
15994
+ Configuration saved in ./checkpoint-13500/config.json
15995
+ Model weights saved in ./checkpoint-13500/pytorch_model.bin
15996
+ Configuration saved in ./checkpoint-13500/preprocessor_config.json
wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7f60b1d3e5c98f817df9ee12e5f0c570e79ea375a520c38899d66a70a2c7d02e
3
- size 97915340
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7195c49ec5c00f9774a22e41f6b170281eed92a46d04c69a0b70ffa6a3274c54
3
+ size 102030924