File size: 14,976 Bytes
2297478
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4a6a658
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
2022-02-28 14:41:13,233 INFO    MainThread:224433 [wandb_setup.py:_flush():75] Loading settings from /home/sanchit_huggingface_co/.config/wandb/settings
2022-02-28 14:41:13,233 INFO    MainThread:224433 [wandb_setup.py:_flush():75] Loading settings from /home/sanchit_huggingface_co/wav2vec2-gpt2-wandb-grid-search/wandb/settings
2022-02-28 14:41:13,233 INFO    MainThread:224433 [wandb_setup.py:_flush():75] Loading settings from environment variables: {}
2022-02-28 14:41:13,233 INFO    MainThread:224433 [wandb_setup.py:_flush():75] Inferring run settings from compute environment: {'program_relpath': 'run_speech_recognition_seq2seq.py', 'program': '/home/sanchit_huggingface_co/wav2vec2-gpt2-wandb-grid-search/run_speech_recognition_seq2seq.py'}
2022-02-28 14:41:13,234 INFO    MainThread:224433 [wandb_init.py:_log_setup():386] Logging user logs to /home/sanchit_huggingface_co/wav2vec2-gpt2-wandb-grid-search/wandb/run-20220228_144113-18osrjm3/logs/debug.log
2022-02-28 14:41:13,234 INFO    MainThread:224433 [wandb_init.py:_log_setup():387] Logging internal logs to /home/sanchit_huggingface_co/wav2vec2-gpt2-wandb-grid-search/wandb/run-20220228_144113-18osrjm3/logs/debug-internal.log
2022-02-28 14:41:13,234 INFO    MainThread:224433 [wandb_init.py:init():420] calling init triggers
2022-02-28 14:41:13,234 INFO    MainThread:224433 [wandb_init.py:init():425] wandb.init called with sweep_config: {}
config: {}
2022-02-28 14:41:13,234 INFO    MainThread:224433 [wandb_init.py:init():471] starting backend
2022-02-28 14:41:13,234 INFO    MainThread:224433 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
2022-02-28 14:41:13,293 INFO    MainThread:224433 [backend.py:ensure_launched():219] starting backend process...
2022-02-28 14:41:13,350 INFO    MainThread:224433 [backend.py:ensure_launched():224] started backend process with pid: 224575
2022-02-28 14:41:13,352 INFO    MainThread:224433 [wandb_init.py:init():480] backend started and connected
2022-02-28 14:41:13,363 INFO    MainThread:224433 [wandb_init.py:init():550] updated telemetry
2022-02-28 14:41:13,498 INFO    MainThread:224433 [wandb_init.py:init():581] communicating current version
2022-02-28 14:41:14,258 INFO    MainThread:224433 [wandb_init.py:init():586] got version response 
2022-02-28 14:41:14,258 INFO    MainThread:224433 [wandb_init.py:init():596] communicating run to backend with 30 second timeout
2022-02-28 14:41:14,354 INFO    MainThread:224433 [wandb_init.py:init():624] starting run threads in backend
2022-02-28 14:41:14,469 INFO    MainThread:224433 [wandb_run.py:_console_start():1827] atexit reg
2022-02-28 14:41:14,469 INFO    MainThread:224433 [wandb_run.py:_redirect():1701] redirect: SettingsConsole.REDIRECT
2022-02-28 14:41:14,470 INFO    MainThread:224433 [wandb_run.py:_redirect():1706] Redirecting console.
2022-02-28 14:41:14,472 INFO    MainThread:224433 [wandb_run.py:_redirect():1762] Redirects installed.
2022-02-28 14:41:14,472 INFO    MainThread:224433 [wandb_init.py:init():651] run started, returning control to user process
2022-02-28 14:41:14,474 INFO    MainThread:224433 [wandb_run.py:_config_callback():966] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'torch.float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': False, 'is_encoder_decoder': True, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 50, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['SpeechEncoderDecoderModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': None, 'pad_token_id': 50256, 'eos_token_id': 50256, 'sep_token_id': None, 'decoder_start_token_id': 50256, 'task_specific_params': None, 'problem_type': None, '_name_or_path': './', 'transformers_version': None, 'decoder': {'vocab_size': 50257, 'n_positions': 1024, 'n_embd': 1024, 'n_layer': 24, 'n_head': 16, 'n_inner': None, 'activation_function': 'gelu_new', 'resid_pdrop': 0.0, 'embd_pdrop': 0.0, 'attn_pdrop': 0.0, 'layer_norm_epsilon': 1e-05, 'initializer_range': 0.02, 'summary_type': 'cls_index', 'summary_use_proj': True, 'summary_activation': None, 'summary_first_dropout': 0.0, 'summary_proj_to_labels': True, 'scale_attn_weights': True, 'use_cache': False, 'scale_attn_by_inverse_layer_idx': False, 'reorder_and_upcast_attn': False, 'bos_token_id': 50256, 'eos_token_id': 50256, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': True, 'cross_attention_hidden_size': None, 'add_cross_attention': True, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['GPT2LMHeadModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'pad_token_id': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': {'text-generation': {'do_sample': True, 'max_length': 50}}, 'problem_type': None, '_name_or_path': 'gpt2-medium', 'transformers_version': '4.17.0.dev0', 'n_ctx': 1024, 'n_special': 0, 'predict_special_tokens': True, 'model_type': 'gpt2'}, 'encoder': {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-large-lv60', 'transformers_version': '4.17.0.dev0', 'feat_extract_dropout': 0.0, 'gradient_checkpointing': False, 'hidden_dropout_prob': 0.0, 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.0, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': False, 'mask_time_prob': 0.0, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': True, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'model_type': 'wav2vec2'}, 'model_type': 'speech-encoder-decoder', 'processor_class': 'Wav2Vec2Processor', 'use_cache': False, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 8, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 8, 'eval_accumulation_steps': 'None', 'learning_rate': 1e-05, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 1.0, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Feb28_14-40-31_sanchit--v100', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 1, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': './', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 40, 'generation_num_beams': 1, 'train_batch_size': 8, 'eval_batch_size': 8}
2022-02-28 14:41:14,477 INFO    MainThread:224433 [wandb_watch.py:watch():43] Watching
2022-02-28 16:32:24,188 INFO    MainThread:224433 [wandb_run.py:_atexit_cleanup():1797] got exitcode: 1
2022-02-28 16:32:24,191 INFO    MainThread:224433 [wandb_run.py:_restore():1769] restore
2022-02-28 16:32:26,392 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 1
}
pusher_stats {
  uploaded_bytes: 2093
  total_bytes: 2093
}

2022-02-28 16:32:26,512 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 1
}
pusher_stats {
  uploaded_bytes: 2093
  total_bytes: 2093
}

2022-02-28 16:32:26,665 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 1
}
pusher_stats {
  uploaded_bytes: 2093
  total_bytes: 2093
}

2022-02-28 16:32:27,493 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 1
}
pusher_stats {
  uploaded_bytes: 2093
  total_bytes: 2093
}

2022-02-28 16:32:27,717 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 2
}
pusher_stats {
  uploaded_bytes: 2093
  total_bytes: 972408
}

2022-02-28 16:32:27,819 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2093
  total_bytes: 2832309
}

2022-02-28 16:32:27,921 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2832309
  total_bytes: 2832309
}

2022-02-28 16:32:28,022 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2832309
  total_bytes: 2832309
}

2022-02-28 16:32:28,124 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2832309
  total_bytes: 2832309
}

2022-02-28 16:32:28,226 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2832309
  total_bytes: 2832309
}

2022-02-28 16:32:29,750 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2832309
  total_bytes: 2832309
}

2022-02-28 16:32:29,922 INFO    MainThread:224433 [wandb_run.py:_wait_for_finish():1929] got exit ret: done: true
exit_result {
}
file_counts {
  wandb_count: 5
}
pusher_stats {
  uploaded_bytes: 2832309
  total_bytes: 2832309
}
local_info {
}

2022-02-28 16:32:31,095 INFO    MainThread:224433 [wandb_run.py:_append_history():2144] rendering history
2022-02-28 16:32:31,096 INFO    MainThread:224433 [wandb_run.py:_append_summary():2102] rendering summary
2022-02-28 16:32:31,097 INFO    MainThread:224433 [wandb_run.py:_append_files():2194] logging synced files