File size: 11,119 Bytes
2c70f5c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
fb46903
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
2022-01-27 12:12:47,332 INFO    MainThread:15970 [wandb_setup.py:_flush():69] setting env: {}
2022-01-27 12:12:47,332 INFO    MainThread:15970 [wandb_setup.py:_flush():69] setting login settings: {}
2022-01-27 12:12:47,332 INFO    MainThread:15970 [wandb_init.py:_log_setup():342] Logging user logs to /home/patrick/experiments/xls-r-300m-sv-cv8/wandb/run-20220127_121247-o8za41vn/logs/debug.log
2022-01-27 12:12:47,332 INFO    MainThread:15970 [wandb_init.py:_log_setup():343] Logging internal logs to /home/patrick/experiments/xls-r-300m-sv-cv8/wandb/run-20220127_121247-o8za41vn/logs/debug-internal.log
2022-01-27 12:12:47,333 INFO    MainThread:15970 [wandb_init.py:init():375] calling init triggers
2022-01-27 12:12:47,333 INFO    MainThread:15970 [wandb_init.py:init():380] wandb.init called with sweep_config: {}
config: {}
2022-01-27 12:12:47,333 INFO    MainThread:15970 [wandb_init.py:init():424] starting backend
2022-01-27 12:12:47,333 INFO    MainThread:15970 [backend.py:_multiprocessing_setup():70] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
2022-01-27 12:12:47,360 INFO    MainThread:15970 [backend.py:ensure_launched():135] starting backend process...
2022-01-27 12:12:47,383 INFO    MainThread:15970 [backend.py:ensure_launched():139] started backend process with pid: 16113
2022-01-27 12:12:47,384 INFO    MainThread:15970 [wandb_init.py:init():429] backend started and connected
2022-01-27 12:12:47,385 INFO    MainThread:15970 [wandb_init.py:init():477] updated telemetry
2022-01-27 12:12:47,385 INFO    MainThread:15970 [wandb_init.py:init():500] communicating current version
2022-01-27 12:12:47,846 INFO    MainThread:15970 [wandb_init.py:init():505] got version response upgrade_message: "wandb version 0.12.9 is available!  To upgrade, please run:\n $ pip install wandb --upgrade"

2022-01-27 12:12:47,846 INFO    MainThread:15970 [wandb_init.py:init():513] communicating run to backend with 30 second timeout
2022-01-27 12:12:48,083 INFO    MainThread:15970 [wandb_init.py:init():540] starting run threads in backend
2022-01-27 12:12:53,087 INFO    MainThread:15970 [wandb_run.py:_console_start():1601] atexit reg
2022-01-27 12:12:53,087 INFO    MainThread:15970 [wandb_run.py:_redirect():1475] redirect: SettingsConsole.REDIRECT
2022-01-27 12:12:53,087 INFO    MainThread:15970 [wandb_run.py:_redirect():1480] Redirecting console.
2022-01-27 12:12:53,089 INFO    MainThread:15970 [wandb_run.py:_redirect():1536] Redirects installed.
2022-01-27 12:12:53,089 INFO    MainThread:15970 [wandb_init.py:init():565] run started, returning control to user process
2022-01-27 12:12:53,091 INFO    MainThread:15970 [wandb_run.py:_config_callback():843] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 34, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.1, 'feat_proj_dropout': 0.0, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 37, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.75, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.25, 'mask_feature_length': 64, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 8, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 4, 'eval_accumulation_steps': 'None', 'learning_rate': 7.5e-05, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 50.0, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 2000, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan27_12-08-15_brutasse', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 3, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': './', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['tensorboard', 'wandb', 'codecarbon']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 8, 'eval_batch_size': 8}
2022-01-27 12:12:53,094 INFO    MainThread:15970 [wandb_watch.py:watch():43] Watching
2022-01-27 20:36:06,837 INFO    MainThread:15970 [wandb_run.py:_atexit_cleanup():1571] got exitcode: 0
2022-01-27 20:36:06,838 INFO    MainThread:15970 [wandb_run.py:_restore():1543] restore
2022-01-27 20:36:09,501 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 1
}
pusher_stats {
  uploaded_bytes: 2244
  total_bytes: 2244
}

2022-01-27 20:36:10,037 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 1
}
pusher_stats {
  uploaded_bytes: 2244
  total_bytes: 2244
}

2022-01-27 20:36:10,627 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2244
  total_bytes: 2750979
}

2022-01-27 20:36:10,729 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2244
  total_bytes: 2750979
}

2022-01-27 20:36:10,830 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 515593
  total_bytes: 2750979
}

2022-01-27 20:36:10,931 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 1897531
  total_bytes: 2750979
}

2022-01-27 20:36:11,033 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:11,135 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:11,237 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:11,339 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:11,441 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:12,158 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:12,260 INFO    MainThread:15970 [wandb_run.py:_wait_for_finish():1693] got exit ret: done: true
exit_result {
}
file_counts {
  wandb_count: 6
}
pusher_stats {
  uploaded_bytes: 2750979
  total_bytes: 2750979
}

2022-01-27 20:36:13,565 INFO    MainThread:15970 [wandb_run.py:_show_summary():1848] rendering summary
2022-01-27 20:36:13,566 INFO    MainThread:15970 [wandb_run.py:_show_history():1886] rendering history
2022-01-27 20:36:13,567 INFO    MainThread:15970 [wandb_run.py:_show_files():1915] logging synced files