2022-01-09 22:02:41,000 INFO MainThread:31164 [wandb_setup.py:_flush():69] setting env: {} 2022-01-09 22:02:41,000 INFO MainThread:31164 [wandb_setup.py:_flush():69] setting login settings: {} 2022-01-09 22:02:41,000 INFO MainThread:31164 [wandb_init.py:_log_setup():342] Logging user logs to /home/patrick/hugging_face/examples/xls-r-300m-sv/wandb/run-20220109_220240-1g372i3v/logs/debug.log 2022-01-09 22:02:41,000 INFO MainThread:31164 [wandb_init.py:_log_setup():343] Logging internal logs to /home/patrick/hugging_face/examples/xls-r-300m-sv/wandb/run-20220109_220240-1g372i3v/logs/debug-internal.log 2022-01-09 22:02:41,001 INFO MainThread:31164 [wandb_init.py:_jupyter_setup():294] configuring jupyter hooks 2022-01-09 22:02:41,001 INFO MainThread:31164 [wandb_init.py:init():375] calling init triggers 2022-01-09 22:02:41,001 INFO MainThread:31164 [wandb_init.py:init():380] wandb.init called with sweep_config: {} config: {} 2022-01-09 22:02:41,001 INFO MainThread:31164 [wandb_init.py:init():424] starting backend 2022-01-09 22:02:41,001 INFO MainThread:31164 [backend.py:_multiprocessing_setup():70] multiprocessing start_methods=fork,spawn,forkserver, using: spawn 2022-01-09 22:02:41,019 INFO MainThread:31164 [backend.py:ensure_launched():135] starting backend process... 2022-01-09 22:02:41,034 INFO MainThread:31164 [backend.py:ensure_launched():139] started backend process with pid: 31514 2022-01-09 22:02:41,035 INFO MainThread:31164 [wandb_init.py:init():429] backend started and connected 2022-01-09 22:02:41,036 INFO MainThread:31164 [wandb_run.py:_label_probe_notebook():815] probe notebook 2022-01-09 22:02:41,036 INFO MainThread:31164 [wandb_run.py:_label_probe_notebook():822] Unable to probe notebook: 'NoneType' object has no attribute 'get' 2022-01-09 22:02:41,036 INFO MainThread:31164 [wandb_init.py:init():477] updated telemetry 2022-01-09 22:02:41,037 INFO MainThread:31164 [wandb_init.py:init():500] communicating current version 2022-01-09 22:02:41,464 INFO MainThread:31164 [wandb_init.py:init():505] got version response upgrade_message: "wandb version 0.12.9 is available! To upgrade, please run:\n $ pip install wandb --upgrade" 2022-01-09 22:02:41,464 INFO MainThread:31164 [wandb_init.py:init():513] communicating run to backend with 30 second timeout 2022-01-09 22:02:41,634 INFO MainThread:31164 [wandb_init.py:init():540] starting run threads in backend 2022-01-09 22:02:46,638 INFO MainThread:31164 [wandb_run.py:_console_start():1601] atexit reg 2022-01-09 22:02:46,638 INFO MainThread:31164 [wandb_run.py:_redirect():1475] redirect: SettingsConsole.WRAP 2022-01-09 22:02:46,639 INFO MainThread:31164 [wandb_run.py:_redirect():1512] Wrapping output streams. 2022-01-09 22:02:46,639 INFO MainThread:31164 [wandb_run.py:_redirect():1536] Redirects installed. 2022-01-09 22:02:46,639 INFO MainThread:31164 [wandb_init.py:init():565] run started, returning control to user process 2022-01-09 22:02:46,642 INFO MainThread:31164 [wandb_run.py:_config_callback():843] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 34, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.1, 'feat_proj_dropout': 0.0, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 37, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.75, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.25, 'mask_feature_length': 64, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 8, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 4, 'eval_accumulation_steps': 'None', 'learning_rate': 7.5e-05, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 50.0, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 2000, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan09_22-00-50_brutasse', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 3, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': './', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['tensorboard', 'wandb', 'codecarbon']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 8, 'eval_batch_size': 8} 2022-01-09 22:02:46,645 INFO MainThread:31164 [wandb_watch.py:watch():43] Watching