|
2024-03-12 14:29:03,625 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Current SDK version is 0.16.3 |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Configure stats pid to 2490955 |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Loading settings from /home/lilei/.config/wandb/settings |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Loading settings from /home/lilei/prismatic-vlms/wandb/settings |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'scripts/pretrain.py', 'program_abspath': '/home/lilei/prismatic-vlms/scripts/pretrain.py', 'program': '/home/lilei/prismatic-vlms/scripts/pretrain.py'} |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_init.py:_log_setup():526] Logging user logs to runs/qformer2_256/wandb/run-20240312_142903-yrs4wcl6/logs/debug.log |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_init.py:_log_setup():527] Logging internal logs to runs/qformer2_256/wandb/run-20240312_142903-yrs4wcl6/logs/debug-internal.log |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_init.py:init():566] calling init triggers |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_init.py:init():573] wandb.init called with sweep_config: {} |
|
config: {'model': {'type': 'one-stage+7b', 'model_id': 'one-stage+7b_qformer2_256', 'arch_specifier': 'qformer2_256', 'vision_backbone_id': 'clip-vit-l-336px', 'llm_backbone_id': 'vicuna-v15-7b', 'image_resize_strategy': 'letterbox', 'llm_max_length': 2048, 'align_epochs': 1, 'align_max_steps': None, 'align_global_batch_size': 256, 'align_per_device_batch_size': 16, 'align_learning_rate': 0.001, 'align_weight_decay': 0.0, 'align_max_grad_norm': 1.0, 'align_lr_scheduler_type': 'linear-warmup+cosine-decay', 'align_warmup_ratio': 0.03, 'align_train_strategy': 'fsdp-shard-grad-op', 'finetune_epochs': 1, 'finetune_max_steps': None, 'finetune_global_batch_size': 128, 'finetune_per_device_batch_size': 16, 'finetune_learning_rate': 2e-05, 'finetune_weight_decay': 0.1, 'finetune_max_grad_norm': 1.0, 'finetune_lr_scheduler_type': 'linear-warmup+cosine-decay', 'finetune_warmup_ratio': 0.03, 'finetune_train_strategy': 'fsdp-full-shard', 'enable_gradient_checkpointing': True, 'enable_mixed_precision_training': True, 'reduce_in_full_precision': False}, 'dataset': {'type': 'llava-v15', 'dataset_id': 'llava-v15', 'align_stage_components': ['download/llava-laion-cc-sbu-558k/chat.json', 'download/llava-laion-cc-sbu-558k'], 'finetune_stage_components': ['download/llava-v1.5-instruct/llava_v1_5_mix665k.json', 'download/llava-v1.5-instruct'], 'dataset_root_dir': 'data'}, 'stage': 'finetune', 'pretrained_checkpoint': None, 'run_id': 'qformer2_256', 'run_root_dir': 'runs', 'seed': 7, 'hf_token': '.hf_token', 'trackers': ['jsonl', 'wandb'], 'wandb_project': 'hf-vlms', 'wandb_entity': 'lilei_stones', 'max_length': 4096} |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_init.py:init():616] starting backend |
|
2024-03-12 14:29:03,626 INFO MainThread:2490955 [wandb_init.py:init():620] setting up manager |
|
2024-03-12 14:29:03,633 INFO MainThread:2490955 [backend.py:_multiprocessing_setup():105] multiprocessing start_methods=fork,spawn,forkserver, using: spawn |
|
2024-03-12 14:29:03,636 INFO MainThread:2490955 [wandb_init.py:init():628] backend started and connected |
|
2024-03-12 14:29:03,649 INFO MainThread:2490955 [wandb_init.py:init():720] updated telemetry |
|
2024-03-12 14:29:03,675 INFO MainThread:2490955 [wandb_init.py:init():753] communicating run to backend with 90.0 second timeout |
|
2024-03-12 14:29:07,875 INFO MainThread:2490955 [wandb_run.py:_on_init():2262] communicating current version |
|
2024-03-12 14:29:07,923 INFO MainThread:2490955 [wandb_run.py:_on_init():2271] got version response upgrade_message: "wandb version 0.16.4 is available! To upgrade, please run:\n $ pip install wandb --upgrade" |
|
|
|
2024-03-12 14:29:07,923 INFO MainThread:2490955 [wandb_init.py:init():804] starting run threads in backend |
|
2024-03-12 14:29:13,120 INFO MainThread:2490955 [wandb_run.py:_console_start():2241] atexit reg |
|
2024-03-12 14:29:13,120 INFO MainThread:2490955 [wandb_run.py:_redirect():2096] redirect: wrap_raw |
|
2024-03-12 14:29:13,120 INFO MainThread:2490955 [wandb_run.py:_redirect():2161] Wrapping output streams. |
|
2024-03-12 14:29:13,120 INFO MainThread:2490955 [wandb_run.py:_redirect():2186] Redirects installed. |
|
2024-03-12 14:29:13,121 INFO MainThread:2490955 [wandb_init.py:init():847] run started, returning control to user process |
|
2024-03-12 22:26:21,885 INFO MainThread:2490955 [wandb_run.py:_finish():1970] finishing run lilei_stones/hf-vlms/yrs4wcl6 |
|
2024-03-12 22:26:21,886 INFO MainThread:2490955 [wandb_run.py:_atexit_cleanup():2210] got exitcode: 0 |
|
2024-03-12 22:26:21,886 INFO MainThread:2490955 [wandb_run.py:_restore():2193] restore |
|
2024-03-12 22:26:21,886 INFO MainThread:2490955 [wandb_run.py:_restore():2199] restore done |
|
2024-03-12 22:26:29,403 INFO MainThread:2490955 [wandb_run.py:_footer_history_summary_info():3866] rendering history |
|
2024-03-12 22:26:29,403 INFO MainThread:2490955 [wandb_run.py:_footer_history_summary_info():3898] rendering summary |
|
2024-03-12 22:26:29,414 INFO MainThread:2490955 [wandb_run.py:_footer_sync_info():3825] logging synced files |
|
|