0
stringclasses 12
values | 1
float64 0
26.5k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 2.165376 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.116032 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.120672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.2208 |
megatron.core.transformer.mlp.forward.activation
| 0.027168 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.51344 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.77312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.120384 |
megatron.core.transformer.attention.forward.qkv
| 0.128992 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 2.807168 |
megatron.core.transformer.attention.forward.linear_proj
| 1.878208 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.838144 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.120224 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.223904 |
megatron.core.transformer.mlp.forward.activation
| 0.02736 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.514784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.778272 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.121088 |
megatron.core.transformer.attention.forward.qkv
| 0.075424 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 55.33728 |
megatron.core.transformer.attention.forward.linear_proj
| 1.959776 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 57.39632 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.064928 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.11872 |
megatron.core.transformer.mlp.forward.activation
| 0.017728 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.28912 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.437152 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.064288 |
megatron.core.transformer.attention.forward.qkv
| 0.072256 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 3.74704 |
megatron.core.transformer.attention.forward.linear_proj
| 2.03712 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.879456 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.064928 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.118912 |
megatron.core.transformer.mlp.forward.activation
| 0.017568 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.293152 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.441088 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.064896 |
megatron.core.transformer.attention.forward.qkv
| 0.042848 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 0.872128 |
megatron.core.transformer.attention.forward.linear_proj
| 7.652096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8.590848 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.036864 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.060064 |
megatron.core.transformer.mlp.forward.activation
| 0.01152 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.041504 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.124576 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037792 |
megatron.core.transformer.attention.forward.qkv
| 0.043424 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 0.885536 |
megatron.core.transformer.attention.forward.linear_proj
| 4.974112 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.926592 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.037376 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.060032 |
megatron.core.transformer.mlp.forward.activation
| 0.011776 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.48288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.56656 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037408 |
megatron.core.transformer.attention.forward.qkv
| 1.937376 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 516.364868 |
megatron.core.transformer.attention.forward.linear_proj
| 0.679232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 519.354431 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 384.165802 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.158976 |
megatron.core.transformer.mlp.forward.activation
| 109.003075 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.896672 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 115.774307 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 140.308258 |
megatron.core.transformer.attention.forward.qkv
| 1.4904 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 4.07632 |
megatron.core.transformer.attention.forward.linear_proj
| 0.64912 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.31472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.094144 |
megatron.core.transformer.mlp.forward.linear_fc1
| 2.876032 |
megatron.core.transformer.mlp.forward.activation
| 0.334336 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.670752 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 5.893216 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.09376 |
megatron.core.transformer.attention.forward.qkv
| 1.365216 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.07664 |
megatron.core.transformer.attention.forward.core_attention
| 982.801208 |
megatron.core.transformer.attention.forward.linear_proj
| 0.730112 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 985.333191 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 416.220154 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.8448 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.