0 stringclasses 12 values | 1 float64 0 9.97k |
|---|---|
megatron.core.transformer.attention.forward.qkv | 266.536835 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.109184 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.079296 |
megatron.core.transformer.attention.forward.core_attention | 8,810.711914 |
megatron.core.transformer.attention.forward.linear_proj | 291.54538 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 9,370.21875 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 189.861862 |
megatron.core.transformer.mlp.forward.linear_fc1 | 1.017088 |
megatron.core.transformer.mlp.forward.activation | 350.586517 |
megatron.core.transformer.mlp.forward.linear_fc2 | 299.409973 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 651.77533 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.0256 |
megatron.core.transformer.attention.forward.qkv | 0.05712 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002944 |
megatron.core.transformer.attention.forward.core_attention | 1,527.846558 |
megatron.core.transformer.attention.forward.linear_proj | 2.495776 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,530.423462 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.023328 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.10688 |
megatron.core.transformer.mlp.forward.activation | 0.016832 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.14784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.284 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.024 |
megatron.core.transformer.attention.forward.qkv | 370.296051 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.116608 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.092288 |
megatron.core.transformer.attention.forward.core_attention | 9,135.602539 |
megatron.core.transformer.attention.forward.linear_proj | 459.757446 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 9,967.209961 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 414.264099 |
megatron.core.transformer.mlp.forward.linear_fc1 | 1.733888 |
megatron.core.transformer.mlp.forward.activation | 779.439697 |
megatron.core.transformer.mlp.forward.linear_fc2 | 115.2472 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 897.046631 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.068032 |
megatron.core.transformer.attention.forward.qkv | 0.18848 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003072 |
megatron.core.transformer.attention.forward.core_attention | 1,689.543945 |
megatron.core.transformer.attention.forward.linear_proj | 2.874016 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,692.630249 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.06576 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.383872 |
megatron.core.transformer.mlp.forward.activation | 0.047072 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.491136 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.933472 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.06512 |
megatron.core.transformer.attention.forward.qkv | 223.739838 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.113184 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.092736 |
megatron.core.transformer.attention.forward.core_attention | 7,628.444336 |
megatron.core.transformer.attention.forward.linear_proj | 4.434176 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 7,858.064453 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 672.165955 |
megatron.core.transformer.mlp.forward.linear_fc1 | 4.688704 |
megatron.core.transformer.mlp.forward.activation | 285.001831 |
megatron.core.transformer.mlp.forward.linear_fc2 | 2.729664 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 293.049438 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.236256 |
megatron.core.transformer.attention.forward.qkv | 0.705408 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.858464 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.114016 |
megatron.core.transformer.attention.forward.core_attention | 1,711.918335 |
megatron.core.transformer.attention.forward.linear_proj | 0.827552 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,715.774414 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.234464 |
megatron.core.transformer.mlp.forward.linear_fc1 | 1.574944 |
megatron.core.transformer.mlp.forward.activation | 0.169888 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.913184 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 3.66976 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.233632 |
megatron.core.transformer.attention.forward.qkv | 241.449829 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.128992 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.08912 |
megatron.core.transformer.attention.forward.core_attention | 5,033.308105 |
megatron.core.transformer.attention.forward.linear_proj | 11.987808 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5,288.118652 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 254.712158 |
megatron.core.transformer.mlp.forward.linear_fc1 | 3.705248 |
megatron.core.transformer.mlp.forward.activation | 283.935211 |
megatron.core.transformer.mlp.forward.linear_fc2 | 16.154177 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 304.677612 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.016928 |
megatron.core.transformer.attention.forward.qkv | 0.035424 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003104 |
megatron.core.transformer.attention.forward.core_attention | 2,412.901855 |
megatron.core.transformer.attention.forward.linear_proj | 0.15856 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 2,413.120361 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.016768 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.056 |
megatron.core.transformer.mlp.forward.activation | 0.011648 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.0912 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.17024 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.016352 |
megatron.core.transformer.attention.forward.qkv | 235.00029 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003104 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003072 |
megatron.core.transformer.attention.forward.core_attention | 7,542.206543 |
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 8