MaxView

← Back to run

Log Summary

XPK Start: Fri Apr 17 14:09:12 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-17 14:09:36.921730: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0417 14:09:37.100418 133882125645632 max_utils.py:273] Attempting to initialize the jax distributed system...
I0417 14:09:46.140721 133882125645632 distributed.py:149] Starting JAX distributed service on [::]:8482
I0417 14:09:46.143069 133882125645632 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-wkj01-slice-job-0-0.mt-13-scan-layers-false-wkj01:8482
I0417 14:09:47.585783 133882125645632 max_utils.py:284] Jax distributed system initialized!
I0417 14:09:52.779340 133882125645632 max_utils.py:800] System Information: Jax Version: 0.9.2
I0417 14:09:52.779442 133882125645632 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0417 14:09:52.779482 133882125645632 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0417 14:09:52.779518 133882125645632 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0417 14:09:53.456602 133882125645632 maxtext_utils.py:1548] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0417 14:09:53.456897 133882125645632 checkpointing.py:677] Setting up checkpoint logger...
I0417 14:09:53.456953 133882125645632 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0417 14:09:53.456998 133882125645632 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0417 14:09:53.457326 133882125645632 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x79c325051cd0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0417 14:09:56.402965 133882125645632 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0417 14:09:56.403192 133882125645632 checkpoint_manager.py:708] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79afa46a0b90>}, handler_registry=None
I0417 14:09:56.403428 133882125645632 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79afa46a0b90>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0417 14:09:56.403477 133882125645632 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79afa46a0dd0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0417 14:09:56.403519 133882125645632 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79afa46a0b90>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79afa46a0b90>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79afa46a0dd0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79afa46a0dd0>}).
I0417 14:09:56.403850 133882125645632 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0417 14:09:56.403919 133882125645632 async_checkpointer.py:192] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x79afa42d0a40> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0417 14:09:57.577494 133882125645632 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints
I0417 14:09:58.872760 133882125645632 checkpoint_manager.py:929] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x79afa46a2990>
I0417 14:09:58.872969 133882125645632 checkpointing.py:301] Checkpoint manager created!
I0417 14:09:59.848685 133882125645632 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0417 14:09:59.848791 133882125645632 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0417 14:10:00.229001 133882125645632 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0417 14:10:00.229094 133882125645632 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0417 14:10:00.245480 133882125645632 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0417 14:10:00.245555 133882125645632 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0417 14:10:00.275456 133882125645632 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0417 14:10:00.275537 133882125645632 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0417 14:10:00.291782 133882125645632 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0417 14:10:00.291851 133882125645632 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0417 14:10:00.307933 133882125645632 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0417 14:10:00.307994 133882125645632 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0417 14:10:00.337342 133882125645632 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0417 14:10:00.337408 133882125645632 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0417 14:10:00.359730 133882125645632 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0417 14:10:00.359804 133882125645632 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0417 14:10:03.216854 133882125645632 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0417 14:10:03.216984 133882125645632 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0417 14:10:10.403972 133882125645632 maxtext_utils.py:1651]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404105 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.404158 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.404215 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.404256 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404292 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404343 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404392 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.404430 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404466 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404500 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.404534 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.404567 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.404597 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404627 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404658 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404692 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.404726 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404757 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404787 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.404837 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.404874 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.404904 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404933 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.404963 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.404993 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.405024 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405053 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405083 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.405112 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.405142 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.405170 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.405198 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.405228 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405258 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.405287 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405317 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405347 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.405377 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.405407 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.405447 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.405498 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.405536 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405569 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.405601 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405641 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405673 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.405704 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.405734 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.405762 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.405790 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.405854 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405892 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.405924 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405955 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.405984 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406013 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406043 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.406070 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406098 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406126 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406157 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.406187 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406216 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406245 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406274 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406303 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.406331 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406358 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406386 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406416 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.406445 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406475 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406504 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406534 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406562 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.406589 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406619 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406648 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406679 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.406709 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406738 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406767 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406796 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.406842 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.406872 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406900 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.406929 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.406959 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.406988 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407017 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407046 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407074 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407103 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.407130 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407157 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407186 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407216 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.407245 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407274 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407302 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407330 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407359 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.407386 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407413 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407442 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407470 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.407499 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407528 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407556 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407584 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407613 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.407640 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407667 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407696 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407725 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.407754 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407782 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.407815 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407859 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.407889 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.407916 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407943 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.407971 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408000 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.408030 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408058 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408087 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.408115 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.408144 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.408170 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.408198 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.408226 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408255 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.408283 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408312 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408340 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.408369 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0417 14:10:10.408398 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0417 14:10:10.408424 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.408451 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 14:10:10.408479 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408508 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 14:10:10.408537 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408566 133882125645632 maxtext_utils.py:1651]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0417 14:10:10.408613 133882125645632 maxtext_utils.py:1651]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)

I0417 14:10:10.408657 133882125645632 maxtext_utils.py:1651]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0417 14:10:14.845350 133882125645632 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0417 14:10:14.845448 133882125645632 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0417 14:10:14.860895 133882125645632 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0417 14:10:14.860956 133882125645632 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0417 14:11:11.177747 133882125645632 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0417 14:11:11.178946 133882125645632 metric_logger.py:301] number parameters: 1.104 billion
I0417 14:12:11.699013 133882125645632 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0417 14:12:11.931576 133882125645632 checkpointing.py:776] Waited 0.2325429916381836 seconds for step 0 to finish before starting checkpointing.
I0417 14:12:11.935030 133882125645632 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0417 14:12:11.936959 133882125645632 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 0
I0417 14:12:11.938300 133882125645632 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints/0.
I0417 14:12:12.724001 133882125645632 signaling_client.py:364] Using JaxDistributedSignalingClient
I0417 14:12:12.724977 133882125645632 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0417 14:12:12.725102 133882125645632 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0417 14:12:13.025957 133882125645632 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.302480s
I0417 14:12:13.026124 133882125645632 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 4.584 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.33652782440185547 s) (per-host)
I0417 14:12:13.026174 133882125645632 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.336588s (batch_requests_ready=0.017071s, total_serialization_initiated=0.319449s, others=0.000068s)
I0417 14:12:13.026426 133882125645632 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.340720s (all_items=0.000020s, per_item={'items': '0.00001955'}, temp_paths=0.340700)
I0417 14:12:13.027253 133882125645632 event_tracking.py:125] [process=6] [async] Finished blocking save in 1.09 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints/0.
I0417 14:12:13.027547 133756671768320 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-17 14:32:13.027515
I0417 14:12:13.080077 133882125645632 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0417 14:12:13.080442 133759908034304 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0417 14:12:13.080612 133882125645632 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776435131.9350126, 'wait_for_prev_duration_secs': 6.151199340820312e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776435131.936998, 'checkpointer_blocking_duration_secs': 1.090702772140503, 'get_old_steps_start_time': 1776435133.0277274, 'get_old_steps_duration_secs': 3.0517578125e-05, 'checkpoint_manager_blocking_start_time': 1776435131.9331968, 'checkpoint_manager_blocking_duration_secs': 1.1473755836486816}
I0417 14:12:13.080800 133882125645632 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0417 14:12:13.080876 133882125645632 max_utils.py:750] 
Memstats: After params initialized:
I0417 14:12:13.080931 133882125645632 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_24(process=6,(0,6,0,0))
I0417 14:12:13.080965 133882125645632 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_25(process=6,(1,6,0,0))
I0417 14:12:13.080993 133882125645632 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_28(process=6,(0,7,0,0))
I0417 14:12:13.081018 133882125645632 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_29(process=6,(1,7,0,0))
I0417 14:12:13.438658 133882125645632 metric_logger.py:196] completed step: 0, seconds: 60.520, TFLOP/s/device: 0.225, Tokens/s/device: 33.840, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0417 14:12:13.574902 133882125645632 metric_logger.py:196] completed step: 1, seconds: 1.730, TFLOP/s/device: 7.852, Tokens/s/device: 1183.548, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0417 14:12:13.990739 133882125645632 metric_logger.py:196] completed step: 2, seconds: 0.015, TFLOP/s/device: 889.094, Tokens/s/device: 134013.873, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28794.377
I0417 14:12:14.126906 133882125645632 metric_logger.py:196] completed step: 3, seconds: 0.410, TFLOP/s/device: 33.108, Tokens/s/device: 4990.375, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16998.465
I0417 14:12:14.403157 133882125645632 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.285, Tokens/s/device: 14362.355, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10778.518
I0417 14:12:14.414533 133882125645632 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.922, Tokens/s/device: 15061.260, total_weights: 65536, loss: 8.900, lm_loss: 8.900, perplexity: 7335.202
I0417 14:12:16.414193    2770 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0417 14:12:18.931079 133756128052992 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_6
I0417 14:12:39.034192 133882125645632 metric_logger.py:196] completed step: 6, seconds: 0.277, TFLOP/s/device: 49.007, Tokens/s/device: 7386.888, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.185
I0417 14:12:39.173880 133882125645632 metric_logger.py:196] completed step: 7, seconds: 24.488, TFLOP/s/device: 0.555, Tokens/s/device: 83.632, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.134
I0417 14:12:39.310178 133882125645632 metric_logger.py:196] completed step: 8, seconds: 0.143, TFLOP/s/device: 95.194, Tokens/s/device: 14348.670, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.514
I0417 14:12:39.447414 133882125645632 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0417 14:12:39.450716 133882125645632 checkpointing.py:776] Waited 0.003323078155517578 seconds for step 9 to finish before starting checkpointing.
I0417 14:12:39.453689 133882125645632 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0417 14:12:46.001099 133756671768320 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 47.419 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.31145167350769 s) (per-host)
I0417 14:12:46.001224 133756671768320 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 32.973564s.
I0417 14:12:55.426475 133756671768320 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 42.398798s.
I0417 14:12:55.426772 133759908034304 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0417 14:12:55.426937 133759908034304 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0417 14:12:55.426993 133759908034304 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0417 14:12:55.429321 133759908034304 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0417 14:12:55.429522 133882125645632 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0417 14:12:55.429675 133882125645632 checkpoint_manager.py:1452] Waiting for previous save to complete took 15.975984 seconds. If this number is high, consider checkpointing less frequently.
I0417 14:12:55.432323 133882125645632 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 9
I0417 14:12:55.434499 133882125645632 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints/9.
I0417 14:12:56.168013 133882125645632 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0417 14:12:56.168200 133882125645632 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0417 14:12:56.295601 133882125645632 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.128936s
I0417 14:12:56.295768 133882125645632 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 9.551 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.161515474319458 s) (per-host)
I0417 14:12:56.295843 133882125645632 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.161581s (batch_requests_ready=0.016713s, total_serialization_initiated=0.144795s, others=0.000072s)
I0417 14:12:56.296130 133882125645632 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.166025s (all_items=0.000016s, per_item={'items': '0.00001645'}, temp_paths=0.166008)
I0417 14:12:56.296912 133882125645632 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.86 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints/9.
I0417 14:12:56.297284 133759908034304 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-17 14:32:56.297247
I0417 14:12:56.302838 133882125645632 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0417 14:12:56.303134 133756663375616 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0417 14:12:56.303296 133882125645632 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776435159.453659, 'wait_for_prev_duration_secs': 15.9759840965271, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776435175.4323633, 'checkpointer_blocking_duration_secs': 0.8650622367858887, 'get_old_steps_start_time': 1776435176.2974505, 'get_old_steps_duration_secs': 3.0517578125e-05, 'checkpoint_manager_blocking_start_time': 1776435159.4516065, 'checkpoint_manager_blocking_duration_secs': 16.851653337478638}
I0417 14:12:56.303508 133882125645632 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0417 14:12:56.303556 133882125645632 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0417 14:13:03.961447 133757185009408 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260417_125726/linen_xpk_main_20260417_125726_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_6
I0417 14:13:39.257130 133759908034304 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 36.630 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.12285113334656 s) (per-host)
I0417 14:13:39.257241 133759908034304 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.959846s.
I0417 14:13:47.301070 133759908034304 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 51.003660s.
I0417 14:13:47.301352 133756663375616 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0417 14:13:47.301474 133756663375616 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0417 14:13:47.301525 133756663375616 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0417 14:13:47.304095 133756663375616 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0417 14:13:47.304272 133882125645632 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0417 14:13:47.304417 133882125645632 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0417 14:13:47.305408 133882125645632 metric_logger.py:196] completed step: 9, seconds: 0.139, TFLOP/s/device: 97.790, Tokens/s/device: 14739.963, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3599.028
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Fri Apr 17 14:13:58 UTC 2026
EXIT_CODE=0