MaxView

← Back to run

Log Summary

XPK Start: Sun Apr 19 18:56:00 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-19 18:56:24.504832: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0419 18:56:24.683186 136899099592512 max_utils.py:273] Attempting to initialize the jax distributed system...
I0419 18:56:33.724684 136899099592512 distributed.py:149] Starting JAX distributed service on [::]:8482
I0419 18:56:33.727057 136899099592512 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-gyis8-slice-job-0-0.mt-13-scan-layers-false-gyis8:8482
I0419 18:56:35.413758 136899099592512 max_utils.py:284] Jax distributed system initialized!
I0419 18:56:41.560554 136899099592512 max_utils.py:800] System Information: Jax Version: 0.9.2
I0419 18:56:41.560658 136899099592512 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0419 18:56:41.560698 136899099592512 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0419 18:56:41.560731 136899099592512 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0419 18:56:42.251157 136899099592512 maxtext_utils.py:1551] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0419 18:56:42.251455 136899099592512 checkpointing.py:677] Setting up checkpoint logger...
I0419 18:56:42.251510 136899099592512 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0419 18:56:42.251554 136899099592512 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0419 18:56:42.251884 136899099592512 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7c819684d9d0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0419 18:56:45.530897 136899099592512 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0419 18:56:45.531130 136899099592512 checkpoint_manager.py:708] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c6d88441430>}, handler_registry=None
I0419 18:56:45.531382 136899099592512 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c6d88441430>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0419 18:56:45.531432 136899099592512 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c6d88446b40>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0419 18:56:45.531466 136899099592512 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c6d88441430>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c6d88441430>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c6d88446b40>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c6d88446b40>}).
I0419 18:56:45.531774 136899099592512 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0419 18:56:45.531843 136899099592512 async_checkpointer.py:192] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7c6d607c7c40> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0419 18:56:47.087750 136899099592512 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints
I0419 18:56:47.096864 136899099592512 checkpoint_manager.py:929] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7c6da0171ee0>
I0419 18:56:47.096984 136899099592512 checkpointing.py:301] Checkpoint manager created!
I0419 18:56:48.021091 136899099592512 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0419 18:56:48.021216 136899099592512 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0419 18:56:48.397418 136899099592512 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0419 18:56:48.397507 136899099592512 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0419 18:56:48.413501 136899099592512 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0419 18:56:48.413567 136899099592512 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0419 18:56:48.443361 136899099592512 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0419 18:56:48.443446 136899099592512 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 18:56:48.459665 136899099592512 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0419 18:56:48.459726 136899099592512 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 18:56:48.475591 136899099592512 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0419 18:56:48.475650 136899099592512 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 18:56:48.505039 136899099592512 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0419 18:56:48.505109 136899099592512 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 18:56:48.526617 136899099592512 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0419 18:56:48.526680 136899099592512 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0419 18:56:51.367701 136899099592512 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0419 18:56:51.367826 136899099592512 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0419 18:56:58.581845 136899099592512 maxtext_utils.py:1654]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.581973 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582026 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582082 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.582121 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.582157 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.582210 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582261 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.582300 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582334 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582384 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582424 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582458 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.582488 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.582517 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.582549 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582584 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.582618 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582651 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582684 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582714 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582744 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.582773 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.582801 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.582832 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582865 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.582896 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582926 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.582957 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.582987 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583016 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.583044 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583072 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583101 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583131 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.583161 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583191 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583220 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583249 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583278 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.583306 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583333 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583389 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583430 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.583462 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583493 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583523 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583552 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583581 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.583609 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583636 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583665 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583695 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.583724 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583753 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583782 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583811 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.583841 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.583867 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583894 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.583924 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.583953 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.583982 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584011 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584040 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584068 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584097 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.584125 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584152 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584181 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584209 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.584238 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584267 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584296 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584325 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584354 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.584400 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584430 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584460 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584491 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.584523 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584553 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584581 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584610 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584639 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.584666 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584692 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584721 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584750 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.584778 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584808 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.584837 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584867 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.584896 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.584923 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584950 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.584980 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585009 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.585038 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585067 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585096 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585125 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585157 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.585183 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.585212 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.585243 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585273 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.585304 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585334 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585376 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585413 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585443 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.585470 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.585497 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.585526 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585555 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.585584 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585613 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585642 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585671 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585700 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.585727 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.585754 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.585782 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585811 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.585839 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585868 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.585896 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585925 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.585952 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.585979 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.586005 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.586033 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.586061 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.586090 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.586118 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.586148 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.586178 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 18:56:58.586206 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 18:56:58.586233 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.586260 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0419 18:56:58.586289 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.586318 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 18:56:58.586346 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.586394 136899099592512 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 18:56:58.586447 136899099592512 maxtext_utils.py:1654]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)

I0419 18:56:58.586492 136899099592512 maxtext_utils.py:1654]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0419 18:57:03.050986 136899099592512 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0419 18:57:03.051080 136899099592512 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0419 18:57:03.066507 136899099592512 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0419 18:57:03.066567 136899099592512 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0419 18:58:00.112921 136899099592512 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0419 18:58:00.121006 136899099592512 metric_logger.py:301] number parameters: 1.104 billion
I0419 18:59:02.044846 136899099592512 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0419 18:59:02.271443 136899099592512 checkpointing.py:776] Waited 0.2265777587890625 seconds for step 0 to finish before starting checkpointing.
I0419 18:59:02.275105 136899099592512 checkpoint_manager.py:2009] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0419 18:59:02.277005 136899099592512 checkpoint_manager.py:1512] [process=0] Saving checkpoint at step 0
I0419 18:59:02.278359 136899099592512 event_tracking.py:70] [process=0] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0.
I0419 18:59:02.571056 136899099592512 signaling_client.py:364] Using JaxDistributedSignalingClient
I0419 18:59:02.617152 136899099592512 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0419 18:59:02.617305 136899099592512 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0419 18:59:02.910907 136899099592512 base_pytree_checkpoint_handler.py:154] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.295090s
I0419 18:59:02.911739 136899099592512 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/blocking_gbytes_per_sec: 4.631 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3331174850463867 s) (per-host)
I0419 18:59:02.911798 136899099592512 base_pytree_checkpoint_handler.py:768] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.333193s (batch_requests_ready=0.018197s, total_serialization_initiated=0.314245s, others=0.000751s)
I0419 18:59:02.911908 136899099592512 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.337069s (all_items=0.000017s, per_item={'items': '0.00001717'}, temp_paths=0.337052)
I0419 18:59:02.912713 136899099592512 event_tracking.py:125] [process=0] [async] Finished blocking save in 0.64 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0.
I0419 18:59:02.913007 136770280748800 async_checkpointer.py:76] [process=0][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-19 19:19:02.912976
I0419 18:59:02.962284 136899099592512 checkpoint_manager.py:1560] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0419 18:59:02.962645 136769743877888 async_checkpointer.py:280] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0419 18:59:02.962805 136899099592512 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776625142.2750876, 'wait_for_prev_duration_secs': 5.936622619628906e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776625142.277044, 'checkpointer_blocking_duration_secs': 0.6361031532287598, 'get_old_steps_start_time': 1776625142.9131753, 'get_old_steps_duration_secs': 2.8133392333984375e-05, 'checkpoint_manager_blocking_start_time': 1776625142.2726257, 'checkpoint_manager_blocking_duration_secs': 0.6901395320892334}
I0419 18:59:02.962995 136899099592512 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0419 18:59:02.963048 136899099592512 max_utils.py:750] 
Memstats: After params initialized:
I0419 18:59:02.963098 136899099592512 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_0(process=0,(0,0,0,0))
I0419 18:59:02.963131 136899099592512 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_1(process=0,(1,0,0,0))
I0419 18:59:02.963162 136899099592512 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_4(process=0,(0,1,0,0))
I0419 18:59:02.963186 136899099592512 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_5(process=0,(1,1,0,0))
I0419 18:59:03.258085 136770272356096 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0
I0419 18:59:03.350915 136899099592512 metric_logger.py:196] completed step: 0, seconds: 61.337, TFLOP/s/device: 0.222, Tokens/s/device: 33.389, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0419 18:59:03.353817 136899099592512 metric_logger.py:281] To see full metrics 'tensorboard --logdir=gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/tensorboard/'
I0419 18:59:03.776097 136899099592512 metric_logger.py:196] completed step: 1, seconds: 1.297, TFLOP/s/device: 10.479, Tokens/s/device: 1579.540, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0419 18:59:03.907622 136899099592512 metric_logger.py:196] completed step: 2, seconds: 0.431, TFLOP/s/device: 31.495, Tokens/s/device: 4747.334, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28793.863
I0419 18:59:04.046509 136899099592512 metric_logger.py:196] completed step: 3, seconds: 0.013, TFLOP/s/device: 1038.216, Tokens/s/device: 156491.174, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 17001.510
I0419 18:59:06.488120 136769717151488 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776625146083516043, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776625146083516043, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0419 18:59:06.764709 136770255570688 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/items
I0419 18:59:08.275256    2852 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0419 18:59:11.922166 136770238785280 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0
I0419 18:59:29.209734 136899099592512 metric_logger.py:196] completed step: 4, seconds: 0.133, TFLOP/s/device: 102.489, Tokens/s/device: 15448.326, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10780.271
I0419 18:59:29.225504 136899099592512 metric_logger.py:196] completed step: 5, seconds: 0.140, TFLOP/s/device: 97.253, Tokens/s/device: 14659.041, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.094
I0419 18:59:29.355003 136899099592512 metric_logger.py:196] completed step: 6, seconds: 25.162, TFLOP/s/device: 0.540, Tokens/s/device: 81.391, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.973
I0419 18:59:29.491124 136899099592512 metric_logger.py:196] completed step: 7, seconds: 0.013, TFLOP/s/device: 1075.271, Tokens/s/device: 162076.607, total_weights: 65536, loss: 8.394, lm_loss: 8.394, perplexity: 4418.633
I0419 18:59:29.631897 136899099592512 metric_logger.py:196] completed step: 8, seconds: 0.131, TFLOP/s/device: 103.742, Tokens/s/device: 15637.169, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.280
I0419 18:59:29.767473 136899099592512 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0419 18:59:29.770952 136899099592512 checkpointing.py:776] Waited 0.0034961700439453125 seconds for step 9 to finish before starting checkpointing.
I0419 18:59:29.773668 136899099592512 checkpoint_manager.py:2020] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0419 18:59:42.404230 136770230392576 base_pytree_checkpoint_handler.py:1282] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 34.162601s (commit=32.517947s, array_metadata_write=1.644654s)
I0419 18:59:42.405992 136770280748800 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/gbytes_per_sec: 39.661 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 39.82732653617859 s) (per-host)
I0419 18:59:42.406123 136770280748800 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 39.493008s.
I0419 18:59:43.967430 136770280748800 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776625146083516043, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0419 18:59:45.154966 136769717151488 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776625146083516043, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0419 18:59:46.076230 136770280748800 array_metadata_store.py:411] [process=0][thread=async_save] Validated ArrayMetadata from all 8 hosts. Time taken: 0.002259s.
I0419 18:59:46.363794 136770280748800 ocdbt_utils.py:49] Param validation support for Zarr3 will be added later (b/362328389).
I0419 18:59:47.773298 136770280748800 base_pytree_checkpoint_handler.py:1406] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 3.259255s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/items
I0419 18:59:47.774956 136770280748800 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0/items
I0419 18:59:48.285036 136770280748800 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0
I0419 18:59:49.826486 136770280748800 atomicity.py:847] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0`.
I0419 18:59:49.827350 136770280748800 event_tracking.py:138] [process=0] [async] Finished save (blocking + background) in 47.55 seconds @ gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/0
I0419 18:59:49.828814 136770280748800 async_checkpointer.py:160] [process=0][thread=async_save] Background save thread done. Time taken: 46.915697s.
I0419 18:59:49.829019 136769743877888 async_checkpointer.py:288] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0419 18:59:49.829149 136769743877888 async_checkpointer.py:298] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0419 18:59:49.829233 136769743877888 checkpoint_manager.py:2137] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0419 18:59:49.831889 136769743877888 checkpoint_manager.py:2146] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0419 18:59:49.832077 136899099592512 checkpoint_manager.py:2032] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0419 18:59:49.832213 136899099592512 checkpoint_manager.py:1452] Waiting for previous save to complete took 20.058545 seconds. If this number is high, consider checkpointing less frequently.
I0419 18:59:49.834684 136899099592512 checkpoint_manager.py:1512] [process=0] Saving checkpoint at step 9
I0419 18:59:49.836769 136899099592512 event_tracking.py:70] [process=0] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9.
I0419 18:59:50.573454 136899099592512 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0419 18:59:50.573617 136899099592512 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0419 18:59:50.695451 136899099592512 base_pytree_checkpoint_handler.py:154] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.123321s
I0419 18:59:50.696310 136899099592512 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/blocking_gbytes_per_sec: 9.667 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.1595771312713623 s) (per-host)
I0419 18:59:50.696395 136899099592512 base_pytree_checkpoint_handler.py:768] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.159679s (batch_requests_ready=0.017618s, total_serialization_initiated=0.141256s, others=0.000805s)
I0419 18:59:50.696527 136899099592512 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.163775s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=0.163758)
I0419 18:59:50.697300 136899099592512 event_tracking.py:125] [process=0] [async] Finished blocking save in 0.86 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9.
I0419 18:59:50.697675 136769743877888 async_checkpointer.py:76] [process=0][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-19 19:19:50.697641
I0419 18:59:50.708956 136899099592512 checkpoint_manager.py:1560] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0419 18:59:50.709209 136769732429568 async_checkpointer.py:280] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0419 18:59:50.709350 136899099592512 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776625169.773636, 'wait_for_prev_duration_secs': 20.058544635772705, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776625189.834724, 'checkpointer_blocking_duration_secs': 0.863055944442749, 'get_old_steps_start_time': 1776625190.6978045, 'get_old_steps_duration_secs': 2.8371810913085938e-05, 'checkpoint_manager_blocking_start_time': 1776625169.7717824, 'checkpoint_manager_blocking_duration_secs': 20.937533855438232}
I0419 18:59:50.709537 136899099592512 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0419 18:59:50.709583 136899099592512 checkpoint_manager.py:2020] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0419 18:59:50.823539 136770280748800 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9
I0419 18:59:52.998337 136769691973376 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9/items
I0419 18:59:59.753016 136770238785280 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0
I0419 19:00:38.330709 136770230392576 base_pytree_checkpoint_handler.py:1282] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 42.486325s (commit=40.939312s, array_metadata_write=1.547013s)
I0419 19:00:38.332500 136769743877888 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/gbytes_per_sec: 33.049 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 47.795769691467285 s) (per-host)
I0419 19:00:38.332563 136769743877888 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 47.634068s.
I0419 19:00:40.727640 136769743877888 array_metadata_store.py:411] [process=0][thread=async_save] Validated ArrayMetadata from all 8 hosts. Time taken: 0.002314s.
I0419 19:00:40.767216 136769743877888 ocdbt_utils.py:49] Param validation support for Zarr3 will be added later (b/362328389).
I0419 19:00:42.241482 136769743877888 base_pytree_checkpoint_handler.py:1406] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 2.672270s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9/items
I0419 19:00:42.244354 136769743877888 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9/items
I0419 19:00:42.778444 136769743877888 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9
I0419 19:00:44.566304 136769743877888 atomicity.py:847] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9`.
I0419 19:00:44.567135 136769743877888 event_tracking.py:138] [process=0] [async] Finished save (blocking + background) in 54.73 seconds @ gs://lance-maxtext/linen_ckpt_xpk_main_20260419_180002/linen_xpk_main_20260419_180002_13_scan_layers_false/checkpoints/9
I0419 19:00:44.568446 136769743877888 async_checkpointer.py:160] [process=0][thread=async_save] Background save thread done. Time taken: 53.869948s.
I0419 19:00:44.568619 136769732429568 async_checkpointer.py:288] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0419 19:00:44.568683 136769732429568 async_checkpointer.py:298] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0419 19:00:44.568752 136769732429568 checkpoint_manager.py:2137] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0419 19:00:44.571619 136769732429568 checkpoint_manager.py:2146] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0419 19:00:44.571786 136899099592512 checkpoint_manager.py:2032] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0419 19:00:44.571929 136899099592512 checkpoint_manager.py:2009] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0419 19:00:44.572900 136899099592512 metric_logger.py:196] completed step: 9, seconds: 0.136, TFLOP/s/device: 100.008, Tokens/s/device: 15074.341, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.871
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Sun Apr 19 19:00:55 UTC 2026
EXIT_CODE=0