MaxView

‹ 11_optimizer_offload_trueCase: 13_scan_layers_false13_scan_layers_true ›

Metrics: main (0ba93e21a) vs feat/nnx-post-train-fixes (d8cde296b)

Metricmain  0ba93e21afeat/nnx-post-train-fixes  d8cde296bDiff (feat/nnx-post-train-fixes − main)
Parameters1.104 billion1.104 billion
Final loss5.93005.93000
TFLOP/s125.131124.407-0.724
Tok/s18861.118752.0-109.145
Avg s/step2.7072.624-0.083
Memory %6.916.910
JAX0.9.20.9.2

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  0ba93e21a  ·  main_20260416_123638  ·  full log
2026-04-16 13:17:20.185490: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0416 13:17:20.304062 132981438057600 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag.
I0416 13:18:18.193039 132981438057600 max_utils.py:800] System Information: Jax Version: 0.9.2
I0416 13:18:18.193245 132981438057600 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0416 13:18:18.193280 132981438057600 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894
I0416 13:18:18.193307 132981438057600 train_utils.py:334] WARNING: 'dataset_path' might be pointing your local file system
I0416 13:18:18.193329 132981438057600 train_utils.py:347] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0416 13:18:18.193413 132981438057600 train.py:683] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper.
W0416 13:18:18.285721 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 13:18:18.685256 132981438057600 maxtext_utils.py:1517] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0416 13:18:18.685572 132981438057600 checkpointing.py:677] Setting up checkpoint logger...
I0416 13:18:18.685616 132981438057600 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0416 13:18:18.685658 132981438057600 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0416 13:18:18.686275 132981438057600 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x78f1907f9550>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0416 13:18:21.130900 132981438057600 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0416 13:18:21.131311 132981438057600 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78eb14dc99a0>}, handler_registry=None
I0416 13:18:21.131844 132981438057600 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78eb14dc99a0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0416 13:18:21.131896 132981438057600 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x78eb1413a1b0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0416 13:18:21.131932 132981438057600 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78eb14dc99a0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78eb14dc99a0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x78eb1413a1b0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x78eb1413a1b0>}).
I0416 13:18:21.132646 132981438057600 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
I0416 13:18:21.132722 132981438057600 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x78eb1412d260> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0416 13:18:21.258694 132981438057600 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints
I0416 13:18:21.258953 132981438057600 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x78eb141383b0>
I0416 13:18:21.259065 132981438057600 checkpointing.py:301] Checkpoint manager created!
W0416 13:18:21.277935 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 13:18:21.469704 132981438057600 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0416 13:18:21.469794 132981438057600 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0416 13:18:21.566205 132981438057600 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0416 13:18:21.566280 132981438057600 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0416 13:18:21.580242 132981438057600 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0416 13:18:21.580289 132981438057600 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0416 13:18:21.606562 132981438057600 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0416 13:18:21.606621 132981438057600 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 13:18:21.620807 132981438057600 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0416 13:18:21.620856 132981438057600 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 13:18:21.634803 132981438057600 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0416 13:18:21.634852 132981438057600 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 13:18:21.661563 132981438057600 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0416 13:18:21.661625 132981438057600 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 13:18:21.680527 132981438057600 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0416 13:18:21.680577 132981438057600 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None).
I0416 13:18:24.200985 132981438057600 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0416 13:18:24.201104 132981438057600 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
W0416 13:18:24.749011 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
[DECOUPLED NO-OP] gcs_storage: using stubs.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] workload_monitor: using stub.
[DECOUPLED NO-OP] vertex_tensorboard: using stub.
fsdp: 8
I0416 13:18:27.519118 132981438057600 maxtext_utils.py:1620]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519231 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519273 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519314 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.519342 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519366 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519405 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519442 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.519469 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519494 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519519 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519542 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519566 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.519587 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519609 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519633 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519656 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.519679 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519703 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519727 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519749 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519771 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.519793 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519814 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.519836 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519864 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.519887 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519910 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.519931 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519952 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.519973 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.519993 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520016 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520040 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520080 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.520103 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520125 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520146 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520168 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520191 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.520212 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520232 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520254 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520277 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.520298 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520321 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520342 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520364 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520385 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.520405 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520424 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520446 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520467 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.520488 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520510 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520530 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520551 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520572 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.520591 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520611 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520631 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520652 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.520673 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520694 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520715 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520736 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520757 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.520777 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520797 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520817 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520838 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.520865 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520887 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.520907 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520928 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.520949 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.520969 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.520989 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521010 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521032 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.521069 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521094 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521116 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521137 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521157 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.521177 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521197 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521218 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521239 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.521260 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521283 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521304 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521325 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521346 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.521367 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521386 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521407 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521428 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.521450 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521470 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521491 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521512 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521533 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.521552 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521572 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521593 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521614 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.521636 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521657 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521677 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521698 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521719 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.521739 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521758 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521779 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521800 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.521821 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521842 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521867 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521888 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.521910 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.521930 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521949 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.521970 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.521991 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.522012 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522033 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522066 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.522088 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.522108 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.522128 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.522148 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.522169 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522190 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.522210 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522231 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522252 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.522272 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 13:18:27.522293 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 13:18:27.522313 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.522332 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 13:18:27.522353 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522374 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 13:18:27.522395 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522416 132981438057600 maxtext_utils.py:1620]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 13:18:27.522453 132981438057600 maxtext_utils.py:1620]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0416 13:18:27.522485 132981438057600 maxtext_utils.py:1620]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0416 13:18:31.181861 132981438057600 train.py:155] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0416 13:18:31.181964 132981438057600 train.py:155] train/xent Physical: float32[8,2048]............................................. ('fsdp', None).
I0416 13:18:31.195710 132981438057600 train.py:162] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0416 13:18:31.195761 132981438057600 train.py:162] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None).
W0416 13:18:33.944787 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 13:18:36.219308 132981438057600 max_utils.py:791] Total memory size: 3.3 GB, Output size: 1.5 GB, Temp size: 1.8 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB.
I0416 13:18:36.220002 132981438057600 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter.
I0416 13:18:36.220455 132981438057600 metric_logger.py:289] number parameters: 1.104 billion
W0416 13:18:41.817562 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 13:18:43.503423 132981438057600 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0416 13:18:43.606295 132981438057600 checkpointing.py:776] Waited 0.10285758972167969 seconds for step 0 to finish before starting checkpointing.
I0416 13:18:43.607258 132981438057600 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0416 13:18:43.607434 132981438057600 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0
I0416 13:18:43.607795 132981438057600 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0.
I0416 13:18:43.696353 132981438057600 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient
I0416 13:18:43.794396 132873982051904 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0
I0416 13:18:43.828661 132981438057600 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0416 13:18:43.828760 132981438057600 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0416 13:18:45.395600 132981438057600 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 1.567718s
I0416 13:18:45.396075 132981438057600 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 7.265 GiB/s (total gbytes: 12.3 GiB) (time elapsed: a second) (per-host)
I0416 13:18:45.396138 132981438057600 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 1.698832s (batch_requests_ready=0.105333s, total_serialization_initiated=1.593182s, others=0.000317s)
I0416 13:18:45.396271 132981438057600 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 1.699454s (all_items=0.000022s, per_item={'items': '0.00002193'}, temp_paths=1.699432)
I0416 13:18:45.397146 132873914943040 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0416 13:18:45.397284 132981438057600 async_checkpointer.py:561] Finished blocking save. Time taken: 1.789806s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0.
I0416 13:18:45.397529 132981438057600 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0416 13:18:45.397751 132981438057600 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776345523.607245, 'wait_for_prev_duration_secs': 4.2438507080078125e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776345523.6074548, 'checkpointer_blocking_duration_secs': 1.7899744510650635, 'get_old_steps_start_time': 1776345525.397453, 'get_old_steps_duration_secs': 3.0517578125e-05, 'checkpoint_manager_blocking_start_time': 1776345523.6071634, 'checkpoint_manager_blocking_duration_secs': 1.7905516624450684}
I0416 13:18:45.397938 132981438057600 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0416 13:18:45.397988 132981438057600 max_utils.py:750] 
Memstats: After params initialized:
I0416 13:18:45.398422 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_0(process=0,(0,0,0,0))
I0416 13:18:45.398455 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_1(process=0,(1,0,0,0))
I0416 13:18:45.398476 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_2(process=0,(0,1,0,0))
I0416 13:18:45.398496 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_3(process=0,(1,1,0,0))
I0416 13:18:45.398515 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_4(process=0,(0,2,0,0))
I0416 13:18:45.398533 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_5(process=0,(1,2,0,0))
I0416 13:18:45.398550 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_6(process=0,(0,3,0,0))
I0416 13:18:45.398568 132981438057600 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_7(process=0,(1,3,0,0))
W0416 13:18:45.405603 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 13:18:45.413635 132873971566144 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/items
W0416 13:18:45.413889 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0416 13:18:45.419561 3862878 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 13:18:45.570478 132873935914560 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776345525311501426, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776345525311501426, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0416 13:18:45.570811 132873992537664 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0416 13:18:45.750132 132981438057600 metric_logger.py:185] completed step: 0, seconds: 7.282, TFLOP/s/device: 1.866, Tokens/s/device: 281.224, total_weights: 16384, loss: 10.887
I0416 13:18:45.751237 132981438057600 metric_logger.py:269] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/tensorboard/'
I0416 13:18:45.833029 132981438057600 metric_logger.py:185] completed step: 1, seconds: 2.225, TFLOP/s/device: 6.106, Tokens/s/device: 920.341, total_weights: 16384, loss: 10.887
I0416 13:18:45.943138 132981438057600 metric_logger.py:185] completed step: 2, seconds: 0.031, TFLOP/s/device: 440.869, Tokens/s/device: 66452.513, total_weights: 16384, loss: 9.787
I0416 13:18:46.053031 132981438057600 metric_logger.py:185] completed step: 3, seconds: 0.085, TFLOP/s/device: 159.027, Tokens/s/device: 23970.318, total_weights: 16384, loss: 8.853
I0416 13:18:46.118339 3865302 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json
I0416 13:18:46.118399 3865302 google_auth_provider.cc:156] Using OAuth2 AuthProvider
I0416 13:18:47.181801 132873982051904 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0
I0416 13:19:03.344021 132873961080384 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 17.241085s (commit=16.682485s, array_metadata_write=0.558600s)
I0416 13:19:03.346597 132873914943040 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 643.120 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 19 seconds) (per-host)
I0416 13:19:03.346754 132873914943040 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 17.949411s.
I0416 13:19:03.579852 132873914943040 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776345525311501426, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0416 13:19:03.774219 132873914943040 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0416 13:19:03.976477 132873935914560 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776345525311501426, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0416 13:19:04.307667 132873914943040 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0416 13:19:04.308933 132873914943040 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.681770s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/items
I0416 13:19:04.310036 132873914943040 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0/items
I0416 13:19:04.558149 132873914943040 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0
I0416 13:19:05.250179 132873914943040 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0`.
I0416 13:19:05.251115 132873914943040 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 21.643646s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/0
I0416 13:19:05.251252 132873914943040 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 19.853911s.
I0416 13:19:05.251475 132873992537664 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0416 13:19:05.251529 132873992537664 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0416 13:19:05.251603 132873992537664 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0416 13:19:05.251668 132873992537664 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0416 13:19:07.627827 132981438057600 metric_logger.py:185] completed step: 4, seconds: 0.110, TFLOP/s/device: 123.153, Tokens/s/device: 18562.999, total_weights: 16384, loss: 7.982
I0416 13:19:07.646070 132981438057600 metric_logger.py:185] completed step: 5, seconds: 0.110, TFLOP/s/device: 123.479, Tokens/s/device: 18612.091, total_weights: 16384, loss: 7.248
I0416 13:19:07.745708 132981438057600 metric_logger.py:185] completed step: 6, seconds: 21.574, TFLOP/s/device: 0.630, Tokens/s/device: 94.927, total_weights: 16384, loss: 6.690
I0416 13:19:07.854184 132981438057600 metric_logger.py:185] completed step: 7, seconds: 0.013, TFLOP/s/device: 1044.441, Tokens/s/device: 157429.472, total_weights: 16384, loss: 6.307
I0416 13:19:07.964960 132981438057600 metric_logger.py:185] completed step: 8, seconds: 0.103, TFLOP/s/device: 132.281, Tokens/s/device: 19938.859, total_weights: 16384, loss: 6.070
I0416 13:19:08.074485 132981438057600 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0416 13:19:08.082635 132981438057600 checkpointing.py:776] Waited 0.008206844329833984 seconds for step 9 to finish before starting checkpointing.
I0416 13:19:08.083613 132981438057600 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0416 13:19:08.083822 132981438057600 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9
I0416 13:19:08.084227 132981438057600 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9.
I0416 13:19:08.271134 132873992537664 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9
I0416 13:19:08.310084 132981438057600 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0416 13:19:08.310281 132981438057600 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0416 13:19:08.562960 132981438057600 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.253829s
I0416 13:19:08.563443 132981438057600 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 31.549 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 391 milliseconds) (per-host)
I0416 13:19:08.563532 132981438057600 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.391299s (batch_requests_ready=0.106696s, total_serialization_initiated=0.284267s, others=0.000336s)
I0416 13:19:08.563669 132981438057600 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.392455s (all_items=0.000024s, per_item={'items': '0.00002408'}, temp_paths=0.392431)
I0416 13:19:08.564526 132874032383552 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0416 13:19:08.564675 132981438057600 async_checkpointer.py:561] Finished blocking save. Time taken: 0.480801s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9.
I0416 13:19:08.564899 132981438057600 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0416 13:19:08.565106 132873982051904 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0416 13:19:08.565221 132981438057600 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776345548.0835595, 'wait_for_prev_duration_secs': 8.392333984375e-05, 'time_between_consecutive_saves_sec': 2.8318750858306885, 'checkpointer_blocking_start_time': 1776345548.0838463, 'checkpointer_blocking_duration_secs': 0.48095107078552246, 'get_old_steps_start_time': 1776345548.564821, 'get_old_steps_duration_secs': 2.9325485229492188e-05, 'checkpoint_manager_blocking_start_time': 1776345548.0834973, 'checkpoint_manager_blocking_duration_secs': 0.4816863536834717}
I0416 13:19:08.565468 132981438057600 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0416 13:19:08.565511 132981438057600 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0416 13:19:08.974550 132874063840832 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9/items
I0416 13:19:10.275752 132874053355072 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0
I0416 13:19:38.679091 132874042869312 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 29.208808s (commit=28.777864s, array_metadata_write=0.430944s)
I0416 13:19:38.681432 132874032383552 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 414.196 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 30 seconds) (per-host)
I0416 13:19:38.681509 132874032383552 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 30.116776s.
I0416 13:19:39.099705 132874032383552 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0416 13:19:39.578901 132874032383552 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0416 13:19:39.579726 132874032383552 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.627331s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9/items
I0416 13:19:39.580704 132874032383552 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9/items
I0416 13:19:39.816982 132874032383552 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9
I0416 13:19:40.516720 132874032383552 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9`.
I0416 13:19:40.517658 132874032383552 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 32.433797s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260416_123638/linen_main_20260416_123638_13_scan_layers_false/checkpoints/9
I0416 13:19:40.517738 132874032383552 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 31.953007s.
I0416 13:19:40.517864 132873982051904 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0416 13:19:40.517917 132873982051904 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0416 13:19:40.517976 132873982051904 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0416 13:19:40.518061 132873982051904 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0416 13:19:40.519181 132981438057600 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0416 13:19:40.519377 132981438057600 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0416 13:19:40.520189 132981438057600 metric_logger.py:185] completed step: 9, seconds: 0.109, TFLOP/s/device: 125.131, Tokens/s/device: 18861.148, total_weights: 16384, loss: 5.930
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
2026-04-16 18:56:25.602607: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0416 18:56:25.720034 137272899492992 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag.
I0416 18:57:22.329154 137272899492992 max_utils.py:800] System Information: Jax Version: 0.9.2
I0416 18:57:22.329266 137272899492992 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0416 18:57:22.329301 137272899492992 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894
I0416 18:57:22.329326 137272899492992 train_utils.py:364] WARNING: 'dataset_path' might be pointing your local file system
I0416 18:57:22.329347 137272899492992 train_utils.py:377] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0416 18:57:22.329425 137272899492992 train.py:811] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper.
W0416 18:57:22.422446 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 18:57:22.820934 137272899492992 maxtext_utils.py:1687] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0416 18:57:22.821113 137272899492992 maxtext_utils.py:1687] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0416 18:57:22.821339 137272899492992 checkpointing.py:688] Setting up checkpoint logger...
I0416 18:57:22.821381 137272899492992 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0416 18:57:22.821421 137272899492992 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0416 18:57:22.822175 137272899492992 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7cd8bfbf5010>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0416 18:57:25.186671 137272899492992 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0416 18:57:25.186842 137272899492992 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd313d958b0>}, handler_registry=None
I0416 18:57:25.187098 137272899492992 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd313d958b0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0416 18:57:25.187139 137272899492992 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7cd24393e2a0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0416 18:57:25.187167 137272899492992 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd313d958b0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd313d958b0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7cd24393e2a0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7cd24393e2a0>}).
I0416 18:57:25.187455 137272899492992 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
I0416 18:57:25.187505 137272899492992 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x7cd24392dee0> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0416 18:57:25.321161 137272899492992 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints
I0416 18:57:25.321380 137272899492992 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7cd245170920>
I0416 18:57:25.321460 137272899492992 checkpointing.py:302] Checkpoint manager created!
W0416 18:57:25.337767 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 18:57:25.532895 137272899492992 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0416 18:57:25.533056 137272899492992 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0416 18:57:25.625521 137272899492992 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0416 18:57:25.625606 137272899492992 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0416 18:57:25.639631 137272899492992 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0416 18:57:25.639681 137272899492992 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0416 18:57:25.667598 137272899492992 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0416 18:57:25.667655 137272899492992 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 18:57:25.681857 137272899492992 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0416 18:57:25.681904 137272899492992 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 18:57:25.695924 137272899492992 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0416 18:57:25.695971 137272899492992 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 18:57:25.721959 137272899492992 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0416 18:57:25.722022 137272899492992 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0416 18:57:25.741372 137272899492992 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0416 18:57:25.741425 137272899492992 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None).
I0416 18:57:28.293857 137272899492992 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0416 18:57:28.293961 137272899492992 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
W0416 18:57:28.824801 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
[DECOUPLED NO-OP] gcs_storage: using stubs.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] workload_monitor: using stub.
[DECOUPLED NO-OP] vertex_tensorboard: using stub.
fsdp: 8
I0416 18:57:31.700072 137272899492992 maxtext_utils.py:1796]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700182 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700222 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700265 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.700294 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700320 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700359 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700397 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.700423 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700448 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700472 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700494 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700517 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.700541 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700563 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700586 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700609 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.700634 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700658 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700680 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700701 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700722 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.700743 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700764 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700787 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700810 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.700838 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700863 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.700886 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700909 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.700931 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.700952 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700973 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.700994 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701016 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.701037 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701076 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701099 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701121 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701143 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.701164 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701184 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701206 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701227 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.701249 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701270 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701291 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701313 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701334 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.701354 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701375 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701396 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701418 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.701439 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701461 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701482 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701503 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701525 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.701546 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701566 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701588 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701609 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.701632 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701654 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701676 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701698 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701720 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.701740 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701761 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701782 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701804 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.701828 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701855 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.701877 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701901 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.701923 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.701944 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701964 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.701987 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702009 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.702032 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702064 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702086 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702108 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702129 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.702149 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702169 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702191 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702212 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.702233 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702254 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702275 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702296 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702318 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.702337 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702357 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702377 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702398 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.702419 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702440 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702461 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702481 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702502 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.702522 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702541 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702562 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702584 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.702605 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702627 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702648 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702669 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702690 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.702709 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702728 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702749 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702770 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.702791 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702812 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702836 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702858 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.702878 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.702898 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702918 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.702939 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.702960 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.702981 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703001 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703022 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.703054 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.703077 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.703096 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.703116 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.703138 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703159 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.703180 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703202 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703222 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.703243 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0416 18:57:31.703264 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0416 18:57:31.703284 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.703304 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0416 18:57:31.703325 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703346 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0416 18:57:31.703367 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703388 137272899492992 maxtext_utils.py:1796]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0416 18:57:31.703424 137272899492992 maxtext_utils.py:1796]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0416 18:57:31.703456 137272899492992 maxtext_utils.py:1796]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0416 18:57:35.803982 137272899492992 train.py:157] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0416 18:57:35.804076 137272899492992 train.py:157] train/xent Physical: float32[8,2048]............................................. ('fsdp', None).
I0416 18:57:35.818023 137272899492992 train.py:164] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0416 18:57:35.818085 137272899492992 train.py:164] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None).
W0416 18:57:38.053884 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 18:57:40.325320 137272899492992 max_utils.py:791] Total memory size: 3.3 GB, Output size: 1.5 GB, Temp size: 1.8 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB.
I0416 18:57:40.325891 137272899492992 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter.
I0416 18:57:40.326352 137272899492992 metric_logger.py:289] number parameters: 1.104 billion
W0416 18:57:46.198442 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 18:57:47.871049 137272899492992 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0416 18:57:47.975911 137272899492992 checkpointing.py:798] Waited 0.10489416122436523 seconds for step 0 to finish before starting checkpointing.
I0416 18:57:47.977005 137272899492992 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0416 18:57:47.977244 137272899492992 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0
I0416 18:57:47.977788 137272899492992 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0.
I0416 18:57:48.070324 137272899492992 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient
I0416 18:57:48.161396 137165484852800 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0
I0416 18:57:48.789952 137272899492992 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0416 18:57:48.790434 137272899492992 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0416 18:57:49.390948 137272899492992 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.601765s
I0416 18:57:49.391364 137272899492992 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 9.349 GiB/s (total gbytes: 12.3 GiB) (time elapsed: a second) (per-host)
I0416 18:57:49.391413 137272899492992 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 1.320112s (batch_requests_ready=0.101207s, total_serialization_initiated=1.218597s, others=0.000308s)
I0416 18:57:49.391507 137272899492992 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 1.320707s (all_items=0.000030s, per_item={'items': '0.00002980'}, temp_paths=1.320677)
I0416 18:57:49.392158 137165417743936 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0416 18:57:49.392251 137272899492992 async_checkpointer.py:561] Finished blocking save. Time taken: 1.414962s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0.
I0416 18:57:49.392430 137272899492992 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0416 18:57:49.392615 137272899492992 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776365867.9769876, 'wait_for_prev_duration_secs': 4.76837158203125e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776365867.9772675, 'checkpointer_blocking_duration_secs': 1.4150924682617188, 'get_old_steps_start_time': 1776365869.3923764, 'get_old_steps_duration_secs': 1.9550323486328125e-05, 'checkpoint_manager_blocking_start_time': 1776365867.9768808, 'checkpoint_manager_blocking_duration_secs': 1.415705680847168}
I0416 18:57:49.392761 137272899492992 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0416 18:57:49.392800 137272899492992 max_utils.py:750] 
Memstats: After params initialized:
I0416 18:57:49.393127 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_0(process=0,(0,0,0,0))
I0416 18:57:49.393159 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_1(process=0,(1,0,0,0))
I0416 18:57:49.393179 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_2(process=0,(0,1,0,0))
I0416 18:57:49.393197 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_3(process=0,(1,1,0,0))
I0416 18:57:49.393213 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_4(process=0,(0,2,0,0))
I0416 18:57:49.393230 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_5(process=0,(1,2,0,0))
I0416 18:57:49.393248 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_6(process=0,(0,3,0,0))
I0416 18:57:49.393264 137272899492992 max_utils.py:756] 	Using (GB) 2.16 / 31.25 (6.912000%) on TPU_7(process=0,(1,3,0,0))
W0416 18:57:49.398466 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0416 18:57:49.403768 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0416 18:57:49.408172 4011682 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0416 18:57:49.408267 137165474367040 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/items
I0416 18:57:49.559321 137165438715456 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776365869324220056, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776365869324220056, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0416 18:57:49.559562 137165407258176 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0416 18:57:49.722228 137272899492992 metric_logger.py:185] completed step: 0, seconds: 7.544, TFLOP/s/device: 1.801, Tokens/s/device: 271.469, total_weights: 16384, loss: 10.887
I0416 18:57:49.723102 137272899492992 metric_logger.py:269] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/tensorboard/'
I0416 18:57:49.814107 137272899492992 metric_logger.py:185] completed step: 1, seconds: 1.836, TFLOP/s/device: 7.399, Tokens/s/device: 1115.184, total_weights: 16384, loss: 10.887
I0416 18:57:49.924197 137272899492992 metric_logger.py:185] completed step: 2, seconds: 0.024, TFLOP/s/device: 574.800, Tokens/s/device: 86640.156, total_weights: 16384, loss: 9.787
I0416 18:57:50.034049 137272899492992 metric_logger.py:185] completed step: 3, seconds: 0.092, TFLOP/s/device: 146.905, Tokens/s/device: 22143.174, total_weights: 16384, loss: 8.853
I0416 18:57:50.089094 4013886 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json
I0416 18:57:50.089155 4013886 google_auth_provider.cc:156] Using OAuth2 AuthProvider
I0416 18:57:50.855979 137165484852800 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0
I0416 18:58:08.710565 137165428229696 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 18.652235s (commit=18.192127s, array_metadata_write=0.460107s)
I0416 18:58:08.712322 137165417743936 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 612.219 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 20 seconds) (per-host)
I0416 18:58:08.712426 137165417743936 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 19.320131s.
I0416 18:58:08.938730 137165417743936 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776365869324220056, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0416 18:58:09.124421 137165417743936 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0416 18:58:09.329246 137165438715456 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776365869324220056, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0416 18:58:09.563593 137165417743936 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0416 18:58:09.564513 137165417743936 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.580104s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/items
I0416 18:58:09.565215 137165417743936 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0/items
I0416 18:58:09.815771 137165417743936 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0
I0416 18:58:10.487547 137165417743936 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0`.
I0416 18:58:10.488167 137165417743936 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 22.510884s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/0
I0416 18:58:10.488238 137165417743936 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 21.095943s.
I0416 18:58:10.488361 137165407258176 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0416 18:58:10.488411 137165407258176 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0416 18:58:10.488465 137165407258176 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0416 18:58:10.488506 137165407258176 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0416 18:58:11.248819 137272899492992 metric_logger.py:185] completed step: 4, seconds: 0.110, TFLOP/s/device: 123.017, Tokens/s/device: 18542.495, total_weights: 16384, loss: 7.982
I0416 18:58:11.263188 137272899492992 metric_logger.py:185] completed step: 5, seconds: 0.110, TFLOP/s/device: 123.909, Tokens/s/device: 18676.929, total_weights: 16384, loss: 7.248
I0416 18:58:11.365335 137272899492992 metric_logger.py:185] completed step: 6, seconds: 21.215, TFLOP/s/device: 0.640, Tokens/s/device: 96.536, total_weights: 16384, loss: 6.690
I0416 18:58:11.474570 137272899492992 metric_logger.py:185] completed step: 7, seconds: 0.011, TFLOP/s/device: 1250.311, Tokens/s/device: 188460.477, total_weights: 16384, loss: 6.307
I0416 18:58:11.583833 137272899492992 metric_logger.py:185] completed step: 8, seconds: 0.105, TFLOP/s/device: 129.834, Tokens/s/device: 19569.995, total_weights: 16384, loss: 6.070
I0416 18:58:11.694426 137272899492992 checkpointing.py:794] Waiting for step 9 to finish before checkpoint...
I0416 18:58:11.700305 137272899492992 checkpointing.py:798] Waited 0.0058972835540771484 seconds for step 9 to finish before starting checkpointing.
I0416 18:58:11.701012 137272899492992 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0416 18:58:11.701184 137272899492992 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9
I0416 18:58:11.701457 137272899492992 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9.
I0416 18:58:11.898821 137165407258176 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9
I0416 18:58:11.899027 137272899492992 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0416 18:58:11.899341 137272899492992 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0416 18:58:12.120055 137272899492992 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.221961s
I0416 18:58:12.120467 137272899492992 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 35.758 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 345 milliseconds) (per-host)
I0416 18:58:12.120523 137272899492992 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.345210s (batch_requests_ready=0.095625s, total_serialization_initiated=0.249262s, others=0.000323s)
I0416 18:58:12.120626 137272899492992 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.345997s (all_items=0.000014s, per_item={'items': '0.00001383'}, temp_paths=0.345983)
I0416 18:58:12.121358 137165363217984 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0416 18:58:12.121451 137272899492992 async_checkpointer.py:561] Finished blocking save. Time taken: 0.420226s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9.
I0416 18:58:12.121620 137272899492992 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0416 18:58:12.121885 137165352732224 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0416 18:58:12.122012 137272899492992 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776365891.7009714, 'wait_for_prev_duration_secs': 6.842613220214844e-05, 'time_between_consecutive_saves_sec': 1.2124507427215576, 'checkpointer_blocking_start_time': 1776365891.7012064, 'checkpointer_blocking_duration_secs': 0.42033886909484863, 'get_old_steps_start_time': 1776365892.1215599, 'get_old_steps_duration_secs': 2.5033950805664062e-05, 'checkpoint_manager_blocking_start_time': 1776365891.700936, 'checkpoint_manager_blocking_duration_secs': 0.42104554176330566}
I0416 18:58:12.122216 137272899492992 checkpointing.py:409] Started an asynchronous checkpoint save for step 9
I0416 18:58:12.122250 137272899492992 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0416 18:58:12.571703 137165474367040 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9/items
I0416 18:58:13.800885 137165384189504 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0
I0416 18:58:47.042068 137165373703744 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 34.004720s (commit=33.567101s, array_metadata_write=0.437619s)
I0416 18:58:47.043801 137165363217984 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 358.303 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 35 seconds) (per-host)
I0416 18:58:47.043914 137165363217984 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 34.922418s.
I0416 18:58:47.456325 137165363217984 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0416 18:58:47.896408 137165363217984 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0416 18:58:47.897509 137165363217984 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.583935s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9/items
I0416 18:58:47.898274 137165363217984 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9/items
I0416 18:58:48.147716 137165363217984 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9
I0416 18:58:48.820723 137165363217984 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9`.
I0416 18:58:48.821365 137165363217984 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 37.120147s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_13_scan_layers_false/checkpoints/9
I0416 18:58:48.821445 137165363217984 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 36.699952s.
I0416 18:58:48.821584 137165352732224 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0416 18:58:48.821686 137165352732224 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0416 18:58:48.821745 137165352732224 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0416 18:58:48.821787 137165352732224 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0416 18:58:48.822449 137272899492992 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0416 18:58:48.822935 137272899492992 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0416 18:58:48.823495 137272899492992 metric_logger.py:185] completed step: 9, seconds: 0.109, TFLOP/s/device: 124.407, Tokens/s/device: 18752.003, total_weights: 16384, loss: 5.930
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops