MaxView

‹ 11_optimizer_offload_trueCase: 13_scan_layers_false14_async_ckpt_false_resume ›

Metrics: main (b117f50cf) vs feat/nnx-set-defaults-true (73213e044)

Metricmain  b117f50cffeat/nnx-set-defaults-true  73213e044Diff (feat/nnx-set-defaults-true − main)
Parameters1.104 billion1.104 billion
Final loss8.18808.18800
TFLOP/s97.70199.644+1.943
Tok/s14726.615019.4+292.789
Avg s/step2.9372.780-0.157
Memory %2.622.620
JAX0.9.20.8.1

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  b117f50cf  ·  main_20260424_070227  ·  full log
XPK Start: Fri Apr 24 07:42:49 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-24 07:43:13.954729: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0424 07:43:14.166485 133699333535552 max_utils.py:273] Attempting to initialize the jax distributed system...
I0424 07:43:23.208752 133699333535552 distributed.py:149] Starting JAX distributed service on [::]:8482
I0424 07:43:23.211035 133699333535552 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-uyqu5-slice-job-0-0.mt-13-scan-layers-false-uyqu5:8482
I0424 07:43:24.382563 133699333535552 max_utils.py:284] Jax distributed system initialized!
I0424 07:43:29.447008 133699333535552 max_utils.py:800] System Information: Jax Version: 0.9.2
I0424 07:43:29.447111 133699333535552 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0424 07:43:29.447152 133699333535552 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0424 07:43:29.447189 133699333535552 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0424 07:43:30.144434 133699333535552 maxtext_utils.py:1604] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 07:43:30.144727 133699333535552 checkpointing.py:677] Setting up checkpoint logger...
I0424 07:43:30.144783 133699333535552 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0424 07:43:30.144827 133699333535552 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0424 07:43:30.145168 133699333535552 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7998c0d18f80>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 07:43:33.008356 133699333535552 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0424 07:43:33.008596 133699333535552 checkpoint_manager.py:708] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7985006a6300>}, handler_registry=None
I0424 07:43:33.008848 133699333535552 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7985006a6300>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 07:43:33.008899 133699333535552 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7985006a9ee0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 07:43:33.008935 133699333535552 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7985006a6300>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7985006a6300>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7985006a9ee0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7985006a9ee0>}).
I0424 07:43:33.009264 133699333535552 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0424 07:43:33.009334 133699333535552 async_checkpointer.py:192] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7985002d9940> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0424 07:43:33.744480 133699333535552 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints
I0424 07:43:33.961011 133699333535552 checkpoint_manager.py:929] [process=5][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7985006a7200>
I0424 07:43:33.961183 133699333535552 checkpointing.py:301] Checkpoint manager created!
I0424 07:43:34.910640 133699333535552 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0424 07:43:34.910767 133699333535552 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 07:43:35.292994 133699333535552 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0424 07:43:35.293088 133699333535552 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 07:43:35.309375 133699333535552 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0424 07:43:35.309434 133699333535552 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 07:43:35.345121 133699333535552 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 07:43:35.345231 133699333535552 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:43:35.361546 133699333535552 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 07:43:35.361610 133699333535552 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:43:35.377858 133699333535552 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 07:43:35.377921 133699333535552 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:43:35.407475 133699333535552 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0424 07:43:35.407545 133699333535552 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:43:35.429471 133699333535552 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0424 07:43:35.429538 133699333535552 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0424 07:43:38.330959 133699333535552 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0424 07:43:38.331086 133699333535552 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0424 07:43:45.522083 133699333535552 maxtext_utils.py:1707]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.522214 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.522265 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.522319 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.522358 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.522392 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.522442 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.522490 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.522528 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.522561 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.522595 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.522627 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.522673 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.522708 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.522739 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.522773 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.522804 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.522836 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.522868 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.522899 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.522929 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.522965 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.522994 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523022 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523052 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523083 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.523114 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523144 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523174 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.523203 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.523231 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.523258 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523285 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523314 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523343 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.523371 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523400 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523428 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.523457 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.523485 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.523512 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523539 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523568 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523596 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.523624 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523665 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523696 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.523725 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.523760 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.523792 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523841 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.523883 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523914 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.523943 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.523977 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524006 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524034 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524064 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.524090 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524117 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524146 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524174 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.524205 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524235 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524264 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524293 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524321 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.524348 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524374 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524403 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524431 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.524460 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524488 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524517 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524547 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524576 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.524603 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524631 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524687 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524720 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.524751 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524780 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524810 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524839 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.524868 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.524895 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524922 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.524956 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.524985 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.525015 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525043 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525071 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525099 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525127 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.525154 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525180 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525208 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525236 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.525265 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525294 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525323 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525351 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525380 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.525407 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525433 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525461 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525489 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.525518 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525547 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525574 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525603 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525630 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.525667 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525696 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525725 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525753 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.525782 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525811 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.525841 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525871 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.525900 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.525928 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525959 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.525988 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526017 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.526046 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526074 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526103 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.526131 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.526159 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.526187 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.526214 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.526244 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526274 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.526303 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526332 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526360 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.526389 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 07:43:45.526422 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 07:43:45.526450 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.526476 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:43:45.526504 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526532 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:43:45.526561 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526589 133699333535552 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 07:43:45.526636 133699333535552 maxtext_utils.py:1707]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0424 07:43:45.526690 133699333535552 maxtext_utils.py:1707]  params/params/token_embedder/embedding

    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0424 07:43:50.086751 133699333535552 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 07:43:50.086842 133699333535552 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 07:43:50.102612 133699333535552 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 07:43:50.102682 133699333535552 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 07:44:46.885870 133699333535552 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0424 07:44:46.887066 133699333535552 metric_logger.py:301] number parameters: 1.104 billion
I0424 07:45:47.711523 133699333535552 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0424 07:45:47.949041 133699333535552 checkpointing.py:776] Waited 0.23749637603759766 seconds for step 0 to finish before starting checkpointing.
I0424 07:45:47.952926 133699333535552 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 07:45:47.954821 133699333535552 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 0
I0424 07:45:47.956431 133699333535552 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints/0.
I0424 07:45:48.309243 133699333535552 signaling_client.py:364] Using JaxDistributedSignalingClient
I0424 07:45:48.310277 133699333535552 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0424 07:45:48.310403 133699333535552 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 07:45:48.614904 133699333535552 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.306249s
I0424 07:45:48.615083 133699333535552 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 4.541 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3397238254547119 s) (per-host)
I0424 07:45:48.615134 133699333535552 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.339784s (batch_requests_ready=0.016798s, total_serialization_initiated=0.322918s, others=0.000068s)
I0424 07:45:48.615382 133699333535552 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.344175s (all_items=0.000020s, per_item={'items': '0.00001955'}, temp_paths=0.344155)
I0424 07:45:48.616226 133699333535552 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.66 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints/0.
I0424 07:45:48.616605 133573403571968 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 08:05:48.616566
I0424 07:45:48.659873 133699333535552 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 07:45:48.660251 133573376288512 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0424 07:45:48.660411 133699333535552 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777016747.952907, 'wait_for_prev_duration_secs': 6.437301635742188e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777016747.95486, 'checkpointer_blocking_duration_secs': 0.661959171295166, 'get_old_steps_start_time': 1777016748.6168456, 'get_old_steps_duration_secs': 3.0517578125e-05, 'checkpoint_manager_blocking_start_time': 1777016747.9505482, 'checkpoint_manager_blocking_duration_secs': 0.7098245620727539}
I0424 07:45:48.660601 133699333535552 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0424 07:45:48.660670 133699333535552 max_utils.py:750] 
Memstats: After params initialized:
I0424 07:45:48.660719 133699333535552 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_18(process=5,(2,4,0,0))
I0424 07:45:48.660753 133699333535552 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_19(process=5,(3,4,0,0))
I0424 07:45:48.660784 133699333535552 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_22(process=5,(2,5,0,0))
I0424 07:45:48.660807 133699333535552 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_23(process=5,(3,5,0,0))
I0424 07:45:49.008773 133699333535552 metric_logger.py:196] completed step: 0, seconds: 60.824, TFLOP/s/device: 0.223, Tokens/s/device: 33.671, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0424 07:45:49.185976 133699333535552 metric_logger.py:196] completed step: 1, seconds: 1.289, TFLOP/s/device: 10.542, Tokens/s/device: 1589.020, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0424 07:45:49.616733 133699333535552 metric_logger.py:196] completed step: 2, seconds: 0.049, TFLOP/s/device: 275.294, Tokens/s/device: 41495.289, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28797.213
I0424 07:45:49.757417 133699333535552 metric_logger.py:196] completed step: 3, seconds: 0.431, TFLOP/s/device: 31.505, Tokens/s/device: 4748.842, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 17000.262
I0424 07:45:50.037450 133699333535552 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.240, Tokens/s/device: 14355.610, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10779.323
I0424 07:45:50.048485 133699333535552 metric_logger.py:196] completed step: 5, seconds: 0.140, TFLOP/s/device: 96.920, Tokens/s/device: 14608.849, total_weights: 65536, loss: 8.900, lm_loss: 8.900, perplexity: 7335.285
I0424 07:45:51.864443    2768 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0424 07:45:53.950751 133573384681216 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_5
I0424 07:46:14.002812 133699333535552 metric_logger.py:196] completed step: 6, seconds: 0.281, TFLOP/s/device: 48.300, Tokens/s/device: 7280.302, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.390
I0424 07:46:14.141833 133699333535552 metric_logger.py:196] completed step: 7, seconds: 23.822, TFLOP/s/device: 0.570, Tokens/s/device: 85.969, total_weights: 65536, loss: 8.394, lm_loss: 8.394, perplexity: 4418.604
I0424 07:46:14.279134 133699333535552 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.743, Tokens/s/device: 14431.479, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.855
I0424 07:46:14.415261 133699333535552 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0424 07:46:14.418719 133699333535552 checkpointing.py:776] Waited 0.0034744739532470703 seconds for step 9 to finish before starting checkpointing.
I0424 07:46:14.421353 133699333535552 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 07:46:21.630938 133573403571968 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 47.356 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.355538845062256 s) (per-host)
I0424 07:46:21.631071 133573403571968 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.014289s.
I0424 07:46:31.042793 133573403571968 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 42.425997s.
I0424 07:46:31.043078 133573376288512 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 07:46:31.043241 133573376288512 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0424 07:46:31.043301 133573376288512 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0424 07:46:31.045552 133573376288512 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0424 07:46:31.045767 133699333535552 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0424 07:46:31.045924 133699333535552 checkpoint_manager.py:1452] Waiting for previous save to complete took 16.624571 seconds. If this number is high, consider checkpointing less frequently.
I0424 07:46:31.048717 133699333535552 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 9
I0424 07:46:31.050873 133699333535552 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints/9.
I0424 07:46:31.380016 133699333535552 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0424 07:46:31.380617 133699333535552 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 07:46:31.507238 133699333535552 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.128628s
I0424 07:46:31.507400 133699333535552 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 9.616 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.16042518615722656 s) (per-host)
I0424 07:46:31.507450 133699333535552 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.160484s (batch_requests_ready=0.016019s, total_serialization_initiated=0.144398s, others=0.000066s)
I0424 07:46:31.507755 133699333535552 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.164935s (all_items=0.000017s, per_item={'items': '0.00001669'}, temp_paths=0.164918)
I0424 07:46:31.508531 133699333535552 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.46 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints/9.
I0424 07:46:31.508934 133573376288512 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 08:06:31.508897
I0424 07:46:31.519488 133699333535552 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 07:46:31.519797 133573273184000 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0424 07:46:31.519954 133699333535552 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777016774.4213216, 'wait_for_prev_duration_secs': 16.624570846557617, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777016791.0487554, 'checkpointer_blocking_duration_secs': 0.4603233337402344, 'get_old_steps_start_time': 1777016791.5091026, 'get_old_steps_duration_secs': 3.0994415283203125e-05, 'checkpoint_manager_blocking_start_time': 1777016774.4195893, 'checkpoint_manager_blocking_duration_secs': 17.10032868385315}
I0424 07:46:31.520170 133699333535552 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0424 07:46:31.520219 133699333535552 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 07:46:37.015066 133573384681216 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_5
I0424 07:47:13.691683 133573376288512 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 37.303 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.34465575218201 s) (per-host)
I0424 07:47:13.691796 133573376288512 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.182746s.
I0424 07:47:21.109174 133573376288512 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 49.600111s.
I0424 07:47:21.109472 133573273184000 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 07:47:21.109598 133573273184000 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0424 07:47:21.109648 133573273184000 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0424 07:47:21.112200 133573273184000 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0424 07:47:21.112384 133699333535552 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0424 07:47:21.112529 133699333535552 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 07:47:21.113527 133699333535552 metric_logger.py:196] completed step: 9, seconds: 0.139, TFLOP/s/device: 97.701, Tokens/s/device: 14726.609, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.748
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Fri Apr 24 07:47:34 UTC 2026
EXIT_CODE=0
XPK Start: Fri Apr 24 15:35:58 UTC 2026
2026-04-24 15:36:02.209748: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1777044962.222863      10 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1777044962.226609      10 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1777044962.238412      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1777044962.238432      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1777044962.238435      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1777044962.238437      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-24 15:36:21.407247: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0424 15:36:21.935934 137106215524160 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-24 15:36:30,977:jax._src.distributed:140: Starting JAX distributed service on [::]:8482
I0424 15:36:30.977586 137106215524160 distributed.py:140] Starting JAX distributed service on [::]:8482
INFO:2026-04-24 15:36:30,979:jax._src.distributed:157: Connecting to JAX distributed service on mt-13-scan-layers-false-lbv43-slice-job-0-0.mt-13-scan-layers-false-lbv43:8482
I0424 15:36:30.979974 137106215524160 distributed.py:157] Connecting to JAX distributed service on mt-13-scan-layers-false-lbv43-slice-job-0-0.mt-13-scan-layers-false-lbv43:8482
I0424 15:36:32.036529 137106215524160 max_utils.py:284] Jax distributed system initialized!
I0424 15:36:38.845523 137106215524160 max_utils.py:800] System Information: Jax Version: 0.8.1
I0424 15:36:38.845630 137106215524160 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0424 15:36:38.845672 137106215524160 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0424 15:36:38.845705 137106215524160 train_utils.py:377] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0424 15:36:39.480221 137106215524160 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 15:36:39.480815 137106215524160 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 15:36:39.480997 137106215524160 checkpointing.py:688] Setting up checkpoint logger...
I0424 15:36:39.481047 137106215524160 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0424 15:36:39.481090 137106215524160 pytree_checkpoint_handler.py:589] save_device_host_concurrent_bytes=None
I0424 15:36:39.481475 137106215524160 base_pytree_checkpoint_handler.py:415] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7cb205996450>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 15:36:42.330947 137106215524160 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0424 15:36:42.331431 137106215524160 checkpoint_manager.py:709] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c9f20c39940>}, handler_registry=None
I0424 15:36:42.331772 137106215524160 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c9f20c39940>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 15:36:42.331828 137106215524160 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c9f20c3ecc0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 15:36:42.331866 137106215524160 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c9f20c39940>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c9f20c39940>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c9f20c3ecc0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c9f20c3ecc0>}).
I0424 15:36:42.332403 137106215524160 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.33
I0424 15:36:42.332485 137106215524160 async_checkpointer.py:177] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7c9e53c184a0> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0424 15:36:43.103714 137106215524160 checkpoint_manager.py:1818] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints
I0424 15:36:43.123782 137106215524160 checkpoint_manager.py:929] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7c9d7c655ac0>
I0424 15:36:43.123911 137106215524160 checkpointing.py:302] Checkpoint manager created!
I0424 15:36:44.045168 137106215524160 nnx_wrappers.py:455] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0424 15:36:44.045352 137106215524160 nnx_wrappers.py:455] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 15:36:44.838083 137106215524160 attentions.py:1084] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0424 15:36:44.838231 137106215524160 attentions.py:1084] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 15:36:44.994115 137106215524160 attentions.py:1085] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0424 15:36:44.994246 137106215524160 attentions.py:1085] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 15:36:45.159348 137106215524160 attentions.py:1150] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 15:36:45.159479 137106215524160 attentions.py:1150] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 15:36:45.313815 137106215524160 attentions.py:1151] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 15:36:45.313946 137106215524160 attentions.py:1151] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 15:36:45.466722 137106215524160 attentions.py:1152] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 15:36:45.466854 137106215524160 attentions.py:1152] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 15:36:45.627832 137106215524160 attentions.py:1193] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0424 15:36:45.627965 137106215524160 attentions.py:1193] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 15:36:45.651183 137106215524160 linears.py:541] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0424 15:36:45.651281 137106215524160 linears.py:541] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0424 15:37:00.828871 137106215524160 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0424 15:37:00.829004 137106215524160 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0424 15:37:09.770181 137106215524160 maxtext_utils.py:1740]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.770323 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.770378 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.770437 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.770479 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.770515 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.770567 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.770620 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.770660 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.770697 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.770734 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.770769 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.770804 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.770837 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.770869 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.770903 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.770938 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.770972 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771004 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771036 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771068 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771111 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.771144 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.771175 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.771209 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771241 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.771281 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771319 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771352 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771384 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771417 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.771446 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.771476 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.771506 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771540 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.771571 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771603 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771635 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771667 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771699 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.771728 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.771757 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.771788 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771820 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.771851 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771882 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.771913 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771943 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.771974 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.772002 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772030 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772060 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772090 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.772146 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772179 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772211 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.772243 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.772274 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.772303 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772337 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772368 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772399 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.772430 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772462 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772494 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.772526 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.772556 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.772585 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772613 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772644 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772675 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.772706 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772737 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772767 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.772799 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.772829 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.772858 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772886 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.772918 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.772952 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.772983 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773015 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773045 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773075 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773117 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.773147 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773175 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773206 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773236 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.773268 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773298 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773332 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773363 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773392 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.773421 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773448 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773478 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773508 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.773538 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773568 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773598 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773628 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773658 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.773686 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773714 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773744 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773774 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.773805 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773834 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.773866 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773896 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.773926 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.773954 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.773982 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774012 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774045 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.774075 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774116 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774148 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.774179 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.774209 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.774237 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774265 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774295 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774329 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.774359 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774390 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774420 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.774449 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.774481 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.774508 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774536 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774566 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774596 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.774626 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774656 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774685 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.774715 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0424 15:37:09.774745 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0424 15:37:09.774772 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774799 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0424 15:37:09.774830 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774860 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 15:37:09.774890 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774920 137106215524160 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0424 15:37:09.774967 137106215524160 maxtext_utils.py:1740]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   PartitionSpec('embed', 'vocab')
    Physical:  ('fsdp', None)
I0424 15:37:09.775012 137106215524160 maxtext_utils.py:1740]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   PartitionSpec('vocab', 'embed')
    Physical:  (None, 'fsdp')

I0424 15:37:13.724607 137106215524160 train.py:158] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 15:37:13.724702 137106215524160 train.py:158] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 15:37:13.739815 137106215524160 train.py:165] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 15:37:13.739874 137106215524160 train.py:165] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 15:38:10.797432 137106215524160 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.4 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0424 15:38:10.800892 137106215524160 metric_logger.py:289] number parameters: 1.104 billion
I0424 15:39:13.085691 137106215524160 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0424 15:39:13.376921 137106215524160 checkpointing.py:798] Waited 0.2912111282348633 seconds for step 0 to finish before starting checkpointing.
I0424 15:39:13.382586 137106215524160 checkpoint_manager.py:2013] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 15:39:13.384607 137106215524160 checkpoint_manager.py:1518] [process=6] Saving checkpoint at step 0
I0424 15:39:13.387704 137106215524160 async_checkpointer.py:452] [process=6] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints/0.
I0424 15:39:14.033688 137106215524160 signaling_client.py:364] Using JaxDistributedSignalingClient
I0424 15:39:14.034798 137106215524160 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0424 15:39:14.034919 137106215524160 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 15:39:14.361492 137106215524160 base_pytree_checkpoint_handler.py:153] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.328326s
I0424 15:39:14.361664 137106215524160 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/blocking_gbytes_per_sec: 2.401 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.6425964832305908 s) (per-host)
I0424 15:39:14.361716 137106215524160 base_pytree_checkpoint_handler.py:737] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.642659s (batch_requests_ready=0.295457s, total_serialization_initiated=0.347132s, others=0.000070s)
I0424 15:39:14.361987 137106215524160 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.647974s (all_items=0.000019s, per_item={'items': '0.00001884'}, temp_paths=0.647955)
I0424 15:39:14.363053 136980463474432 async_checkpointer.py:79] [process=6][thread=async_save] Background save thread started.
I0424 15:39:14.363241 137106215524160 async_checkpointer.py:561] Finished blocking save. Time taken: 0.978562s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints/0.
I0424 15:39:14.365265 137106215524160 checkpoint_manager.py:1566] [process=6][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 15:39:14.365566 136979884771072 async_checkpointer.py:265] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0424 15:39:14.365695 137106215524160 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777045153.3825676, 'wait_for_prev_duration_secs': 6.890296936035156e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777045153.3846452, 'checkpointer_blocking_duration_secs': 0.9787116050720215, 'get_old_steps_start_time': 1777045154.3633783, 'get_old_steps_duration_secs': 2.5033950805664062e-05, 'checkpoint_manager_blocking_start_time': 1777045153.380319, 'checkpoint_manager_blocking_duration_secs': 0.985339879989624}
I0424 15:39:14.365946 137106215524160 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0424 15:39:14.365999 137106215524160 max_utils.py:750] 
Memstats: After params initialized:
I0424 15:39:14.366045 137106215524160 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_24(process=6,(0,6,0,0))
I0424 15:39:14.366077 137106215524160 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_25(process=6,(1,6,0,0))
I0424 15:39:14.366116 137106215524160 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_28(process=6,(0,7,0,0))
I0424 15:39:14.366142 137106215524160 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_29(process=6,(1,7,0,0))
I0424 15:39:14.679428 137106215524160 metric_logger.py:185] completed step: 0, seconds: 62.285, TFLOP/s/device: 0.218, Tokens/s/device: 32.881, total_weights: 65536, loss: 10.877
I0424 15:39:14.819979 137106215524160 metric_logger.py:185] completed step: 1, seconds: 1.583, TFLOP/s/device: 8.583, Tokens/s/device: 1293.786, total_weights: 65536, loss: 10.877
I0424 15:39:15.247817 137106215524160 metric_logger.py:185] completed step: 2, seconds: 0.017, TFLOP/s/device: 817.222, Tokens/s/device: 123180.561, total_weights: 65536, loss: 10.268
I0424 15:39:15.383889 137106215524160 metric_logger.py:185] completed step: 3, seconds: 0.427, TFLOP/s/device: 31.804, Tokens/s/device: 4793.873, total_weights: 65536, loss: 9.741
I0424 15:39:15.661419 137106215524160 metric_logger.py:185] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.113, Tokens/s/device: 14336.416, total_weights: 65536, loss: 9.285
I0424 15:39:15.674620 137106215524160 metric_logger.py:185] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.878, Tokens/s/device: 15054.728, total_weights: 65536, loss: 8.901
I0424 15:39:37.964846 137106215524160 metric_logger.py:185] completed step: 6, seconds: 0.279, TFLOP/s/device: 48.692, Tokens/s/device: 7339.371, total_weights: 65536, loss: 8.602
I0424 15:39:38.101294 137106215524160 metric_logger.py:185] completed step: 7, seconds: 22.161, TFLOP/s/device: 0.613, Tokens/s/device: 92.417, total_weights: 65536, loss: 8.393
I0424 15:39:38.237602 137106215524160 metric_logger.py:185] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.973, Tokens/s/device: 14466.035, total_weights: 65536, loss: 8.264
I0424 15:39:38.373663 137106215524160 checkpointing.py:794] Waiting for step 9 to finish before checkpoint...
I0424 15:39:38.377467 137106215524160 checkpointing.py:798] Waited 0.0038208961486816406 seconds for step 9 to finish before starting checkpointing.
I0424 15:39:38.380150 137106215524160 checkpoint_manager.py:2024] [process=6][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 15:39:39.420104    2727 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0424 15:39:41.434871 136979901556480 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_6
I0424 15:40:09.488428 136980463474432 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/gbytes_per_sec: 28.324 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 55.76931691169739 s) (per-host)
I0424 15:40:09.488579 136980463474432 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 55.125385s.
I0424 15:40:18.492988 136980463474432 async_checkpointer.py:144] [process=6][thread=async_save] Background save thread done. Time taken: 64.129789s.
I0424 15:40:18.493311 136979884771072 async_checkpointer.py:273] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 15:40:18.493439 136979884771072 async_checkpointer.py:283] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0424 15:40:18.493493 136979884771072 checkpoint_manager.py:2133] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0424 15:40:18.505957 136979884771072 checkpoint_manager.py:2142] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0424 15:40:18.506147 137106215524160 checkpoint_manager.py:2036] [process=6][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0424 15:40:18.506295 137106215524160 checkpoint_manager.py:1458] Waiting for previous save to complete took 40.126143 seconds. If this number is high, consider checkpointing less frequently.
I0424 15:40:18.508611 137106215524160 checkpoint_manager.py:1518] [process=6] Saving checkpoint at step 9
I0424 15:40:18.512143 137106215524160 async_checkpointer.py:452] [process=6] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints/9.
I0424 15:40:19.509655 137106215524160 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0424 15:40:19.509818 137106215524160 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 15:40:19.632321 137106215524160 base_pytree_checkpoint_handler.py:153] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.124184s
I0424 15:40:19.632490 137106215524160 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/blocking_gbytes_per_sec: 3.752 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.41109299659729004 s) (per-host)
I0424 15:40:19.632539 137106215524160 base_pytree_checkpoint_handler.py:737] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.411151s (batch_requests_ready=0.269329s, total_serialization_initiated=0.141757s, others=0.000066s)
I0424 15:40:19.632811 137106215524160 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.415401s (all_items=0.000015s, per_item={'items': '0.00001478'}, temp_paths=0.415386)
I0424 15:40:19.633839 136980426340096 async_checkpointer.py:79] [process=6][thread=async_save] Background save thread started.
I0424 15:40:19.634025 137106215524160 async_checkpointer.py:561] Finished blocking save. Time taken: 1.125341s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints/9.
I0424 15:40:20.001445 137106215524160 checkpoint_manager.py:1566] [process=6][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 15:40:20.001839 136979884771072 async_checkpointer.py:265] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0424 15:40:20.002013 137106215524160 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777045178.3801208, 'wait_for_prev_duration_secs': 40.126142740249634, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777045218.5086505, 'checkpointer_blocking_duration_secs': 1.1254990100860596, 'get_old_steps_start_time': 1777045219.6341686, 'get_old_steps_duration_secs': 2.5033950805664062e-05, 'checkpoint_manager_blocking_start_time': 1777045178.378399, 'checkpoint_manager_blocking_duration_secs': 41.623576402664185}
I0424 15:40:20.002230 137106215524160 checkpointing.py:409] Started an asynchronous checkpoint save for step 9
I0424 15:40:20.002280 137106215524160 checkpoint_manager.py:2024] [process=6][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 15:40:24.824047 137019413485312 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260424_145744/linen_xpk_feat_nnx_set_defaults_true_20260424_145744_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_6
I0424 15:41:00.898083 136980426340096 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/gbytes_per_sec: 37.901 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 41.67664337158203 s) (per-host)
I0424 15:41:00.898231 136980426340096 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.264270s.
I0424 15:41:09.079222 136980426340096 async_checkpointer.py:144] [process=6][thread=async_save] Background save thread done. Time taken: 49.445245s.
I0424 15:41:09.079470 136979884771072 async_checkpointer.py:273] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 15:41:09.079529 136979884771072 async_checkpointer.py:283] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0424 15:41:09.079577 136979884771072 checkpoint_manager.py:2133] [process=6][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0424 15:41:09.082063 136979884771072 checkpoint_manager.py:2142] [process=6][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0424 15:41:09.082246 137106215524160 checkpoint_manager.py:2036] [process=6][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0424 15:41:09.082393 137106215524160 checkpoint_manager.py:2013] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 15:41:09.083029 137106215524160 metric_logger.py:185] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.644, Tokens/s/device: 15019.398, total_weights: 65536, loss: 8.188
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Fri Apr 24 15:41:23 UTC 2026
EXIT_CODE=0